Researchers Release Unique Dataset Looking at Distracted Drivers
Distracted driving – texting or absent-mindedness – claims thousands of lives a year. Researchers from the University of Houston and the Texas A&M Transportation Institute have produced an extensive dataset examining how drivers react to different types of distractions, part of an effort to devise strategies for making driving safer.
Scientific Data, the researchers make the dataset publicly available for the first time and describe how they collected the information.
In a paper published Aug. 15 in the journalThe study was conducted with 68 volunteers, all of whom had a valid driver’s license and normal or corrected-to-normal vision, on a driving simulator. Drivers were tracked with both thermal and visual cameras, along with palm sensors, sensors to measure heart rate and breathing rate, and an eye tracking system.
Ioannis Pavlidis, Eckhard Pfeiffer Professor and director of the Computational Physiology Lab at UH, said the study is the first to tackle three types of distracting elements – sensorimotor, such as texting; cognitive, such as absorbing thoughts; and emotional distractions.
Texting, the researchers found, led to far more dangerous driving, while a “sixth sense” appeared to protect those suffering emotional upset or absent-mindedness. Texting interfered with that sixth sense, letting drivers drift out of their traffic lanes. The researchers reported this result in the journal Scientific Reports last year, using a subset of the data they collected.
Additional investigation showed that “eye tracking and breathing rate proved useful metrics for measuring the impact of texting while driving,” Pavlidis said. “But that wasn’t helpful in cases of emotional or cognitive distractions.” However, he said the researchers found heart rate signals captured via wearable sensors and perinasal perspiration captured via miniature thermal imagers were able to track all forms of distraction – a result that is reported in the current Scientific Data paper.
That and other findings provide the groundwork for future safety systems, said Robert Wunderlich, director of the Center for Transportation Safety at the Texas A&M Transportation Institute. Given the widespread use of smart watches capable of measuring heart rate, he said this result opens the way for universal sensing of all forms of distraction at the consequential source, that is, the driver’s sympathetic system.
The potential market for interventions is huge. According to the National Highway Traffic Safety Administration, 3,477 people were killed and 391,000 were injured in motor vehicle crashes involving distracted drivers in 2015. Texas Gov. Greg Abbott signed a law banning texting while driving earlier this summer, leaving just three states that have not banned the practice.
The experiment worked like this: Volunteers drove the same segment of highway four times in a high-fidelity driving simulator – with no distraction and with cognitive, emotional and physical distraction. They were monitored via standoff and wearable sensors, which recorded perspiration, heart rate, breathing rate, gaze and facial expressions to capture the drivers’ state as they were overloaded by multitasking.
At the same time, the simulator’s computer recorded driving performance variables including speed, acceleration, braking force, steering angle and lane position.
In addition to Pavlidis, authors on the paper include Panagiotis Tsiamyrtzis of Athens University of Economics and Business, who spearheaded the data analysis and validation; Salah Taamneh and Ashik Khatri of UH; Malcolm Dcosta of Elizabeth City State University; Pradeep Buddharaju of University of Houston-Clear Lake; Michael Manser and Robert Wunderlich of the Texas A&M Transportation Institute; and Thomas Ferris of Texas A&M University.
Along with the publication of the paper in Scientific Data, the researchers released the full dataset in the Open Science Framework (OSF) databank.
This experiment represents an emerging form of multimodal design, where an abundance of highly quantitative variables are measured continuously, providing a 360-degree view of the studied conditions. Pavlidis noted that these designs are now possible because of technological advances in wearable and imaging sensors, as well as the emergence of robust computational algorithms.
The deluge of data such multimodal experiments produce requires sophisticated curation and complete openness, he said, not only for purposes of reproducibility but also as a means to investigate the dataset’s full potential.
- Jeannie Kever, University Media Relations
Citation: Taamneh, S. et al. A multimodal dataset for various forms of distracted driving. Sci. Data 4:170110 doi: 10.1038/sdata.2017.110 (2017).
Link: http://dx.doi.org/10.1038/sdata.2017.110
The full dataset is available at https://osf.io/c42cn/.