Self-driving cars are speedily racing toward becoming an everyday reality, with Congress considering recently introduced bills that would allow the US National Highway Traffic Safety Administration (NHTSA) to oversee the deployment of autonomous vehicles on public roads, pre-empting individual states from issuing local laws. Researchers at Children’s Hospital of Philadelphia are both applauding the cool technology … and urging the study of the associated human factors to ensure their safe deployment.
“Research is just emerging in this field,” said Helen Loeb, PhD, senior scientist and biomedical research engineer at the Center for Injury Research and Prevention (CIRP). “Tech is way ahead of everyone. Autonomous cars are out there on the road, but drivers pretty much have no training and a lack of legislation as well. There are many things that you would think should be at least studied, and they’re not. That is our focus: Understanding how to deploy this new technology safely.”
Questions of safety are increasingly important for both teens and adults. Dr. Loeb – whose background combines aerospace engineering and robotics – is actively working to address them in both populations: What training or legislation can help all drivers safely use autonomous vehicles? What safety bases have not yet been considered? Can we trust teen drivers with their limited driving experience?
Dr. Loeb and her team of researchers (including Aditya Belwadi, PhD, research scientist at CHOP; Chelsea Ward, Driving Simulator Core Coordinator; and engineering students at the University of Pennsylvania) recently completed a study that leveraged the CIRP driving simulator to investigate self-driving. For the study’s first component, eight teenagers and four adults answered questions about their perception of self-driving, including what they knew about the technology, whether they trusted it, how often they drove, and more.
Next, the 12 participants strapped on their seat belts to engage with a simulation that would assess their reflexes when presented with an emergency situation. Drivers could either manually control the vehicle or engage the autopilot. On autopilot, they could take their hands off the wheel, their feet off the pedals, and sit back to watch the road, just like in the self-driving mode found in Tesla cars. The car controls were designed to replicate those found on Tesla vehicles, with drivers pulling a lever to engage or deactivate the autopilot.
“We wanted to study the ability of young people and more mature drivers to focus, notice dangerous situations, and take over if something went wrong with the autopilot,” Dr. Loeb said.
While driving, the software presented the drivers with two emergency scenarios: In one, the drivers found themselves on a highway with their self-driving car veering toward an unplanned exit. The research team measured how quickly each driver responded to the malfunction by switching back to manual and steering back onto the road. In another scenario, the participants drove on a two-way road and at some point, the autopilot swerved toward an incoming car.
“Our scenarios were based on real-life situations we observed on YouTube videos,” Dr. Loeb said. “We programmed them in the simulator so we could see who is able to avoid the crash and who is not.”
The data still need to be analyzed, but Dr. Loeb believes that the crash rate for both scenarios is right where she’d like it to be: Crash risk was high, but in all cases, it was avoidable through a swift evasive maneuver.
Dr. Loeb is now writing the results, which will be published within the year. This research will hopefully help inform key questions — about placement of hands on the steering wheel, efficiency of warning systems, complacency of the technology — that could assist in the development of future legislation about self-driving cars. The research may also help provide manufacturers with insight into how alerts should be installed to work most efficiently. In the study, Dr. Loeb will research various types of alerts that will go off when the autopilot is about to malfunction and human should take over, including visual alerts, audio alerts, and haptic alerts.
“Our focus here is Human Factors: We want to understand how people interact with the technology, and maybe inform the development of training programs,” Dr. Loeb said. “We train people to drive, but anybody can go out, buy a car with self-driving features, and try it out for the first time on the highway. It’s going to be a little disorienting. We need to think about this carefully.”
For now, Dr. Loeb says the next step, unfolding in the next year or so, will involve enrolling a larger study cohort of 36 people or more, in order to get a clearer idea of people’s impressions and handling of the self-driving technology.