People are bad at taking over from autonomous cars
Kelsey D. Atherton
at 10:50 AM Jan 27 2017
People are bad at taking over from autonomous cars
Professor Neville Stanton in University of Southampton driving simulator
Image courtesy of University of Southampton
Cars // 

Humans trust robots with their lives, and they probably shouldn't. A new study published today shows that people aren't great at taking control back from autonomous vehicles, or handing off the control if need be. Autopilot for cars promises to save lives, but those promises will mean little if they can't account for human error from the start.

 Published in the journal Human Factors, the study “Takeover Time in Highly Automated Vehicles: Noncritical Transitions to and From Manual Control” is about what happens when humans need to take control of a car that's driving itself.

the authors observed 26 men and women (ages 20-52) engaged in simulated driving at 70 mph with and without a nondriving secondary (i.e., distracting) task and recorded response time as the drivers took over or relinquished control of the automated system. A takeover request was issued at random intervals ranging from 30 to 45 seconds during normal motorway-driving conditions. The authors found that drivers engaged in a secondary task prior to a control transition took longer to respond, posing a safety hazard.

Takeover is a pretty key part of how autopilots work in cars right now. Last May, a Tesla Model S driver died when both he and his car failed to spot a tractor-trailer crossing the highway. A National Highway Traffic Safety Administration investigation into the crash did not find sufficient grounds for a product recall. It did note, however, that despite the clear limitations of the autopilot feature, “other jurisdictions have raised concerns about Tesla's use of the name “Autopilot.”

Autopilots have been legally tricky from the start. There's a case on autopilot liability from 1947, when a Navy pilot crashed into a small plane. The case, Ferguson v. Bombardier Services Corp ruled that despite the autopilot, the human flying the plane had a responsibility to monitor the craft and was ultimately held responsible.

I learned about the case of [Ferguson v. Bombardier Services Corp] from Robots in American Law, a study published by University of Washington assistant law professor Ryan Calo. I spoke to Calo about the latest study.

“The pilot got sued by the survivors of the victims,” Calo said today. “That's one very clear-cut case where a person was bad at monitoring and taking control from a rudimentary autopilot and it resulted in an autopilot, for which a human got blamed.”

There is a body of work on handoff issues, that specific moment when control of a vehicle transfers from autopilot to human or the reserver. In 2013 robotics researcher Missy Cummings, who noted that “at precisely the time when the automation needs assistance, the operator could not provide it and may actually have made the situation worse.”

“It's a known problem,” says Ryan Calo, “it's part of the reason that human-robot interactions need to be part of what we fund.”

As for the study at hand, the researchers found that in non-emergency situations, it can take anywhere from 2 to 26 seconds for a human driver to hand off control to an autopilot. That's a huge range, and the authors recommend that, rather than aiming at an average response time, autopilot designers take into account how much difference there is for human reactions, and start working from there.

It might be hard to design autopilots in a way that accounts for every human, but it's probably a lot easier to design autopilots with humans in mind, instead of trying to standardize the behavior of every human driver.

comments powered by Disqus
Sign up for the Pop Sci newsletter
Australian Popular Science
ON SALE 01 FEBRUARY
PopSci Live