Doug Hines, CEO of a software company in Decatur, GA, has logged hundreds of miles in his Tesla. In addition to the obvious perks of owning an all-electric car—little maintenance, no exhaust, and just downright fun to drive — there was one he hadn't expected: the unfailing generosity of people willing to offer up their home chargers to a stranger, often for free.
Elon Musk is a man who makes the future happen. He's building solar panels, and electric cars that can run off of the clean energy they create. He's helping humanity become an interplanetary species by making spaceflight dramatically cheaper. But his idea to build a tunnel to avoid traffic congestion in Los Angeles seems uncharacteristically outdated.
Humans trust robots with their lives, and they probably shouldn't. A new study published today shows that people aren't great at taking control back from autonomous vehicles, or handing off the control if need be. Autopilot for cars promises to save lives, but those promises will mean little if they can't account for human error from the start.
“The Automatic Emergency Braking (AEB) or Autopilot systems may not function as designed, increasing the risk of a crash.” It's a simple sentence, delivered with the calm finality of bureaucratic certainty. It is a literal post-mortem, the bottom-line-up-front from the National Highway Traffic Safety Administration's investigation into the first fatal crash of an autonomous car—one made by Tesla Motors. The investigation into the crash closed today, and it will likely cast a long shadow over the future of self-driving cars, which have long been heralded as potentially life-saving devices.
Human drivers are imperfect pilots, placed in command of a couple thousand pounds of fast-moving metal. We're just not equipped for the task: The eyes that evolved pointing forward (to better navigate our ancestral home in the trees) mean a smaller field of vision. Even with well-positioned mirrors, a human driving a car in three-dimensional space is bound to have blindspots. But what if the car itself didn't? What if cars could sense where they were, and then communicate that information to other cars? Suddenly, a dense road of imperfectly piloted vehicles would become a smart, safe network, with the cars themselves constantly pinpointing one another in space and time.
Long before we have fully autonomous cars – ones in which we can kick back and catch up on Westworld as our faithful robot delivers us safely to work – we'll live in a world of shared responsibility. Tesla Autopilot and other upcoming advanced semi-autonomous systems require the driver to stay alert, should a situation arise in which humans need to step in and take the wheel.