A ridesharing service like Uber or Lyft seems like it should help stem drunk driving by offering an easy, cheap option for tipsy customers to get home at the end of the night. And Uber even claims on its website that ride-hailing options like it “are helping to curb drunk driving.” But new research shows that Uber's presence in a city only inconsistently leads to a decline in accidents caused by intoxication behind the wheel, and there's far from being a conclusive answer to the question.
Elon Musk revealed more about his plans for the Boring Company on Friday, showing a video that features a utopian vision in which sleds whisk cars through buried tunnels at speeds of around 124 miles per hour. Designed to alleviate traffic problems in cities like Los Angeles, the concept includes vertical entrance and exit points to a subterranean network. But while the idea seems sexy, experts are skeptical of its feasibility—or even if it's the right approach to solving the gridlock woes of modern metropolises.
Doug Hines, CEO of a software company in Decatur, GA, has logged hundreds of miles in his Tesla. In addition to the obvious perks of owning an all-electric car—little maintenance, no exhaust, and just downright fun to drive — there was one he hadn't expected: the unfailing generosity of people willing to offer up their home chargers to a stranger, often for free.
Elon Musk is a man who makes the future happen. He's building solar panels, and electric cars that can run off of the clean energy they create. He's helping humanity become an interplanetary species by making spaceflight dramatically cheaper. But his idea to build a tunnel to avoid traffic congestion in Los Angeles seems uncharacteristically outdated.
Humans trust robots with their lives, and they probably shouldn't. A new study published today shows that people aren't great at taking control back from autonomous vehicles, or handing off the control if need be. Autopilot for cars promises to save lives, but those promises will mean little if they can't account for human error from the start.
“The Automatic Emergency Braking (AEB) or Autopilot systems may not function as designed, increasing the risk of a crash.” It's a simple sentence, delivered with the calm finality of bureaucratic certainty. It is a literal post-mortem, the bottom-line-up-front from the National Highway Traffic Safety Administration's investigation into the first fatal crash of an autonomous car—one made by Tesla Motors. The investigation into the crash closed today, and it will likely cast a long shadow over the future of self-driving cars, which have long been heralded as potentially life-saving devices.