The autonomous car will solve many human problems. It will help reduce emissions and prevent accidents, saving lives on both counts. It will also empower aging populations and lower the risk of drunk driving.
But in order to thrive, autonomous cars may require a less hostile environment.
Humans and autonomous cars approach the rules of the road differently. This fact has been made apparent by the nature of the accidents involving autonomous cars: They have all been the fault of another human driver. One could argue that had these autonomous cars been programmed to drive less politely — stopping less abruptly and cutting corners when making turns — those rear-endings wouldn’t have happened. These observations have caused the designers of these systems to consider how to accommodate the dominant human population of drivers, who generally drive contrary to Google’s more politely programmed system.
So, how should these cars be programmed to navigate the streets?
“It’s a constant debate inside our group,” Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh, told MSN. “And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people.”
This has led programmers to debate whether they should make their cars more human by allowing them to commit infractions ordinary human drivers would. Of course, that presents the following dilemma: What happens if such reprogramming causes an accident?
Author and computer scientist Jerry Kaplan says, “We’re going to need new kinds of laws that deal with the consequences of well-intentioned autonomous actions that robots take.”
The environment the autonomous car lives in is a hostile one. Rajkumar told MSN he took the lab’s autonomous Caddy out for a test drive. All was going well until it had to “merge onto I-395 South and swing across three lanes of traffic in 150 yards (137 meters) to head toward the Pentagon.” The car’s sensors saw the traffic whizzing past on all cameras, but didn’t trust the other (human) drivers to let it in. The human supervisor had to take over.
The autonomous car may not be able to thrive in the environment it has been born into. The success of attaining its full autonomy is reliant on an ecosystem that supports it, and right now it doesn’t.
That leaves us with the following question: Who should change — the law-abiding computer drivers or the law-ignoring human ones?
Natalie has been writing professionally for about 6 years. After graduating from Ithaca College with a degree in Feature Writing, she snagged a job at PCMag.com where she had the opportunity to review all the latest consumer gadgets. Since then she has become a writer for hire, freelancing for various websites. In her spare time, you may find her riding her motorcycle, reading YA novels, hiking, or playing video games. Follow her on Twitter: @nat_schumaker
Photo Credit: GLENN CHAPMAN / Getty Staff