While flying cars may still remain absent from our city streets, the future of driverless cars is getting one step closer each day.
It seems to be a bit of a rocky road, however, as accidents involving Google’s driverless cars have been in the headlines recently.
Google now posts monthly reports on its autonomous vehicle accidents. The company claims that in the six years since starting the driverless car project, they’ve only been involved in 16 minor accidents during 2 million miles of autonomous and manual driving. Google doesn’t blame its cars, though: It blames humans, and now they say that they are currently “teaching their cars to behave like humans.”
Accidents Will Happen
Accidents have happened every year since the project’s beginning. The years prior to 2015 saw up to two accidents, but 2015 alone had five accidents. Examples of past Google crashes over the last few years include:
- 2010: A Google Prius vehicle got rear-ended while stopped at a traffic light in manual mode.
- 2011: A Google Prius rear-ended a car that was sitting in traffic while in manual mode – the driver was running an errand and was not testing the vehicle.
- 2012: A Google Prius stopped at a traffic light was rear-ended by another car.
- 2013: A vehicle veered into the side of a Google Lexus in autonomous mode while on the highway.
- 2014: The rear bumper of a Google Lexus was struck by another vehicle while the Lexus was waiting to make a right turn.
- 2015: A Google Lexus was rear-ended, causing the first human injuries in the testing – the three Google employees on board complained of minor whiplash.
While Google may be proud of its somewhat small accident record and the fact that the crashes were not technologically caused, accidents are still accidents. In the case of the most recent one, people actually got hurt. Human drivers aren’t going anywhere in the future, so what will be done to prevent further errors?
The answer may worry you. Like Google said above, it must program its cars to fit the organic and aggressive tendencies of humans. Driving open roads by way of only algorithms was the easy part – it’s now up to Google to make its autonomous vehicles a bit less robotic.
Matt Richtel and Conor Dougherty of the New York Times brought up another point that Google will have to consider: “On a recent outing with New York Times journalists, the Google driverless car took two evasive maneuvers that simultaneously displayed how the car errs on the cautious side, but also how jarring that experience can be.”
Who’s at Fault?
One thing Google has failed to address is who to blame if a human is injured in an accident involving an autonomous vehicle. Mirror took a survey that showed maybe the cars are to blame: “More than one person in four – 26 percent – would blame the car maker if a self-driving car was involved in a collision, while more than one in five – 18 percent – think the responsibility lies with the person in the driving seat.”
The publication also said, however, that driving laws still apply to autonomous vehicles. You need insurance and a license to drive them. You cannot be drunk and drive them. It’s you who gets penalized if the car is speeding.
Some manufacturers, however, are taking a more cautious route. For example, Volvo has been one of the first companies to accept full liability for autonomous car accidents. This is due in part to how state laws vary when handling autonomous cars.
Autonomous vehicle technology is still in its early stages, and although this is a hot subject of debate right now, hopefully someday we can all zip to the store without having to lift a finger. After all, not only are these cars projected to ease our driving, but they might also save more lives, cause fewer crashes, and help the economy.
Would you climb into one?