Self-driving cars are supposed to be safer than human-operated cars. In fact, in 90% of car accidents, of which there are over 2 million a year, driver error causes the crash. So, in theory, take the driver out of it, and 90% of those accidents would disappear. But that still leaves the 10%, something that the first pedestrian fatality by self-driving car this week illustrated all too starkly. The victim, out walking in the early morning hours in Tempe, AZ, was hit by one of Uber’s experimental self-driving cars. And underlying the tragedy is a larger problem: Who’s to blame? And therein lies the problem; nobody knows, and nobody really wants to accept responsibility.
In most cases, car accidents are the fault of what lawyers call “traditional negligence”: texting while driving, blowing through stop lights, pulling a left out of a right turn lane, and all the other stupid things people do that leave you screaming obscenities at them. But, in the rare cases where the brakes fail, the accelerator sticks, or the headlights short out, that’s product liability; it’s the car’s fault, in other words. The advantage here is that one or the other is relatively easy to prove, and the proliferation of tools like dashcams, diagnostic computers, and crash data recorders has just made it easier. But once the car is driving, that makes it much harder to figure out. If someone’s behind the wheel, but not controlling the vehicle, is it the driver’s fault for not noticing what’s happening and taking control? Or is it the car’s fault for not doing its job?
Nobody really knows the answer. Some, like Volvo, have stepped up to take liability for their cars, and insurers believe it’ll be the carmakers who have to take the burden of responsibility. But that will likely fall to politicians and courts, and lawmakers, at least, haven’t weighed in: A 2017 law that passed Congress unanimously only says that drivers might still be on the hook, even if it’s a product failure. And self-driving car companies insist, every time, that it’s not their technology, it’s the nut behind the wheel.
This isn’t new, of course. Auto companies spent decades insisting that it was the driver, not the design of their car, even as car crashes became the number one cause of accidental death in America. Safety features like seat belts and airbags, which we take for granted now, were bitterly fought over every step of the way.
This is a glaring problem, as self-driving cars are not nearly as good at their jobs as the hype claims. They are essentially robots, and robots don’t deal well with rapidly changing environments full of variables that can suddenly change. In other words, being on the road is probably the worst place for a robot, and sure enough, dealing with humans, or even something as simple as graffiti on a sign, has been a struggle for them. For all the insistence of the tech industry that this first fatality is just an outlier and self-driving cars are still much safer, the simple fact of the matter is this isn’t going to be the last fatality, and the families of those lost are going to deserve answers that self-driving car companies may not have.
The auto industry is most likely to solve this problem by slowly adding robotic features while still keeping the driver at the wheel. You can see it already with passive safety features like automatic braking, lane assist, and automatic parking. Cadillac has even introduced a “super cruise control” that’s likely the future of driving on the highway, where driving on large highways becomes automated and the more delicate problems are left to the human driver.
In the end, that might be the only solution, that even as cars get better at driving themselves, the buck still needs to stop with a human being. Leaving aside whether the cars themselves are ready to wander the road, no human controlling them, it’s likely we, as a society, aren’t ready to give up the wheel just yet. And perhaps that’s for the best. One thing a robot is never going to be able to do is make amends, and in some cases, that’s what people need more than anything.