In a few decades, consumer motor vehicle operations will be a thing of the past. Planes will be self-directed, trains and even farm equipment will be automated, and we’ll all enjoy the peace of a self-driving car. It’s a pretty appealing concept. Is a long road trip on the horizon? Just jump in the back seat and let your Tesla’s A.I. take the wheel.
Nissan’s CEO recently stated that he expects the first self-driving vehicles to be on traditional roads by 2020. You can still enjoy the convenience of working at Google, which loads many automated cars on its campus.
The goal is to create a safer, saner experience on America’s roadways. The human race is distracted and angry; it’s about time we pass over the reigns to a neutral, respectful autopilot program.
However, the rise of self-driving cars will open up a brand-new controversy. Namely, what happens when a robot is involved in an accident? We’ve spent decades drawing up the specifics for personal injury law in vehicular collisions.
We’ve all danced with insurance companies understanding fault, irresponsibility, and injury graveness. But how does that change when a computer is behind the wheel? It’s a complicated question, and at this point, it’s only a matter of time before we confront it head-on.
The vast majority of accidents involving self-driving cars are not the fault of A.I. itself. The motorized, automated vehicle follows the road rules, and some fallible human idiot rear-ends them or merges into their lane. However, that’s not always the case. Last Valentine’s Day, a Google self-driving car slammed into a bus after it tried to negotiate a right turn.
It wasn’t a glitch. The car’s software made an error in judgment that caused a collision – an implicit confirmation that the self-driving car era will not be an era utterly free of car accidents. Occasionally, there will be a need to get into the legal weeds.
There’s nothing fun about being in an accident. Still, there is evidence that getting involved in a collision with a self-driving car might be more beneficial to the victim than a conventional personal injury case. Bryant Walker Smith, a law professor at the University of South Carolina, has taught a course on the burgeoning sector of automated vehicles and told the Associated Press that “assuming you’re not dead, you’re in a much better position than if you’d been hit by an ordinary, human-driven vehicle.”
This risk exists because self-driving cars will shift the legal nexus away from the traditional questions that surround piloted negligence. Now, these cases are directed toward product liability. “Whereas today’s crash liability regime is based largely on the liability of individual drivers under negligence, tomorrow’s may be premised on the liability of manufacturers. So this would be under product liability broadly,” he writes in a paper called Automated Driving and Product Liability. “This shift will also create new issues for the judges and juries evaluating the resulting crash claims—as well as for the lawyers negotiating to avoid such trials.”
That’s advantageous for a legal client because you’ll be dealing directly with a corporate entity. After all, it’s difficult to prove negligence in a personal injury case. A lot of the specifics are left up to unknowable conjecture. How can you prove that a driver took his eyes off the road at an inopportune moment? Do we know for sure he was looking at a cell phone?
However, this becomes a lot easier when you’re dealing with cold complex calculations of a self-driving vehicle. If the nuts and bolts show that a car on auto-pilot miscalculated an off-ramp merge, you quickly eliminate any guesswork. There’s also speculation that the manufacturers of self-driving vehicles will want to err on the side of caution. That way, they can hand out generous settlements to keep the industry looking pristine.
Volvo, Google, and Mercedes-Benz have all already stated that they’ll take responsibility for any accidents involving their vehicles. Going to court over the specifics about a computer’s decision-making doesn’t look good from the outside looking in. We believe little chance exists for self-driving car accident liability falling on the automated vehicle drivers/occupants. Let’s say an accident happens because a self-driving car’s tire blows out and swerves into traffic, and after an investigation, straightforward tire maintenance is needed. Who pays?
In that case, will pilots be at fault? I, a regular driver, wasn’t maintaining the proper conditions necessary to keep my vehicle safe. Why should it be any different for an automated vehicle? But again, we know that discretion will be left to a judge.
We know this is all speculative. And we know despite all the money being poured into the self-driving industry, the legal jurisdiction remains lost in the Wild West. Money-hungry for campaign donations, legislators will pass legislation and precedents outlining the status quo’s specifics in a few decades. But your best bet till then?
You must hire a good lawyer and seek reasonable money damages settlement compensation amounts. At-fault drivers and companies should pay car accident victims the maximum compensation by receiving a free legal consultation with a local traffic accident lawyer.
You can learn more by contacting the Los Angeles personal injury attorneys at Ehline Law Firm today by dialing (213) 596-9642.
Michael Ehline is an inactive U.S. Marine and world-famous legal historian. Michael helped draft the Cruise Ship Safety Act and has won some of U.S. history’s largest motorcycle accident settlements. Together with his legal team, Michael and the Ehline Law Firm collect damages on behalf of clients. We pride ourselves on being available to answer your most pressing and difficult questions 24/7. We are proud sponsors of the Paul Ehline Memorial Motorcycle Ride and a Service Disabled Veteran Operated Business. (SDVOB.) We are ready to fight.
Go here for More Verdicts and Settlements.
Downtown Los Angeles Office
633 West 5th Street #2890
Los Angeles, CA 90071
3838 W. Carson Street, Ste 334
Torrance, CA 90503