Ehline Law Blog about Self Driving Cars


Dec 19, 2019

NHTSA Opens Investigation into Yet Another Self Driving Car Crash

More Doubts Open Over Faulty Tech The December 7th Tesla crash in Connecticut resulted in yet another federal investigation. The National Highway Transportation Safety Board announced that it was launching a closer look into the tragedy. This involves a closer look into Tesla's promises of autopilot. It still doesn't exist. Yet Tesla buyers seem to think that it is a real feature. In its current form, it's dangerous, and led to a number of severe accidents. ExtremeTech had a good writeup of the crash and the impending investigation. I hope that Tesla lawyered up for this one. Sounds like the accumulation of inquests is growing, as is public attention. Luckily no one was killed in this crash. However, that is more due to circumstance than the autopilot system. The Tesla was severely damaged in the collision. Furthermore, the feature is being used by drivers led to believe by Tesla that it is safe. It is not safe and Tesla's insistence that it represents the tech of the future is misplaced at best. In other ways, it represents negligence. Self driving tech is not ready for prime time. It is not even close to being so. As a result, the rush towards self driving cars will leave consumers disappointed-- or injured in a car crash. Instead of being several months away, self driving vehicles are likely decades away. In the meantime, tests of the system or use of existing programs result in too many accidents. A Legal Perspective Both parties involved in the crash may have legal recourse. The Tesla driver may be able to hold the car manufacturer accountable for pushing an incomplete product. The other vehicle may also have a strong case against the Tesla driver for negligence for keeping the car in autopilot. They may also have a suit against the car maker for the same reason as above. It's not sold as advertised-- read here for more info and updates. When other companies do the same thing, they get sued for fraud-- and lose.

Dec 10, 2019

Connecticut Tesla Crash Latest Roadblock for Self Driving Cars

Yet Another Crash of a So Called "Self Driving Car" A Connecticut police cruiser is in worse shape after yet another Tesla crash this week. According to State Police, a Tesla Model 3 rear ended one of their patrol cars. The driver stated that the car was on autopilot. He turned around to check something and next thing he knew there was a crash. The Drive also reported on the incident. It posted a number of clear photos of the damage done. The driver was ticketed with charges of reckless driving and reckless endangerment. The only good thing to come out of the crash was that no one was hurt. The crash also comes after a deadly Tesla collision while the vehicle was also on autopilot. Tesla can promise the world, but these examples show the opposite. Drivers unaccustomed to how this new tech work are only making the matter worse. In addition, Tesla's promises mean that drivers don't know how to use their flawed systems. All of this is a recipe for more accidents, more injuries, and more deaths. Promises Yet Unfulfilled For all of the promises of self driving cars being just around the corner, this crash is evidence of anything but. In fact, over the last several years, such accidents have become more commonplace. Why would the consumer want to risk their family or their car to such a scheme? Tech is still catching up to the self driving promises of big companies. It also shows that such tech is probably decades, instead of years away. It also shows that many of the top proponents of self driving cars are selling us short. By promising more than they can deliver, we are the true losers. The deaths, injuries, and thousands of dollars of damage caused by these errors are not just background noise. Instead, they represent what should not be on our roads.