Self driving vehicle
Crash avoidance and driving automation make cars safer, but create legal conundrum
A lot of us grew up watching The Jetsons and Lost in Space but never thought we’d have see self-driving cars. Now they appear to be the wave of the future. Or are they?
Many vehicles already are equipped with nifty automated features. They beep, flash warning lights, assist, or even work by themselves to help drivers with steering, parking, braking and avoiding collisions.
Crash avoidance systems make vehicles safer. A tractor-trailer installed with an automatic braking system can now “see” an obstacle 650 feet ahead. Cameras and sensors detect and alert the driver to the upcoming obstacle. If the driver fails to respond, the brakes engage to stop or slow down in time to prevent a crash. Several auto manufacturers have announced plans to offer automatic braking technology on cars as well.
But many important legal, insurance and safety questions have arisen.
What happens when there’s a glitch in the technology?
Backing up human control with automation substantially increases an automobile’s safety. However, replacing humans with automation has pitfalls. A bug in the software could cause the system to malfunction or a hacker could take over the system. Relying solely upon the automatic car is risky.
Some companies are currently testing automobiles that are driverless, not just self-driving. If something goes wrong, nobody is there to take control.
Who was responsible for the Google self-driving car crash?
Google’s self-driving car made the news when it caused a crash in February. The vehicle changed lanes to maneuver around sandbags it mistook for blocking its path and sideswiped a bus. Nobody was hurt in the minor collision but someone could have been.
Google admitted to some responsibility. However, the crash calls into question whether human error was solely responsible for the 17 other minor collisions involving Google self-driving cars. The company’s standby answer has always been that their technology was not to blame.
Who is responsible when something goes wrong?
In a self-driving car, the driver is still probably going to be held liable. Arguing that you didn’t know what your automatic brakes were going to do will not relieve someone from responsibility.
The manufacturer might also be liable for defects, but might have a good defense that the driver ultimately remains responsible for controlling the vehicle. However, as automation becomes more common, auto manufacturers might acquire more responsibility for failures in the system, or for not installing up-to-date technology.
And a car with no driver further complicates the liability question. In all likelihood, the owner and the manufacturer could be held accountable for a driverless car crash.
Will we be driving self-driving vehicles soon?
Don’t hold your breath. “Why buy an autonomous vehicle if you have to maintain control?” asked Adrian Lund, president of the Insurance Institute for Highway Safety. He predicts that self-driving vehicles will not be made due to legal quandaries and insurance company objections.
Our team of personal injury professionals will continue to update this blog as NHTSA guidelines, Texas law, and other legal issues evolve. We have been representing individuals and families injured in auto accidents for over 35 years and offer free legal consultations to victims all over Texas.