(817) 885-8000

Quick Settlement of Uber Fatality Collision Compelling

Dashcam video showed a dark, straight road in Tempe, Arizona, with broken white lane lines ticking by and only distant brake lights ahead. And then, as the headlights illuminated a particularly deep shadow, the video revealed something else: white shoes, then blue jeans, a black shirt, a red bicycle.

The March 19 video released to media by the Tempe Police Department preserved a grim but long-predicted moment — the first time a computer-driven vehicle killed a pedestrian. Uber Technologies Inc.’s rapid settlement a week and a half later with the crash victim’s family heralds change for personal injury litigation, as well.

As vehicles become more automated and collect more objective data — and crash evidence — from their surroundings, more motor-vehicle personal injury cases will settle quickly, attorneys say. But as in the Uber crash, the increasing complexity of these vehicles and reduced human control highlight uncertainties about who might be legally liable for damage they cause.

In other words, these cases won’t be about driver negligence, but about the failure of a product, s

“As decisions shift from individual drivers to vehicles or their designers, liability likewise shifts,” he said. “Manufacturers are likely to have a bigger slice of what is likely to be a smaller pie [of injuries].”

The list of potentially liable parties is longer for automated vehicles than traditional ones because of the wider array of components involved. For instance, in the Uber crash, the car was a Volvo XC90. Other manufacturers produced sensors that fed data about the vehicle’s surroundings into Uber’s self-driving vehicle software. An Uber safety driver sat behind the wheel.

Uber and the attorney for the victim’s family have declined to comment on the settlement. The National Transportation Safety Board said it was investigating whether the safety driver or the vehicle could have detected the pedestrian.

Investigators will examine the Volvo, the accident site, the video, the technology in the test vehicle and all electronic data stored on the test vehicle or transmitted to Uber. They will also collect information about the victim and the Uber driver. The NTSB said it would release a comprehensive report with its findings.

Experts are eagerly awaiting the results of the agency’s report, which Smith said could open the door to more lawsuits — particularly if it finds that subcomponents of Uber’s driving system failed. However, the video shows Uber’s safety driver, who could have overriden the automatic driving system, looking away from the road until just before the crash.

“In that crash, there could certainly be subsequent litigation on behalf of the victim, and possibly by Uber in recovering costs, but that seems pretty unlikely given the facts that have currently been revealed,” Smith said.

Veteran automotive attorneys said they expect cases involving self-driving car crashes to play out differently than those involving traditional cars because of the abundance of evidence the vehicles produce. Video and sensor data would give a huge advantage to lawyers trying to figure out how a crash occurred.

 

But in a future where automated cars are commonplace, there will almost certainly be video and other hard evidence, he said.

 

More realistic clients would mean quicker settlements. And,   the technology would potentially cause fewer crashes, as well. The result would be less strain on the court system, he said.

The nonprofit Consumer Watchdog, however, is not convinced that the adoption of robotic vehicles will reduce the rate of crashes. The group called for a national moratorium on autonomous vehicle testing on public highways until the NTSB releases its report.

 

Driverless cars behave differently than humans do, Simpson said. For example, they might make a full stop while turning right at a stop sign rather than rolling through, as most people would — and this might, in turn, cause more fender-benders.

Simpson raised the specter of legal and ethical conundrums, particularly in cases where a crash just can’t be avoided.

 

Manufacturers may be held to account for such decisions — yet another difficult question that could drag manufacturers, suppliers and insurers into court as liability shifts to the components involved.

Uber’s quick settlement of the Tempe crash denied courts the opportunity to delve into questions bound to arise as this new technology takes hold. But that’s probably for the best,

Congress is better suited to provide answers that would apply in all states, rather than having the courts hash it out piecemeal, he said.

federal legislatures should work out standards for autonomous cars at the federal level with regards to insurance and liability. He said he expects insurers to begin offering products geared toward companies or fleet managers rather than human drivers.

 

Federal legislators’ efforts thus far to impose safety and cybersecurity standards on self-driving cars have stalled. The House’s SELF DRIVE Act and Senate’s AV Start Act, which would preempt state regulations on driverless cars and instead create federal standards for their design and construction, were introduced last year but went nowhere.

 

Share This Post