Automated Driving Systems and Crash Avoidance Technology are in the fast lane to becoming our daily driving reality. While the aim of these technologies is to make travel from point A to point B simpler and safer, the legal and liability issues are anything but simple.

In addition to self-driving systems and crash avoidance, there are even vehicle-to-vehicle systems that allow equipped vehicles to exchange information as they travel.

Although each of these systems is aimed at safety the question remains:
Who is liable when something goes wrong?

Crash avoidance technology includes systems with features that may provide forward collision warning, forward collision brake assist, lane departure prevention, adaptive headlights and more. These technologies are in broad use in newer cars today.

Automated Driving Systems (ADS) are designated by levels 1-5 according to the amount of control used in operating the vehicle:

  • Level 0 – No robot. Full human control at all times.
  • Level 1 – Robot can control lateral movement (Lane Keeping Support).
  • Level 2 – Robot can control both lateral and longitudinal vehicle controls, but cannot monitor roadway and respond sufficiently to perform the entire dynamic driving task. Human driver is expected to monitor and respond to road conditions.
  • Level 3 – Robot can control all functions required to perform entire dynamic driving task and expected to monitor and respond to road conditions. Human is not the driver when robot is engaged, but human must remain available to retake controls after a request to intervene is issued “within sufficient time for a typical person to respond appropriately to the driving situation at hand.”
  • Level 4 – Robot can control functions required to perform entire dynamic driving task through ADS, including responding to a system failure (“the fallback”), but only in some conditions. The human is just a passenger when the ADS is engaged.
  • Level 5 – Robot can control all functions required to perform dynamic driving task through ADS, including responding to a system failure (“the fallback”) under all conditions.

With so many different levels of control available, it becomes more difficult to ascertain who is truly in control of the vehicle. Drivers who purchase vehicles equipped with these systems have consumer expectations about increased safety while driving.  But in the event that something goes wrong it will have to be determined if there is product liability, driver liability or driver negligence.

The manufacturer may make claims about increased safety and avoidance of collisions. There may be warranties available.  Legally speaking the company may be held responsible if something goes wrong either by malpractice—promising a standard of care that was not met, or misrepresentation—promising increased safety. But the driver may also be held strictly liable, as in the case with dog bites—the owner assumes responsibility for the product and its use.

We will certainly see cases played out in court in the future. Just over a year ago in Tempe, Arizona there was an incident in which an Uber test vehicle with a self-driving system in control struck and killed a pedestrian. The National Traffic Safety Board investigated the incident. The NTSB discovered that Uber had turned off the vehicle’s advance warning system and was using its own self-driving system. The NTSB determined that the Uber driver was on a cell phone immediately prior to the collision. NHTSA found the Uber driver had failed to heed warnings that began at six seconds prior to impact with the pedestrian. The self-driving system determined at 1.3 seconds prior to impact that emergency braking was needed; however, Uber had not enabled the emergency braking maneuvers. The pedestrian’s surviving family members reached a confidential settlement with Uber.

The NTSB has also investigated several cases (5/2016, 3/2018, 3/2019) involving Tesla’s AutoPilot program. The NTSB found that the manner in which Tesla’s Autopilot monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.

It remains to be seen which direction the courts and juries will lean, but in the meantime, don’t assume that your ADS is going to offer full and complete protection physically AND legally!