The first fatality involving a "self-driving" vehicle occurred in a pedestrian accident in Tempe, Arizona on March 18, 2018. A pedestrian, walking a bicycle across a 4-lane roadway at night was struck by a self-driving UBER traveling at 38 MPH. Although the pedestrian had walked across three roadway lanes in front of the UBER neither the vehicle nor the human safety driver detected the pedestrian prior to impact. The UBER was approximately 450 feet from the point of impact when the pedestrian started to cross the road. The operator of the car did not have her eyes on the road or hands on the steering wheel before the crash occurred.
The National Transportation Safety Board (NTSB) reported that the radar and LIDAR systems registered the pedestrian 6 seconds before impact. The system first classified the pedestrian as an unknown object, then as a vehicle, then as a bicycle. The system waited until 1.3 seconds before impact to determine that an emergency breaking maneuver was needed.
The Volvo used by UBER in the testing of the fully autonomous vehicle had pedestrian detection system that when integrated to the automatic emergency braking (AEB) system would have prevented hitting the pedestrian. However, the emergency braking maneuver was disabled when the vehicle was under computer control in order to limit erratic vehicle behavior.
Experts have stated that the LIDAR should have discovered the pedestrian much earlier because the pedestrian was moving all the time. The accident was avoidable.
So why did this tragedy occur? Did the software algorithm misinterpret the situation? Did the LIDAR fail? The consensus of the experts is that there was a failure at all levels. Not only with technology, but with UBER's testing methods and procedures.
The expectation that driving responsibility can be safely passed from machine to humans in a split second is not happening. This is called the "handoff problem". Studies have shown that drivers rapidly lose their ability to focus on the road when the technology is driving the vehicle. Several manufacturers have decided to bypass the human altogether by removing the steering wheel and gas pedal
Public highways are not proving grounds for companies to use for testing self-driving vehicles. Test vehicles are known to struggle with basic maneuvers that humans can easily handle. UBER documents indicate that UBER's autonomous vehicles have had difficulty driving through construction zones, next to tall vehicles like big rigs, difficulty navigating streets in busy city centers around pedestrians, cyclists, street cars, regular vehicles and picking up and dropping off passengers.
The self-driving tech companies often introduce software before it is completely ready. When you have heavy vehicles speeding on public roads, there must be technology with proper safeguards for safety. To make self-driving vehicles and their testing programs safe, companies need to recognize their limitations, correct the flaws and make safety a top priority.
Contact Van Blois & Associates if you are involved in a vehicle accident. Attorney R. Lewis Van Blois was selected by Best Lawyers as the Product Liability Litigation Lawyer of the year for 2018 for Northern California.