Skip to main content
SearchLoginLogin or Signup

Driving With No Hands: How Tesla’s Autonomous Vehicle System Escapes Liability

Published onAug 28, 2016
Driving With No Hands: How Tesla’s Autonomous Vehicle System Escapes Liability

As driver assistant technologies have developed, liability from vehicle accidents has increasingly become a topic of discussion. Exactly who should be responsible when these technologies fail and a person or property is damaged? Tesla Motors is widely considered one of the leading companies pioneering the trail of autonomous vehicle operations. Its system, named “Autopilot” can control steering, braking, and acceleration when activated. It uses a combination of cameras and sensors around the vehicle to determine its proximity to other objects and to map the road around the vehicle. Tesla vehicle owners have posted numerous videos online demonstrating showing Autopilot in use.

Autopilot was added as an optional feature to Tesla vehicles in October 2015. Tesla founder of CEO Elon Musk has cautioned that the system is in a beta testing phase and strongly advises against using the system without maintaining hands on the steering wheel and a complete awareness of driving conditions.

While this system has captured the attention of many Tesla owners and non-owners, the system has been under recent scrutiny. The National Highway Traffic Safety Administration (“NHTSA”) recently opened an investigation into the Autopilot system after a driver was killed in an accident while the system was engaged. The vehicle went under the trailer of a truck on a highway as the truck turned in front of the vehicle. This and another handful of accidents has led to critics suggesting that Autopilot should be disabled or even that Tesla should be held liable.

The NHTSA regulates autonomous features within vehicles using a tiered level system.  Level 0 is the level for a standard car, indicating that the driver is in complete and sole control of the vehicle. Level 4 is the maximum designation where vehicles are considered full self-driving automation. Tesla’s Autopilot system likely squares its vehicles into the Level 2 category. That category reads:

Combined Function Automation (Level 2): This level involves automation of at least two primary control functions designed to work in unison to relieve the driver of control of those functions. An example of combined functions enabling a Level 2 system is adaptive cruise control in combination with lane centering.

Tesla has been clear that liability rests with the driver when they engage the Autopilot system. To activate Autopilot, the system must be manually engaged. The driver must pull on a steering wheel stock twice. Tesla’s position is that, like using other systems such as cruise control, the pulling of the stock is a driver’s manifestation that driving conditions are safe enough for the system to be safely used. When Autopilot detects that it is unable to scan the road ahead, a series of warnings tells the driver to regain control of the steering wheel. If those instructions are ignored, then Autopilot automatically decelerates the vehicle. While the system may not be perfect, the value gained from self-driving systems is immense for safety advancement. Indeed this appears to be the NHTSA’s stance on the issue as the head administrator of the organization, who agrees that accidents should not slow the development of self-driving systems.

While Tesla’s Autopilot system does not transfer liability away from the driver, other companies such as Google, Mercedes-Benz, and Volvo have stated that they will accept liability if vehicle accidents are caused by their autonomous cars that are currently under development. An important distinction between those future vehicles and Tesla’s Autopilot system is that these vehicles will be fully autonomous cars falling under Level 4 of the NHTSA’s system. While these companies have offered to accept liability from vehicle accidents, it is likely that they would have no other choice but to. Under Level 4, it is stated that “the driver will provide destination or navigation input, but is not expected to be available for control at any time during the trip.” Accordingly, if driver control is not expected, it is unlikely to find the driver liable for an accident caused by the fully autonomous vehicle.

While escaping liability currently, Tesla’s future plans for a fully autonomous fleet of vehicles will likely lead to it also having to accept liability for these vehicles. Until these fully autonomous vehicles are available to purchase, drivers will remain liable for the vehicles they control.

Khalif Timberlake is a third year law student at Wake Forest University School of Law and first year student at Wake Forest University School of Business. He holds a Bachelor of Arts in both English and International Affairs. Upon graduation, he intends to practice corporate and employment law. 

No comments here
Why not start the discussion?