Cars with varying degrees of autonomy are already part of our reality. In the fourth quarter of 2016, Tesla alone delivered 12,700 of its flagship Model S and 9,500 of the Model X vehicles. According to the U.S. Insurance Institute for Highway Safety, there will be 3.5 million self-driving vehicles by 2025 and 4.5 million by 2030. The Ontario government embraced innovation in this sector of the automobile industry when it became the first province in Canada to allow the testing of autonomous vehicles on its roads as of January 2016.1

Many of us are wondering how all this will affect the way that we investigate liability and litigate motor vehicle collisions, particularly in the absence of government regulation for autonomous cars. The investigation that resulted from a fatal crash between a Tesla Model S and a tractor-trailer in Florida on May 7, 2016 has provided some insight into this question.

The incident involved a 2015 Tesla Model S that, at the time of the collision, was being operated in Autopilot mode. It crashed into a tractor-trailer that was crossing an uncontrolled intersection. The Tesla vehicle technically had the right of way. The vehicle's Automatic Emergency Braking (AEB) system did not provide any warning or engage automated braking prior to the collision. The Tesla driver took no independent braking, steering or other action to avoid the collision because he was distracted for a period of time that exceeded 7 seconds.

The investigation was conducted by the U.S. Department of Transportation National Highway Traffic Safety Administration and concluded on January 19, 2017.2 No defects were identified in the design or performance of the AEB or Autopilot systems in the vehicle. No recall of the Model S was warranted.

Some of the factors that led to the conclusion that Tesla's automated systems did not malfunction and did not increase the risk of an accident in the Florida incident included the information about system limitations contained in the vehicle's owner's manuals, user interface and associated warning and alerts that the vehicle used, and the driver monitoring system that is intended to aid the driver in remaining engaged in the driving task at all times. The key takeaway from the investigation was that drivers must understand the limitations of these types of vehicles before they get on the road and decide to disengage from the driving environment.

Based on the information contained in the investigation report, the following are some questions to consider if and when faced with a lawsuit involving a collision with an autonomous (or semi-autonomous) vehicle:

1. Back to Basics: Who had the right of way?

Until all vehicles on the road are fully autonomous, the Rules of the Road will continue to be an important consideration when evaluating and apportioning liability. The driver of the tractor-trailer in the fatal Florida collision was charged for failing to yield to the Tesla driver's right of way.

2. Was the Autopilot engaged at the time of the crash?

The automated vehicles that are on the road at present allow the operator to switch in and out of the autopilot mode. Tesla vehicles log data to the manufacturer's servers that provide extensive information about the use of the various features of the car and driver engagement at all times.

If fault is at issue, an attempt to obtain the data should be made, given the sheer amount of information that is stored by the manufacturer. In the fatal Florida crash, the investigators secured the release of the data through a subpoena (court Order). Insurers investigating circumstances of a loss pursuant to the Ontario Automobile Policy may have to do the same, as it's not immediately clear whether a simple consent from the vehicle owner would be sufficient authority to access data that the manufacturer may consider proprietary in nature. Within the context of litigation, defence counsel would turn to Rule 30.10.

The data may in some instances not be accessible due to the extent of damage to the vehicle. In a recent fatal Tesla death in China, the damage to the vehicle was so extensive that the car was physically incapable of transmitting log data to Tesla's servers.

3. Was avoidance of the particular type of collision within the accepted scope of performance capabilities of the automated system?

Tesla's AEB system, at least for the time being, is designed to avoid or mitigate rear-end collisions. Braking for crossing path collisions, cut-ins and cut-outs is currently outside the expected performance capabilities of the system. The responsibility rests on the driver to keep a proper look out and take steps to avoid a collision.

4. Did the collision occur on a dry, straight road outside of an urban centre?

Tesla's owner's manual indicates that the Traffic-Aware Cruise Control (TACC) part of its Autopilot "is primarily intended for driving on dry, straight roads, such as highways and freeways. It should not be used on city streets." The manual further contains warnings related to the use on winding roads with sharp curves, slippery surfaces, poor weather conditions, and near pedestrians and cyclists.

If Autopilot was not engaged at the time of the accident because the vehicle was being operated outside of the environments recommended by the owner's manual, driver errors and municipal liability considerations will continue to dominate the assessment.

5. At what junction of the road did the crash occur?

Highway entrance and exit ramps, construction zones, and road junctions and intersections are not proper environments for autonomous driving.

6. What was the automation level of the Autopilot?

The current generation of autonomous vehicles have Level 1 or Level 2 automation, but this will change in the future. Levels 1 and 2 automation requires continuous attention by the operator to monitor the driving environment and take immediate control when necessary. Both TACC and Autosteer used in Tesla vehicles are intended for use only on highways and limited-access roads with a fully attentive driver.

7. If the automation level required engagement of the driver, was the driver fully attentive?

The Florida fatal crash involved a period of extended driver distraction of at least 7 seconds. This level of distraction is not common, but was foreseeable and was considered by Tesla over the course of Model S development. The vehicle contained a system that monitored the driver's engagement through the placement of hands on the steering wheel, as well as other signs of driver engagement. Some manufacturers are reportedly testing and developing systems that track eye movements of the drivers in order to ensure engagement.

Reviewing the particular model's specifications will therefore be integral to determining if the driver was distracted. If the driver was distracted, then human error will still be a factor in the assessment of negligence.

Other practical tips to consider during an investigation include:

  • review the owner's manual,
  • review the notes for new software releases by the automobile's manufacturer,
  • review the user agreement that is required before Autosteer is enabled for the first time, and
  • inquire into any dialog boxes that appear every time that Autosteer is activated.

Conclusion

While autonomous vehicles have the potential to improve road safety, reduce traffic congestion, and produce many environmental benefits, the focus at present is understandably on addressing the new and unique safety challenges that these vehicles create.

There is currently no proven, practical way to test the safety of autonomous vehicles, nor is there agreement on how safe they should be before being allowed on the roads. Experts are of the view that real-world driving is an essential ingredient for improving their safety at a quicker pace.3 Unleashing vehicles that may not be completely safe onto the roads may, in turn, lead to interesting questions about standard of care and need for regulation.

Mercedes, Volvo, and Google have agreed to accept liability when their autonomous cars malfunction and cause accidents4, but the Tesla example shows that there is fine print to that agreement. Manufacturers are being careful and forthcoming about the vehicle's limitations in owner's manuals in order to shield themselves from liability. They store a great deal of information that monitors the actions of both the vehicle and the operator. The research and development stage accounts for many of the common and uncommon human errors.

In the end, one of the biggest factors to ensuring a smooth integration of autonomous vehicles into real traffic appears to be the operator's willingness to understand the limitations of these vehicles and act accordingly. If drivers do not meet their end of the bargain, the well-known statistic that 90% of crashes are caused by human error will continue to hold true even in cases involving self-driving cars.

Footnotes

1 Ministry of Transportation of Ontario, "Automated Vehicles Coming to Ontario Roads", News Release (November 28, 2016)

2 U.S. Department of Transportation National Highway Traffic Safety Administration, "Preliminary Evaluation Report", https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF (January 19, 2017)

3 The RAND Corporation, "Challenges and Approaches to Realizing Autonomous Vehicle Safety", Testimony Submitted to the House Energy and Commerce Committee, Subcommittee on Digital Commerce and Consumer Protection (February 14, 2017)

4 Michael Ballaban, "Mercedes, Google, Volvo To Accept Liability When Their Autonomous Cars Screw Up", Jalopnik (October 7, 2015)

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.