The Legal Examiner Affiliate Network The Legal Examiner The Legal Examiner The Legal Examiner search instagram avvo phone envelope checkmark mail-reply spinner error close The Legal Examiner The Legal Examiner The Legal Examiner
Skip to main content

Tesla’s autopilot feature is, by all accounts, a very cool feature.  According to the company:

Autopilot allows Model S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed by using active, traffic-aware cruise control.  Digital control of motors, brakes, and steering helps avoid collisions from the front and sides, as well as preventing the car from wandering off the road.

If the car is driving itself – or even if the car is helping to drive itself – a question arises of who is responsible when something goes wrong?  The Wall Street Journal asked this very question back in May – before Tesla’s autopilot had been released.  Unfortunately, it didn’t really provide an answer:

“Tesla is venturing into the mushy middle of automation, where the human still performs part of the driving task, the computer performs other parts.”

***

“There is an incredible amount of trust being put in [the] human driver that they are paying attention, and there is a lot of evidence where that trust is unfounded”.

Who’s Responsible when a Driverless Car Crashes? [Mike Ramsey at The Wall Street Journal]

Yes.  There is mounting evidence that such trust is unfounded.

One issue is if the feature simply fails.  If the vehicle overrides the driver’s commands and turns into oncoming traffic or even simply runs off of the road, the liability question seems pretty straightforward: the product failed and Tesla would be liable under either a negligence or product liability action.

The issue is murkier where the driver’s conduct might contribute to cause a collision in combination with autopilot’s failure.  Take a look at these close calls:

In this second video, the passenger makes a very interesting comment (at about 1:20): “It’s weird how quickly I went from being slightly nervous about it to having a lot of trust in it.”

And then there’s this idiot who isn’t even sitting in the driver’s seat as his Tesla cruises at highway speeds:

https://youtu.be/3gax8BnQpuA

Unfortunately, there aren’t really rules or regulations specifically designed for autonomous or semi-autonomous driving.  Such vehicles are legal in the US simply because they’ve never been outlawed.  As noted at Wired, the technology is difficult to regulate because it’s new, it’s complicated, and it’s being developed by several competing interests.  The limited rules that have been implemented are sporadic and inconsistent.

So, without specific rules governing self-driving vehicles, we’re forced to take a step back and look at the rules that are already in place.  One of the questions is what happens when a Tesla owner is riding – not driving – along using its autopilot mode: Could Telsa be held responsible?  The answer is yes.

Courts across the country have found misuse to be reasonably anticipated.

[W]here there is both a design defect and misuse of the product, each of which contributes to an accident, the misuse does not become an intervening cause if the misuse was foreseeable.  The realities of the intended and actual use are well known to the manufacturer and to the public and these realities should be squarely faced by the manufacturer and the court.

Jarrell v. Fort Worth Steel & Mfg. Co., 666 S.W.2d 838, 836 (Mo. App. 1984).

Under California law, misuse is a defense only where the misuse was so highly extraordinary as to be unforeseeable.  Chavez v. Glock, Inc., 207 Cal.App. 4th 1283, 1308 (2012).

You see, Tesla’s Autopilot feature is designed such that the driver is necessary for its safe and proper use, but is not required.  Tesla’s announcement of the feature – and official policy from the company – is that continued driver input and interaction is necessary.  However, Tesla did not program Autopilot to actually enforce the requirement.

The potential misuse of the Autopilot feature is not just foreseeable, it’s been acknowledged by Tesla.  Just days after Tesla released Autopilot, a group of drivers set off across the country covering 2,994 miles in just 57 hours (including the time plugged into Supercharger stations to “fuel” up).  The group used Autopilot 96% of the time at speeds around 90 mph.  After learning of the feat, Tesla founder Elon Musk tweeted his congratulations.

A Tesla spokesperson went on to say:

“It’s so cool to see Model S owners get out there and use this groundbreaking technology.  The more people who use it, the better it will get.  Having said that, today’s Autopilot features are designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable.  Drivers can’t abdicate responsibility, we expect the driver to be present and prepared to take over at any time.”

And this kind of double-speak is really at the heart of the problem.  If the Autopilot system is designed to be a “hands-on” experience, then the system should require hands-on use.  The very nature of the system lures drivers into a false-sense of confidence in the system and encourages the kind of distraction that it should seek to prevent.

Update:

Looks like Elon agrees:

© Copyright 2015 Brett A. Emison

Follow @BrettEmison on Twitter.

Comments for this article are closed.