Kansas City, Missouri

HomeMissouriKansas City

Email Brett Emison Brett Emison on LinkedIn Brett Emison on Twitter Brett Emison on Facebook Brett Emison on Avvo
Brett Emison
Brett Emison
Attorney • (800) 397-4910

Tesla’s Autopilot: What Happens When Things Go Wrong?


Tesla Autopilot

Tesla’s autopilot feature is, by all accounts, a very cool feature.  According to the company:

Autopilot allows Model S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed by using active, traffic-aware cruise control.  Digital control of motors, brakes, and steering helps avoid collisions from the front and sides, as well as preventing the car from wandering off the road.

If the car is driving itself – or even if the car is helping to drive itself – a question arises of who is responsible when something goes wrong?  The Wall Street Journal asked this very question back in May – before Tesla’s autopilot had been released.  Unfortunately, it didn’t really provide an answer:

“Tesla is venturing into the mushy middle of automation, where the human still performs part of the driving task, the computer performs other parts.”


“There is an incredible amount of trust being put in [the] human driver that they are paying attention, and there is a lot of evidence where that trust is unfounded”.

Who’s Responsible when a Driverless Car Crashes? [Mike Ramsey at The Wall Street Journal]

Yes.  There is mounting evidence that such trust is unfounded.

One issue is if the feature simply fails.  If the vehicle overrides the driver’s commands and turns into oncoming traffic or even simply runs off of the road, the liability question seems pretty straightforward: the product failed and Tesla would be liable under either a negligence or product liability action.

The issue is murkier where the driver’s conduct might contribute to cause a collision in combination with autopilot’s failure.  Take a look at these close calls:

In this second video, the passenger makes a very interesting comment (at about 1:20): “It’s weird how quickly I went from being slightly nervous about it to having a lot of trust in it.”

And then there’s this idiot who isn’t even sitting in the driver’s seat as his Tesla cruises at highway speeds:

Unfortunately, there aren’t really rules or regulations specifically designed for autonomous or semi-autonomous driving.  Such vehicles are legal in the US simply because they’ve never been outlawed.  As noted at Wired, the technology is difficult to regulate because it’s new, it’s complicated, and it’s being developed by several competing interests.  The limited rules that have been implemented are sporadic and inconsistent.

So, without specific rules governing self-driving vehicles, we’re forced to take a step back and look at the rules that are already in place.  One of the questions is what happens when a Tesla owner is riding – not driving – along using its autopilot mode: Could Telsa be held responsible?  The answer is yes.

Courts across the country have found misuse to be reasonably anticipated.

[W]here there is both a design defect and misuse of the product, each of which contributes to an accident, the misuse does not become an intervening cause if the misuse was foreseeable.  The realities of the intended and actual use are well known to the manufacturer and to the public and these realities should be squarely faced by the manufacturer and the court.

Jarrell v. Fort Worth Steel & Mfg. Co., 666 S.W.2d 838, 836 (Mo. App. 1984).

Under California law, misuse is a defense only where the misuse was so highly extraordinary as to be unforeseeable.  Chavez v. Glock, Inc., 207 Cal.App. 4th 1283, 1308 (2012).

You see, Tesla’s Autopilot feature is designed such that the driver is necessary for its safe and proper use, but is not required.  Tesla’s announcement of the feature – and official policy from the company – is that continued driver input and interaction is necessary.  However, Tesla did not program Autopilot to actually enforce the requirement.

The potential misuse of the Autopilot feature is not just foreseeable, it’s been acknowledged by Tesla.  Just days after Tesla released Autopilot, a group of drivers set off across the country covering 2,994 miles in just 57 hours (including the time plugged into Supercharger stations to “fuel” up).  The group used Autopilot 96% of the time at speeds around 90 mph.  After learning of the feat, Tesla founder Elon Musk tweeted his congratulations.

A Tesla spokesperson went on to say:

“It’s so cool to see Model S owners get out there and use this groundbreaking technology.  The more people who use it, the better it will get.  Having said that, today’s Autopilot features are designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable.  Drivers can’t abdicate responsibility, we expect the driver to be present and prepared to take over at any time.”

And this kind of double-speak is really at the heart of the problem.  If the Autopilot system is designed to be a “hands-on” experience, then the system should require hands-on use.  The very nature of the system lures drivers into a false-sense of confidence in the system and encourages the kind of distraction that it should seek to prevent.


Looks like Elon agrees:

© Copyright 2015 Brett A. Emison

Follow @BrettEmison on Twitter.


Have an opinion about this post? Please consider leaving a comment or subscribing to the feed to have future articles delivered to your feed reader.

  1. Guy says:
    up arrow

    What BS! I am so tired of poor accountability in this country. I own a Tesla P90D and I know with certainty that the Autopilot feature is there for me but I am still the pilot. I am responsible If I plow into a wall or into anyone or anything. Elon’s got balls of steel to offer this technology to the masses knowing how many out there are just plane stupid. Certainly to stupid to be entrusted with anything this amazing. If you die using Autopilot, Tesla isn’t to blame and the unfortunate fact is tabloids grab ahold of marvels like this and ask the question “who would be to blame”. How remedial and expectant. Get a life and look in the mirror and you will see first hand who is responsible for your safety. Thank you Tesla. I’m rocking my P90D in the 808. Aloha.

  2. Brett Emison says:
    up arrow

    Guy – this post is about accountability. Accountability for the driver – I never said the driver wasn’t accountable; I just said the driver wasn’t solely accountable – and Tesla’s accountability.

    BTW – I think you’re right about Tesla and Elon Musk. They have created a ground-breaking product and are pushing advanced technology like no other company. But with that comes responsibility to do so in a manner that is safe not only for their drivers, but for others on the road.

    When Tesla knows that its ground-breaking product is being misused and when Tesla has the means to prevent that misuse, it has a duty to do so.

  3. Carl Reese says:
    up arrow

    Fox News covers Record breaking couple. The autonomous record is just one of 6 transcontinental records the couple has set. https://www.youtube.com/watch?v=OPTb9IgREM0&feature=youtu.be

  4. Carl Reese says:
    up arrow

    Time Magazine’s “theDrive” camera crew covers the behind the scenes of the record autonomous/EV run.