When Shawn Hudson decided to buy a Tesla Model S last year, Autopilot was a key selling point. His commute was brutal—125 miles each way, nearly all of it on the highway—and he figured the semi-autonomous driving system would make his life easier. Over 98,000 miles of driving, he used it regularly, letting the computer keep the car in its lane and away from other cars.
“I was sold,” he told reporters at a press conference Tuesday morning. He would relax during the long ride, checking his phone and sending emails.
That changed one Friday morning a few weeks ago, during his daily drive from his home in Winter Garden to his job at a Nissan dealership in Fort Pierce, Florida. Driving at about 80 mph in the left lane of the Florida Turnpike, with Autopilot engaged, Hudson crashed into a disabled, empty Ford Fiesta.
And in a lawsuit filed today against the automaker, his lawyers write “Tesla has duped consumers, including Hudson, into believing that the autopilot system it offers with Tesla vehicles at additional cost can safely transport passengers at highway speeds with minimal input and oversight from those passengers.”
According to the lawsuit, filed in Florida’s Ninth Judicial Circuit Court, Hudson “suffered severe permanent injuries.” Those include possibly fractured vertebrae, Hudson’s lawyer Mike Morgan, of Morgan & Morgan, said during the press conference. The suit seeks $15,000 in damages (Morgan & Morgan did not immediately reply to a request to confirm that unexpectedly conservative number is correct) and accuses Tesla of negligence, breach of implied warranty or fitness for a particular purpose, and violation of Florida’s Deceptive and Unfair Trade Practices Act, among other things. It also names the owner of the disabled Fiesta as a defendant, accusing him of negligence.
“We have no reason to believe that Autopilot malfunctioned or operated other than as designed,” Tesla said in a statement. “When using Autopilot, it is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner’s Manual and Release Notes for software updates.”
Before using Autopilot for the first time, drivers must agree to watch the road and keep their hands on the wheel. The car monitors the latter and issues warnings when drivers go more than a few seconds without touching the wheel.
This is just the latest in a string of crashes in which Autopilot-equipped Tesla cars have hit stationary vehicles. At least three Teslas have hit stopped fire trucks in 2018 alone. It’s a known weakness of the feature, even noted in Tesla’s manual: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.” Like, say, a Ford Fiesta just sitting there in the left lane.
Similar systems offered by Tesla competitors, including Volvo, have the same shortcoming. It’s a function of how they use radar data. The sensor picks up everything from speed signs to Mack trucks, so engineers curtail false positives—slamming your brakes for no reason can be very dangerous—by focusing on the stuff that’s moving. That’s part of the reason all these automakers, including Tesla, insist that drivers pay constant attention to the road and keep their hands near or on the steering wheel.
Hudson says that at the time of the impact, he was looking at his phone. And his crash makes clear that a disconnect between what a car can do, and what a human wrongly believes it can do can have serious results.
Over the past few years, Tesla has been accused of not addressing that disconnect, most notably by the National Transportation Safety Board. In response, Tesla reduced how long you can keep your hands off the wheel without getting a warning to a few seconds. After a few warnings, Autopilot turns off and can’t be re-engaged until you restart the car. (Cadillac and Audi have developed systems that track the driver’s head position and gaze, respectively, more sophisticated ways to verify they’re watching the road.)
According to the lawsuit, Tesla’s sales people—who are Tesla employees; the automaker doesn’t sell cars through franchised dealers—encouraged Hudson’s confidence in Autopilot. One sales rep said “a consumer can purchase an autopilot upgrade with any Tesla vehicle that will allow the vehicle to drive itself from one point to another, with minimal user input or oversight,” the suit reads. It continues: “Tesla’s sales representative further advised Hudson that, in the event the vehicle detects a hazard, the autopilot system is designed to also alert passengers so that they could take control of the vehicle if necessary.”
Autopilot, though, is not capable of recognizing obstacles it’s not build to handle—that’s why it can’t handle them. Again, that’s why it’s so important that the human remain alert at all times.
Tesla spokespeople did not answer WIRED’s questions about these specific allegations and about how Tesla trains its sales staff to explain Autopilot to consumers.
Whatever the truth of the matter, it’s unlikely this will be the last time someone questions the design of Autopilot and similar systems, and what responsibility their makers should assume for ensuring they’re not taken for granted as they proliferate. Tesla sold 83,500 cars in the third quarter of this year—more than it did in all of 2016. Autopilot—which it offers for $5,000—is one of its most popular features.