A Fatal Crash Renews Concerns Over Tesla’s ‘Autopilot’ Claim

The company offers a feature called “Full Self-Driving Capability.” But it remains far from a self-driving car. 
tesla camera
Photograph: Salwan Georges/The Washington Post/Getty Images

Tesla offers a $10,000 feature called Full Self-Driving Capability. It includes futuristic goodies like the ability to summon the car via app in a parking lot, and it can detect and react to traffic lights and stop signs. FSD, as Tesla enthusiasts call it, includes Autopilot, a feature that “automatically” drives on highways, changing lanes, keeping a car within its lane and at a consistent distance from other vehicles.

But even people who shell out for Full Self-Driving don’t own a self-driving car, and vehicles with Autopilot can’t automatically pilot themselves. Lengthy blocks of text in Tesla owners’ manuals describe when, where, and how the features should be used: by a fully attentive driver who is holding the steering wheel and is “mindful of road conditions and surrounding traffic.” That driver shouldn’t be near city streets, construction zones, bicyclists, or pedestrians, the manual says. As Tesla puts it on its website, in slightly smaller print: “The currently enabled features require active driver supervision and do not make the vehicle autonomous.”

The nuance may have been lost on two men who died in the Houston suburb of Spring, Texas, on Saturday evening when their 2019 Model S reportedly slammed into a tree and caught fire. It took local firefighters four hours to put out the flames and keep them at bay. In an interview with The Wall Street Journal, Harris County constable Mark Herman said that while local authorities’ preliminary investigation is not complete, they believe no one was behind the wheel of the vehicle when it crashed. “We’re almost 99.9 percent sure,” he said. The bodies of the two victims were reportedly found in the passenger and back seats. Herman told Reuters that the men were talking about Autopilot before they hopped in the car. "We have witness statements from people that said they left to test-drive the vehicle without a driver and to show the friend how it can drive itself," he said.

Tesla didn’t respond to questions about the incident, but CEO Elon Musk tweeted Monday evening that “data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.” Constable Herman did not respond to phone calls, but he told Reuters late Monday that law enforcement will serve search warrants to Tesla to secure data related to the crash.

Both the National Highway Traffic Safety Administration—the government agency that oversees car safety—and the National Transportation Safety Board—an independent agency that investigates noteworthy incidents—have dispatched teams to Texas to study the crash. “We are actively engaged with local law enforcement and Tesla to learn more about the details of the crash and will take appropriate steps when we have more information,” NHTSA said in a statement. It will likely be weeks, if not months, before results of an investigation are released.

Still, the incident again highlights the still-yawning gap between Tesla’s marketing of its technology and its true capabilities, highlighted in in-car dialog boxes and owners’ manuals.

Image may contain: Vehicle, Transportation, Car, Automobile, Sedan, Sports Car, and Race Car
How a chaotic skunkworks race in the desert launched what's poised to be a runaway global industry.

There’s a small cottage industry of videos on platforms like YouTube and TikTok where people try to “fool” Autopilot into driving without an attentive driver in the front seat; some videos show people “sleeping” in the back or behind the wheel. Tesla owners have even demonstrated that, once the driver’s seat belt is secured, someone can prompt a car in Autopilot mode to travel for a few seconds without anyone behind the wheel.

Tesla—and particularly Musk—have a mixed history of public statements about Full Self-Driving and Autopilot. A Tesla on Autopilot issues visible and audible warnings to drivers if its sensors do not detect the pressure of their hands on the wheel every 30 or so seconds, and it will come to a stop if it doesn’t sense hands for a minute. But during a 60 Minutes appearance in 2018, Musk sat behind the wheel of a moving Model 3, leaned back, and put his hands in his lap. “Now you’re not driving at all,” the anchor said with surprise.

This month, Musk told the podcaster Joe Rogan, “I think Autopilot's getting good enough that you won't need to drive most of the time unless you really want to.” The CEO has also repeatedly given optimistic assessments of his company’s progress in autonomous driving. In 2019 he promised Tesla would have 1 million robotaxis on the road by the end of 2020. But in the fall of 2020, company representatives wrote to the California Department of Motor Vehicles to assure them that the Full Self-Driving will “remain largely unchanged in the future,” and that FSD will remain an “advanced driver-assistance feature” rather than an autonomous one.

Thus far, FSD has only been released to 1,000 or so participants of the company’s Beta testing program. “Still be careful, but it’s getting mature,” Musk tweeted last month to FSD Beta testers.

At least three people have died in fatal crashes involving Autopilot. After an investigation into a fatal 2018 crash in Mountain View, California, the NTSB asked the federal government and Tesla to ensure that drivers can only operate Tesla’s automated safety features in places where they are safe. It also recommended Tesla install a more robust monitoring system, to make sure drivers are paying attention to the road. General Motors, for example, will only allow users of its automated SuperCruise feature to operate on pre-mapped roads. A driver-facing camera also detects whether drivers’ eyes are pointing toward the road.

A spokesperson for NHTSA says the agency has opened investigations into 28 Tesla-related crash incidents.

Data released by Tesla suggests the vehicles are safer than the average US car. On Saturday, just hours before the fatal Texas crash, Musk tweeted that Teslas with Autopilot engaged are almost 10 times less likely to crash than the average vehicle, as measured by federal data. But experts point out that the comparison isn’t quite apt. Autopilot is only supposed to be used on highways, while the federal data captures all kinds of driving conditions. And Teslas are heavy, luxury cars, which means they’re safer in a crash.

Updated 4-27-21, 2:15pm EST: This story was updated to clarify that firefighters spent four hours cooling the car, not extinguishing flames.


More Great WIRED Stories