Tech

The Government Is Finally Catching Up With Tesla's Wild Autopilot Claims

After years of looking the other way, regulators might finally be getting around to caring about Tesla's deceptive self-driving claims.
Don't do what donny dont does
Screenshot: Youtube
Screen Shot 2021-02-24 at 3
Moveable explores the future of transportation, infrastructure, energy, and cities.

Tesla, the world's most frustrating company, simultaneously makes what are widely regarded as the best electric vehicles and most functional and comprehensive charging network while also selling the world's most dangerous and widely abused driver-assist features. Thanks to years of the company's misleading marketing of the "Autopilot" and "Full Self-Driving" packages—as well as the frequent wild claims by the extremely online CEO Elon Musk such as the prediction in 2019 that there would be one million Tesla robotaxis by 2020—owners perceive it to be far more capable than it is

Advertisement

After years of looking the other way, it's possible that maybe, just maybe, the government is finally going to do something about Tesla's massive beta test in which we are all experiment subjects.

On Monday, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into 11 cases where a Tesla on Autopilot crashed into emergency vehicles. NHTSA has previously disclosed it is also investigating 30 other Tesla crashes where 10 people died, most involving Autopilot of FSD.

NHTSA's investigations alone indicate a new degree of seriousness from the agency under the Biden administration, but Tesla faces criticism from elsewhere in the government, too. On Wednesday, Senators Richard Blumenthal and Ed Markey sent a letter to Federal Trade Commission chief Lina Kahn asking her to open its own investigation into Tesla's deceptive marketing practices around Autopilot and FSD. The letter cites a video Tesla posted to YouTube in 2019 with 18 million views showing someone "driving" the car without touching the wheel for more than a minute, in violation of Tesla's own stated safety policies. 

Even taking Tesla's policies at relatively face value—and not including the highly publicized ways Teslas have been easily tricked for years into driving on their own for extended periods, bugs for which Tesla could issue a software update to fix—Tesla has always tried to have it both ways. It promotes these driver assist features as if they basically drive the car itself—the names are "Autopilot" and "Full-Self Driving," after all—and you can pay $10,000 for the privilege of using them, a premium price for what’s being sold as a premium experience. But, in the fine legal print, the company says these features are no more reliable than any other Level 2 driver assist system that can be found from virtually every other manufacturer, and the driver must still pay close attention at all times. Some drivers tragically find this out the hard way, like George McGee, a man in Florida who reached down to pick up his phone thinking Autopilot was in control when it promptly slammed into another car, killing a woman. When police arrived, he referred to the car's capabilities as "stupid cruise control."

Whether anything will come of these investigations remains to be seen—or, in the FTC's case, if an investigation will be made at all. But if the last five years or so have taught us anything, it's that Tesla won't stop until someone makes them.