Two men were killed after their Tesla crashed into a tree Friday night in Spring, Texas, according to local news reports. Unlike other crashes, there was no one in the driver’s seat.
Local police told KPRC 2 that their investigation revealed "no one was driving" the 2019 Tesla S at the time of the crash: one person was in the front passenger, another in the rear passenger seat, no one in the driver seat. In an interview with the New York Times, local police also said the men's wives overheard them talking about the Autopilot feature and wanting to go for a ride. It is unclear whether the car was on “Autopilot” or if they were using a beta version of what Tesla calls “Full Self Driving.”
Local police also told reporters they had to use 32,000 gallons of water "because the vehicle's batteries kept reigniting" and at one point called Tesla to ask for help.
The incident raises two problems. First, it shows that there are people who are letting their cars drive them around without sitting in the driver’s seat. Both Autopilot and the beta of Full Self Driving are supposed to ensure that the driver is actually paying attention (their hands are also supposed to be on the wheel, ready to take over if something goes wrong). That does not appear to have happened here. Second, it shows that fire departments around the country do not know how to put out fires from electric car batteries.
For years, Tesla has advertised its Autopilot feature as a fully autonomous function, even though it admits in its owners' manual that Autopilot and "Full Self Driving" are supposed to require the driver to have their hands on or near the wheel. Just this year, Musk went on the Joe Rogan Experience to sing the feature's praises and said "I think Autopilot's getting good enough that you won't need to drive most of the time unless you really want to."
The reality, however, is that Tesla admits these features are nowhere near autonomous driving. In its owners manual, the company admits both “require active driver supervision and do not make the vehicle autonomous.” Federal authorities have also warned that the company has essentially rolled out an incomplete version and is testing the technology on drivers, at great risk.
The National Highway Traffic Safety Administration told the New York Times it is investigating dozens of crashes where the supposed self-driving technology consistently failed to prevent collisions, resulting in serious injuries or death.
In response to the crash, the TeslaMotors subreddit, which often simply boosts the brand, banned discussion of crashes as well as “over-the-top cynicism.”