On Friday, two men died in Spring, Texas when their 2019 Tesla Model S crashed into a tree. The crash quickly made national news when authorities revealed that "no one was driving" and neither body was in the driver's seat, immediately rousing suspicion that Tesla's "Autopilot" or "Full Self-Driving" systems have claimed two more victims. Since then, authorities have said they "are certain" no one was driving at the time of the crash.
Tesla and Elon Musk were silent on the issue until Monday, when Musk replied to an account with 478 followers that had replied to the Wall Street Journal Twitter account regarding the story. Musk claimed "Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD. Moreover, standard Autopilot would require lane lines to turn on, which this street did not have."
There is an awful lot of speculation online about the exact circumstances of this crash. Two federal agencies are investigating: the National Transportation Safety Board, which is generally regarded as the best crash investigators in the world, and the National Highway Traffic Safety Administration, which has steadfastly ignored pleas from the NTSB to restrict what the latter described as Tesla testing new and potentially risky technology on the public without adequate oversight.
Tesla's semi-autonomous software and marketing have been called out as dangerous and misleading before. Both Autopilot and "Full Self-Driving" (which is not self-driving software) have extensively documented flaws both in terms of how well they function and how easily the safety features can be circumvented. Musk himself has repeatedly acknowledged and even celebrated use cases where people take advantage of those flaws, such as the time he appeared to reference a viral video of a couple having sex in a Tesla with Autopilot engaged.
The premise of Musk's tweet about the crash in Texas is that if Autopilot was functioning perfectly, this never could have happened. But the premise of a Tesla with a tree trunk in the passenger's seat is that something went horribly wrong.
Whatever the exact circumstances that resulted in this tragedy, somehow these two adult men believed they could leave the driver's seat of their Tesla while it was driving. Regardless of what systems were or weren't engaged, it's hard to think of more damning evidence that there is a fatal gap between what Tesla wants its customers to think its cars can do and what they can actually do.