I heart Tesla
Image: Michelle Urra
Tech

How Tesla’s ‘Self-Driving’ Beta Testers Protect the Company From Critics

Every so often, a clip of a Tesla running experimental beta driver assistance software goes viral. But who are the people behind the wheel? And why does it matter?
Screen Shot 2021-02-24 at 3
Moveable explores the future of transportation, infrastructure, energy, and cities.

On September 14, YouTuber Galileo Russell posted a video titled "Tesla FSD V10 Monorail Test 5X 🚝 👀." During the video, Russell set his Tesla running Full Self-Driving Beta—which is not, in fact, self-driving, but a "Level 2" driver-assistance system—on the roads under the Seattle monorail, whose support structure runs along the middle of the road. In the video, Russell is excited by how well the car handles this unique road layout and, to his mind, shows tremendous improvement.

Advertisement

Russell is part of a community of Youtubers with access to FSD Beta, a pre-release version of the software Tesla sells as a subscription for $200 a month. These YouTubers, called beta testers, post videos of themselves driving their Teslas around their cities, showing what FSD Beta does and doesn't do well. They're especially interested in "edge cases," scenarios drivers—and artificial intelligence—don't encounter often, to see if FSD Beta can handle them or if it gets stuck or makes potentially dangerous mistakes. Like Russell, they often use these types of "edge cases" to gauge the software's progress over time.

But not everyone shared Russell's excitement over version 10. Hedge fund owner Taylor Ogan clipped a portion of that video for Twitter, which showed Russell's Tesla swerving to take an unexpected right turn across a crosswalk into the path of several pedestrians. Russell hit the brakes and took the wheel to prevent a catastrophe. He exclaimed, "Whoa! Shit," then raised an apologetic hand to the pedestrians, said "sorry," then said, "Damn."

Ogan's tweet of this incident went viral. The Twitter reaction to this clip was overwhelmingly one of horror; after all, it sure looks like a Tesla, running a controversial beta software on public roads, almost steers directly into pedestrians in a crosswalk.

Advertisement

After Ogan's tweet went viral, he got a direct message from Russell, according to a screenshot shared by Ogan. "Yo," Russell said, "Please delete your tweet. You can link to my video but not rip my video and give me no credit. That is illegal. I will follow up with your employer and am contacting twitter." 

Twitter did end up removing the clip from Ogan's tweet, who defended posting the clip to Motherboard by citing fair use and said he linked to the full video in a follow-up tweet in the thread. In any event, the clip continues to circulate on Twitter. 

Motherboard has learned that every FSD Beta tester signs a non-disclosure agreement in order to be a member of the Early Access Program (EAP), a community of Tesla aficionados the company selects. This NDA, the language of which Motherboard confirmed with multiple beta testers, specifically prohibits EAP members from speaking to the media or giving test rides to the media. It also says: "Do remember that there are a lot of people that want Tesla to fail; Don't let them mischaracterize your feedback and media posts." It also encourages EAP members to "share on social media responsibly and selectively...consider sharing fewer videos, and only the ones that you think are interesting or worthy of being shared." 

Though Ogan's tweet was deleted, others, including David Zipper, a visiting fellow at Harvard Kennedy School who frequently writes about the automotive industry and government regulation for CityLab and Slate, have shared their own versions of Russell's video. In other words, a video that was supposed to demonstrate FSD Beta's improvement was going viral as evidence of how dangerous it is.

Advertisement

"Not sure why FSD is such a safety hazard—especially for pedestrians and cyclists?" Zipper asked in a tweet linking to the clip. "Check out this video, posted last week."

In a different world, these FSD beta test videos would be sequestered to a quiet corner of the internet just like any other product testing, unboxing, or brand-obsessed subculture. The people who care about them would care a lot, and everyone else would hardly even know they existed. 

But, one of the only things Tesla supporters and critics agree on is that these videos are important. No matter what you think of Tesla, they provide the only window into just how good (or bad) FSD is. 

Owners of the vehicles opt in to being beta testers, but all the other people on the road, sidewalks, or bike lanes do not. They are all—including, for example, the pedestrians trying to cross the street in Seattle—non-consenting subjects in the experiment, whether they realize it or not. Tesla doesn't release any data on FSD Beta's performance. And, so far, federal and state regulators have been completely absent, including the National Highway Traffic Safety Administration (NHTSA), which Zipper reported has clear jurisdiction to regulate FSD. Instead, by doing nothing, NHTSA has tacitly permitted Tesla to turn American roads into testing grounds. 

Advertisement

As a result, these FSD Beta testing videos are more than entertainment for Tesla supporters like Russell and his fans. They're also important for people like Zipper, who are concerned that a private company is using public roads to beta test potentially dangerous software. 

"How else can we really sound the alarm but by pointing out these videos, which are one of the best pieces of evidence we have, that show what seem to be blatantly unsafe things happening to a Tesla, or the Teslas doing plainly unsafe things, when it's operating under FSD?" Zipper asked. "And so that's why I distributed [the video] because, you know, I've looked pretty hard at what Tesla's doing, and at the, the ADAS [advanced driver assistance systems] space, and to me, there's real risks here to safety that I don't think people fully appreciate." 

Other companies are beta testing experimental car software on public roads, too. Companies like Amazon through its Zoox subsidiary, Alphabet through its Waymo subsidiary, and General Motors through its Cruise subsidiary, just to name a few. But, with the exception of a few truly driverless Waymo vehicles, these companies have paid, trained safety drivers in the vehicles, who are functionally the equivalent of test pilots. And they play incredibly important roles as stewards of public safety. It was, according to government safety officials, the inattentiveness of one of these drivers, in an Uber testing vehicle that contributed, along with faulty software and a toxic company culture, to the 2018 death of Elaine Herzberg in Tempe, Arizona.

Advertisement

The biggest difference between what Tesla is doing and what the other self-driving car aspirants are doing is that Tesla has replaced the role of paid, trained safety drivers with fans and investors. Because of the NDA that beta testers sign, it's difficult to speak to people who have used FSD on the record. Russell and two other beta testers declined to speak on the record in an interview to Motherboard for this article. Four other beta testers did not respond to Motherboard's interview requests. One agreed to an interview as long as Motherboard didn't reveal their identity, citing the NDA, as long as Motherboard didn't ask specifically about FSD's capabilities. Motherboard sent an email to Tesla's long-abandoned inbox for media inquiries, but predictably received no reply, as Tesla no longer responds to media inquiries or requests for comment.

The idea referenced in the NDA that lots of people want Tesla to "fail" is a constant refrain among many Tesla supporters—a suggestion they have obviously gotten from Tesla and Musk himself over the years—as is the idea that Tesla critics have Ford stock, work for oil companies, are shorting Tesla's stock, or are otherwise financially incentivized to spread misinformation about or generally hate on Tesla. Zipper said, at least in his case, that's not true. He has no financial stake in Tesla or any of its competitors. And, for Ogan's part, he says he owns a Tesla with FSD.

Advertisement

The same cannot be said for beta testers themselves. In fact, it's not clear how much this NDA language is even necessary because beta testers tend to be the company's biggest fans. There is no application for the EAP. Tesla chooses its members, based on criteria known only to Tesla. But intense brand loyalty appears to be the common theme. In an interview with The TeslaLife Show, Rafael Santoni, who is known for his YouTube channel TesLatino, said he became an FSD beta tester when "Tesla reached out to me" for reasons he didn't know for sure, but he said it may be because he has "several relationships" with Tesla including in his role as a Tesla Club president. All seem to be Tesla investors or fan club presidents or otherwise people who have publicly professed deep brand loyalty to Tesla. And many invest in Elon Musk's other ventures, too, like SpaceX. The vast majority also post referral links in their YouTube and social media profiles, which gives them Tesla benefits like free Supercharger miles if someone uses it, not to mention whatever advertising or other revenue they receive from having a prominent Youtube channel hyping Tesla's technology thanks to privileged access to beta software the company could revoke at any time. 

But it would also be naive to suggest beta tester motivations are purely monetary. I know about some of the beta testers' investment choices—and even the mere fact that they are beta testers—because it is typically the first fact they list in their social media profiles. Being Tesla fans, beta testers, and investors is a core part of their online identities. Of the beta testers who declined to be interviewed, most cited a fear not of being sued by Tesla for violating the NDA, but of losing beta access. Two even said their main motivation was avoiding making Tesla somehow look bad.

Advertisement

This highlights the paradox of this strange dynamic. A group of evangelists who believe Tesla is on the cusp of changing the world by making roads dramatically safer are posting videos to spread the good word and document its progress. And it is those videos that provide the best evidence critics have to argue the company is actually making roads more dangerous. It is a fitting dynamic, considering the quasi-religious overtones that echo in any debate about Tesla. Two groups of people watch the same video and see very different things.

Russell ended up deleting the monorail test video and, in his next video, explained why. "Tesla doesn’t want us sharing all of the clips of the videos, just, like, when it looks good, because they know people take it out of context, so I’m kind of mad at myself for leaving that in the video," he explained. "But I also wanted to show ya’ll for five tries what happened on the monorail test, and so like, not only the good stuff, you know? But unfortunately the internet can’t handle that."

How FSD Beta testers like Russell react to clips going viral is, in the absence of government oversight, an important question. In the TeslaLife interview, Santoni said "90 percent or more of the time that I'm spending testing the FSD software is not published because it would be too much work." And if people like him start self-censoring even more than they already do, it will further distort the public image of what the software is or isn't capable of. And the word that constantly comes up in this equation is the matter of context. 

Advertisement

Russell used "context" three times in his explanation for why he deleted his own video. And the beta tester Motherboard spoke to said, "A lot of people will see just one video...which shows a very, very bad experience. However, the software is progressing so fast that even though that was just a few months ago, people might think that's the latest video, they may never look at the technology again, because the video is bad."

What both Russell and the other beta tester are alluding to is not only the fact that people don't see the entire video the clip is from, but they also miss the progression of the software over time. This, broadly speaking, is how beta testers form their opinions about the software. They have a baseline of what the software can and can't handle—such as the monorail test or a specific tricky intersection in their cities—and each time a new version is released, they compare its performance to that baseline. This is, after all, how most beta tests for software work, but most software beta tests don't involve multi-ton vehicles moving on public roads.

"I think these videos are a great idea," the beta tester told Motherboard, "It can show people exactly how bad the technology can be and how good it can be...and that just shows the world where we're going so they're ready when it does become public, and they know what they're getting before they've actually ever experienced it."

Advertisement

But this appeal to "context" belies a fundamental misunderstanding with how semi-autonomous technology like FSD Beta works, said Steve Casner, a NASA researcher who has spent decades studying how humans interact with computer-controlled machines. One of the difficulties in talking about semi-autonomous technology, Casner said, is it requires people to think completely differently about driving, because computer programs don't "think" or "learn" like humans do.

"The one thing that really worries me about drivers is the way they build confidence in the car," Casner told Motherboard. To illustrate, he told me about a recent conversation he had with a friend who owns a Tesla. The friend sent Casner a clip of a Tesla in San Francisco moving over to provide a lane-splitting motorcycle more room. An impressive maneuver, Casner acknowledged, demonstrating what appears to be situational awareness. But then the friend kept going, drawing all kinds of inferences about what Teslas are now capable of if it can move over for motorcycles in traffic.

"That's where you got to stop," Casner recalls himself telling his friend. "The car doesn't think the way you do." The car does not see motorcycles and traffic. As Casner put it in a recent article for The Conversation, "Sure, today’s artificial intelligence systems can beat the world’s greatest chess masters, but they can also miss a flashing firetruck that’s sitting right in front of them." (Or, in Tesla's case, mistake the moon for a yellow traffic light.) "We humans draw upon a powerful and flexible commonsense understanding of the world. Our cars possess nothing like that. They know the world as data compiled from video footage."

Advertisement

It sounds counterintuitive, but just because a Tesla passes the monorail test today doesn't mean it is necessarily a better driver than it was yesterday, or that it will pass the monorail test tomorrow. Just because it moved over for a motorcycle in one instance doesn't mean a Tesla has "learned" to avoid motorcycles always and forever, in every driving context. And it takes little more than watching a few videos on YouTube to illustrate this point.

For example, while Russell was raving about FSD beta 10.0, Kim Paquette in Newport, Rhode Island was having a worse experience with the very same software. "V10 has not been good for me," she wrote in the video description. "It is failing at tasks it was able to do previously, like sharing road [sic] for oncoming traffic on narrow streets, unprotected lefts with diagonal cross traffic." Across the 18-minute video clip, she has to intervene several times per minute. Tesla pushed an update less than a week later, but Paquette said the improvements were small.

This may be perplexing if you think of the software the way humans learn about the world, but to Casner this is totally expected. He said we must resist the temptation of thinking, "the car's getting better at this, the car understands this sort of situation," because "that's a human categorization, the car doesn't make human categorizations."

Advertisement

For now, the beta testers appear to be pretty decent safety drivers, at least so far as we can see. They are generally alert with hands near the wheel at all times and intervene quickly when needed. Ironically, this is likely because FSD beta is still so bad the drivers know it can’t handle basic situations. Casner says semi-autonomous technology becomes most dangerous when people start trusting it. And both Casner and Zipper are worried about what will happen if the software becomes just good enough that people start trusting it. 

Soon, many more people will have access to FSD beta. Tesla owners are now able to request access by pressing a button. Musk said that “If driving behavior is good for 7 days, beta access will be granted.” Tesla later published a page explaining how it will calculate these “Safety Scores” which are calculated over 30-day intervals. The company will use five metrics—forward collision warnings per 1,000 miles, hard braking, aggressive turning, unsafe following, and number of Autopilot disengagements because the driver isn’t paying attention—to create a score from 0 to 100, the higher number being “safer.”

Abolish the Driving Test

Casner said this is a more sensible approach to granting beta access, because “these are known ’tells’ for good and bad drivers.” However, he cautioned anyone can pretend to be a good driver for short periods of time to game tests.

In fact, it took less than 48 hours for people to start tweeting about themselves gaming the Safety Score, with such maneuvers as not braking for cyclists and driving purposelessly around the block over and over.

Casner said this was utterly predictable. “They knew people would try to maximize their scores,” he told Motherboard, “so they let them.”

One of the reasons I suspect beta testers often raise the question of "context" is because it avoids having to wrestle with any particular question. There is always more context to be had. The context of a video clip on Twitter is the full YouTube video. The context of a 20-minute test drive is five test drives. The context of those five test drives are five drives done in different cities, at different times of day, with different traffic and environment conditions. And the context of those are the hundreds, even thousands of videos made over the last year of FSD beta. And the context of those countless videos is the data only Tesla has on how the car actually performs, a higher authority to which beta testers can always appeal to, ending every debate back where it started, at the intractable difference of opinion of whether Elon Musk is a force for good or bad, a debate that itself is always being drawn into a contest of more context. In other words, one can always make an appeal to the need for more context.

But Zipper believes this is a fundamentally mistaken way of thinking about the safety of this software. "I think it's absurd," he said of the suggestion that it is all about context. "I don't care if 99 percent of the time the vehicle is driven safely. If one percent of the time the full self-driving is directing the car to oncoming traffic or into a pedestrian or cyclist, we shouldn't be having FSD anywhere."

To illustrate his point, he referred to a moment in Russell's video. As Russell spoke about why Teslas are the future and how FSD beta is improving so impressively, lamenting that people take his videos out of context, he had to stop talking so he could concentrate. The car was trying to drive into a curb cutout for a bus stop that was only about 100 feet long. It mistook the cutout for a lane. 

"This is gonna be funny," Russell said, then grabbed the wheel. "Ha! Whoa, whoa, whoa, what are you doing?" He asked the car as it tried to drive over the curb.

"OK, that wasn't good," he admitted. "But, um, yeah," he transitioned. "Tesla doesn't want us sharing all of the clips of the videos, just like when it looks good, because they know people take it out of context."

"So I’m on better behavior now," Russell said. He added that he's going to be "more selective with what I post. Because that’s the game."