For 45 minutes on Friday, Tesla and X CEO Elon Musk live-streamed a drive in a Tesla controlled by an unreleased version of the company’s full self-driving (FSD) system. He left the Tesla Engineering HQ in Palo Alto and took a cruise through the city using FSD v12, which is currently unavailable even to beta testers. Musk has called v12 “mind-blowing.” Twenty minutes into the journey, the self-driving car lurched forward into an intersection and attempted to run a red light.
“Ohhh, intervention!” Musk can be heard saying. The phrase is a reference to anytime a driver has to take control from a self-driving system. “Sorry,” he said, and then laughed.
Videos by VICE
According to Musk, v12 runs on neural nets (a term for deep learning AI models) instead of handwritten code. Tesla trained the driving system on hours of footage of traffic lights, turns, and cars on the road. The drive went smoothly for the most part, but at an intersection where Musk intended the car to go straight, the light for a left turn went green and the straightaway remained red. Despite not having the go-ahead, Musk’s Tesla attempted to roll the red light.
The model needs to watch more footage of controlled and uncontrolled left turns, Musk said.
“So with that intervention we just had, the solution is essentially to feed the network a bunch more video of traffic lights,” Musk said in the video. “That was a controlled left turn where there was a green light for the left turn but not a greenlight to go straight. And, so, we’ll feed it a bunch of video of controlled left turns and then it’ll work.”
Dan O’Dowd, a software CEO who wants to ban Tesla’s FSD due to safety concerns, noted the moment on X. “Watch this @Tesla FSD Beta v12 try to run a red light, forcing Elon Musk to take over. ‘Intervention!’ @ElonMusk your defective and dangerous software has no place on our public roads and should be banned immediately,” O’Dowd tweeted.
As usually happens when someone criticizes Musk on the platform he owns, an army of reply guys with blue checkmarks descended on O’Dowd. They claimed that one incident in a 40 minute video was pretty good and that v12 is still in beta. “Focusing on the 1 negative out of the entire 40 minutes video showcasing mind-blowing improvements…oh Dan” one blue-check user with 119 followers said.
“Mrs. Lincoln, other than that, how did you like the play?” O’Dowd responded.
A community note was added to O’Dowd’s original tweet. “The version of the software demonstrated here is not available to the public,” it reads.
The safety of Tesla’s autopilot and full self-driving systems have come under increasing scrutiny in recent years. Tesla is facing multiple lawsuits in the U.S. this year where plaintiffs allege that autopilot was to blame for the deaths of drivers.
To be fair to Tesla’s self-driving system, making a left turn at an intersection is one of the most deadly and complicated things a human driver does in a car. An overwhelming amount of traffic accidents involve making a left hand turn. Problems with left hand turns are so common that UPS has a policy against them, a restriction that’s saved them on fuel costs and prevented traffic accidents.
Tesla did not respond to Motherboard’s request for comment.