Elon Musk has long used his mighty Twitter megaphone to amplify the idea that Tesla’s automated driving software isn’t just safe — it’s safer than anything a human driver can achieve.
That campaign kicked into overdrive last fall when the electric-car maker expanded its Full Self-Driving “beta” program from a few thousand people to a fleet that now numbers more than 100,000. The $12,000 feature purportedly lets a Tesla drive itself on highways and neighborhood streets, changing lanes, making turns and obeying traffic signs and signals.
As critics scolded Musk for testing experimental technology on public roads without trained safety drivers as backups, Santa Monica investment manager and vocal Tesla booster Ross Gerber was among the allies who sprang to his defense.
“There hasn’t been one accident or injury since FSD beta launch,” hey tweeted in January. “Not one. Not a single one.”
To which Musk responded with a single word: “Correct.”
In fact, by that time dozens of drivers had already filed safety complaints with the National Highway Traffic Safety Administration over incidents involving Full Self-Driving — and at least eight of them involved crashes. The complaints are in the public domain, in a database on the NHTSA website.
One driver reported FSD automatically “jerked right toward a semi truck” before accelerating into a median post, causing a wreck.
“The car went into the wrong lane” with FSD engaged “and I was hit by another driver in the lane next to my car,” another said.
YouTube and Twitter are rife with videos that reveal FSD misbehavior, including a recent one post that appears to show a Tesla steering itself into the path of an oncoming train. The driver yanks the steering wheel to avert a head-on crash.
It’s nearly impossible for anyone but Tesla to say how many FSD-related crashes, injuries or deaths have occurred; NHTSA is investigating several recent fatal crashes in which it may have been engaged. The agency recently ordered automakers to report serious crashes involving automated and semi-automated technology to the agency, but it has yet to release crash-by-crash detail to the public.
Robot-car companies such as Cruise, Waymo, Argo and Zoox are equipped with over-the-air software that reports crashes to the company immediately. Tesla pioneered such software in passenger cars, but the company, which does not maintain a media relations office, did not respond to questions about whether it receives automated crash reports from cars running FSD. Carmakers without over-the-air software must rely on public reports and communications with drivers and service centers to judge whether an NHTSA report is necessary.
Attempts to reach Musk were also unsuccessful.
Gerber said he was not aware of the crash reports in NHTSA’s database when he posted his tweet, but believed the company would have known about any collisions. “Due to the fact that Tesla records everything that happens, Tesla’s aware of each incident,” he said. He said it was possible the drivers were at fault in the crashes but he had not reviewed the reports himself.
Accurate public statistics on automated car crashes currently don’t exist because police officers who write up crash reports only have the drivers’ statements to go by. “We’re not experts on how to pull that kind of data,” said Amber Davis, spokesperson for the California Highway Patrol. “At the end of the day, we’re asking for best recollections about how [a crash] happened.”
Exactly what data a Tesla vehicle’s automated driving system collects and transmits back to headquarters is known only to Tesla, notes Mahmood Hikmet, head of research and development at autonomous shuttle company Ohmio. He said Musk’s definition of a crash or an accident might differ from how an insurance company or an average person might define it. NHTSA requires crash reports for fully or partly automated vehicles only if someone is injured, or an air bag is deployed, or a car must be towed away.
The FSD crash reports were first unearthed by FSD critic Taylor Ogan, who runs Snow Bull Capital, a China-oriented hedge fund. The Times separately downloaded and evaluated the data to verify Ogan’s findings.
The data — covering a period from Jan. 1, 2021, to Jan. 16, 2022 — show dozens of safety complaints about FSD, including many reports of phantom braking, in which a car’s automatic emergency braking system slams on the brakes for no apparent reason.
Below are excerpts from the eight reports of crashes in which FSD was engaged:
- Southampton, NY: A Model 3 traveling at 60 miles per hour collided with an SUV parked on the highway shoulder. The Tesla drove itself “straight through the side of the SUV, ripping off the car mirror.” The driver called Tesla to say “our car had gone crazy.”
- Houston: A Model 3 was traveling at 35 mph “when suddenly the car jumped over the curb, causing damage to the bumper, to the wheel and a flat tire.” The crash “appeared to be caused by a discolored patch in the road that gave the FSD the false perception of an obstacle which it tried to avoid.” Rejecting a warranty claim, a Tesla service center charged $2,332.37 and said it wouldn’t return the car until the bill was paid.
- brea: “While taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my car.” The car “by itself took control and forced itself into the incorrect lane … putting everyone involved at risk. Car is severely damaged on the driver side.”
- Collettsville, NC: “The road curved to the left and as the car took the turn it took too wide of a turn and veered off the road…. The right side of car went up and over beginning of rock incline. The front right tire blew out and only the side air bags deployed (both sides.) The car traveled about 500 yards along the road and then turned itself off.” The estimated damages were $28,000 to $30,000.
- Troy, Mo.: A Tesla was turning through a curve when “suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods, causing significant damage to the vehicle.”
- Jackson, Mo.: A Model 3 “jerked right toward a semi truck, then jerked left toward the posts in the median as it was accelerating and FSD would not turn off.… We owned this car for 11 days when our wreck happened.”
- Hercules, Calif.: “Phantom braking” caused the Tesla to suddenly stop, and “the vehicle behind me didn’t react.” A rear-end collision caused “serious damage to the vehicle.”
- Dallas: “I was driving on full self driving assistance … a car was in my blind spot so I tried to take over the car by tugging the wheel. The car sounded an alarm indicating I was going to crash into the left hand median. I believe I was fighting with the car to regain control of the car and ended up hitting the left median which ricochet[ed] the car all the way to the right, hitting the median.”
Critics say that the name Full Self-Driving is a misnomer, and that no car available for sale to an individual in the US can drive itself. FSD “is entirely a fantasy,” said New York University professor Meredith Broussard, author of the book “Artificial Unintelligence,” published by MIT Press. “And it’s a safety nightmare.”
California regulations bar a company from advertising a car as full self-driving when it’s not. The state Department of Motor Vehicles is conducting a review of Tesla marketing, a review well into its second year.
DMV head Steve Gordon has declined to speak publicly about the matter since May 2021. On Wednesday, the department said, “The review is ongoing. Will let you know when we have something to share.”