It is important to note that this is with safety drivers. Professional driver + their most advanced "Robotaxi" FSD version under test with careful scrutiny is 4x worse than the average non-professional driver alone and averaging 57,000 miles per minor collision.
Yet it is quite odd how Tesla also reports that untrained customers using old versions of FSD with outdated hardware average 1,500,000 miles per minor collision [1], a literal 3000% difference, when there are no penalties for incorrect reporting.
The comparison to human crash rates needs more context. These low-speed incidents (1-4 mph backing into a fixed object) rarely get reported in human driver statistics because they usually do not involve police reports or injuries. The NHTSA SGO database counts all ADS incidents regardless of severity, while human driver baselines come from reported incidents.
That said, the redaction issue is the real story. Waymo publishes detailed narratives. Zoox publishes detailed narratives. Tesla marks everything confidential. When every other company is transparent and one is not, that tells you something about what they are finding in the data. You cannot independently assess fault or system failure, which makes any comparison meaningless.
WarmWashtoday at 8:27 PM
The problem Tesla faces and their investors are unaware of, is that just because you have a Model Y that has driven you around for thousands of miles without incident does not mean Tesla has autonomous driving solved.
Tesla needs their FSD system to be driving hundreds of thousands of miles without incident. Not the 5,000 miles Michael FSD-is-awesome-I-use-it-daily Smith posts incessantly on X about.
There is this mismatch where overly represented people who champion FSD say it's great and has no issues, and the reality is none of them are remotely close to putting in enough miles to cross the "it's safe to deploy" threshold.
A fleet of robotaxis will do more FSD miles in an afternoon than your average Tesla fanatic will do in a decade. I can promise you that Elon was sweating hard during each of the few unsupervised rides they have offered.
lateforworktoday at 7:51 PM
Tesla's Robotaxis are bringing a bad name to the entire field of autonomous driving. The average consumer isn't going to make a distinction between Tesla vs. Waymo. When they hear about these Robotaxi crashes, they will assume all robotic driving is crash prone, dangerous and irresponsible.
Trastertoday at 7:28 PM
I said in earlier reports about this, it's difficult to draw statistical comparisons with humans because there's so little data. Having said that, it is clear that this system just isn't ready and it's kind of wild that a couple of those crashes would've been easily preventable with parking sensors that come equipped as standard on almost every other car.
In some spaces we still have rule of law - when xAI started doing the deepfake nude thing we kind of knew no one in the US would do anything but jurisdictions like the EU would. And they are now. It's happening slowly but it is happening. Here though, I just don't know if there's any institution in the US that is going to look at this for what it is - an unsafe system not ready for the road - and take action.
jgalt212today at 11:37 PM
TSLA investors don't care (as long as Musk is still there to keep them believing). Years of bad news, and the stock is only 10% off it's all time highs.
I'm not an Elon fan at all, and I'm highly skeptical of Tesla's robotaxi efforts in general, but the context here is that only one of these seems like a true crash?
I'm curious how crashes are reported for humans, because it sounds like 3 of the 5 examples listed happened at like 1-4 mph, and the fourth probably wasn't Tesla's fault (it was stationary at the time). The most damning one was a collision with a fixed object at a whopping 17 mph.
Tesla sucks, but this feels like clickbait.
vessenestoday at 7:53 PM
Interesting crash list. A bunch of low speed crashes, one bus hit the Tesla while the Tesla was stationary, and one 17mph into static object (ouch).
For those complaining about Tesla's redactions - fair and good. That said, Tesla formed its media strategy at a time when gas car companies and shorts bought ENTIRE MEDIA ORGs just to trash them to back their short. Their hopefulness about a good showing on the media side died with Clarkson and co faking dead batteries in a roadster test -- so, yes, they're paranoid, but also, they spent years with everyone out to get them.
whimsicalismtoday at 10:28 PM
I'm no tesla lover, but I doubt that 4mph backing into an object is something that would be reported in a human context so I'm not sure a '4x' number is really comparative vs. sensationalized.
maxdotoday at 9:16 PM
electrec as always.
```
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
```
so in reality one crash with fixed object, the rest is... questionable, and it's not a crash as you portrait. Such statistic will not even go into human reports, as it goes into non driving incidents, parking lot etc.
pnwtoday at 10:21 PM
Electrek is a highly biased source, the editor has a grudge against Elon and Tesla. It's really unfortunate since it used to be one of the best EV sites.
DivingForGoldtoday at 11:10 PM
Elon discarded any thoughts of LiDAR years ago. Motor Trend said the lack of LiDAR and reliance on cameras has led to specific, recurring issues that critics argue make it less "practical" for true, unsupervised autonomy.
DoesntMatter22today at 11:32 PM
Not too bad. It’ll only improve from here and some of the accidents are reversing into poles and what not. Most of which isn’t counted in human accidents.
nelsonictoday at 10:27 PM
Did anyone actually read the article before commenting? The crashes were all minor. No injuries. If anything this shows Tesla making an effort to report everything. A 2mph bump isn’t a “crash” it’s barely anything. The 17mph collision may have caused some minor damage to the “fixed object” but not clear from the article.
fabian2ktoday at 8:03 PM
It's impressive how bad they're at hiring the safety drivers. This is not even measuring how good the Robotaxi itself is, right now it's only measuring how good Tesla is at running this kind of test. This is not inspiring any confidence.
Though maybe the safety drivers are good enough for the major stuff, and the software is just bad enough at low speed and low distance collisions where the drivers don't notice as easily that the car is doing something wrong before it happens.
iknowstufftoday at 10:45 PM
Such slop. First, they take NHTSA SGO "crashes" which explicitly includes basically any physical impact with property damage e.g. 1–2 mph “backed into a pole/tree”.
Then they compare that numerator to Tesla’s own “minor collision” benchmark — which is not police-reported fender benders; it’s a telemetry-triggered “collision event” keyed to airbag deployment or delta-V ≥ 8 km/h. Different definitions. Completely bogus ratio.
Any comparison to police-reported crashes is hilariously stupid for obvious reasons.
On top of that, the denominator is hand-waved ("~800k paid miles extrapolated"), which is extra sketchy because SGO crashes can happen during non-paid repositioning/parking while "paid miles" excludes those segments. And we’re talking 14 events in one geofenced, early rollout in Austin so your confidence interval is doing backflips. If you want a real claim vs humans, do matched Austin exposure, same reportable-crash criteria, severity stratification, and show uncertainty bands.
But what you get instead is clickbait so stop falling for this shit please HN.
ProfessorZoomtoday at 9:02 PM
Is there any place online to read the incident reports? For example Waymo in CA there's a gov page to read them, I read 9 of them and they were all not at the fault of Waymo, so I'm wondering how many of these crashes are similar (ie at a red light and someone rear ends them)
legitstertoday at 9:40 PM
Also keep in mind all of the training and data and advanced image processing has only ever been trained on cities with basically perfect weather conditions for driving (maybe with the exception of fog in San Francisco).
We are still a long, long, long way off for someone to feel comfortable jumping in a FSD cab on a rainy night in in New York.
leesectoday at 9:46 PM
Funny to see the comments here vs the thread the other day where a Waymo hit a child.
There's no real discussion to be had on any of this. Just people coming in to confirm their biases.
As for me, I'm happy to make and take bets on Tesla beating Waymo. I've heard all these arguments a million times. Bet some money
bdangubictoday at 11:13 PM
at this point much bigger news story might be “robotaxi made it from point A to point B in a straight line on a vacant parking lot (supervised)”
smileson2today at 8:22 PM
ill stick to the bus
guywithahattoday at 10:37 PM
This is something Electrek does regularly and isn't unique to this article but I don't like how they suggests the Tesla crash reports are doing something shady by following the reporting guidelines. Tesla is reporting things by the books, and when Electrek doesn't like how the laws are laid out they blame Tesla. Electrek wants Tesla to publish separate press notes, and since they don't they take their frustration out on the integrity of the article, which is worse for everyone.
nova22033today at 9:00 PM
He going to fix this by having grok redefine "widespread"
Tesla CEO Elon Musk said at the World Economic Forum in Davos that the company’s robotaxis will be “widespread” in the U.S. by the end of 2026.
jeffbeetoday at 9:08 PM
Their service is way worse than you think, in every way. The actual unsupervised Robotaxi service doesn't cover a geofenced area of Austin, like Waymo does. It traverses a fixed route along South Congress Avenue, like a damned bus.
ggmtoday at 10:28 PM
Given how minor these are, you think they'd get in front of the conspiracy by full disclosure.
yieldcrvtoday at 9:36 PM
Waymo is licensing out their "Driver" software to cars that fit the specification
if Tesla drops the ego they could obtain Waymo software and track record on future Tesla hardware
Grimblewaldtoday at 10:24 PM
I spew elon hate every chance I get and I maintain I am being too kind on him.
lbritotoday at 9:54 PM
Now imagine if all those billions in taxes had been used to build real transit infrastructure instead of subsidizing Tesla.
chinathrowtoday at 8:40 PM
Well, how about time to take them off the roads then?
pengarutoday at 8:06 PM
It's a fusion of jazz and funk!
hermitcrabtoday at 8:27 PM
"Tesla remains the only ADS operator to systematically hide crash details from the public through NHTSA’s confidentiality provisions."
Given the way Musk has lied and lied about Tesla's autonomous driving capabilities, that can't be much of a surprise to anyone.
ModernMechtoday at 9:30 PM
Honestly I thought everyone was clear how this was going to go after the initial decapitation from 2016, but it seems like everyone's gonna allow these science experiments to keep causing damage until someone actually regulates them with teeth.
anonym29today at 9:05 PM
This data seems very incomplete and potentially misleading.
>The new crashes include [...] a crash with a bus while the Tesla was stationary
Doesn't this imply that the bus driver hit the stationary Tesla, which would make the human bus driver at fault and the party responsible for causing the accident? Why should a human driver hitting a Tesla be counted against Tesla's safety record?
It's possible that the Tesla could've been stopped in a place where it shouldn't have, like in the middle of an intersection (like all the Waymos did during the SF power outage), but there aren't details being shared about each of these incidents by Electrek.
>The new crashes include [...] a collision with a heavy truck at 4 mph
The chart shows only that the Tesla was driving straight at 4mph when this happened, not whether the Tesla hit the truck or the truck hit the Tesla.
Again, it's entirely possible that the Tesla hit the truck, but why aren't these details being shared? This seems like important data to consider when evaluating the safety of autonomous systems - whether the autonomous system or human error was to blame for the accident.
I appreciate that Electrek at least gives a mention of this dynamic:
>Tesla fans and shareholders hold on to the thought that the company’s robotaxis are not responsible for some of these crashes, which is true, even though that’s much harder to determine with Tesla redacting the crash narrative on all crashes, but the problem is that even Tesla’s own benchmark shows humans have fewer crashes.
Aren't these crash details / "crash narrative" a matter of public record and investigations? By e.g. either NHTSA, or by local law enforcement? If not, shouldn't it be? Why should we, as a society, rely on the automaker as the sole source of information about what caused accidents with experimental new driverless vehicles? That seems like a poor public policy choice.
outside1234today at 8:32 PM
Just imagine how bad it is going to be when they take the human driver out of the car.
No idea how these things are being allowed on the road. Oh wait, yes I do. $$$$
dayyantoday at 10:04 PM
[dead]
xysttoday at 8:30 PM
[flagged]
BirAdamtoday at 8:12 PM
[flagged]
b8today at 7:34 PM
[flagged]
LightBug1today at 8:17 PM
Move fast and hospitalise people.
arein3today at 8:02 PM
A minor fender-bender is not a crash
4x worse than humans is misleading, I bet it's better than humans, by a good margin.
small_modeltoday at 7:43 PM
The source is a well known anti Tesla, anti Musk site, the owner has a psychotic hatred from Tesla and Elon after being a balanced click bait site for years. Ignore.
ArchieScrivenertoday at 8:37 PM
Good, who cares. Autonomous driving is an absolute waste of time. We need autodrone transport for civilian traffic. The skies have been waiting.