Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.
Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.
Oh boy, and they just removed “steering wheel nag” in a recent update. I can’t imagine that will have any unintended consequences.
Not really. They just removed unprompted nag. If you’re not constantly keeping a hand on the wheel and looking at the road, it nags more and will pull you over if you ignore it.
If you turn off the internal driver monitoring camera, you can’t engage FSD or even use lane assist.
As someone with more than a basic understanding of technology and how self driving works, I would think the end user would take special care driving in fog since the car relies on cameras to identify the roads and objects. This is clearly user error.
Lol
Linkin.park-numb.mp3.exe
Leaving room for user error in this sort of situation is unacceptable at Tesla’s scale and with their engineering talent, as hamstrung as it is by their deranged leadership
If you are in the driver’s seat, you are 100% responsible for what your car does. If you let it drive itself into a moving train, that’s on you.
I cannot fathom how anyone can honestly believe Tesla is entirely faultless in any of this, completely and totally free of any responsibility whatsoever.
I’m not gonna say they’re 100% responsible but they are at least 1% responsible.
If Tesla is at fault for an inattentive driver ignoring the myriad warnings he got to remain attentive when he enabled FSD and allowing the 2 ton missile he’s sitting in to nearly plow into a train, then Dodge has to be responsible for the Challenger being used to plow into those protestors in Charlottesville.
God fucking damn it, why do you people insist on making me defend fucking Tesla?!
This is clearly user error.
When it’s been advertised to the user as “full self driving”, is it?
Furthermore, the car can’t recognize the visibility is low and alert the user and/or refuse to go into self driving?
There are many quite loud alerts when FSD is active in subpar circumstances about how it is degraded, and the car will slow down. That video was pretty foggy, I’d say the dude wasn’t paying attention.
I came up on a train Sunday evening in the dark, which I hadn’t had happen in FSD, so I decided to just hit the brakes. It saw the crossing arms as blinking stoplights, probably wouldn’t have stopped?
Either way that dude was definitely not paying attention.
When it’s been advertised to the user as “full self driving”, is it?
I wouldn’t believe an advertisement.
I wouldn’t trust Musk with my life either.
But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.
If a user does as is advertised and something goes wrong I do believe it’s the advertiser who is liable.
But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.
Keyword presumably.
If the product doesn’t do what it says it does, that’s the product / manufacturers fault. Not the users fault. Wtf lol how is this even a debate.
Right. But can you blame the user for trusting the advertisement?
Full Self Demolition
Every couple of months there’s a new story like this. And yet we’re supposed to believe this system is ready for use…
It is ready because Musk needs it to be ready. Watch out, this comment may bring the morale down, and Elron will be forced to … Cry like a baby 😆
Didn’t he recently claim Tesla robotaxi is only months away?
Well I suppose he didn’t say how many months, but the implication was less than a year, which has been his claim every year since 2016.He said that Teslas were an investment worth hundreds of thousands of dollars because owners would be able to use them as robot taxis when they weren’t using their car and charge a small fee by next year…in 2019. Back then he promised 1 million robot taxis nationwide in under a year. Recently he gave the date august 8 to reveal a new model of robot taxi. So, by Cybertruck estimates, I would say a Tesla robot taxi is a possibility by late 2030.
He is just spewing shit to keep the stock price afloat, as usual.
He also said they were ready to manufacture the 2nd generation Tesla Roaster “now,” which was back in 2014. No points for guessing that as of yet (despite taking in millions of dollars in preorders) they have not produced a single one.
Given this very early and still quite relevant warning, I’m astounded that anyone is dumb enough to believe any promise Elon makes about anything.
Ever couple of months you hear about every issue like this, just like you hear about every airline malfunction. It ignores the base rate of accurate performances which is very high.
FSD is imperfect but still probably more ready for use than a substantial fraction of human drivers.
This isn’t actually true. The Tesla full self driving issues we hear about in the news are the ones that result in fatal and near fatal accidents, but the forums are chock full of reports from owners of the thing malfunctioning on a regular basis.
It IS actually true. It does goofy stuff in some situations, but on the whole is a little better than your typical relatively inexperienced driver. It gets it wrong about when to be assertive and when to wait sometimes, it thinks there’s enough space for a courteous merge but there isn’t (it does some Chicago style merges sometimes), it follows the lines on the road like they are gospel, and doesn’t always properly estimate how to come to a smooth and comfortable stop. These are annoying things, but not outrageous provided you are paying attention like you’re obliged to do.
I have it, I use it, and I make lots of reports to Tesla. It is way better than it used to be and still has plenty of room to improve, but a Tesla can’t reboot without having a disparaging article written about it.
Also fuck elon, because I don’t think it gets said enough.
Seriously you sound like a Mac user in the '90s. “It only crashes 8 or 9 times a day, it’s so much better than it used to be. It’s got so many great features that I’m willing to deal with a little inconvenience…” Difference being that when a Mac crashes it just loses some data and has to reboot but when a Tesla crashes people die.
These are serious rate differences man.
Every driver, and even Tesla, will tell you it’s a work in progress, and you’d be hard pressed to find someone who has had an accident with it. I’d be willing to bet money that IF You find someone who has had an accident they have a driving record that’s shitty without it too.
If you want to talk stats, let’s talk stats, but “It seems like Tesla is in the news a lot for near crashes” is a pretty weak metric, even from your armchair.
typical relatively inexperienced driver
Look at the rates that teenagers crash, this is an indictment.
provided you are paying attention
It was advertised as fully autonomous dude. People wouldn’t have this much of a hard-on for trashing it if it wasn’t so oversold.
This fully autonomous argument is beat to death already. Every single Tesla owner knows you’re supposed to pay attention and be ready to take over when necessary. That is such a strawman argument. Nobody blames the car when automatic braking fails to see the car infront of it. It might save your ass if you’re distracted but ultimately it’s always the driver whose responsible. FSD is no different.
If it’s not fully capable of self driving then maybe they shouldn’t call it full self driving
You hear so much about the people Jeffrey Dahmer murdered, but never anything about all the people he didn’t murder!
Cute.
Here’s some actual information
People are terrible at probability estimation, and even with two fatal accidents a month FSD is likely still safer than most of the people on the road per million miles driven.
I see you’ve decided to be condescending, and also made a falsifiable claim. This is the part where you bring some actual data or STFU.
Whatever you say Mr Dahmer joke instead of content. I see that was really all in good faith and maybe I unintentionally hurt your feelings by citing a source on base rate biases?
What data would you like me to bring for discussion since you’ve been so open thus far? Do you want me to bring some data showing that teslas spend more time not having accidents than having accidents? I’m happy to go do some homework to enrich this interaction.
It’s not as though you can just ask Tesla for every case of an FSD crash. The falsifiable claim is just me tossing a number, the point is that memorable bad press and bad stats are not the same.
In what way is it not ready to use? Does cars have some other driver assistant features that are fool proof? You’re not supposed to blindly trust any of those. Why would FSD be an exception? The standards people are aplying to it are quite unreasonable.
Of whiat words is FSD an acronym?
You’re not supposed to blindly trust any of those. Why would FSD be an exception?
Because that’s how Elon (and by extension Tesla) market it. Full self driving. If they’re saying I can blindly trust their product, then I expect it to be safe to blindly trust it.
And if the fine print says I can’t blindly trust it, they need to be sued or put under legal pressure to change the term, because it’s incredibly misleading.
Full Self Driving (Beta), nowdays Full Self Driving (Supervised)
Which of those names invokes trust to put your life in it’s hands?
It’s not in fine print. It’s told to you when you purchase FSD and the vehicle reminds you of it every single time you enable the system. If you’re looking at your phone it starts nagging at you eventually locking you out of the feature. Why would they put driver monitoring system in place if you’re supposed to put blind faith into it?
That is such an old, beat up strawman argument. Yes, Elon has said it would be fully autonomous in a year or so which turned out to be a lie but nobody today is claiming it can be blindly trusted. That simply just is not true.
It isn’t Full Self Driving if it is supervised.
It’s especially not Full Self Driving if it asks you to intervene.
It is false advertisement at best, deadly at worst.
Unfortunately, companies also have to make their products safe for idiots. If the system is in beta or must be supervised, there should be inherently safe design that prevents situations like this from happening even if an idiot is at the wheel.
ESP is not idiot proof either just to name one such feature that’s been available for decades. It assists the driver but doesn’t replace them.
Hell, cars themselves are not idiot proof.
No, the standards people are applying to it are the bare minimum for a full self driving system like what musk claims.
It’s a level 2 self driving system which by definition requires driver supervision. It’s even stated in the name. What are the standards it doesn’t meet?
Because it’s called Full Self Drive and Musk has said it will be able to drive without user intervention?
The naming is poor, but in no way does the car represent to you that no intervention is required. It also constantly asks you for input and even watches your eyes to make sure you pay attention.
The car maybe not, but the marketing sure does
Marketing besides the naming we have already established and Elon himself masturbating to it? Is there some other marketing that pushes this narrative, because I certainly have not seen it.
It’s called Full Self Driving (Supervised)
Yeah, it will be able to drive without driver intervention eventually. Atleast that’s their goal. Right now however, it’s level 2 and no-one is claiming otherwise.
In what way is it not ready to use?
Full Self Driving (sike!)
It’s unreasonable for FSD to see a train? … that’s 20ft tall and a mile long? Am I understanding you correctly?
Foolproof would be great, but I think most people would set the bar at least as high as not getting killed by a train.
Did you watch the video? It was insanely foggy there. It makes no difference how big the obstacle is if you can’t even see 50 meters ahead of you.
Also, the car did see the train. It just clearly didn’t understand what it was and how to react to it. That’s why the car has a driver who does. I’m sure this exact edge case will be added to the training data so that this doesn’t happen again. Stuff like this takes ages to iron out. FSD is not a finished product. It’s under development and receives constant updates and keeps improving. That’s why it’s classified as level 2 and not level 5.
Yes. It’s unreasonable to expect brand new technology to be able to deal with every possible scenario that a car can encounter on traffic. Just because the concept of train in a fog makes sense to you as a human doesn’t mean it’s obvious to the AI.
In what way is it not ready to use?
To me it seems you just spent three paragraphs answering your own question.
can’t even see 50 meters ahead
didn’t understand what it was and how to react to it
FSD is not a finished product. It’s under development
doesn’t mean it’s obvious to the AI
If I couldn’t trust a system not to drive into a train, I don’t feel like I would trust it to do even the most common tasks. I would drive the car like a fully attentive human and not delude myself into thinking the car is driving me with “FSD.”
You can’t see 50 meters ahead in that fog.
Completely true. And I would dictate my driving characteristics based on that fact.
I would drive at a speed and in a manner that would allow me to not almost crash into things. But especially trains.
I agree. In fact I’m surprised the vehicle even lets you enable FSD in that kind of poor visibility and based on the video it seemed to be going quite fast aswell.
LIDAR can
Yeah there’s a wide range of ways to map the surroundings. Road infrastructure, however is designed for vision so I don’t see why just cameras wouldn’t be sufficient. The issue here is not that it’s didn’t see the train - it’s on video, after all - but that it didn’t know how to react to it.
what’s so “new” about this concern? I’d probably be able to afford a house if I had a dollar for every article I saw on Tesla’s wrecking or nearly wrecking because of FSD.
I don’t see any information about the crossing. Was it a crossing without gates? As the sensors must’ve picked that up when driving towards it. If so, is a huge oversight not putting up gated crossings nowadays, certainly on busy roads, regardless of the performance of self driving cars.
Not being able to identify a railroad crossing without a gate is a failing of the car not the train. Gated crossings are not guaranteed, nor should they be because they don’t make sense for every situation in which roads and tracks cross.
they don’t make sense for every situation in which roads and tracks cross.
they don’t make sense for every situation in which roads and tracks cross.
The video in the article shows lowered arms flashing. Very visible with plenty of time to stop despite the foggy conditions. It just didn’t.
Stuff like that happens whe you opt for visual obstacle detection instead of lidar
Ah. I’ve read it, but I have media tuned of, so I didn’t see the video. Thanks for the clarification!
Yep of course!
When you look at the development notes on the self-driving at Tesla, anyone with a brain wouldn’t trust that shit, not even a little bit. Most of what they did is placate Musk’s petty whims and delusions. Any real R&D issues were basically glazed over it given quick software fixes.
Demonstrate what you mean because it really sounds like you’re describing what you feel should be true to justify your emotions about the bad Twitter man.
And to be clear, I mean link the documents you claim to have read and the parts of them you claim demonstrate this.
Just need to Google “Tesla self-driving development engineers and Elon Musk”, and you’ll find lots of articles. Here’s one of the few that wasn’t paywalled.
https://www.businessinsider.com/elon-musk-tesla-autopilot-fsd-engineers-unsettled-2021-8
This is an article from 2021 about a book researched in 2019.
Yeah, during development of the Tesla self driving system.
Read the development notes from the first years of any technology you use. The research you’re “referencing” is six years old at this point.
What’s next? You going to criticize an iPod Nano to make a point about the broken screen on your iPhone 8? Criticize Google assistant from 2019 to harangue OpenAI?
Look at what six years of development means: https://youtu.be/qTDlRLeDxxM?si=dFZzLcO_a8wfy2QS
Are there notes available to read?
Here’s one of the few articles that wasn’t paywalled when I pulled up a Google search on it. It’s really not hard to find the stories are all over the place.
https://www.businessinsider.com/elon-musk-tesla-autopilot-fsd-engineers-unsettled-2021-8
I hope Tesla owners only get themselves killed.
What a horrible thing to say, especially since Elon and Tesla have only relatively recently turned to absolute shit. There are a lot of Tesla drivers that don’t support what he has done to the company and all that.
Here you are advocating for the death of people because they purchased a vehicle. A lot of people bought Teslas as they were one of the better EVs at the time during Tesla’s climb to their peak (which they have since fallen very far from). They too deserve death?
Here you are advocating for the death of people because they purchased a vehicle.
No; I’m expressing the same sentiment that I express for motorcycle riders that refuse to wear a helmet. I really, genuinely don’t care if they beat their brains out on the front bumper of a Hyundai, but I don’t think they get to force a Hyundai driver to hose brains off their car.
Teslas are death traps. Their owners can make that choice for themselves but I don’t think they get to make it for others, which is what they try to do every time they turn on that self-driving feature.
AI said kill this guy right here
It noticed he didn’t like Elon’s tweet that day.
You know… I’d believe Elon is petty enough to actually put something like that in.
He fortunately avoided the train, but unfortunately still owns a Tesla.
So not even turning towards an intersection with a train or anything complicated. Tesla can’t even tell there’s a 12" steel wall in front of it. Fucking pathetic.
How would it though? It probably didn’t have any images like this in the train-ing data.
I guess some sort of radar could identify solid objects in nearly any condition… Hmm…
The new models with hardware 4 (atleast models S and X) have a radar but then again humans can manage without so I have no doubt that a vision-based system will be more than sufficient in the end.
This is showing it works or no? I can’t tell and there isn’t audio, it seems like it would be stopped correctly.
Definitely shows it working.
Yeah, I was thinking maybe the weird flashing lights on the screen was maybe being pointed to as not working right or something to that effect? Idk lol no context provided at all
10x safer than a human!
*a drunken human
I don’t want to disagree, but I would like a source to support this claim
Musk has multiple times stated that FSD is safer than han driving. I’m not gonna bother finding the bids as I’m at work
That exclamation point in the comment you replied to should be your hint that it’s sarcasm.
No, Musk said this at one point at some press conference
Not the person I was replying to but okay… what has that got to do with the sarcasm of the comment I was referring to?
Feels like these things were more capable a decade ago when they had radar.
Not that they should be called “full self driving” either then or now, but at least radar can deal fog better than regular ass cameras
For now, cars need more than computer vision to navigate because right now adding cameras by themselves doesn’t help a car spatially orient itself in its environment. What might help? I think the consensus is that the cameras need to get a 360 deg view of the surroundings and the car needs a method for making sense of these inputs without that understanding being the focus of attention.
It seems Teslas do add sensors in appropriate locations to be able to do that, but there’s some disconnect in reconciling the information: https://www.notateslaapp.com/news/1452/tesla-guide-number-of-cameras-their-locations-uses-and-how-to-view. A multi-modal sensing system would bypass reliance on getting everything right via CV.
Think of you focusing on an object in the distance and moving toward it: while you’re using your eyes to look at it, you’re subconsciously computing relative distance and speed as you approach it. it’s your subconscious memory of your 3D spatial orientation that helps you make corrections and adjustments to your speed and approach. Outside of better hardware that can reconcile these different inputs, relying on different sensor inputs would make the most robust approach for autonomous vehicles.
Humans essentially keep track of their body in 3D space and time without thinking about it, and actually most multicellular organisms have learned to do this in some manner.