Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

  • Furbag@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Oh boy, and they just removed “steering wheel nag” in a recent update. I can’t imagine that will have any unintended consequences.

    • WoahWoah@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Not really. They just removed unprompted nag. If you’re not constantly keeping a hand on the wheel and looking at the road, it nags more and will pull you over if you ignore it.

      If you turn off the internal driver monitoring camera, you can’t engage FSD or even use lane assist.

  • ElPenguin@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    As someone with more than a basic understanding of technology and how self driving works, I would think the end user would take special care driving in fog since the car relies on cameras to identify the roads and objects. This is clearly user error.

    • Noxy@yiffit.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Leaving room for user error in this sort of situation is unacceptable at Tesla’s scale and with their engineering talent, as hamstrung as it is by their deranged leadership

      • SaltySalamander@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        If you are in the driver’s seat, you are 100% responsible for what your car does. If you let it drive itself into a moving train, that’s on you.

        • Noxy@yiffit.net
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          I cannot fathom how anyone can honestly believe Tesla is entirely faultless in any of this, completely and totally free of any responsibility whatsoever.

          I’m not gonna say they’re 100% responsible but they are at least 1% responsible.

          • SaltySalamander@fedia.io
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            If Tesla is at fault for an inattentive driver ignoring the myriad warnings he got to remain attentive when he enabled FSD and allowing the 2 ton missile he’s sitting in to nearly plow into a train, then Dodge has to be responsible for the Challenger being used to plow into those protestors in Charlottesville.

            God fucking damn it, why do you people insist on making me defend fucking Tesla?!

    • tb_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      This is clearly user error.

      When it’s been advertised to the user as “full self driving”, is it?

      Furthermore, the car can’t recognize the visibility is low and alert the user and/or refuse to go into self driving?

      • darganon@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        There are many quite loud alerts when FSD is active in subpar circumstances about how it is degraded, and the car will slow down. That video was pretty foggy, I’d say the dude wasn’t paying attention.

        I came up on a train Sunday evening in the dark, which I hadn’t had happen in FSD, so I decided to just hit the brakes. It saw the crossing arms as blinking stoplights, probably wouldn’t have stopped?

        Either way that dude was definitely not paying attention.

        • tb_@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          I wouldn’t trust Musk with my life either.

          But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.

          If a user does as is advertised and something goes wrong I do believe it’s the advertiser who is liable.

          • 0x0@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.

            Keyword presumably.

            • jaybone@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              If the product doesn’t do what it says it does, that’s the product / manufacturers fault. Not the users fault. Wtf lol how is this even a debate.

  • FangedWyvern42@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Every couple of months there’s a new story like this. And yet we’re supposed to believe this system is ready for use…

    • darki@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      It is ready because Musk needs it to be ready. Watch out, this comment may bring the morale down, and Elron will be forced to … Cry like a baby 😆

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Didn’t he recently claim Tesla robotaxi is only months away?
        Well I suppose he didn’t say how many months, but the implication was less than a year, which has been his claim every year since 2016.

        • dustyData@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          He said that Teslas were an investment worth hundreds of thousands of dollars because owners would be able to use them as robot taxis when they weren’t using their car and charge a small fee by next year…in 2019. Back then he promised 1 million robot taxis nationwide in under a year. Recently he gave the date august 8 to reveal a new model of robot taxi. So, by Cybertruck estimates, I would say a Tesla robot taxi is a possibility by late 2030.

          He is just spewing shit to keep the stock price afloat, as usual.

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            He also said they were ready to manufacture the 2nd generation Tesla Roaster “now,” which was back in 2014. No points for guessing that as of yet (despite taking in millions of dollars in preorders) they have not produced a single one.

            Given this very early and still quite relevant warning, I’m astounded that anyone is dumb enough to believe any promise Elon makes about anything.

    • dream_weasel@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Ever couple of months you hear about every issue like this, just like you hear about every airline malfunction. It ignores the base rate of accurate performances which is very high.

      FSD is imperfect but still probably more ready for use than a substantial fraction of human drivers.

      • buddascrayon@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        This isn’t actually true. The Tesla full self driving issues we hear about in the news are the ones that result in fatal and near fatal accidents, but the forums are chock full of reports from owners of the thing malfunctioning on a regular basis.

        • dream_weasel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          It IS actually true. It does goofy stuff in some situations, but on the whole is a little better than your typical relatively inexperienced driver. It gets it wrong about when to be assertive and when to wait sometimes, it thinks there’s enough space for a courteous merge but there isn’t (it does some Chicago style merges sometimes), it follows the lines on the road like they are gospel, and doesn’t always properly estimate how to come to a smooth and comfortable stop. These are annoying things, but not outrageous provided you are paying attention like you’re obliged to do.

          I have it, I use it, and I make lots of reports to Tesla. It is way better than it used to be and still has plenty of room to improve, but a Tesla can’t reboot without having a disparaging article written about it.

          Also fuck elon, because I don’t think it gets said enough.

          • buddascrayon@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Seriously you sound like a Mac user in the '90s. “It only crashes 8 or 9 times a day, it’s so much better than it used to be. It’s got so many great features that I’m willing to deal with a little inconvenience…” Difference being that when a Mac crashes it just loses some data and has to reboot but when a Tesla crashes people die.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              These are serious rate differences man.

              Every driver, and even Tesla, will tell you it’s a work in progress, and you’d be hard pressed to find someone who has had an accident with it. I’d be willing to bet money that IF You find someone who has had an accident they have a driving record that’s shitty without it too.

              If you want to talk stats, let’s talk stats, but “It seems like Tesla is in the news a lot for near crashes” is a pretty weak metric, even from your armchair.

          • bane_killgrind@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            typical relatively inexperienced driver

            Look at the rates that teenagers crash, this is an indictment.

            provided you are paying attention

            It was advertised as fully autonomous dude. People wouldn’t have this much of a hard-on for trashing it if it wasn’t so oversold.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 year ago

              This fully autonomous argument is beat to death already. Every single Tesla owner knows you’re supposed to pay attention and be ready to take over when necessary. That is such a strawman argument. Nobody blames the car when automatic braking fails to see the car infront of it. It might save your ass if you’re distracted but ultimately it’s always the driver whose responsible. FSD is no different.

              • Pazuzu@midwest.social
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 year ago

                If it’s not fully capable of self driving then maybe they shouldn’t call it full self driving

      • lolcatnip@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        You hear so much about the people Jeffrey Dahmer murdered, but never anything about all the people he didn’t murder!

          • lolcatnip@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            I see you’ve decided to be condescending, and also made a falsifiable claim. This is the part where you bring some actual data or STFU.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 year ago

              Whatever you say Mr Dahmer joke instead of content. I see that was really all in good faith and maybe I unintentionally hurt your feelings by citing a source on base rate biases?

              What data would you like me to bring for discussion since you’ve been so open thus far? Do you want me to bring some data showing that teslas spend more time not having accidents than having accidents? I’m happy to go do some homework to enrich this interaction.

              It’s not as though you can just ask Tesla for every case of an FSD crash. The falsifiable claim is just me tossing a number, the point is that memorable bad press and bad stats are not the same.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      In what way is it not ready to use? Does cars have some other driver assistant features that are fool proof? You’re not supposed to blindly trust any of those. Why would FSD be an exception? The standards people are aplying to it are quite unreasonable.

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        You’re not supposed to blindly trust any of those. Why would FSD be an exception?

        Because that’s how Elon (and by extension Tesla) market it. Full self driving. If they’re saying I can blindly trust their product, then I expect it to be safe to blindly trust it.

        And if the fine print says I can’t blindly trust it, they need to be sued or put under legal pressure to change the term, because it’s incredibly misleading.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Full Self Driving (Beta), nowdays Full Self Driving (Supervised)

          Which of those names invokes trust to put your life in it’s hands?

          It’s not in fine print. It’s told to you when you purchase FSD and the vehicle reminds you of it every single time you enable the system. If you’re looking at your phone it starts nagging at you eventually locking you out of the feature. Why would they put driver monitoring system in place if you’re supposed to put blind faith into it?

          That is such an old, beat up strawman argument. Yes, Elon has said it would be fully autonomous in a year or so which turned out to be a lie but nobody today is claiming it can be blindly trusted. That simply just is not true.

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            It isn’t Full Self Driving if it is supervised.

            It’s especially not Full Self Driving if it asks you to intervene.

            It is false advertisement at best, deadly at worst.

          • assassin_aragorn@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Unfortunately, companies also have to make their products safe for idiots. If the system is in beta or must be supervised, there should be inherently safe design that prevents situations like this from happening even if an idiot is at the wheel.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              ESP is not idiot proof either just to name one such feature that’s been available for decades. It assists the driver but doesn’t replace them.

              Hell, cars themselves are not idiot proof.

      • Holyginz@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        No, the standards people are applying to it are the bare minimum for a full self driving system like what musk claims.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          It’s a level 2 self driving system which by definition requires driver supervision. It’s even stated in the name. What are the standards it doesn’t meet?

      • ammonium@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Because it’s called Full Self Drive and Musk has said it will be able to drive without user intervention?

        • dream_weasel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          The naming is poor, but in no way does the car represent to you that no intervention is required. It also constantly asks you for input and even watches your eyes to make sure you pay attention.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              Marketing besides the naming we have already established and Elon himself masturbating to it? Is there some other marketing that pushes this narrative, because I certainly have not seen it.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          It’s called Full Self Driving (Supervised)

          Yeah, it will be able to drive without driver intervention eventually. Atleast that’s their goal. Right now however, it’s level 2 and no-one is claiming otherwise.

          In what way is it not ready to use?

      • Piranha Phish@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        It’s unreasonable for FSD to see a train? … that’s 20ft tall and a mile long? Am I understanding you correctly?

        Foolproof would be great, but I think most people would set the bar at least as high as not getting killed by a train.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Did you watch the video? It was insanely foggy there. It makes no difference how big the obstacle is if you can’t even see 50 meters ahead of you.

          Also, the car did see the train. It just clearly didn’t understand what it was and how to react to it. That’s why the car has a driver who does. I’m sure this exact edge case will be added to the training data so that this doesn’t happen again. Stuff like this takes ages to iron out. FSD is not a finished product. It’s under development and receives constant updates and keeps improving. That’s why it’s classified as level 2 and not level 5.

          Yes. It’s unreasonable to expect brand new technology to be able to deal with every possible scenario that a car can encounter on traffic. Just because the concept of train in a fog makes sense to you as a human doesn’t mean it’s obvious to the AI.

          • Piranha Phish@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            In what way is it not ready to use?

            To me it seems you just spent three paragraphs answering your own question.

            can’t even see 50 meters ahead

            didn’t understand what it was and how to react to it

            FSD is not a finished product. It’s under development

            doesn’t mean it’s obvious to the AI

            If I couldn’t trust a system not to drive into a train, I don’t feel like I would trust it to do even the most common tasks. I would drive the car like a fully attentive human and not delude myself into thinking the car is driving me with “FSD.”

              • Piranha Phish@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 year ago

                Completely true. And I would dictate my driving characteristics based on that fact.

                I would drive at a speed and in a manner that would allow me to not almost crash into things. But especially trains.

                • Thorny_Insight@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 year ago

                  I agree. In fact I’m surprised the vehicle even lets you enable FSD in that kind of poor visibility and based on the video it seemed to be going quite fast aswell.

                • Thorny_Insight@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 year ago

                  Yeah there’s a wide range of ways to map the surroundings. Road infrastructure, however is designed for vision so I don’t see why just cameras wouldn’t be sufficient. The issue here is not that it’s didn’t see the train - it’s on video, after all - but that it didn’t know how to react to it.

  • helpmyusernamewontfi@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    what’s so “new” about this concern? I’d probably be able to afford a house if I had a dollar for every article I saw on Tesla’s wrecking or nearly wrecking because of FSD.

  • Akasazh@feddit.nl
    cake
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I don’t see any information about the crossing. Was it a crossing without gates? As the sensors must’ve picked that up when driving towards it. If so, is a huge oversight not putting up gated crossings nowadays, certainly on busy roads, regardless of the performance of self driving cars.

  • buddascrayon@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    When you look at the development notes on the self-driving at Tesla, anyone with a brain wouldn’t trust that shit, not even a little bit. Most of what they did is placate Musk’s petty whims and delusions. Any real R&D issues were basically glazed over it given quick software fixes.

    • Fades@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      What a horrible thing to say, especially since Elon and Tesla have only relatively recently turned to absolute shit. There are a lot of Tesla drivers that don’t support what he has done to the company and all that.

      Here you are advocating for the death of people because they purchased a vehicle. A lot of people bought Teslas as they were one of the better EVs at the time during Tesla’s climb to their peak (which they have since fallen very far from). They too deserve death?

      • Captain Aggravated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Here you are advocating for the death of people because they purchased a vehicle.

        No; I’m expressing the same sentiment that I express for motorcycle riders that refuse to wear a helmet. I really, genuinely don’t care if they beat their brains out on the front bumper of a Hyundai, but I don’t think they get to force a Hyundai driver to hose brains off their car.

        Teslas are death traps. Their owners can make that choice for themselves but I don’t think they get to make it for others, which is what they try to do every time they turn on that self-driving feature.

    • Neato@ttrpg.network
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      So not even turning towards an intersection with a train or anything complicated. Tesla can’t even tell there’s a 12" steel wall in front of it. Fucking pathetic.

      • rsuri@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        How would it though? It probably didn’t have any images like this in the train-ing data.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            The new models with hardware 4 (atleast models S and X) have a radar but then again humans can manage without so I have no doubt that a vision-based system will be more than sufficient in the end.

    • Jakeroxs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      This is showing it works or no? I can’t tell and there isn’t audio, it seems like it would be stopped correctly.

  • Noxy@yiffit.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Feels like these things were more capable a decade ago when they had radar.

    Not that they should be called “full self driving” either then or now, but at least radar can deal fog better than regular ass cameras

  • nifty@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    For now, cars need more than computer vision to navigate because right now adding cameras by themselves doesn’t help a car spatially orient itself in its environment. What might help? I think the consensus is that the cameras need to get a 360 deg view of the surroundings and the car needs a method for making sense of these inputs without that understanding being the focus of attention.

    It seems Teslas do add sensors in appropriate locations to be able to do that, but there’s some disconnect in reconciling the information: https://www.notateslaapp.com/news/1452/tesla-guide-number-of-cameras-their-locations-uses-and-how-to-view. A multi-modal sensing system would bypass reliance on getting everything right via CV.

    Think of you focusing on an object in the distance and moving toward it: while you’re using your eyes to look at it, you’re subconsciously computing relative distance and speed as you approach it. it’s your subconscious memory of your 3D spatial orientation that helps you make corrections and adjustments to your speed and approach. Outside of better hardware that can reconcile these different inputs, relying on different sensor inputs would make the most robust approach for autonomous vehicles.

    Humans essentially keep track of their body in 3D space and time without thinking about it, and actually most multicellular organisms have learned to do this in some manner.