The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • elgordino@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    If anyone was somehow still thinking RoboTaxi is ever going to be a thing. Then no, it’s not, because of reasons like this.

    • testfactor@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      It doesn’t have to not hit pedestrians. It just has to hit less pedestrians than the average human driver.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 months ago

        That is the minimal outcomes for an automated safety feature to be an improvement over human drivers.

        But if everyone else is using something you refused to that would have likely avoided someone’s death, while misnaming you feature to mislead customers, then you are in legal trouble.

        When it comes to automation you need to be far better than humans because there will be a higher level of scrutiny. Kind of like how planes are massively safer than driving on average, but any incident where someone could have died gets a massive amount of attention.

      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Exactly. The current rate is 80 deaths per day in the US alone. Even if we had self-driving cars proven to be 10 times safer than human drivers, we’d still see 8 news articles a day about people dying because of them. Taking this as ‘proof’ that they’re not safe is setting an impossible standard and effectively advocating for 30,000 yearly deaths, as if it’s somehow better to be killed by a human than by a robot.

          • Billiam@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            If you get killed by a robot, you can at least die knowing your death was the logical option and not a result of drunk driving, road rage, poor vehicle maintenance, panic, or any other of the dozens of ways humans are bad at decision-making.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              8 months ago

              It doesn’t even need to be logical, just statistically reasonable. You’re literally a statistic anytime you interact w/ any form of AI.

      • elgordino@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        It needs to be way way better than ‘better than average’ if it’s ever going to be accepted by regulators and the public. Without better sensors I don’t believe it will ever make it. Waymo had the right idea here if you ask me.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          But why is that the standard? Shouldn’t “equivalent to average” be the standard? Because if self-driving cars can be at least as safe as a human, they can be improved to be much safer, whereas humans won’t improve.

          • medgremlin@midwest.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            I’d accept that if the makers of the self-driving cars can be tried for vehicular manslaughter the same way a human would be. Humans carry civil and criminal liability, and at the moment, the companies that produce these things only have nominal civil liability. If Musk can go to prison for his self-driving cars killing people the same way a regular driver would, I’d be willing to lower the standard.