Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

  • itsonlygeorge@reddthat.com
    link
    fedilink
    English
    arrow-up
    78
    arrow-down
    2
    ·
    7 months ago

    Tesla opted not to use LIDAR as part of its sensor package and instead relies on cameras which are not enough to determine accurate location data for other cars/trains etc.

    This is what you get when billionaires cheap out on their products.

    • Imalostmerchant@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      7 months ago

      I never understood Musk’s reasoning for this decision. From my recollection it was basically “how do you decide who’s right when lidar and camera disagree?” And it felt so insane to say that the solution to conflicting data was not to figure out which is right but only to listen to one.

      • wirehead@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 months ago

        I mean, I think he’s a textbook example of why not to do drugs and why we need to eat the rich, but I can understand the logic here.

        When you navigate a car as a human, you are using vision, not LIDAR. Outside of a few edge cases, you aren’t even using parallax to judge distances. Ergo, a LIDAR is not going to see the text on a sign, the reflective stripes on a truck, etc. And it gets confused differently than the eye, absorbed by different wavelengths, etc. And you can jam LIDAR if you want. Thus, if we were content to wait until the self-driving-car is actually safe before throwing it out into the world, we’d probably want the standard to be that it navigates as well as a human in all situations using only visual sensors.

        Except, there’s some huge problems that the human visual cortex makes look real easy. Because “all situations” means “understanding that there’s a kid playing in the street from visual cues so I’m going to assume they are going to do something dumb” or “some guy put a warning sign on the road and it’s got really bad handwriting”

        Thus, the real problem is that he’s not using LIDAR as harm reduction for a patently unsafe product, where the various failure modes of the LIDAR-equipped self-driving cars show that those aren’t safe either.

      • Jakeroxs@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        Also that LIDAR is more expensive then cameras, which means higher end user price, as far as I remember.

      • smonkeysnilas@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        I mean the decision was stupid from an engineering point of view, but the reasoning is not entirely off. Basically it follows the biological example: if humans can drive without Lidar and only using their eyes than this is proof that it is possible somehow. It’s only that the current computer vision and AI tech is way worse than humans. Elon chose to ignore this, basically arguing that it is merely a software problem for his developers to figure out. I guess in reality it is a bit more complex.

    • skyspydude1@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      7 months ago

      Not only that, but took out the radar, which while it has its own flaws, would have had no issue seeing the train through the fog. While they claimed it was because they had “solved vision” and didn’t need it anymore, it’s bullshit, and their engineering team knew it. They were in the middle of sourcing a new radar, but because of supply chain limitations (like everyone in 2021) with both their old and potential new supplier, they wouldn’t continue their “infinite growth” narrative and fElon wouldn’t get his insane pay package. They knew for a fact it would negatively affect performance significantly, but did it anyway so line could go up.

      While no automotive company’s hands are particularly clean, the sheer level of willful negligence at Tesla is absolutely astonishing and have seen and heard so many stories about their shitty engineering practices that the only impressive thing is how relatively few people have died as a direct result of their lax attitude towards basic safety practices.

    • Wrench@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      7 months ago

      LIDAR would have similarly been degraded in the foggy conditions that this occurred in. Lasers are light too.

      While I do think Tesla holds plenty of responsibility for their intentionally misleading branding in FSD, as well as cost saving measures to not include lidar and/or radar, this particular instance boils down to yet another shitty and irresponsible driver.

      You should not be relying on FSD over train tracks. You should not be allowing FSD to be going faster than conditions allow. Dude was tearing down the road in thick fog, way faster than was safe for the conditions.

      • Rekorse@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        A Tesla drover might get the impression that the cars “opinion” is better than their own, which could cause them to hesitate before intervening or to allow the car to drive in a way they are uncomfortable with.

        The misinformation about the car reaches the level of negligence because even smart people are being duped by this.

        Honestly I think some people just dont believe someone could lie so publicly and loudly and often, that it must be something else besides a grift.

      • Pazuzu@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        7 months ago

        Maybe it shouldn’t be called full self driving if it’s not fully capable of self driving

      • FreddyDunningKruger@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        One of the first things you learn to get your driver’s license is the Basic Speed Law, you must not drive faster than the driving conditions would allow. If only Full Self Driving followed the law and reduced its max speed based on the same.

        • Rekorse@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          If you were to strictly take that rule seriously, you should not allow FSD to drive at all, as at any speed its more dangerous than the person driving it (given an average driver who’s not intoxicated).