Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

  • Thorny_Insight@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    20
    ·
    7 months ago

    In what way is it not ready to use? Does cars have some other driver assistant features that are fool proof? You’re not supposed to blindly trust any of those. Why would FSD be an exception? The standards people are aplying to it are quite unreasonable.

    • ammonium@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      3
      ·
      7 months ago

      Because it’s called Full Self Drive and Musk has said it will be able to drive without user intervention?

      • dream_weasel@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        10
        ·
        7 months ago

        The naming is poor, but in no way does the car represent to you that no intervention is required. It also constantly asks you for input and even watches your eyes to make sure you pay attention.

          • dream_weasel@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            7 months ago

            Marketing besides the naming we have already established and Elon himself masturbating to it? Is there some other marketing that pushes this narrative, because I certainly have not seen it.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        12
        ·
        7 months ago

        It’s called Full Self Driving (Supervised)

        Yeah, it will be able to drive without driver intervention eventually. Atleast that’s their goal. Right now however, it’s level 2 and no-one is claiming otherwise.

        In what way is it not ready to use?

    • Piranha Phish@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 months ago

      It’s unreasonable for FSD to see a train? … that’s 20ft tall and a mile long? Am I understanding you correctly?

      Foolproof would be great, but I think most people would set the bar at least as high as not getting killed by a train.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        7 months ago

        Did you watch the video? It was insanely foggy there. It makes no difference how big the obstacle is if you can’t even see 50 meters ahead of you.

        Also, the car did see the train. It just clearly didn’t understand what it was and how to react to it. That’s why the car has a driver who does. I’m sure this exact edge case will be added to the training data so that this doesn’t happen again. Stuff like this takes ages to iron out. FSD is not a finished product. It’s under development and receives constant updates and keeps improving. That’s why it’s classified as level 2 and not level 5.

        Yes. It’s unreasonable to expect brand new technology to be able to deal with every possible scenario that a car can encounter on traffic. Just because the concept of train in a fog makes sense to you as a human doesn’t mean it’s obvious to the AI.

        • Piranha Phish@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          7 months ago

          In what way is it not ready to use?

          To me it seems you just spent three paragraphs answering your own question.

          can’t even see 50 meters ahead

          didn’t understand what it was and how to react to it

          FSD is not a finished product. It’s under development

          doesn’t mean it’s obvious to the AI

          If I couldn’t trust a system not to drive into a train, I don’t feel like I would trust it to do even the most common tasks. I would drive the car like a fully attentive human and not delude myself into thinking the car is driving me with “FSD.”

            • Piranha Phish@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              7 months ago

              Completely true. And I would dictate my driving characteristics based on that fact.

              I would drive at a speed and in a manner that would allow me to not almost crash into things. But especially trains.

              • Thorny_Insight@lemm.ee
                link
                fedilink
                English
                arrow-up
                5
                ·
                7 months ago

                I agree. In fact I’m surprised the vehicle even lets you enable FSD in that kind of poor visibility and based on the video it seemed to be going quite fast aswell.

              • Thorny_Insight@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                7 months ago

                Yeah there’s a wide range of ways to map the surroundings. Road infrastructure, however is designed for vision so I don’t see why just cameras wouldn’t be sufficient. The issue here is not that it’s didn’t see the train - it’s on video, after all - but that it didn’t know how to react to it.

    • Holyginz@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      7 months ago

      No, the standards people are applying to it are the bare minimum for a full self driving system like what musk claims.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        8
        ·
        7 months ago

        It’s a level 2 self driving system which by definition requires driver supervision. It’s even stated in the name. What are the standards it doesn’t meet?

    • assassin_aragorn@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      7 months ago

      You’re not supposed to blindly trust any of those. Why would FSD be an exception?

      Because that’s how Elon (and by extension Tesla) market it. Full self driving. If they’re saying I can blindly trust their product, then I expect it to be safe to blindly trust it.

      And if the fine print says I can’t blindly trust it, they need to be sued or put under legal pressure to change the term, because it’s incredibly misleading.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        Full Self Driving (Beta), nowdays Full Self Driving (Supervised)

        Which of those names invokes trust to put your life in it’s hands?

        It’s not in fine print. It’s told to you when you purchase FSD and the vehicle reminds you of it every single time you enable the system. If you’re looking at your phone it starts nagging at you eventually locking you out of the feature. Why would they put driver monitoring system in place if you’re supposed to put blind faith into it?

        That is such an old, beat up strawman argument. Yes, Elon has said it would be fully autonomous in a year or so which turned out to be a lie but nobody today is claiming it can be blindly trusted. That simply just is not true.

        • Honytawk@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          It isn’t Full Self Driving if it is supervised.

          It’s especially not Full Self Driving if it asks you to intervene.

          It is false advertisement at best, deadly at worst.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            It’s misleading advertising for sure. At no point have I claimed otherwise.

            The meaning of what qualifies as “full self driving” is still up for debate however. There are worse human drivers on the roads than what the current version of FSD is capable of. It’s by no means flawless but it’s much better than most people even realize. It’s a vehicle capable of self driving even if not fully.

        • assassin_aragorn@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          7 months ago

          Unfortunately, companies also have to make their products safe for idiots. If the system is in beta or must be supervised, there should be inherently safe design that prevents situations like this from happening even if an idiot is at the wheel.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 months ago

            ESP is not idiot proof either just to name one such feature that’s been available for decades. It assists the driver but doesn’t replace them.

            Hell, cars themselves are not idiot proof.

          • VirtualOdour@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            Yeah and cars should have a system to stop idiots doing dumb things, best we have is a license so if it’s good enough for cars without added safety features is good enough for them with