Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”

  • cm0002@lemmy.world
    link
    fedilink
    English
    arrow-up
    253
    arrow-down
    9
    ·
    11 months ago

    I lost all trust in their ‘Autopilot’ the day I read Musk said (Paraphrasing) “All we need are cameras, there’s no need for secondary/tertiary LIDAR or other expensive setups”

    Like TFYM? No backups?? Or backups to the backups?? On a life fucking critical system?!

    • Ottomateeverything@lemmy.world
      link
      fedilink
      English
      arrow-up
      109
      ·
      11 months ago

      or other expensive setups

      As much as I lost trust in his bullshittery a long time ago, his need to mention the cost of critical safety systems is what stuck out to me the most here. That’s how you know the priorities are backwards.

        • AlexWIWA@lemmy.ml
          link
          fedilink
          English
          arrow-up
          34
          ·
          11 months ago

          Hell every iphone has lidar and the pro models have two lidar cameras. The tech is not very expensive, epecially not for a $80,000 car.

          My partner’s econobox has lidar for its cruise control, but Tesla can’t seem to figure out how to make it work.

          • Sondermotor@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            ·
            11 months ago

            Hell every iphone has lidar and the pro models have two lidar cameras. The tech is not very expensive, epecially not for a $80,000 car.

            Around the time Elon made the claim Lidar for automotive purposes was quite expensive. That additional cost would make the self driving product a lot less desirable. Up selling cruise control into “self driving” earned them a lot of money.

            Funnily enough all other aspects where Tesla has taken the expensive option the cult retail investors would claim it was brilliant decisions because economy of scale would kick in and make it cheaper in the long run.

            Lidar was obviously exempt from any such scale and future tech improvements, because reasons.

            My partner’s econobox has lidar for its cruise control, but Tesla can’t seem to figure out how to make it work.

            It could be very expensive for Tesla to start using Lidar, because they’ve sold a lot of cars with the promise that they have the hardware for self driving. Retrofitting a million cars would not only cost a lot in terms of gear and work, but it would put additional stress on an already poor service network.

            They have painted themselves into a corner. All because leadership thought self driving was a more or less solved problem almost a decade ago.

            • AlexWIWA@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              11 months ago

              Good point. I thought Teslas had radar for awhile though and they took it out?

              Was lidar that expensive in a car though? Because Infiniti started adding it in 2014 for the cruise control and those cars usually sell new for $50k if you get it fully loaded.

              And they could have added radar and sonar to assist the cameras at least. The radar couldn’t give 3d data, but it could say “yo bro that’s a solid object, not the skyline” at least.

              Good point on the promises though. They really fucked themselves with Elon’s claims.

              • Sondermotor@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                I thought Teslas had radar for awhile though and they took it out?

                They decided radar was superfluous at one point during the pandemic. By sheer coincidence by the time supply chains were getting fucked. Hitting delivery targets were more important than safety.

                And they could have added radar and sonar to assist the cameras at least. The radar couldn’t give 3d data, but it could say “yo bro that’s a solid object, not the skyline” at least.

                They did do that. It can be pretty difficult to make sense of conflicting data like that. Tesla may have decided to not bother to solve such issues and hope less sensor data makes it easier to interpret.

                This is what Elon had to say about Tesla’s sophisticated radar data interpretation capabilities in 2016:

                In fact, an additional level of sophistication – we are confident that we can use the radar to look beyond the car in front of you by bouncing the radar signal off the road and around the car. We are able to process that echo by using the unique signature of each radar pulse as well as the time of flight of the photon to determine  that what we are seeing is in fact an echo in front of the car that’s in front of you. So even if there’s something that was obscured directly both in vision and radar, we can use the bounce effect of the radar to look in front of that car and still brake.

                It takes things to another level of safety.

                I guess the ability to see around cars in front of you got lost in some software update along the line. Otherwise removing radar necessarily meant reducing the safety of the system, or Elon lied in 2016.

                Was lidar that expensive in a car though? Because Infiniti started adding it in 2014 for the cruise control and those cars usually sell new for $50k if you get it fully loaded.

                It depends on what you want to do with the sensors. Somewhat accurately mapping what’s immediately in front of the car to slightly improve speed matching and false positive/negative rates for emergency breaking comes at a cheaper price than the capability to fully map the surroundings fast and accurately enough for a computer to make correct decisions.

      • frozen@lemmy.frozeninferno.xyz
        link
        fedilink
        English
        arrow-up
        29
        ·
        edit-2
        11 months ago

        Skimping on cost is how disasters happen. Ask Richard Hammond. “Spared no expense” my ass, hire more than 2 programmers, you cheap fuck.

        Edit: This was supposed to be a Jurassic Park reference, but my dumb ass mixed up John Hammond and Richard Hammond. That’s what I get for watching Top Gear and reading at the same time.

        • eric@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          Were Richard Hammond’s many crashes a result of cost skimping? If so, I had no idea. Could you elaborate?

          • gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            11 months ago

            I was under the impression that Hammond’s serious crashes were a combination of bad luck and getting a bit too spicy when driving in some already-risky situations. I, too, would appreciate some corroboration.

            • eric@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              11 months ago

              Same here. I did a little googling and can’t find any corroborating evidence, but I also learned that Hammond’s Grand Tour insurance premiums are now more expensive than Top Gear’s budgets were for entire specials.

              • gravitas_deficiency@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                11 months ago

                I mean… given that he has had two very well documented and life-threateningly catastrophic crashes in the course of making car shows… the insurance company underwriting his policies isn’t out of line.

                • eric@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  11 months ago

                  I figured insuring him would be expensive, but it’s more the magnitude of his premiums that shocked me.

        • MonkeMischief@lemmy.today
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 months ago

          As someone who hasn’t much watched Top Gear, I was cracking up at your Jurassic Park reference until I saw your edit and was like “Wait a minute.”

          Top Gear? Jurassic Park? Either way: Hold on to your butts.

          😆

    • mosiacmango@lemm.ee
      link
      fedilink
      English
      arrow-up
      104
      arrow-down
      2
      ·
      edit-2
      11 months ago

      The crazier and stupier shit was that part of his justification was that “people drive and they only have eyes. We should be able to do the same.”

      Its a stunningly idiotic justification, and yet here we are with millions of these “eyes only” teslas on the road.

        • Ranvier@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          31
          ·
          11 months ago

          I can add more, we don’t only have five senses. Elementary school propoganda that is. Here’s all the ones I can think of while driving.

          1. Vision
          2. Hearing
          3. Tactile feedback from wheel, pedals, you could break this down further into skin tactile pressure receptors, and also receptors of muscle tension, though muscle tension and stretching receptors also involved in number 4
          4. Proprioception, where your limbs and body are in space
          5. Rotational acceleration (semi circular canals)
          6. Linear acceleration (utricle and saccule)
          7. Smell, okay this might be a stretch but, some engine issues can be smelly

          And that doesn’t even consider higher order processing and actual integration of all these things which despite all it’s gains with Ai recently can’t match all the capabilities of the brain to integrate all that information or deal with novel stimuli. Point is Elon, add more sensors to your dang cars so they’re less likely to kill people. And people aren’t even perfect at driving, why would we limit it to only our senses anyways? So dumb

      • Akasazh@feddit.nl
        link
        fedilink
        English
        arrow-up
        23
        ·
        11 months ago

        Reminds me of Mao not brushing his teeth, because tigers didn’t brush theirs either.

          • Akasazh@feddit.nl
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            11 months ago

            Nah that would be silly.

            Did deflower a generation of young girls though, who would suffer the agony of his presumed halitosis

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      45
      ·
      edit-2
      11 months ago

      Ah, but you see, his reasoning is that what if the camera and lidar disagree, then what? With only a camera based system, there is only one truth with no conflicts!

      Like when the camera sees the broad side of a white truck as clear skies and slams right at it, there was never any conflict anywhere, everything went just as it was suppo… Wait, shit.

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        30
        ·
        11 months ago

        sees the broad side of a white truck as clear skies and slams right at it

        RIP Joshua Brown:

        The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was “playing Harry Potter on the TV screen” during the collision and was driving so fast that “he went so fast through my trailer I didn’t see him”.

        • girthero@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          11 months ago

          he went so fast through my trailer I didn’t see him”.

          Lidar would still prevail over stupidity in this situation. It does a better job detecting massive objects cars can’t go through.

      • DreadPotato@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        8
        ·
        11 months ago

        what if the camera and lidar disagree, then what?

        This (sensor fusion) is a valid issue in mobile robotics. Adding more sensors doesn’t necessarily improve stability or reliability.

        • ZapBeebz_@lemmy.world
          link
          fedilink
          English
          arrow-up
          25
          ·
          11 months ago

          After a point, yes. However, that point comes when the sensor you are adding is more than the second type in the system. The correct answer is to work into your algorithm a weighting system so the car can decide which sensor it trusts to not kill the driver, i.e. if the LIDAR sees the broadside of a trailer and the camera doesn’t, the car should believe the LIDAR over the camera, as applying the brakes and speeding into the obstacle at 60mph is likely the safer option.

          • DreadPotato@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            4
            ·
            11 months ago

            Yes the solution is fairly simple in theory, implementing this is significantly harder, which is why it is not a trivial issue to solve in robotics.

            I’m not saying their decision was the right one, just that his argument with multiple sensors creating noise in the decision-making is a completely valid argument.

            • lightnsfw@reddthat.com
              link
              fedilink
              English
              arrow-up
              3
              ·
              11 months ago

              Doesn’t seem too complicated… if ANY of the sensors see something in the way that the system can’t resolve then it should stop the vehicle/force the driver to take over

              • DreadPotato@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                11 months ago

                Then you have a very unreliable system, stopping without actual reason all the time, causing immense frustration for the user. Is it safe? I guess, cars that don’t move generally are. Is it functional? No, not at all.

                I’m not advocating unsafe implementations here, I’m just pointing out that your suggestion doesn’t actually solve the issue, as it leaves a solution that’s not functional.

                • lightnsfw@reddthat.com
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  11 months ago

                  If they’re using such unreliable sensors that they’re getting false positives all the time the system isn’t going to be functional in the first place.

              • Kogasa@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                “seeing an obstacle” is a high level abstraction. Sensor fusion is a lower level problem. It’s fundamentally kinda tricky to get coherent information out of multiple sensors looking partially at the same thing in different ways. Not impossible, but the basic model is less “just check each camera” and more sheafs

    • chitak166@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      8
      ·
      11 months ago

      To be fair, humans have proven all you need are visual receptors to navigate properly.

      • Maalus@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 months ago

        To be fair, current computers / AI / whatever marketing name you call them aren’t as good as human brains.

        • chitak166@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          9
          ·
          11 months ago

          No, but they can be improved to the point where all that’s necessary are cameras and the means to control the vehicle.

              • Maalus@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                11 months ago

                No, they don’t and that’s the entire point in all of this. Tesla autopilot sucks and it will suck and kill people. But fanboys like you would rather “look to the future” instead of realistically looking at it.

      • 0xD@infosec.pub
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        Visual receptors… And 3-dimensional vision with all the required processing and decision making behind that based on the visual stimuli, lol.

      • cm0002@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        11 months ago
        1. And how many vehicle accidents and deaths are there today? Proven that humans suck at driving maybe

        2. No we don’t, we use sight, sound and touch/feeling to drive at a minimum

        • chitak166@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          edit-2
          11 months ago

          Touch? Sure, barely. But you can drive without being able to hear.

          I’d also wager you can get a license if you have that rare disease that prevents you from feeling. Since, you know, how little we use touch and hearing to drive.

          But hey? Maybe I’m wrong. Maybe you can provide a source that says you can’t get licensed if you have that disease or if you’re deaf. That would prove your point. Otherwise, it proves mine.

      • ImFresh3x@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        25
        ·
        11 months ago

        Uhhhh…

        …any level 4 car actually, according to the federal governments and all the agencies who regulate this stuff.

        NAVYA, Volvo/Audi, Mercedes, magna, baidu, Waymo.

        Tesla isn’t even trying to go past level 3 at this point.

        • Gargantu8@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          11 months ago

          Why the fuck did I get down voted for looking for an ev Tesla alternative… This place makes no sense. Are you all Tesla fan boys or what?

          • chakan2@lemmy.world
            link
            fedilink
            English
            arrow-up
            13
            arrow-down
            3
            ·
            11 months ago

            It sounded like sarcasm rather than an honest question. Like “Find me a better autopilot” rather than “What manufacturer would you recommend for autopilot?”

            • Gargantu8@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              5
              ·
              11 months ago

              It… sounded?? How does my question not appear genuine? I literally just asked a question. I want to buy an EV with excellent software in the next few years. That’s it. No sarcasm. Would prefer not a Tesla.

              • n3m37h@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                11 months ago

                I took it as condescending, just poorly written. Also lots of Tesla fanboys on here. Glad your not

                • Gargantu8@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  11 months ago

                  So condescending to ask for a better technology option with multiple sensors. /s

      • AlexWIWA@lemmy.ml
        link
        fedilink
        English
        arrow-up
        17
        ·
        11 months ago

        A 2014 Infiniti can drive itself more safely on the highway than a Tesla. The key here is they didn’t lie about the cars capabilities so they didn’t encourage complacency.

        In the city though, yeah you’ll need to look at other level 4 cars.

      • chakan2@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        edit-2
        11 months ago

        What brand of car has better autopilot with other sensors?

        All of them. The other automakers didn’t fire their engineers during a hissy fit.

    • lefaucet@slrpnk.net
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      20
      ·
      11 months ago

      Bot to be a hard-on about it, but if the cameras hace any problem autopilot ejects gracefully and hands it over to the driver.

      I aint no elon dicj rider, but I got FSD andd the radar would see manhole covers and freak the fuck out. It was annoying as hell and pissed my wife off. The optical depth estimation is now far more useful than the radar sensor.

      Lidar has severe problems too. I’ve used it many times professionally for mapping spaces. Reflective surfaces fuck it up. It delivers bad data frequently.

      Cameras will eventually be great! Really they already are, but they’ll get orders of magnitude better. Yeah 4 years ago the ai failed to recognize a rectagle as a truck, but it aint done learning yet.

      That driver really should have been paying attention. Thee car fucking tells you to all the time.

      If a camera has a problem the whole system aborts.

      In the future this will mean the car will pull over, but it’'s, as it makes totally fucking clear, in beta. So for now it aborts and passes control to the human that is payong attention.

      • BaronDoggystyleVonWoof@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        ·
        11 months ago

        So I drive a tesla as well. Quite often I get the message that the camera is blocked by something (like sun, fog, heavy rain).

        You can’t have a reliable self driving system if that is the case.

        Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

        • DreadPotato@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          11 months ago

          Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

          Of course it is, functionally both the camera and lidar solutions work in vector-space. The big difference is that a camera feed holds a lot more information beyond simple vector-space to feed the AI straining with than a lidar feed ever will.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        11 months ago

        any problem autopilot ejects gracefully and hands it over to the driver.

        Gracefully? LMAO

        You can come back when it gives at least 3 minutes warning time in advance, so that I can wake up, get my hands out of the woman, climb into the driver seat, find my glasses somewhere, look around where we are, and then I tell that effing autopilot that it’s okay and it is allowed to disengage now!

        • anotherandrew@lemmy.mixdown.ca
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          Yes, that’s exactly how autopilots in airplanes work too… 🙄

          I think camera FSD will get there, but I also think there are additional sensors needed (perhaps not lidar necessarily) to increase safety and like the point of the article states… a shitload more testing before it’s allowed on public roads. But let’s be reasonable about how the autopilot can disengage.

          • chakan2@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            3
            ·
            11 months ago

            I think camera FSD will get there

            Tesla’s won’t. Musk fired all his engineers. Mercedes has a better driving record these days.

          • NeoNachtwaechter@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            11 months ago

            how autopilots in airplanes work

            That was interesting for some people, before we had autonomy levels defined for cars. Nobody wants to know that anymore.

      • AlexWIWA@lemmy.ml
        link
        fedilink
        English
        arrow-up
        8
        ·
        11 months ago

        Starting off with 3d data will always be better than inferring it. Go fire up Adobe after effects and do a 3d track and see how awful it is, now that same awful process drives your car.

        The AI argument falls short too because that same AI will be better if it just starts off with mostly complete 3d data from lidar and sonar.

        • lefaucet@slrpnk.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Lidar and sonar are way lower resolution.

          Sonar has a hard time telling the difference between a manhole cover, large highway sign and a brick wall.

          • AlexWIWA@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            Okay? The resolution doesn’t help apparently because Teslas are hitting everything. Sonar can look ahead several cars and lidar is 3d data. Combining those with a camera is the only way to do things safely. And lidar is definitely not low resolution.

      • elephantium@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        ejects gracefully and hands it over to the driver

        This is exactly the problem. If I’m driving, I need to be alert to the driving tasks and what’s happening on the road.

        If I’m not driving because I’m using autopilot, … I still need to be alert to the driving tasks and what’s happening on the road. It’s all of the work with none of the fun of driving.

        Fuck that. What I want is a robot chauffer, not a robot version of everyone’s granddad who really shouldn’t be driving anymore.

        • lefaucet@slrpnk.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          After many brilliant people trying for decades, it seems you can’t get the robot chauffeur without several billion miles of actual driving data, sifted and sorted into what is safe, good driving and what is not.

      • chakan2@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        11 months ago

        ejects gracefully and hands it over to the driver.

        Just in time to slam you into an emergency vehicle at 80…but hey…autopilot wasn’t on during the impact, not Musk’s fault.

        • lefaucet@slrpnk.net
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          Nah, with hands on the wheel, looking at the road, the driver, who agrees they will pay attention, will have disengaged the system long before it gets to that point.

          The system’s super easy to disengage.

          It’s also getting better every year.

          5 years ago my car could barely change lanes on the highway. Now it navigates lefts at 5 way lighted intersections in big city traffic with idiots blocking the intersection and suicidal cyclists running red lights as well as it was changing lanes on highway… And highway lane changes are extremely reliable. Cant remember my last lane change disengagement. Same car; just better software.

          I bet 5 years from now it’ll be statistically safer than humans… Maybe not same car. Hope it’s my car too, but it’s unclear if that processor is sufficient…

          Anyway, it’ll keep improving from there.

      • Noxy@yiffit.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        good thing regular cameras aren’t affected by reflective surfaces

        oh wait

    • jimmydoreisalefty@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      11 months ago

      Proof by looking at internal information and data.

      The data leaked by Krupski included lists of Tesla employees, often featuring their social security numbers, in addition to thousands of accident reports, and internal Tesla communications. Handelsblatt and others have used these internal memos and emails as the basis for stories on the dangers of Autopilot and the reasons for the three-year delay in Cybertruck deliveries. From NYT:

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        11
        ·
        11 months ago

        How does any of that prove the claim? Surely independent crash data would show these vehicles are involved in many more accidents than other vehicles if it’s true, but that doesn’t seem to be the case.

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      11 months ago

      Is it really whistleblowing

      It is, and it is important.

      Employees are usually bound by loyalty and contract not to tell any internals. But public knowledge often needs confirmation, otherwise it is only rumours.

  • Routhinator@startrek.website
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    11 months ago

    Gee thanks for reporting on the obvious, Jalponik.

    We knew this. And even this whistleblower report is old.

    What a garbage news outlet.

  • linearchaos@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    22
    ·
    11 months ago

    Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets. They need the scale, and the public traffic, and the idiots in the drunkards and the kids speeding. The only thing that’s going to stop them from working on autopilot will be that it’s no longer financially reasonable to keep going. Even a couple handfuls of deaths aren’t going to stop them.

    • Ottomateeverything@lemmy.world
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      2
      ·
      11 months ago

      Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets.

      Even if we hold this to be true (and I disagree in large part), the point is that Tesla’s systems aren’t at that stage yet. Failing to recognize lights correctly during live demos and such are absolutely things you can test and develop on closed streets or in a lab. Tesla’s shouldn’t be allowed on roads until they’re actually at a point where there are no glaring flaws. And then they should be allowed in smaller numbers.

      • linearchaos@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        25
        ·
        edit-2
        11 months ago

        Do you really think they didn’t test that before they got to this point?

        I’m willing to bet they had been through that intersection before hundreds of times and never seen this. It’s not like it can’t detect a stoplight and they’re just out there randomly running through them all.

        Of the millions of variables that were around them something blinded it to light this time. The footage from that run has probably been reviewed at nauseam at this point and is done more for them finding the problem than they could have done sitting in a closed warehouse making guesses when the car never fails to detect a red light.

        edit: look keep smacking that downvote, but it’s not going to change anything. I hate musk too, but we’re going to make progress toward automated driving unless it becomes more dangerous than existing driver. In the next generation or so, most driving will become automated and all deaths by automobiles will drop significantly. Old and young people will get where they need to go. You cannot automate driving without driving in the real world. If you think they haven’t been doing this in a simulation for a decade, you’re on crack.

        • pivot_root@lemmy.world
          link
          fedilink
          English
          arrow-up
          25
          arrow-down
          2
          ·
          11 months ago

          I still wouldn’t trust the company with a CEO who unilaterally decided that not having redundant systems makes for a better product.

          • linearchaos@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            11 months ago

            I absolutely don’t trust the CEO. I don’t even need to trust the company, there are a dozen others trying to work out the same problem.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      11 months ago

      That’s true, but I think the issue people have with “AutoPilot” is about marketing.

      Tesla brands their cars’ solution as being a full replacement for human interaction and word from Musk, other Tesla employees, media personalities close to Tesla, and fanboys all make out like the car drives itself and the only reason you need a driver in place is to satisfy laws.

      It’s bullshit. They know exactly what they’re doing when they do the above, when they call their system “AutoPilot”, when Musk makes claims his cars can travel from one side of the US to the other without human interaction (only to never actually do it, of course!), and sells car upgrades as Full Self Driving support.

      If they branded it as Assisted Driving, Advanced Cruise Control, Smart Cruise, or something along those lines, like all the other carmakers do with their similar systems, I’d be less inclined to blame Tesla when there’s an unfortunate incident. I think most would agree with me, too.

      But Tesla markets and encourages, both officially and unofficially, that their cars have the ability to drive themselves, look after themselves, and that you’re safe when using the system. It’s a lie and I’m absolutely astounded they’ve had little more than a series of slaps on the wrist for it in most markets.

      • linearchaos@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        11 months ago

        100% accurate.

        They want people to use it so they get data from it. Accidents and deaths will happen… honestly, they’ll always happen… they happen now without it, it’s just more acceptable because it’s human error. Road safety is absolutely awful.

        The reason they get away with it is Lobbying, Money and Political favors. They got where they are by greasing a whole shit ton of wheels with dumptrucks of money.

        Shitty means, but pretty righteous ways.

      • Takumidesh@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        11 months ago

        Tbf, Tesla’s are the only cars that actually know you are on your phone and/or not paying attention.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      6
      ·
      11 months ago

      Should a couple handfuls of deaths if as you said you can’t test it any other way? Autopilot systems could already be saving thousands of lives if more widely deployed and a lack of good reliable autopilot systems has the opportunity cost of blood on our hands. Human drivers are well established to be dangerous. Testing and release of autopilot systems should be done as safely as possible, but to think the first decade or so of these systems will be flawless seems unreasonable.

    • gregorum@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      12
      ·
      11 months ago

      The fact is that most technology that we take for granted today went through a similar evolutionary phase with public use before they became as safe as they are now, especially cars themselves. For well over a century, the automobile has made countless leaps and bounds in safety improvements due to data gathered from public use studies.

      We learn by doing.

      • kingthrillgore@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        That’s fine but Waymo, Cruise et al do trials on closed courses and in co-operation with states to assure a high degree of public safety. Tesla is testing without asking regulators.

        • gregorum@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          Do they? I actually, and honestly, have very little to no knowledge of how companies gather, which is why I did not mention them. Can you provide any links to any information about them? I honestly would like to learn more.

  • WoahWoah@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    20
    ·
    edit-2
    11 months ago

    Tesla “autopilot” averages one airbag deployment every five million miles.

    The average driver in the U.S. averages one every 600,000 miles.

    Idk. Doesn’t seem like it works perfectly, but it does seem to work pretty well.

    • noride@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      ·
      11 months ago

      The comparison is a little flat when you consider autopilot has minimum viable weather and road condition requirements to activate, no snow or hail, etc, while human drivers must endure and perform optimally in all road and weather conditions.

    • linearchaos@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      11 months ago

      Man that’s some interesting brigading we have going on here. You throw facts at them they just explode.

  • CmdrShepard@lemmy.one
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    19
    ·
    edit-2
    11 months ago

    He says this yet they’re already out on the roads logging millions of miles without any outsized danger. Just because we get sensational headlines about a driver behind the wheel who crashed into the side of a semi, doesn’t mean they’re any more dangerous than any other car. AFAIK Tesla still has far fewer wrecks than many others. These driving aids have a lot of room for improvement but they only need to be better than an average driver in order to reduce accidents.

    • 1984@lemmy.today
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      11 months ago

      I think it’s very likely that specially Tesla would go ahead with technology that is dangerous in certain situations, as long as it only happens rarely.

      We all know what kind of a man Elon is.

      You would not see the same in other established car brands.

      Elon is the kind of man who would break not only eggs, but the chickens and the chicken farmers, to make his omelet, and if people get hurt, he would blame it on them for being stupid.

      I would never trust a Tesla because I obviously don’t trust Elon and nobody should.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      11 months ago

      It’s not just about what he’s saying. It’s about the internal data he’s leaking to back up the claims.

    • JWayn596@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      11 months ago

      Nah it’s because they decided to use cameras instead of LiDAR and then try to make it autonomous instead of driver aid.

      AI is at its best when it’s opening up productivity and freedom to think critically or leisurely, the same way sticky notes help someone study.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        11 months ago

        Autopilot is just advanced cruise control. I think you’re conflating it with FSD which is their autonomous driving feature.

    • pedz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      3
      ·
      11 months ago

      I know it’s not the answer you’re looking for but, what is safer for pedestrians, cyclists and other drivers, is to have less cars on the roads. Buses can move dozens of people with a single trained professional driver. Trains can move hundreds. It’s illogical to try to push for autonomous cars for individuals when we already have “self driving” technologies that are much much safer and much more efficient.

      • Cold_Brew_Enema@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        22
        ·
        11 months ago

        You anti car people find any way to insert your views into a conversation. Let me guess, you also do Crossfit?

        • pedz@lemmy.ca
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          2
          ·
          11 months ago

          Being “anti car” is good for people that love cars. More public transit means less trafic, less congestion, less demand for gas and generally just more space for people that actually like to drive cars.

          Plus, if some people don’t want to drive a car and just want to get places, maybe don’t get a car? There’s already safe and proven “technology” to do that. I understand the added safety bonus of “autonomous” cars but let’s be real, it’s not advertised as something to boost the safety of everyone around, it’s advertised as “autopilot” or even worse, “Full Self Driving”.

          I am certainly anti car, but pointing out the flaws in “FSD” or “autonomous cars” and how it’s being falsely marketed to people is also on topic and is not exactly “inserting my views”. People can still love cars and use them, just don’t BS us with the “FSD” and “autonomous” spiel.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      11 months ago

      Depends on the Autopilot feature.

      I was test driving model 3 and summon almost ran over a little kid in the parking lot until my wife ran in front of the car.

      At least when my car’s collision sensors misread something, my eyeballs are there for redundancy.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      11 months ago

      Someone paying proper attention probably would be. But a huge chunk of accidents happen because idiots are looking at their phones or fall asleep on the wheel, and at least a self driving cars, even Teslas on Autopilot, won’t do that.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        11 months ago

        No, they just relinquish control to a sleepy driver without a warning whenever they are about to crash.

        • anotherandrew@lemmy.mixdown.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          We aren’t at the point yet — with any self-drive car — where you should be behind the wheel unless you’re absolutely capable of taking over in seconds.

        • JohnEdwa@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          If you are referring to autopilot, yeah, technically it does that - it turns off once it realises it can’t do anything anymore to avoid the collision so that it doesn’t speed off afterwards due to damaged sensor or glitches etc. But the whole “autopilot turns off so it doesn’t show in statistics” was a blatant lie as Tesla counts all crashes where it has been on before the crash.

            • JohnEdwa@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              11 months ago

              We also receive a crash alert anytime a crash is reported to us from the fleet, which may include data about whether Autopilot was active at the time of impact. To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. https://www.tesla.com/en_eu/VehicleSafetyReport

              In the case the crash happened later than 5 seconds after Autopilot was disabled, or it was never used in the first place, it would be in the “Tesla vehicles not using autopilot technology” part of the data.

              As for automatically detecting not-crashes, that’s a bit harder to do don’t ya think?