Last year, two Waymo robotaxis in Phoenix “made contact” with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles’ software. A “recall” in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn’t pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn’t elaborate on what it meant by saying that its robotaxis “made contact” with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren’t carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

  • bstix@feddit.dk
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    The company says the truck was being towed improperly

    Shit happens on the road. It’s still not a great idea to drive into it.

    The company developed and validated a fix for its software to prevent similar incidents

    So their plan is to fix one accident at a time…

    • DoomBot5@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      5 months ago

      Rules are written in blood. Once you figure out all the standard cases, you can only try and predict as many edge cases that you can think of. You can’t make something fool proof because there will always be a greater fool that will come by.

  • Overzeetop@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    The description of an unexpected/(impossible) orientation for an on road obstacle works as an excuse, right up to the point where you realize that the software should, explicitly, not run into anything at all. That’s got to be, like, the first law of (robotic) vehicle piloting.

    It was just lucky that it happened twice as, otherwise, Alphabet likely would have shrugged it off as some unimportant, random event.

    • dan1101@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Billionaires get to alpha test their software on public roads and everyone is at risk.

      • nivenkos@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        It’s great though - that’s how you get amazing services and technological advancement.

        I wish we had that. In Europe you’re just stuck paying 50 euros for a taxi in major cities (who block the roads, etc. to maintain their monopolies).

        Meanwhile in the USA you guys have VR headsets, bioluminescent houseplants and self-driving cars (not to mention the $100k+ salaries!), it’s incredible.

        • vaultdweller013@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          Most of us are in poverty, I dont know when but we’re in another gilded age and just like the last was underneath the gold is rusty iron.

          • Patches@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            Bruh in the US of A the grass is greener because it’s made of polypropylene and spray painted green. Just don’t smell it, or look too hard.

  • Chozo@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it.

    Having worked at Waymo for a year troubleshooting daily builds of the software, this sounds to me like they may be trying to test riskier, “human” behaviors. Normally, the cars won’t accelerate at all if the lidar detects an object in front of it, no matter what it thinks the object is or what direction it’s moving in. So the fact that this failsafe was overridden somehow makes me think they’re trying to add more “What would a human driver do in this situation?” options to the car’s decision-making process. I’m guessing somebody added something along the lines of “assume the object will have started moving by the time you’re closer to that position” and forgot to set a backup safety mechanism for the event that the object doesn’t start moving.

    I’m pretty sure the dev team also has safety checklists that they go through before pushing out any build, to make sure that every failsafe is accounted for, so that’s a pretty major fuckup to have slipped through the cracks (if my theory is even close to accurate). But luckily, a very easily-fixed fuckup. They’re lucky this situation was just “comically stupid” instead of “harrowing tragedy”.

  • rsuri@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    5 months ago

    In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane.

    See? Waymo robotaxis don’t just take you where you need to go, they also dispense swift road justice.