Waymo issued a recall after two robotaxis crashed into the same pickup truck - eviltoast

Last year, two Waymo robotaxis in Phoenix “made contact” with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles’ software. A “recall” in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn’t pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn’t elaborate on what it meant by saying that its robotaxis “made contact” with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren’t carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

  • GiveMemes@jlai.lu
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    17
    ·
    9 months ago

    Get your beta tests off my tax dollar funded roads pls. Feel free to beta test on a closed track.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      26
      arrow-down
      4
      ·
      9 months ago

      They’ve already been testing on private tracks for years. There comes a point where, eventually, something new is used for the first time on a public road. Regardless, even despite even idiotic crashes like this one, they’re still safer than human drivers.

      I say my tax dollar funded DMV should put forth a significantly more stringent driving test and auto-revoke the licenses of anybody who doesn’t pass, before I’d want SDCs off the roads. Inattentive drivers are one of the most lethal things in the world, and we all just kinda shrug our shoulders and ignore that problem, but then we somehow take issue when a literal supercomputer on wheels with an audited safety history far exceeding any human driver has two hiccups over the course of hundreds of millions of driven miles. It’s just a weird outlook, imo.

      • fiercekitten@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        9 months ago

        People have been hit and killed by autonomous vehicles on public streets due to bad practices and bad software. Those cases aren’t hiccups, those are deaths that shouldn’t have happened and shouldn’t have been able to happen. If a company can’t develop its product and make it safe without killing people first, then it shouldn’t get to make the product.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          9 months ago

          People have been hit and killed by human drivers at much, much higher rates than SDCs. Those aren’t hiccups, and those are deaths that shouldn’t have happened, as well. The miles driven per collision ratio between humans and SDCs aren’t even comparable. Human drivers are an order of magnitude more dangerous, and there’s an order of magnitude more human drivers than SDCs in the cities where these fleets are deployed.

          By your logic, you should agree that we should be revoking licenses and removing human drivers from the equation, because people are far more dangerous than SDCs are. If we can’t drive safely without killing people, then we shouldn’t be licensing people to drive, right?

          • fiercekitten@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            9 months ago

            I’m all for making the roads safer, but these companies should never have the right to test their products in a way that gets people killed, period. That didn’t happen in this article, but it has happened, and that’s not okay.

            • Chozo@kbin.social
              link
              fedilink
              arrow-up
              5
              arrow-down
              1
              ·
              edit-2
              9 months ago

              People shouldn’t drive in a way that gets people killed. Where’s the outrage for the problem that we’ve already had for over a century and done nothing to fix?

              A solution is appearing, and you’re rejecting it.