Waymo issued a recall after two robotaxis crashed into the same pickup truck - eviltoast

Last year, two Waymo robotaxis in Phoenix “made contact” with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles’ software. A “recall” in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn’t pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn’t elaborate on what it meant by saying that its robotaxis “made contact” with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren’t carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

  • bstix@feddit.dk
    link
    fedilink
    English
    arrow-up
    78
    arrow-down
    9
    ·
    9 months ago

    The company says the truck was being towed improperly

    Shit happens on the road. It’s still not a great idea to drive into it.

    The company developed and validated a fix for its software to prevent similar incidents

    So their plan is to fix one accident at a time…

    • DoomBot5@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      9 months ago

      Rules are written in blood. Once you figure out all the standard cases, you can only try and predict as many edge cases that you can think of. You can’t make something fool proof because there will always be a greater fool that will come by.

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        2
        ·
        9 months ago

        Unexpected or not, it should do its best to stop or avoid the obstacle, not drive into it.

        An autonomous vehicle shouldn’t ever be able to actively drive forward into anything. It’s basic collision detection that ought to brake the car here. If something is in the position the car wants to drive to, it simply shouldn’t drive there. There’s no reason to blame the obstacle for being towed incorrectly…

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          9 months ago

          In this case it thought the vehicle had a different trajectory due to how it was improperly set up.

          The car probably thought it wasn’t going to hit it until it was too late and the trajectory calculation proved incorrect.

          Every vehicle on the road is few moments away from crashing if we calculate that incorrectly. It doesn’t matter if it knows its there.

          • bstix@feddit.dk
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 months ago

            Same thing applies to a human driver. Most accidents happen because the driver makes a wrong assumption. The key to safe driving is not getting in situations where driving is based on assumptions.

            Trajectory calculation is definitely an assumption and shouldn’t be allowed to override whatever sensor is checking for obstructions ahead of the car.

            • NotMyOldRedditName@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 months ago

              The car can’t move without trajectory calculations though.

              If the car ahead of you pulls forward when the light goes green, your car can start moving forward as well keeping in mind the lead cars trajectory and speed.

              If it was just don’t hit an object in its path, the car wouldn’t move forward until the lead was half way down the block.

              The car knew the truck was there in this case, it wasn’t a failure to detect. Due to a programming failure it thought it was safe to move because the truck wouldn’t be there.

              If you’re following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you’ve calculated their trajectory and a safe speed behind. But if that same vehicle slams on the brakes and goes into reverse, well… Goodluck.

              It’s all assumptions assuming the detection is accurate in the first place.

              • bstix@feddit.dk
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 months ago

                If you’re following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you’ve calculated their trajectory and a safe speed behind.

                You dont need to calculate their trajectory. It’s enough to know your own.

                If a heavy box falls off a truck and stops dead in front of you, you need to be able to stop. That box has no trajectory, so it’s an error to include other vehicles trajectories in the safe distance calculation.

                Traffic can move through an intersection closely by calculating a safe distance, which may be smaller than the legal definition, but still large enough to stop for anything suddenly appearing on the road. The only thing needed is that the distance is calculated based on your own speed and a visually confirmed position of other things. It can absolutely be done regardless of the speed or direction of other vehicles.

                Anyway. A backwards facing truck is a weird thing to misinterpret. Trucks sometimes face backwards for whatever reasons.

                It would be interesting to know how the self driving car would react to a ghost driver.

                • NotMyOldRedditName@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  edit-2
                  9 months ago

                  You dont need to calculate their trajectory. It’s enough to know your own.

                  This doesn’t make sense. It’s why I was saying the car won’t move at a stop light when it goes green until the car is half way down the street.

                  If the car is 2.5 seconds ahead of me at 60mph on the highway, it’s only 2.5 seconds ahead of me if the other car is doing 60 mph. If the car is doing 0mph then I’m going to crash into it.

                  It needs to know how fast and what direction the obstacle is going, and how to calculate the rate of acceleration/deceleration and extrapolate from there.

                  • bstix@feddit.dk
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    9 months ago

                    2.5 seconds at 60 mph is more than enough to come to a full stop. If the car in front of you dropped an anvil (traveling at 0 mph) on the road, you could stop before crashing into the anvil. You do not need to drive into the other cars trajectory path.

    • Tetsuo@jlai.lu
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      9 months ago

      Honestly, I think only trial and error will let us get a proper autonomous car.

      And I still think autonomous cars will save many more lives than it endangered once it become reliable.

      But for now this is bound to happen…

      To be clear, they still are responsible of these car and the safety of others. They didn’t test properly.

      They should be trying every edge case they can think about.

      A large screen on the side of a truck ? What if a car is displayed on it ? Would the car sensor notice the difference?

      A farmer dropped a hay bale on the road ? It got flattened by rain ? Does the car understand that this might not be safe to drive on or to brake on ?

      There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

      But even if you try everything there will be mistakes and fatalities.

      • threelonmusketeers@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

        Do you think “better than human drivers” is sufficient for deployment on public roads, or do you think the bar should be higher?

        • Tetsuo@jlai.lu
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          Honestly, I’m pragmatic, if less people die in accidents involving autonomous car, then yes.

          The thing is we shouldn’t be trusting the manufacturers for these stats. It has to be reported by a government agency or something.

          Similarly Autonomous car software should have to be certified by an independent organization before being deployed. Same thing for updates to the software. Otherwise we would get deadly updates from time to time.

          If we deploy and handle autonomous car with the same safety approach as in aviation I’m sure this transition can be done fairly safely.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      16
      arrow-down
      4
      ·
      9 months ago

      So their plan is to fix one accident at a time…

      Well how else would you do it?

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        32
        arrow-down
        4
        ·
        9 months ago

        You drive a car and can’t quite figure out what is happening in front of you.

        Do you:

        • A: Turn up the music and plow right through.
        • B: Slow down (potentially to a full stop) and assess the situation.
        • C : Slow down, close your eyes and continue driving slowly into the obstacle
        • D: Sound the horn and flash the lights

        From the description offered in the article the car chose C, which is wrong.

        • lengau@midwest.social
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          3
          ·
          9 months ago

          Given the millions of global road deaths annually I think B is probably the least popular answer.

          • Tetsuo@jlai.lu
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            9 months ago

            Honestly slowing down too much can easily create an accident that didn’t exist in the first place.

            Not every situation can be handled by slowing down.

            If that’s the default behavior on high speed road this could be deadly for the car behind you.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          9
          arrow-down
          8
          ·
          9 months ago

          I wasn’t asking about the car’s logic algorithm; we all know that the SDC made an error, since it [checks notes] hit another car. We already know it didn’t do the correct thing. I was asking how else you think the developers should be working on the software other than one thing at a time. That seemed like a weird criticism.

          • bstix@feddit.dk
            link
            fedilink
            English
            arrow-up
            13
            arrow-down
            7
            ·
            9 months ago

            Sorry, I didn’t answer your question. Consider the following instead:

            Your self driving car has crashed into a god damn tow truck with a backwards facing truck.

            Do you:

            • A: Program your car to deal differently with fucking backwards facing trucks on tow trucks
            • B: Go back to question one and make your self driving car pass a simple theory test.

            According to the article the company has chosen A, which is wrong.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 months ago

        Ideally they don’t need actual accidents to find errors, but discover said issues in QA and automated testing. Not hitting anything sounds like a manageable goal to be honest.