Tesla driver who killed 2 people while using autopilot must pay $23,000 in restitution without having to serve any jail time - eviltoast
  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    190
    arrow-down
    2
    ·
    11 months ago

    Oh no, it’s even worse than that.

    It’s the CEO and other staff repeatedly speaking of the system as if it’s basically fully capable and it’s only for legal reasons why a driver is even required. Even saying that the car could drive from one side of the US to the other without driver interaction (only to not actually do that, of course).

    It’s the company never correcting people when they call it a self driving system.

    It’s the company saying they’re ready for autonomous taxis and saying owner’s cars will make money for them while they aren’t driving it.

    It’s calling their software subscription Full Self Driving

    It’s honestly staggering to me that they’re able to get away with this shit.

    • meleecrits@lemmy.world
      link
      fedilink
      English
      arrow-up
      93
      arrow-down
      4
      ·
      11 months ago

      I love my Model 3, but everything you said is spot on. Autopilot is a great driver assist, but it is nowhere near autonomous driving. I was using it on the highway and was passing a truck on the left. The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes. Fortunately, I was able to figure out what went wrong and quickly accelerated myself so as to not become a hazard to the cars behind me.

      Using Autopilot as anything more than a nice dynamic cruise control setting is putting your life, and other lives, in danger.

      • Neato@kbin.social
        link
        fedilink
        arrow-up
        57
        ·
        11 months ago

        Holy shit. My car doing that once and I’d be a nervous wreck just thinking about using it again.

        • Wrench@lemmy.world
          link
          fedilink
          English
          arrow-up
          29
          ·
          11 months ago

          I give teslas more room because I have been brake checked by them on empty roads before. These ghost brake problems are prevalent.

        • snooggums@kbin.social
          link
          fedilink
          arrow-up
          20
          ·
          11 months ago

          I have had the adaptive cruise control brake on multiple Hondas and Subarus in similar situations. Not like slamming on the brakes, but firm enough to confuse the hell out of me.

          Every time it was confusing and now I just don’t use it if the road is anything but open and clear.

          • buran@lemmy.world
            link
            fedilink
            English
            arrow-up
            23
            ·
            edit-2
            11 months ago

            Honda’s sensing system will read shadows from bridges as obstructions in the road that it needs to brake for. It’s easy enough to accelerate out of the slowdown, but I was surprised to find that there is apparently no radar check to see if the obstruction is real.

            My current vehicle doesn’t have that issue, so either the programming has been improved or the vendor for the sensing systems is a different one (different vehicle make, so it’s entirely possible).

        • KptnAutismus@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          8
          ·
          edit-2
          11 months ago

          i barely trust the lane-keeping assistant in my friend’s car. imagine going 70+km/h and suddenly the car decides to jerk the steering to the left/right because you weren’t exactly in the middle of your lane.

          fuck modern assistants IMO. i can use the steering wheel just fine, and people have been able to for a hundred years.

          • Pennomi@lemmy.world
            link
            fedilink
            English
            arrow-up
            26
            arrow-down
            2
            ·
            11 months ago

            Considering that driving is (statistically) the most dangerous thing the average person does, I wouldn’t really say that people use the steering wheel just fine.

            It’s just that computers are currently worse at it than humans.

            • KptnAutismus@lemmy.world
              link
              fedilink
              English
              arrow-up
              14
              ·
              11 months ago

              agreed. if “autopilot” becomes a better driver than the average person, then it has a right to exist.

                • Pennomi@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  13
                  arrow-down
                  1
                  ·
                  11 months ago

                  In not entirely sure if I trust the statistics that are available for a couple reasons (and feel free to correct me if I’m wrong):

                  1. They are self reported by the manufacturer
                  2. Systems like autopilot will revert to manual control when it detects a situation it can’t handle, which means it has the luxury of “not being at fault during the crash” when it may have caused the situation 5 seconds before
                  3. It’s comparing to all vehicles instead of just vehicles that have similar non-self-driving but effective safety features
            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              4
              arrow-down
              2
              ·
              11 months ago

              I wouldn’t even say that without seeing statistics to back it up. The news doesn’t cover routine traffic accidents, but one Tesla screws up one thing and that story is front page. Don’t rely on anecdotes and emotions.

          • merc@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            11 months ago

            i can use the steering wheel just fine, and people have been able to for a hundred years.

            People have been bad at it for a hundred years. I’m not saying that people should necessarily be using auto-steering that keeps them in the middle of their lanes, but they should at least be using systems that beep at them when they stray out of their lane.

            The bar for self-driving technology isn’t some amazing perfect computer that never makes a mistake. It’s the average driver. The average driver is bad.

            • KptnAutismus@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              we can do two things (these are not mutually exclusive):

              -take further control away from the drivers and make them dependent on a computer, which can always misunderstand a situation and make the driver responsible for it.

              -educate drivers properly, at least in the US. americans have been historically bad at driving and have also been known to be undereducated.

              • merc@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                I’m all for more driver education, and for stricter licensing requirements like they have in Europe. Having said that, eventually computers are going to have to take over.

                It’s pretty absurd that we’re handing control over multi-ton devices traveling at tens of meters per second to fallible, bored, easily distracted humans. The safer cars get, the safer drivers feel. The safer drivers feel, the less they feel they need to concentrate on driving.

                Safe driving just will never be a skill that humans will be good at. The tasks that humans are good at that require concentration are tasks that are challenging and remain challenging. Think playing a sport where there’s always action and you have to react. Humans are bad at tasks that are mostly routine and boring, but if your concentration lapses you can cause a catastrophe. Those are the kinds of tasks where people get bored so they start glancing away, reading a book or looking at a smartphone, or whatever. For driving to be engaging, it has to be non-boring, which means non-safe. The safer it gets, the more boring it gets, so people stop paying the required attention. There’s just no winning.

        • burliman@lemm.ee
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          14
          ·
          11 months ago

          That’s the bar that automatic driving has. It messes up once and you never trust it again and the news spins the failure far and wide.

          Your uncle doing the same thing just triggers you to yell at him, the guy behind him flips you off, he apologizes, you’re nervous for a while, and you continue your road trip. Even if he killed someone we would blame the one uncle, or some may blame his entire class at worst. But we would not say that no human should drive again until it is fixed like we do with automated cars.

          I do get the difference between those, and I do think that they should try to make automated drivers better, but we can at least agree about that premise: automated cars have a seriously unreasonable bar to maintain. Maybe that’s fair, and we will never accept anything but perfect, but then we may never have automated cars. And as someone who drives with humans every day, that makes me very sad.

          • maynarkh@feddit.nl
            link
            fedilink
            English
            arrow-up
            15
            ·
            11 months ago

            There is a big difference between Autopilot and that hypotethical uncle. If the uncle causes an accident or breaks shit, he or his insurance pays. Autopilot doesn’t.

            By your analogy, it’s like putting a ton of learner drivers on the road with unqualified instructors, and not telling the instructors that they are supposed to be instructors, but that they are actually taking a taxi service. Except it’s somehow their responsibility. And of course pocketing both the instruction and taxi fees.

            The bar is not incredibly high for self driving cars to be accepted. The only thing is that they should take the blame if they mess up, like all other drivers.

            • burliman@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              Yeah, for sure. Like I said, I get the difference. But ultimately we are talking about injury prevention. If automated cars prevented one less death per mile than human drivers, we would think they are terrible. Even though they saved one life.

              And even if they only caused one death per year we’d hear about it and we might still think they are terrible.

          • Neato@kbin.social
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            11 months ago

            The difference is that Tesla said it was autopilot when it’s really not. It’s also clearly not ready for primetime. And auto regulators have pretty strict requirements about reliability and safety.

            While that’s true that autonomous cars kill FAR less people than human drivers, ever human is different. If an autonomous driver is subpar and that AI is rolled out to millions of cars, we’ve vastly lowered safety of cars. We need autonomous cars to be better than the best driver because, frankly, humans are shit drivers.

            I’m 100% for autonomous cars taking over entirely. But Tesla isn’t really trying to get there. They are trying to sell cars and lying about their capabilities. And because of that, Tesla should be liable for the deaths. We already have them partially liable: this case caused a recall of this feature.

            • Staiden@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              11 months ago

              But the vaporware salesman said fully automatic driving was 1 year away! In 2018, 2019, 2020, 2021… he should be held responsible. The guy once said to further technology some people will die and that’s just the price we pay. It was in a comment about going to Mars, but we should take that in to accout for everything he does. If I owned a business and one of my workers died or killed someone because of gross negligence I’d be held responsible why does he get away with it.

          • SlopppyEngineer@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            11 months ago

            Except Tesla’s uncle had brain damage and doesn’t really learn from the situation so will go it again, and had clones of him driving thousands of other cars.

      • Damage@slrpnk.net
        link
        fedilink
        English
        arrow-up
        8
        ·
        11 months ago

        Something like that happened to me while using adaptive cruise control on a rental Jeep Renegade, it slammed the brakes twice on the highway but for no clear reason. I deactivated it before it tried a third one.

      • LordKitsuna@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        The auto cruise on the Priuses at work do this a lot. If the freeway curves to the left or something it will panic and think I’m about to hit the cars in the lane next to me also going through the Curve

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        11 months ago

        The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes.

        Even dynamic cruise control must never do such dangerous mistakes!

        You should claim that they fix this on warranty, and they should prove that this is never going to happen again.

        • LordKitsuna@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Almost all of them do it, the one most fresh in my mind is the Prius because my work uses them as base cards so I drive them a lot. If the highway curves kind of hard to either the left or the right sometimes it will panic and think you’re about to hit the car in the lane next to you because they’re technically in front of you and so it will try to brake.

          Thankfully there is an option to turn off the automatic braking it will just start screaming instead