Laws - eviltoast
  • lukeb28@lemm.ee
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    9 months ago

    Depends on the bias of the programmer or just be random due to the impossibility of making a correct choice. If we haven’t been able to solve the problem, a robot will never unless it knows something we dont (in this hypothetical, not an option) or is able to take an action we could not.

    It’s such an absurd situation that I don’t think it is constructive to consider. There are always more options in reality than a binary choice, and likely even more to a machine who could consider so many more inputs so much faster.

    In the end, an accident is just that, an accident. No matter how well you consider all possibilities and design contingencies, there is always risk in everything. After an accident, we assess what happened and modify our assumptions about the probability of the event repeating, and make changes to reduce the odds of it happening again.

    That said, if someone makes a mistake that leads to the robot switching the track from an empty one to one with people, that’s not an accident and someone fucked up royally