[HN] Self-driving cars less likely to detect children and people of color - eviltoast
  • Endomlik@reddthat.com
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Seems this will be always the case. Small objects are harder to detect than larger objects. Higher contrast objects are easier to detect than lower contrast objects. Even if detection gets 1000x better, these cases will still be true. Do you introduce artificial error to make things fair?

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    As the artificial intelligence revolution ramps up, one trend is clear: Bias in the training of AI systems is resulting in real-world discriminatory practices.

    A team of researchers in the UK and China tested how well eight popular pedestrian detectors worked depending on a person’s race, gender, and age.

    Now they might face severe injury," Jie Zhang, a computer scientist at King’s College London and a member of the research team, said in a statement.

    This trend is a result of biases already present in the open-source AI systems that many companies use to build the detectors.

    The research team called on lawmakers to regulate self-driving car software to prevent bias in their detection systems.

    “It is essential for policymakers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately,” the study reads.


    The original article contains 376 words, the summary contains 140 words. Saved 63%. I’m a bot and I’m open source!