In every reported case where police mistakenly arrested someone using facial recognition, that person has been Black - eviltoast

Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism

  • cobra89@beehaw.org
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    1 year ago

    IMO, the fact that the models aren’t accurate with people of color but they’re putting the AI to use for them anyway is the systemic racism. If the AI were not good at identifying white people do we really think it would be in active use for arresting people?

    It’s not the fact that the technology is much much worse at identifying people of color that is the issue, it’s the fact that it’s being used anyway despite it.

    And if you say "oh, they’re just being stupid and didn’t realize it’s doing that " then it’s egregious that they didn’t even check for that.

    • nieceandtows@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      That part I can agree with. These issues should have been fixed before it was rolled out. The fact that they don’t care is very telling.