- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
cross-posted from: https://lemmy.zip/post/1174411
Archived version: https://archive.ph/T9zeg
Archived version: https://web.archive.org/web/20230807184110/https://arstechnica.com/information-technology/2023/08/innocent-pregnant-woman-jailed-amid-faulty-facial-recognition-trend/
This wasn’t just a bad facial recognition issue. During the photo lineup the detective intentionally used a many years old photograph of the victim that more closely matched the person in the surveillance video even though she had a current photo available. She also knew the woman in the surveillance video was definitely not 8 months pregnant, so it couldn’t possibly be the woman they identified, but she still arrested the victim anyway.
Yes the facial recognition gave a false positive but any reasonable person would recognize instantly that they had the wrong person. The detective was either incompetent or a liar.
Incompetent, you don’t even need an undergraduate degree to be a detective in Detroit:
This technology is going to be a nightmare when combined with decades of deliberately selecting for low iq in the police hiring process.
It seems obvious to check that the women in surveillance footage is also pregnant. But, if you have poor critical reasoning skills and you’re only looking for confirmation I can totally see how pregnancy could be missed.
Incompetence.
Cops will be cops