How AI slop undermines our understanding of reality - eviltoast
  • Arthur Besse@lemmy.ml
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    2
    ·
    edit-2
    12 days ago

    big oof.

    We can conclude: that photo isn’t AI-generated. You can’t get an AI system to generate photos of an existing location; it’s just not possible given the current state of the art.

    the author of this substack is woefully misinformed about the state of technology 🤦

    it has, in fact, been possible for several years already for anyone to quickly generate convincing images (not to mention videos) of fictional scenes in real locations with very little effort.

    The photograph—which appeared on the Associated Press feed, I think—was simply taken from a higher vantage point.

    Wow, it keeps getting worse. They’re going full CSI on this photo, drawing a circle around a building on google street view where they think the photographer might have been, but they aren’t even going to bother to try to confirm their vague memory of having seen AP publishing it? wtf?

    Fwiw, I also thought the image looked a little neural network-y (something about the slightly less-straight-than-they-used-to-be lines of some of the vehicles) so i spent a few seconds doing a reverse image search and found this snopes page from which i am convinced that that particular pileup of cars really did happen as it was also photographed by multiple other people.