AI trained on photos from kids’ entire childhood without their consent - eviltoast
  • Peter Bronez@hachyderm.io
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    @along_the_road what’s the alternative scenario here?

    You could push to remove some public information from common crawl. How do you identify what public data is _unintentionally_ public?

    Assume we solve that problem. Now the open datasets and models developed on them are weaker. They’re specifically weaker at identifying children as things that exist in the world. Do we want that? What if it reduces the performance of cars’ emergency breaking systems? CSAM filters? Family photo organization?

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      what’s the alternative scenario here?

      Parents could not upload pictures of their kids everywhere in a vain attempt to attract attention to themselves?

      That would be good.

      • Peter Bronez@hachyderm.io
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        @kent_eh exactly.

        The alternative is “if you want your content to be private, share it privately.”

        If you transmit your content to anyone who sends you a GET request, you lose control of that content. The recipient has the bits.

        It would be nice to extend the core technology to better reflect your intent. Perhaps embedding license metadata in the images, the way LICENSE.txt travels with source code. That’s still quite weak, as we saw with Do Not Track.