Stanford researchers find Mastodon has a massive child abuse material problem - eviltoast

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

    • mindbleach@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      ‘Everyone but you agrees with me!’ Bullshit.

      ‘Nobody wants this stuff that whole servers exist for.’ Self-defeating bullshit.

      ‘You just don’t understand.’ Not an argument.

      • balls_expert@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Okay, the former then.

        Let’s just think about it, how do you think it would turn out if you went outside and asked anyone about pornographic drawings of children? How long until you find someone who thinks like you outside your internet bubble?

        “Nobody wants this stuff that whole servers…”

        There are also servers dedicated to real child porn with real children too. Do you think that argument has any value with that tidbit of information tacked onto it?

        • mindbleach@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          Ask a stranger about anything pornographic and see how it goes.

          This is rapidly going from pointless to stupid. Suffice it to say: stop pretending drawings are ever as bad as actual child abuse.