"A Billion Nazis at the Table" - The Fediverse model proves contextual moderation by real humans is both easy and affordable. The presence of Nazis on corporate social media implies at least a tacit a - eviltoast

“If you’ve ever hosted a potluck and none of the guests were spouting antisemitic and/or authoritarian talking points, congratulations! You’ve achieved what some of the most valuable companies in the world claim is impossible.”

  • Touching_Grass@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 year ago

    I think its a numbers game. If fediverse had the numbers it would be plagued with all the same issues. But its a little fish in a big pond.

    • JustinHanagan@kbin.socialOP
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      If a Fediverse instance grew so big that it couldn’t moderate itself and had a lot of spam/Nazis, presumably other instances would just defederate, yeah? Unless an instance is ad-supported, what’s the incentive to grow beyond one’s ability to stay under control?

        • fubo@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          edit-2
          1 year ago

          questionable pictures

          We need to keep distinguishing “actual, real-life child-abuse material” from “weird/icky porn”. Fediverse services have been used to distribute both, but they represent really different classes of problem.

          Real-life CSAM is illegal to possess. If someone posts it on an instance you own, you have a legal problem. It is an actual real-life threat to your freedom and the freedom of your other users.

          Weird/icky porn is not typically illegal, but it’s something many people don’t want to support or be associated with. Instance owners have a right to say “I don’t want my instance used to host weird/icky porn.” Other instance owners can say “I quite like the porn that you find weird/icky, please post it over here!”

          Real-life CSAM is not just extremely weird/icky porn. It is a whole different level of problem, because it is a live threat to anyone who gets it on their computer.

            • fubo@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              edit-2
              1 year ago

              You’d be surprised by how much of the Internet was built by furries, BDSM folk, and other people whose porn a lot of folks think is weird and icky.

              Also, you seem to have misunderstood the gist of my comment, or I wasn’t clear enough. The tools to deal with CSAM will of necessity be a lot stronger than content moderation that’s driven by users’ preferences of what they’d like not to see.

                • fubo@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  2
                  ·
                  edit-2
                  1 year ago

                  I’m talking about the necessities of moderation policy.

                  The things you think it’s “suspect” I’m not saying? Those are things I think are obviously true and don’t need to be restated. Yes, child abuse is very bad. We know that. I don’t need to say it over again, because everyone already knows it. I’m talking specifically about the needs for moderation here.

                  I’m pointing at the necessary distinction between “you personally morally object to that material” and “that material will cause the law to come down on you and your users and anyone who peers with you”.

                  You should have the ability to keep both of those off your server, but the latter is way more critical.


                  “White knighting”? Delete your account.