Online Safety Bill: Algorithms that lead boys to Andrew Tate content targeted - eviltoast
  • tony@lemmy.hoyle.me.uk
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    My theory is society has a suppressing affect on these things… It’s not nice to be a nazi, or to mistreat people you don’t like, so these things get hidden.

    Algorithms do the opposite. Now someone with Nazi tendencies is surrounded by them and encouraged. Posts hating trans people get pushed by algorithms because they drive engagement (even if all the initial responses are negative, it’s still engagement to the algorithm, which will then boost the ‘popular’ post).

    Things like lemmy and mastodon don’t do that and end up nicer places as a result.

    • Mr_Will@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You’re my mostly right about society but the problem is not algorithms, it’s echo-chambers. The KKK wasn’t driven by an algorithm but still radicalised people in the same way - once you’re able to find a bubble within society that accepts your views, it’s very easy for your views to grow more extreme. Doesn’t matter whether that’s fascism, racism, communism, no-fap or hydrohomies - the mechanisms work the same way.

      Reddit was arguably no more algorithm-led than Lemmy or Mastodon, but that hasn’t prevented the rise of a whole list of hate-fueled subs over there. The root problem is that people with Nazi tendancies find pro-nazi content engaging. The algorithm isn’t pushing it upon them, it’s just delivering what they want.