New Study: At Least 15% of All Reddit Content is Corporate Trolls Trying to Manipulate Public Opinion - eviltoast
  • egeres@lemmy.world
    link
    fedilink
    arrow-up
    56
    ·
    4 months ago

    I’ve said this before, but we also need to be cautious about this on lemmy and devise ways to empower mods and the community to fight back against this, I’m not entirely sure how since it’s a very complex problem

    • Dave@lemmy.nz
      link
      fedilink
      arrow-up
      37
      ·
      4 months ago

      I am convinced this is already happening. One example is the endless new accounts posting ibtimes links.

      There are also propoganda websites posted regularly by new accounts (especially sowing disinformation about Russia’s war on Ukraine).

      Basically be wary of anything posted where it’s their first post. Often they make accounts and don’t use them for months so they look older.

      I also think astroturfing is happening but at lower rate than reddit.

      Like you, I have no idea how we can counter this at scale.

      • maxinstuff@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 months ago

        The same critical thinking should apply as all other platforms.

        A link posted to an article on a company’s public blog published in the last 24hrs? Almost certainly viral marketing.

      • Bwaz@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        4 months ago

        It might help if a poster’s number of posts and signup date were listed at the top of each post or comment. Would’t be a fix but might help weed out upsprouting autotrolls.

        • blind3rdeye@lemm.ee
          link
          fedilink
          arrow-up
          5
          ·
          4 months ago

          There are a lot of subreddits which routinely award hundreds or thousands of upvotes for repetitive low value posts. … This is a cog in the well-tuned machine of new-accounts being created and matured to look ‘real’ for when they are later used for advertising / manipulation later down the line.

          In the early months of a new account, it is easier to spot. Eg. If you see a post on a game subreddit with a title like “Exciting to try this game, any tips get started?”, you might click the profile and see that their entire history is a bunch of low-effort discussion starters. “Name a band from the 80s that everyone has forgotten”; “What’s the most misunderstood concept in maths?”; “What’s the most underrated (movie / band / drug / car / tourist attraction / whatever suits the topic of the subreddit)?”

          A heap of threads like that, on a new account with a very generic name (adjective-noun-numbers is a common pattern); posting on a variety of subredits… is highly suspicious. But it gets harder to recognise as the account gets older and has a longer history - at which point it is ready to be sold / used for its next purpose.

        • Dave@lemmy.nz
          link
          fedilink
          arrow-up
          2
          ·
          4 months ago

          Yes, definitely. Perhaps highlighted if it’s one of their first few posts or the account is new.

    • Starayo@lemm.ee
      link
      fedilink
      English
      arrow-up
      30
      ·
      edit-2
      4 months ago

      It’s bloody difficult.

      I used to mod on /r/videos years and years back. We had this one guy who was not very active as a mod in the day to day stuff, but was respected because he’d basically disappear for a few months and then reappear with a huge post in our modding sub basically going “so these are all spammers/malicious actors, here’s their profiles, the accounts were created in these waves, here’s where they’ve copied existing posts / the identical generic comments and things they use to get around our posting requirements, the targets they’ve been promoting, etc”. Just huge pages of thoroughly researched proof.

      This was well before we had huge awareness of situations like Russia manipulating social media - it was usually those viral video places that buy up rights to videos and handle licensing and promotion. It’s why for a long time any licensed videos from places like viralhog etc were outright banned - they were constantly trying to manipulate reddit postings in bad faith, and even trying to socially engineer the mod team in modmail, so any videos that mentioned a licensing deal in the description were automatically banned from posting.

      If we didn’t have that one guy spotting the patterns, most of it would have gotten by easily. Unfortunately he did eventually disappear for good. No clue what happened to him, hope he just cut out social media or something. But with the spamming and astroturfing stuff… Even after fighting it for years I can’t tell you what to do to counter it besides “have more of that guy”.

    • Audacious@sh.itjust.works
      link
      fedilink
      arrow-up
      11
      ·
      4 months ago

      Most, if not all game reddits, product reddits, and company reddits are secretly or openly controlled by their respective corpos. Keeping communities as third party forums is a must have IMO.

    • AhismaMiasma@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      I agree, this is a very complex issue. As a community we should come together and brainstorm ideas while quenching our thirst with a nice can of Diet Pepsi, the zero-sugar alternative to being thirsty!