Telegram repeatedly refused to join child protection schemes - eviltoast
  • pressanykeynow@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    3 months ago

    Why should they? Should every mail(physical or not) you receive be opened and read? Should the government have access to everything you do on your phone or pc? Should the government moderate your house? You are full 1984.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      3 months ago

      Even Facebook doesn’t allow CSAM in public profiles. You can’t just pull up Facebook and see that on your regular feed. Closed groups are a different story. Why should this be different?

      Mind you I’m not saying that the CEO should be criminally responsible for what users on the platform post. I’m pointing out that moderation is a thing even on some of the worst offenders in the space.

      • pressanykeynow@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        3 months ago

        You didn’t answer my questions.

        What moderation do you want? And how would you prevent “moderation” from becoming censorship?

        Aren’t there people whose job is to prevent crimes? Why some IT person who has no idea of crime need to do their job?

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Because your questions aren’t germane to the point I was making. In fact the first question “how would you prevent " moderation” from becoming censorship" is literally answered by my second comment. Facebook already does this with Facebook messenger. But even if they didn’t, Signal has functions to allow encryption.

          So what you’re saying is, criminals who aren’t using encryption (on a platform where encryption features are readily available) don’t deserve to be moderated on a platform where their messages are using a company’s cloud bandwidth. Does the company not have rights? And if we agree that the company has rights then they also have to follow the law.

          Yes there are people who’s jobs are to (not prevent because police and policing is reactionary not preventative) investigate, and try criminals in a court of law for crimes). This was a poor question to ask. You’re literally acting like we don’t employ thousands of people over various social media and messaging platforms to review and moderate things like CSAM.

          The gist for me is criminals gonna do criminal things but at the end of the day these are our public spaces and just because I don’t want to be surveilled in public or live in a police state doesn’t mean that I want criminals not to be prosecuted for crimes they commit just because someone cares more about their bottom line than they do about moderation of a messaging platform they provide to the public.

          We aren’t talking about end to end encrypted messages here. We’re talking about messages with no such encryption that can be viewed by anyone. There are literally public groups being used by Terrorist organizations on Signal. And while Signal has repeatedly refused to give up encryption keys for the ones that are using encryption (as they should), any criminal that isn’t is not protected by it and should be moderated.