Apple faces renewed pressure to protect child safety: ‘Child sexual abuse is stored on iCloud. Apple allows it.’ - eviltoast
  • afk_strats@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    54
    ·
    1 year ago

    This title is misleading click bait for an article advocating for intrusive data scanning, which by the way, cannot be completely automated.

    Here’s a snippet of the iCloud TOS which specifically forbids CSA on iCloud.

    You agree that you will NOT use the Service to:

    a. upload, download, post, email, transmit, store, share, import or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy…

    Further down, the same TOS specifically calls out that such content may be identified or removed by Apple

    Again, not defending Apple, but I’d rather not have them or an army of underpaid contractors search through people’s pictures as a type of corporate law enforcement, because “think of the children”. This is a systemic problem which can be addressed without invading EVERYONE’s privacy