Sujet : Re: Apple accused of underreporting suspected CSAM on its platforms
De : jollyroger (at) *nospam* pobox.com (Jolly Roger)
Groupes : misc.phone.mobile.iphoneDate : 24. Jul 2024, 22:35:02
Autres entêtes
Organisation : People for the Ethical Treatment of Pirates
Message-ID : <lgdac6F3c6aU4@mid.individual.net>
References : 1 2 3 4 5 6 7 8 9 10
User-Agent : slrn/1.0.3 (Darwin)
On 2024-07-24, Chris <
ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the tech
companies are rather than complain at Apple for not sending millions
of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions
resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in child
safety?
>
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy violations.
plus it only affected customers of icloud. Don't like it? Don't use
icloud. Simple.
That much is true. Only images uploaded to iCloud would have been
examined by the algorithm.
-- E-mail sent to this address may be devoured by my ravenous SPAM filter.I often ignore posts from Google. Use a real news client instead.JR