Sujet : Re: Apple accused of underreporting suspected CSAM on its platforms
De : jollyroger (at) *nospam* pobox.com (Jolly Roger)
Groupes : misc.phone.mobile.iphone alt.privacyDate : 25. Jul 2024, 16:41:59
Autres entêtes
Organisation : People for the Ethical Treatment of Pirates
Message-ID : <lgfa27Fcg0lU3@mid.individual.net>
References : 1 2 3 4 5 6 7 8 9 10 11
User-Agent : slrn/1.0.3 (Darwin)
On 2024-07-25, Andrew <
andrew@spam.net> wrote:
Jolly Roger wrote on 24 Jul 2024 21:29:07 GMT :
>
What matters is the percentage
No, words have meanings, and zero means zero. And there is a
higher-than-zero number of pedophiles who have been caught due to
CSAM scanning. Unfortunately, there is also a higher-than-zero number
of innocent people whose privacy was violated in the process.
>
While I support blah blah blah
Nothing you can say will change the fact that a greater-than-zero number
of people have been convicted from CSAM scanning - just like nothing you
can say will convince me that CSAM scanning can be done without
violating the privacy of innocent people. Things like this should not
happen:
Google AI flagged parents’ accounts for potential abuse over nude photos
of their sick kids
<
https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation>
And while Apple did their best to prevent such things from happening
with their proposal, they could not guarantee it would not happen, which
is why they ended up scrapping the proposal.
Nothing else has any meaning.
Nah.
-- E-mail sent to this address may be devoured by my ravenous SPAM filter.I often ignore posts from Google. Use a real news client instead.JR