Sujet : Re: Apple accused of underreporting suspected CSAM on its platforms
De : jollyroger (at) *nospam* pobox.com (Jolly Roger)
Groupes : misc.phone.mobile.iphoneDate : 24. Jul 2024, 16:39:57
Autres entêtes
Organisation : People for the Ethical Treatment of Pirates
Message-ID : <lgclidFbd6U2@mid.individual.net>
References : 1 2 3 4 5 6 7
User-Agent : slrn/1.0.3 (Darwin)
On 2024-07-24, Andrew <
andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:05:03 -0000 (UTC) :
>
Everyone on this planet should have a right to basic privacy.
And they do. That right is not absolute, however. Just like everyone
has a right to freedom until they are convicted of a serious crime
and are sent to prison. Even *suspects* of serious crimes are held in
prison before conviction.
>
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Nah, there are news articles all of the time about shit like this:
Man Stored Child Pornography on Google Account, Sentenced to 14 Years in
Federal Prison
<
https://www.justice.gov/usao-wdtx/pr/man-stored-child-pornography-google-account-sentenced-14-years-federal-prison>
But there are also plenty of horrendous stories of false matches:
Google AI flagged parents’ accounts for potential abuse over nude photos
of their sick kids
<
https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation>
Apple's proposal went further than any other to protect the privacy of
its customers as well as reduce the possibility of false matches, but it
still fell short and risked people's privacy, which is why they ended up
scrapping it.
-- E-mail sent to this address may be devoured by my ravenous SPAM filter.I often ignore posts from Google. Use a real news client instead.JR