Sujet : Re: Apple accused of underreporting suspected CSAM on its platforms
De : jollyroger (at) *nospam* pobox.com (Jolly Roger)
Groupes : misc.phone.mobile.iphoneDate : 24. Jul 2024, 16:34:16
Autres entêtes
Organisation : People for the Ethical Treatment of Pirates
Message-ID : <lgcl7oFbd6U1@mid.individual.net>
References : 1 2 3 4 5 6
User-Agent : slrn/1.0.3 (Darwin)
On 2024-07-24, Chris <
ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
Apple's proposal was to match personal photos with *known* CSAM
images.
>
Correct.
>
It would do nothing to detect *new* CSAM images.
>
Also correct.
>
And it could not prevent false positive matches.
>
Incorrect.
Nope. I am correct. It absolutely could not prevent false matches.
It is designed to avoid false positives, although nothing is 100%
perfect.
If it has even .1 % false matches, then someone's privacy has been
violated.
Everyone on this planet should have a right to basic privacy.
>
And they do.
Tell that to the people whose private photos are scanned and are falsely
accused of a crime they didn't commit because an imperfect algorithm got
it wrong.
-- E-mail sent to this address may be devoured by my ravenous SPAM filter.I often ignore posts from Google. Use a real news client instead.JR