Sujet : Re: Apple accused of underreporting suspected CSAM on its platforms
De : jollyroger (at) *nospam* pobox.com (Jolly Roger)
Groupes : misc.phone.mobile.iphoneDate : 24. Jul 2024, 16:47:08
Autres entêtes
Organisation : People for the Ethical Treatment of Pirates
Message-ID : <lgclvsFbd6U3@mid.individual.net>
References : 1 2 3 4 5 6 7 8
User-Agent : slrn/1.0.3 (Darwin)
On 2024-07-24, Chris <
ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
True. Unlike others, Apple's proposal was to only scan images on
device that were being uploaded to Apple's servers, and only match
hashes of them to a database of hashes matching known CSAM images.
And only after multiple matches reached a threshold would Apple
investigate further.
>
All correct.
>
Yet even with those precautions, there was still a realistic chance
of false positives
>
The rate was deterministic and tunable.
If the rate was anything other than ZERO, them people's privacy was at
risk.
and invasion of privacy, which is why they scrapped the proposal.
>
No.
Yes.
They scrapped it because it wasn't worth pursuing. As a business it
was of no benefit to them and the noisy reaction was enough to put
them off. There wasn't any "invasion of privacy". At least no more
than there currently is in the US.
Incorrect. Apple's statement makes it clear that their decision to scrap
CSAM scanning was based on the feedback they received from security and
privacy professionals:
---
“After extensive consultation with experts to gather feedback on child
protection initiatives we proposed last year, we are deepening our
investment in the Communication Safety feature that we first made
available in December 2021,” the company told WIRED in a statement. “We
have further decided to not move forward with our previously proposed
CSAM detection tool for iCloud Photos. Children can be protected without
companies combing through personal data, and we will continue working
with governments, child advocates, and other companies to help protect
young people, preserve their right to privacy, and make the internet a
safer place for children and for us all.”
---
For those unaware, the Communication Safety feature is not the same
thing at all: rather than scanning photos being uploaded to iCloud to
match against known CSAM photo hashes, Communication Safety for Messages
is opt-in and analyzes image attachments users send and receive on their
devices to determine whether a photo contains nudity. The feature is
designed so Apple never gets access to the messages, the end-to-end
encryption that Messages offers is never broken, and Apple doesn't even
learn that a device has detected nudity. Everything happens on device
and the feature is only available for children's devices where the
parent can optionally enable it.
-- E-mail sent to this address may be devoured by my ravenous SPAM filter.I often ignore posts from Google. Use a real news client instead.JR