Sujet : Re: Apple accused of underreporting suspected CSAM on its platforms
De : jollyroger (at) *nospam* pobox.com (Jolly Roger)
Groupes : misc.phone.mobile.iphoneDate : 30. Jul 2024, 00:35:29
Autres entêtes
Organisation : People for the Ethical Treatment of Pirates
Message-ID : <lgqna1F5ej4U1@mid.individual.net>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
User-Agent : slrn/1.0.3 (Darwin)
On 2024-07-29, Chris <
ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
Yes, but even in Apple's case, there's a small change of a false
positive patch. And were that to happen, there is a danger of an
innocent person's privacy being violated.
>
In every case there's a chance of FPs. Apple would have had lower FPR then
*the current* system.
>
Given the choice I'm in favour of the better, evidence-based method.
You're wrong. The choices now are:
- Use systems and services where CSAM scanning disregards your privacy.
- Use systems and services that do no CSAM scanning of private content.
The latter happens to be Apple's systems and services (with the singular
exception of email).
You're in favour of the worse system
Nope. I have never said that. I'm in favor of no CSAM scanning of
private content.
Nope. I don't support any scanning of private content.
>
Yet it's already happening so why not support the better method?
Speak for yourself. It's certainly not happening to my private content.
I agree - still not good enough for me though.
>
"Perfect is the enemy of the good"
>
By seeking perfection you and others are allowing and enabling child
abuse.
Nah. There is no child abuse occurring in my private content, and my
decision not to use or support privacy-violating technology isn't
harming anyone.
Apple only shelved it for PR reasons, which is a real shame.
You don't know all of Apple's motivations. What we know is Apple shelved
it after gathering feedback from industry experts. And many of those
experts were of the opinion that even with Apple's precautions, the risk
of violating people's privacy was too great.
>
That wasn't the consensus. The noisy tin-foil brigade drowned out any
possible discussion. i
Not true. There was plenty of collaboration and discussion. Here's what
Apple said about their decision:
---
"Child sexual abuse material is abhorrent and we are committed to
breaking the chain of coercion and influence that makes children
susceptible to it," Erik Neuenschwander, Apple's director of user
privacy and child safety, wrote in the company's response to Heat
Initiative. He added, though, that after collaborating with an array of
privacy and security researchers, digital rights groups, and child
safety advocates, the company concluded that it could not proceed with
development of a CSAM-scanning mechanism, even one built specifically to
preserve privacy.
"Scanning every user's privately stored iCloud data would create new
threat vectors for data thieves to find and exploit," Neuenschwander
wrote. "It would also inject the potential for a slippery slope of
unintended consequences. Scanning for one type of content, for instance,
opens the door for bulk surveillance and could create a desire to search
other encrypted messaging systems across content types."
---
Apple should have simply implemented it like google are doing, but
badly.
No thanks. I like the Apple that protects my privacy. Apparently you
don't.
What you're disregarding is that the alternative hasn't been more
privacy for people, it's been less privacy and more errors. A
lose-lose situation.
You want to live in a world that doesn't exist.
Nah. I simply choose not to engage with those alternatives for things
I consider private.
>
Head-in-sand mentality.
My private photos are private, and that's the way I like it. There is no
sand here.
-- E-mail sent to this address may be devoured by my ravenous SPAM filter.I often ignore posts from Google. Use a real news client instead.JR