Liste des Groupes | Revenir à mpm iphone |
On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:Jolly Roger <jollyroger@pobox.com> wrote:On 2024-07-26, Alan <nuh-uh@nope.com> wrote:On 2024-07-26 09:11, Jolly Roger wrote:On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:On 24/07/2024 22:35, Jolly Roger wrote:On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:Andrew <andrew@spam.net> wrote:Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the
tech companies are rather than complain at Apple for not
sending millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO
convictions resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss
of privacy
Actually, Apple could not guarantee that, and there was a
non-zero chance that false positive matches would result in
privacy violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is
harmed, that's one too many. Would you want to be that unlucky
innocent person who has to deal with charges, a potential criminal
sexual violation on your record, and all that comes with it? I
certainly wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
And at that point, someone's privacy may be violated.
You're entering into confucious territory. If nothing is triggered is
anyone's privacy infringed.
You're claiming innocent photos would never match, but there is a
possibility of false matches inherent in the algorithm, no matter how
small.
Do you want a stranger looking at photos of your sick child?
That wouldn't happen with Apple's method.
It would. If a sufficient number of images matched Apple's algorithms
(which are not perfect and allow for false matches), a human being would
be looking at those photos of your naked sick child. How else do you
think Apple would determine whether the images in question are or are
not CSAM? And what happens when that stranger decides "You know what? I
think these photos are inappropriate even if they don't match known CSAM"?
What if that stranger came to the conclusion that those photos are
somehow classifiable as sexual or abusive in some way? Would you want
to have to argue your case in court because of it?
That's a lot of ifs and steps.
Yes, but it's possible.
No-one is going to be charged for a dubious
photo of their own child. There are much bigger fish to fry and get into
jail.
You're wrong. It has already happened:
>
A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
Him as a Criminal
<https://archive.is/78Pla#selection-563.0-1075.217>
Read the whole article to get a glimpse of what innocent people go
through who fall victim to this invasive scanning.
Do you think these parents and their child consider their privacy to be
violated? How would you feel if your intimate photos were added to the
PhotoDNA CSAM database because they were incorrectly flagged?
---
In 2021, the CyberTipline reported that it had alerted authorities
to “over 4,260 potential new child victims.” The sons of Mark and Cassio
were counted among them.
---
A lot of really bad things can happen to good people:
---
“This would be problematic if it were just a case of content
moderation and censorship,” Ms. Klonick said. “But this is doubly
dangerous in that it also results in someone being reported to law
enforcement.” It could have been worse, she said, with a parent
potentially losing custody of a child. “You could imagine how this might
escalate,” Ms. Klonick said.
---
...AND since they were comparing images against KNOWN CSAM, false
positives would naturally be very few to begin with.
Yes, but one is one too many in my book.
How many children are you prepared to be abused to protect YOUR
privacy?
Now you're being absurd. My right to privacy doesn't cause any children
to be abused.
Apple was wise to shelve this proposal. And I am happy to see that
they embraced more private features such as the Safe Communication
feature which is done without violating customers' privacy.
It wasn't violating anyone's privacy. For the umpteenth time. It
actually preserved people's privacy by design.
While it went further than the rest to protect people's privacy, there
was still room for error and for innocent people to be harmed. That's
the truth you seem to want to disregard, and that's why Apple shelved
it.
Les messages affichés proviennent d'usenet.