Liste des Groupes | Revenir à mpm iphone |
Jolly Roger <jollyroger@pobox.com> wrote:On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:>
No-one is going to be charged for a dubious photo of their own
child. There are much bigger fish to fry and get into jail.
You're wrong. It has already happened:
>
A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
Him as a Criminal <https://archive.is/78Pla#selection-563.0-1075.217>
I explicitly said "charged". No-one got charged. The law is working
just fine. It's the tech, as I've been arguing all along, that's the
problem.
Read the whole article to get a glimpse of what innocent people go>
through who fall victim to this invasive scanning.
Do you think these parents and their child consider their privacy to
be violated? How would you feel if your intimate photos were added to
the PhotoDNA CSAM database because they were incorrectly flagged?
This wasn't PhotoDNA, which is what Apple was similar to. It was
google's AI method that is designed to "recognize never-before-seen
exploitative images of children" which is where the real danger sits.
>
It is designed to identify new abuse images based on only the pixel
data so all hits will be massively enriched for things that look like
abuse. A human won't have the ability to accurately identify the
(likely innocent) motivation for taking photo and "to be safe" will
pass it onto someone else make the decision i.e. law enforcement. The
LE will have access to much more information and see it's an obvious
mistake as seen in your article.
Apple's system was more like hashing the image data and comparing
hashes where false positives are due to algorithmic randomness. The
pixel data when viewed by a human won't be anything like CSAM and an
easy decision made.
>
What's crucial here is that Google are looking for new stuff - which
is always problematic - whereas Apple's was not. The search space when
looking for existing images is much tinier and the impact of false
positives much, much smaller.
>How many children are you prepared to be abused to protect YOUR
privacy?
Now you're being absurd. My right to privacy doesn't cause any
children to be abused.
That's what you'd like to think, yet the reality is that not only is
it harder to identify perpetrators, but also, ironically, your
position ensures more people get erroneously labelled.
>Apple was wise to shelve this proposal. And I am happy to see that
they embraced more private features such as the Safe Communication
feature which is done without violating customers' privacy.
It wasn't violating anyone's privacy. For the umpteenth time. It
actually preserved people's privacy by design.
While it went further than the rest to protect people's privacy,
there was still room for error and for innocent people to be harmed.
That's the truth you seem to want to disregard, and that's why Apple
shelved it.
My truth is that the Apple method was a significant improvement.i
Plus if people didn't like it - unreasonably - they didn't have to use
icloud.
Apple only shelved it for PR reasons, which is a real shame.
What you're disregarding is that the alternative hasn't been more
privacy for people, it's been less privacy and more errors. A
lose-lose situation.
>
You want to live in a world that doesn't exist.
Les messages affichés proviennent d'usenet.