Liste des Groupes | Revenir à mpm iphone |
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:I'm not saying it's impossible. Just very unlikely.On 2024-07-26 15:14, Jolly Roger wrote:Not it's not. There was a margin of error in the proposed matchingOn 2024-07-26, Alan <nuh-uh@nope.com> wrote:>On 2024-07-26 09:11, Jolly Roger wrote:>On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:>On 24/07/2024 22:35, Jolly Roger wrote:>On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:>Andrew <andrew@spam.net> wrote:>Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :>
>The NSPCC should really be complaining at how ineffectual the>
tech companies are rather than complain at Apple for not sending
millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions
resulted.
>
Think about that.
>
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is harmed,
that's one too many. Would you want to be that unlucky innocent
person who has to deal with charges, a potential criminal sexual
violation on your record, and all that comes with it? I certainly
wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
>
An actual human would review the images in question...
And at that point, someone's privacy may be violated. Do you want a
stranger looking at photos of your sick child? What if that stranger
came to the conclusion that those photos are somehow classifiable as
sexual or abusive in some way? Would you want to have to argue your case
in court because of it?
Yes. At that point...
>
...if and only if the person is INNOCENT...
>
...someone's privacy is unnecessarily violated.
>
And it's a stretch to imagine that:
>
1. Innocent pictures would be matched with KNOWN CSAM images, AND;
algorithms.
I'm not saying it's impossible. Just very unlikely.(the logical AND)That decision is a human one, and humans make mistakes and have biased
>
2. A person reviewing those images after they've been flagged wouldn't
notice they don't actually match; AND
beliefs that can lead them to make faulty decisions.
We've already had two very statistically unlikely events that had to happen to get to this point.3. The owner of those images at that point would be charged when theyInnocent people shouldn't have to prove anything to anyone.
could then show that they were in fact innocent images.
But innocent people do get searched...Search warrants require probable cause and are signed by a judge.Yes, but one is one too many in my book.>
And yet you are fine with innocent people's privacy being violated
when a search warrant is issued erroneously.
Totally different scenario.
Les messages affichés proviennent d'usenet.