Liste des Groupes | Revenir à mpm iphone |
On 2024-07-23, Chris <ithinkiam@gmail.com> wrote:badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:Jolly Roger wrote:On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:Jolly Roger wrote:On 2024-07-23, badgolferman <REMOVETHISbadgolferman@gmail.com>
wrote:
Apple declined to comment on the NSPCC's accusation, instead
pointing The Guardian to a statement it made when it shelved the
CSAM scanning plan. Apple said it opted for a different
strategy that “prioritizes the security and privacy of [its]
users.” The company told Wired in August 2022 that "children
can be protected without companies combing through personal
data."
This is one reason many people choose Apple over alternatives.
iPhone. The preferred mobile device of child molestors.
This could be a new marketing ploy someday!
Privacy for everyone is important.
Sorry, I can't agree with that. Some people give up their right to
privacy when they harm others or society. The laws are there for
everyone, not just those who choose to follow them.
Agreed.
No-one is forced to use icloud. If they didn't like the policy, they
could go elsewhere.
True. Unlike others, Apple's proposal was to only scan images on device
that were being uploaded to Apple's servers, and only match hashes of
them to a database of hashes matching known CSAM images. And only after
multiple matches reached a threshold would Apple investigate further.
Yet even with those precautions, there was still a realistic chance of
false positives
and invasion of privacy, which is why they scrapped the
proposal.
Like to google and meta, who are more than happy to share millionsof
people's private photos with law enforcement which apparently is just
fine.
And they do so with zero privacy protections in place.
Les messages affichés proviennent d'usenet.