Liste des Groupes | Revenir à mpm iphone |
Apple has been accused of underreporting the prevalence of child sexualApple had a system ready to go, but a pile of brainless morons complained about their "privacy" being invaded (which it wasn't), so Apple was forced to abandon it. All this report achieves is to acknowledge that all those other companies listed above are less stringent about their users' privacy.
abuse material (CSAM) on its platforms. The National Society for the
Prevention of Cruelty to Children (NSPCC), a child protection charity in
the UK, says that Apple reported just 267 worldwide cases of suspected CSAM
to the National Center for Missing & Exploited Children (NCMEC) last year.
That pales in comparison to the 1.47 million potential cases that Google
reported and 30.6 million reports from Meta. Other platforms that reported
more potential CSAM cases than Apple in 2023 include TikTok (590,376), X
(597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
Interactive Entertainment (3,974). Every US-based tech company is required
to pass along any possible CSAM cases detected on their platforms to NCMEC,
which directs cases to relevant law enforcement agencies worldwide.
As The Guardian, which first reported on the NSPCC's claim, points out,
Apple services such as iMessage, FaceTime and iCloud all have end-to-end
encryption, which stops the company from viewing the contents of what users
share on them. However, WhatsApp has E2EE as well, and that service
reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.
“There is a concerning discrepancy between the number of UK child abuse
image crimes taking place on Apple’s services and the almost negligible
number of global reports of abuse content they make to authorities,”
Richard Collard, the NSPCC's head of child safety online policy, said.
“Apple is clearly behind many of their peers in tackling child sexual abuse
when all tech firms should be investing in safety and preparing for the
roll out of the Online Safety Act in the UK.”
Apple declined to comment on the NSPCC's accusation, instead pointing The
Guardian to a statement it made when it shelved the CSAM scanning plan.
Apple said it opted for a different strategy that “prioritizes the security
and privacy of [its] users.” The company told Wired in August 2022 that
"children can be protected without companies combing through personal
data."
https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html
Les messages affichés proviennent d'usenet.