Liste des Groupes | Revenir à mpm iphone |
On 23/07/2024 05:35 Your Name <YourName@YourISP.com> wrote:The reality is that most users didn't even know about the plans and wouldn't have cared even if they did know. As usual, it was a minority of loud-mouthed whiners who complained because they were scare someone at Apple would see that they've been taking photos of cute wabbits ... which wasn't even possible if the whiners actually understood anything. :-\
badgolferman wrote:Apple has been accused of underreporting the prevalence of child sexualabuse material (CSAM) on its platforms. The National Society for thePrevention of Cruelty to Children (NSPCC), a child protection charity inthe UK, says that Apple reported just 267 worldwide cases of suspected CSAMto the National Center for Missing & Exploited Children (NCMEC) last year.That pales in comparison to the 1.47 million potential cases that Googlereported and 30.6 million reports from Meta. Other platforms that reportedmore potential CSAM cases than Apple in 2023 include TikTok (590,376), X(597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/SonyInteractive Entertainment (3,974). Every US-based tech company is requiredto pass along any possible CSAM cases detected on their platforms to NCMEC,which directs cases to relevant law enforcement agencies worldwide.As The Guardian, which first reported on the NSPCC's claim, points out,Apple services such as iMessage, FaceTime and iCloud all have end-to-endencryption, which stops the company from viewing the contents of what usersshare on them. However, WhatsApp has E2EE as well, and that servicereported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.âThere is a concerning discrepancy between the number of UK child abuseimage crimes taking place on Appleâs services and the almost negligiblenumber of global reports of abuse content they make to authorities,âRichard Collard, the NSPCC's head of child safety online policy, said.sexual abuseâApple is clearly behind many of their peers in tackling child
when all tech firms should be investing in safety and preparing for theroll out of the Online Safety Act in the UK.âApple declined to comment on the NSPCC's accusation, instead pointing TheGuardian to a statement it made when it shelved the CSAM scanning plan.securityApple said it opted for a different strategy that âprioritizes the
and privacy of [its] users.â The company told Wired in August 2022 that"children can be protected without companies combing through personaldata."https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.htmlApple had a system ready to go, but a pile of brainless moronscomplained about their "privacy" being invaded (which it wasn't), soApple was forced to abandon it. All this report achieves is toacknowledge that all those other companies listed above are lessstringent about their users' privacy.Many people threatened to throw away or even burn their iPhones if
Apple went ahead with the scheme. People don't want their actions policed on THEIR phones. This really scared Apple and they immediately did a 180 degree turn.
Les messages affichés proviennent d'usenet.