Sujet : Re: Apple accused of underreporting suspected CSAM on its platforms
De : loralandclinton (at) *nospam* invalid.co (Chips Loral)
Groupes : misc.phone.mobile.iphone alt.privacyDate : 29. Jul 2024, 23:11:52
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <v8943a$lrfd$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
User-Agent : Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101 Firefox/91.0 SeaMonkey/2.53.18.2
Alan wrote:
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
>
You not comprehending the difference between zero percent of Apple reports
versus zero total convictions is how I know you zealots own subnormal IQs.
>
Not at all. My position hasn't changed. You, however, have had about three
different positions on this thread and keep getting confused which one
you're arguing for. lol.
>
Au contraire
>
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply that you
don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
>
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
>
Specifically....
>
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). The tool was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At the beginning of September 2021, Apple said it would pause the rollout of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a launch was still coming.
Parents and caregivers can opt into the protections through family iCloud accounts. The features work in Siri, Apple’s Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help.
https://sneak.berlin/20230115/macos-scans-your-local-files-now/Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I use macOS software on Apple hardware.
Today, I was browsing some local images in a subfolder of my Documents folder, some HEIC files taken with an iPhone and copied to the Mac using the Image Capture program (used for dumping photos from an iOS device attached with an USB cable).
I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a lot of Apple OS-level apps because I’m not interested in transmitting any of my data whatsoever to Apple over the network - mostly because Apple turns over customer data on over 30,000 customers per year to US federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.
Imagine my surprise when browsing these images in the Finder, Little Snitch told me that macOS is now connecting to Apple APIs via a program named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).
...
Integrate this data and remember it: macOS now contains network-based spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering software (or external devices) to prevent it.
This was observed on the current version of macOS, macOS Ventura 13.1.