Liste des Groupes | Revenir à mpm iphone |
Alan wrote:That discusses a system that Apple disabled.On 2024-07-29 15:11, Chips Loral wrote:Bullshit.Alan wrote:>On 2024-07-29 04:23, Andrew wrote:>Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :>
>>You not comprehending the difference between zero percent of Apple reports>
versus zero total convictions is how I know you zealots own subnormal IQs.
Not at all. My position hasn't changed. You, however, have had about three
different positions on this thread and keep getting confused which one
you're arguing for. lol.
Au contraire
>
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply that you
don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
>
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
>
Specifically....
>
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of images of any kind.
After getting caught.
>
You can't seem to get ANYTHING right, Mac-troll:
>
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
>
In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). The tool was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At the beginning of September 2021, Apple said it would pause the rollout of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a launch was still coming.
>
Parents and caregivers can opt into the protections through family iCloud accounts. The features work in Siri, Apple’s Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help.
>
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
>
Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I use macOS software on Apple hardware.
>
Today, I was browsing some local images in a subfolder of my Documents folder, some HEIC files taken with an iPhone and copied to the Mac using the Image Capture program (used for dumping photos from an iOS device attached with an USB cable).
>
I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a lot of Apple OS-level apps because I’m not interested in transmitting any of my data whatsoever to Apple over the network - mostly because Apple turns over customer data on over 30,000 customers per year to US federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.
>
Imagine my surprise when browsing these images in the Finder, Little Snitch told me that macOS is now connecting to Apple APIs via a program named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).
>
...
>
>
Integrate this data and remember it: macOS now contains network-based spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering software (or external devices) to prevent it.
>
This was observed on the current version of macOS, macOS Ventura 13.1.
>
'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the results to an Apple server. This claim was made by a cybersecurity researcher named Jeffrey Paul. However, after conducting a thorough analysis of the process, it has been determined that this is not the case.'
>
https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html
Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.
Apple's new photo-scanning feature will scan photos stored in iCloud to see whether they match known Child Sexual Abuse Material (CSAM). The problem with this, like many others, is that we often have hundreds of photos of our children and grandchildren, and who knows how good or bad the new software scanning technology is? Apple claims false positives are one trillion to one, and there is an appeals process in place. That said, one mistake from this AI, just one, could have an innocent person sent to jail and their lives destroyed.
Apple has many other features as part of these upgrades to protect children, and we like them all, but photo-scanning sounds like a problem waiting to happen.
Here are all of the "features" that come with anti-CSAM, expected to roll out with iOS 15 in the fall of 2021.
Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.
iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.
Siri and Search: Siri and Search will provide additional resources to help children and parents stay safe online and get help with unsafe situations.
Now that you understand how anti-CSAM works, the only way to avoid having your photos scanned by this system is to disable iCloud Photos. Your photos are scanned when you automatically upload your photos to the cloud, so the only current way to avoid having them scanned is not to upload them.
This adds an interesting problem. The majority of iPhone users use iCloud to back up their photos (and everything else). If you disable iCloud, you will need to back up your photos manually. If you have a PC or Mac, you can always copy them to your computer and back them up. You can also consider using another cloud service for backups.
Let's talk about disabling iCloud and also removing any photos you already have uploaded. You will have 30 days to recover your photos if you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.
You'll want to backup and disable iCloud, then verify that no photos were left on their servers.
Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable iCloud Photos
First, we can disable the uploading of iCloud photos while keeping all other backups, including your contacts, calendars, notes, and more.
Click on Settings.
At the top, click on your name.
Click on iCloud.
Click on Photos.
Uncheck iCloud Photos.
You will be prompted to decide what to do with your current photos.
If you have the space on your phone, you can click on Download Photos & Videos, and your photos will all be on your iPhone, ready to back up somewhere else.
Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server
While all of your photos should be deleted from Apple's server, we should verify that.
Click on Settings.
At the top, click on your name.
Click on iCloud.
Click on Manage Storage.
Click on Photos.
Click on Disable & Delete
https://discussions.apple.com/thread/254538081?sortBy=rank
https://www.youtube.com/watch?v=K_i8rTiXTd8
How to disable Apple scanning your photos in iCloud and on device. The new iOS 15 update will scan iPhone photos and alert authorities if any of them contain CSAM. Apple Messages also gets an update to scan and warn parents if it detects an explicit image being sent or received. This video discusses the new Apple update, privacy implications, how to disable iPhone photo scanning, and offers a commentary on tech companies and the issue of privacy and electronic surveillance.
Les messages affichés proviennent d'usenet.