

Apple’s Photos app already uses very clever machine learning to identify the content of photos in your library. Machine learning can be crazy smart fingerprint matching, by design, is a bit simplistic. Machine learning algorithms are far more ripe for that sort of abuse than fingerprint matching. I also continue not to agree, at all, with the “slippery slope” argument, which goes along the lines of “authoritarian regimes around the world will force Apple to add non-CSAM image fingerprints to the database”. Doing that identification via fingerprinting against a database of known and vetted CSAM imagery is far more private than using machine learning. If the sync service does use E2E encryption - which I’d love to see iCloud Photos do - then such matching has to take place on the device side. If that identification takes place server-side, then the service cannot use E2E encryption - it can’t identify what it can’t decrypt. Concede for the moment that CSAM identification needs to happen somewhere, for a large cloud service like iCloud.

I think the CSAM fingerprinting, in some form, is still forthcoming, because I suspect Apple wants to change iCloud Photos storage to use end-to-end encryption.
Spotify icon aesthetic update#
Now that some of the new child safety features are shipping with this week’s iOS 15.2 update (machine-learning-based nude/sexually-explicit image detection in Messages, and “Expanded guidance in Siri, Spotlight, and Safari Search”), Apple has updated the page to state which features are currently shipping. Documents outlining how theįunctionality works are still live on Apple’s site. Over the coming months to collect input and make improvementsīefore releasing these critically important child safetyįeatures,” the company’s September statement read.Ĭrucially, Apple’s statement does not say the feature has beenĬanceled entirely. Researchers, and others, we have decided to take additional time “Based on feedback from customers, advocacy groups, The company’s position hasn’t changed since September, when itįirst announced it would be delaying the launch of the CSAMĭetection. When reached for comment, Apple spokesperson Shane Bauer said that More controversial CSAM detection, whose launch was delayedįollowing backlash from privacy advocates, have been removed. “Expanded Protections for Children.” However references to the With iOS 15.2, are still present on the page, which is titled Two of the three safety features, which released earlier this week Wednesday, 15 December 2021 Apple Updates ‘Child Safety’ Webpage to Remove Mention of CSAM Fingerprint Matching, But Feature May Still Be Forthcoming ★
