Technology

Apple Drops Contentious Proposal to Scan iCloud and iOS Devices for Images of Child Abuse

Apple Drops Contentious Proposal to Scan iCloud and iOS Devices for Images of Child Abuse

Apple is scrapping its plans to release a divisive app that will scan iPhones, iPads, and iCloud photographs for evidence of child sexual abuse material (CSAM), in response to complaints from critics about the feature’s possible privacy consequences.

With the aim of assisting in the fight against child exploitation and promoting safety issues the digital industry has been embracing more and more Apple initially introduced the feature in 2021.

However, after receiving harsh criticism, the company decided against adopting the function, stating that it would “need more time over the upcoming months to gather feedback and make adjustments before launching these critically needed child safety measures.”

In a public statement Wednesday, Apple said it had “decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.”

“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” the company said in a statement provided to Wired. (Apple did not respond to CNN’s request for comment.)

After seeking input from experts on its kid protection measures, the business has decided to refocus its efforts on expanding its Communication Safety feature, which it initially made available in December 2021.

The Communication Safety tool is an optional parental control feature that blurs sexually inappropriate image attachments in iMessage and alerts children and their parents when such files are received or transmitted.

Apple was criticized in 2021 for its plan to offer a different tool that would start checking iOS devices and iCloud photos for child abuse imagery. The program, according to the business at the time, would convert images stored on iPhones and iPads into unreadable hashes or complicated numbers.

Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple’s iCloud storage service.

Many child safety and security experts applauded the effort, acknowledging the moral duties and obligations a business has towards the goods and services it produces. But they also called the efforts “deeply concerning,” stemming largely from how part of Apple’s checking process for child abuse images is done directly on user devices.

Apple made an effort to allay concerns that governments may also pressure Apple to include non-child abuse photographs to the hash list in a PDF document explaining the technology, which it named NeuralHash.

“Apple will refuse any such demands,” it stated. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”

In addition to announcing a number of new security improvements, Apple also disclosed that it had abandoned its ambitions for the tool.

Apple wants to expand end-to-end encryption of iCloud data to cover backups, photographs, notes, chat histories, and other services, a move that may increase tensions with law enforcement officials around the world while also protecting user data.

Even in the event of an Apple data breach, customers will be able to keep specific data more secure from hackers, governments, and spies with the help of the program, called Advanced Data Protection, the company claimed.