Apple is apparently abandoning a controversial proposal to scan users’ iCloud-stored images for child sexual abuse material (CSAM) in response to an ongoing battle for privacy.
These security capabilities, which were introduced in August 2021, were intended to alert unlawful content while maintaining privacy. However, digital rights groups criticized the plans, arguing that the surveillance capabilities were susceptible to abuse.
A month later, Apple put its plans on hold. Now, more than a year after announcing the CSAM-detection tool, the corporation has no intentions to move further with it.
According to the corporation, new features are being developed to better balance user privacy and child protection. Such limits will enable parents to manage their child’s connections, content, and screen time, as well as providing a carefully curated app store.
Apple believes that the most effective method to avoid the internet exploitation of children is to prevent it before it occurs. The corporation cited December 2021’s rollout of additional features as the catalyst for this development.
The company says it will develop additional tools to assist maintain a balance between user privacy and child safety.
SOPA Images/LightRocket via Gett
For example, communication safety in messages includes alerts when dubious photographs are transmitted, as well as increased assistance in Siri, Spotlight, and Safari Search.
The business is working on upgrades to communication safety in messages to include measures against nudity in videos and other threats to children’s safety. Apple claims it is collaborating with child safety professionals to streamline the reporting of instances to law authorities.
The firm stated on Wednesday that it will now provide end-to-end encryption for practically all user data stored in its global cloud storage system, making it more difficult for hackers, spies, and law enforcement agencies to access sensitive customer data.