Apple Delays New Child Safety Features Following Photo-Scanning Scandal

Andrew Heinzman

Review Geek


An iPhone 12's lockscreen.

Justin Duino

Apple recently announced a ton of child safety features for iOS 15, including a tool that automatically checks your iPhone for child sex abuse material (or CSAM). Such tools are commonly used in cloud storage and messaging services, but Apple’s push for on-device scanning led to major push-back on social media and in the press. As a result, Apple will delay all of its new child safety features.

In a statement to 9to5Mac, Apple says it’s “decided to take additional time over the coming months to collect input and make improvements” for its new child safety features, namely the CSAM scanner. It acknowledges that “feedback from customers, advocacy groups, researchers and others” led to this change of plans.

Still, Apple claims that its CSAM scanning system “is designed with user privacy in mind.” Before your photos are stored in iCloud, your iPhone tries to match them against a database of CSAM hashes provided by the NCMEC and other child safety organizations. Matched images are then stored in iCloud with a “safety voucher,” basically an invisible flag that only Apple can track.

If your iCloud account contains several CSAM-matched images, then Apple will review said images manually. Confirmed child sex abuse images are then reported to the NCMEC. Apple says that this system is more secure than cloud-only scanning technologies, as images are only visible to the company if they’re flagged before leaving your iPhone.

Screenshots of iOS 15's new Child Safety features, which are now delayed.

One planned child safety feature hides potentially sexually explicit images from kids in iMessage and alerts parents if such images are opened. Apple

But privacy advocates worry that Apple’s CSAM scanner will catch false-positives, potentially exposing private images to strangers or opening a backdoor for governments and bad actors. The technology could also set a bad precedent for the future—will Apple scan phones for drugs or other subject matter that may be of interest to law enforcement?

We still don’t know how Apple plans to “improve” its CSAM scanner. But to be honest, the company probably didn’t expect to see any backlash in the first place. Big names like Google already use CSAM technology in their cloud and messaging services, and Apple itself scans for CSAM in iCloud Mail.

Regardless of where you stand, it’s disappointing to see Apple push back some of its new child safety tools, including a Messages feature that warns kids not to open potentially explicit photo or video attachments (it doesn’t stop kids from opening such attachments, but alerts parents if they do). Maybe these features will arrive with privacy improvements a few months after iOS 15, but again, Apple’s plans are very unclear.

Source: Apple via 9to5Mac

Continue Reading

Loading data