Apple is delaying its child safety features

Kris Holt


Apple is delaying its child safety features

Apple says it's delaying the rollout of Child Sexual Abuse Material (CSAM) detection tools "to make improvements," following pushback from critics. The features include one that analyzes iCloud Photos for known CSAM, which has caused concern among privacy advocates.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," Apple told 9to5Mac in a statement. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”


Continue Reading

Loading data