Apple backs down on CSAM features

43656 84909 000 lead csam xl.jpg
43656 84909 000 lead csam xl.jpg

After widespread criticism, Apple backs down on CSAM’s features, that is, child protection. As planned, but will “take more time” to consult.

Apple backs down on CSAM (Child Protection) features

In an email sent to various media related to Apple news, and other publications; Apple says it has made the decision to postpone its functions following the reaction to its original announcement.

“Last month we announced feature plans designed to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material,” Apple said in a statement.

“Based on feedback from customers, advocacy groups, researchers, and others,” he continues, “we have decided to take additional time in the coming months to gather information and make improvements before launching these critically important child safety features.”

There are no more details on how the company can consult to “collect inputs”, or with whom it will work.

Apple originally announced its CSAM features on August 5, 2021, saying they would debut later in 2021. Features include detecting child sexual abuse images stored in iCloud Photos and, separately, blocking potentially harmful Messages sent. to kids.

Industry insiders and high-profile names like Edward Snowden responded with an open letter asking Apple not to implement these features. The objection is that it was perceived that these features could be used for surveillance.

However, the complaints, both reported and not, continued. Apple’s Craig Federighi finally said publicly that Apple had misjudged how it announced the new features.

“We wish this had come out a little bit clearer to everyone because we feel very positive and strong about what we are doing, and we can see that it has been widely misunderstood,” Federighi said.