On Apple’s Child Abuse Technology

0
47
captura512.jpg
captura512.jpg

This week has been marked by the news that Apple will use its own technology to analyze the photos of its users in search of child pornography. It will send data from our photos to the cloud and compare it to a database to check if there is anything related to child abuse.

This news has been being commented on by several leaders from around the world, responsible for technology. Most think that it is a mistake, since it seems that what is on your iPhone no longer always stays there, it is sent for analysis without your permission, a violation against privacy.

This new detection tool, called NeuralHash, can identify child abuse images stored on an iPhone without decrypting the image, and it will be rolled out in the United States later this year.

Remember that Google, Microsoft, Dropbox and other large cloud services already scan material stored on their servers for child abuse material, both on Google Photos and on file storage services. The difference is that the scan is now on a mobile, not in the cloud. A code of each photo will be uploaded without permission to the cloud to be analyzed, although it will only apply to whoever uploads the photos to iCloud (the default option for iPhones), which is why it is not understood that they do not directly scan what they already they have in the cloud instead of having to go to the iPhone to find the material.

Apple defends itself by saying that these new features preserve privacy, but critics are more concerned with what could be transformed in the future. If we open the doors to spying on private photos, we are opening them to new possibilities in the future. If the United States government asks Apple to now look for weapons, or faces, a click will be enough to change the orientation of the technology.

SEE ALSO  WhatsApp animated stickers are about to drop, and the best thing is that they will use a format similar to Telegram

After the announcement, many believe that Apple will soon announce iCloud encryption, although at the moment all are rumors, today Apple can analyze and obtain data from iCloud without problems.

Opinions about NeuralHash

Let’s see some opinions:

– David Forsyth, chair of the department of computer science at the University of Illinois:

Apple’s approach preserves privacy better than anyone I know of. The accuracy of the matching system, combined with the threshold, makes it highly unlikely that images other than known images of child abuse will be revealed.

Will Cathcart, manager at WhatsApp:

I read the information that Apple published yesterday and I am concerned. I think this is the wrong approach and a setback for the privacy of people around the world. People have asked if we will adopt this system for WhatsApp. The answer is no.

John Clark, NCMEC President

Apple’s Extended Protection for Kids is a game changer. The reality is that privacy and child protection can coexist.

Be that as it may, we are facing a turning point on this issue. Obtaining private information to compare with criminal material is a huge risk that can easily get out of hand. Let’s hope it’s worth it.