Content Scanners and Parental Controls: Who Controls Google?

0
19
1661259261 content scanners and parental controls who controls google.jpg
1661259261 content scanners and parental controls who controls google.jpg

There has already been criticism of Google’s child protection. Now a case study appears in which the technology fails – but so do the parents, says Eva-Maria Weiß.

 

It’s never a good idea to send nude photos of a child anywhere on your smartphone without encryption. Not even to a doctor who asks for this. But that’s what happened in the USA – with consequences. There, parents sent pictures of their sons’ genitals for a preliminary examination. Google responded and suspended all activity from the associated accounts. You should know, especially as a parent, you should know that Google conducts constant searches on users’ devices. Apple does it similarly, where the technology is called a body scanner and refers to iMessages. If the iPhone detects nudity, the images will blur and a warning will appear.

Now you might think: Great, Apple and Google are reacting, that corresponds to my idea of ​​​​child protection. And of course a doctor has to see the genitals in order to make a diagnosis. But this way of thinking would fall short in several places.

Because Google not only searches the end devices of its customers if they are synchronized with the cloud. Google is also very fond of collecting information. So child X has illness Y. Google knows which child it is – at least that can be seen from the countless other photos. Google can immediately load this information into its health database. Could be exciting later when said child X has grown into adult X. Healthcare is the “next big thing” for big tech companies. Google, for example, analyzes millions of patient data from a US hospital chain. Amazon bought a polyclinic chain. Something like this happens with good intentions, but unfortunately also with great economic interest.

It’s also not about the flu or a broken leg, it’s also explicitly about genitals. Images that are unattractively sought after by people who have no good intentions with them. In the US case study, the parents lost their Google accounts, and one father explained that he set up a Hotmail account as a result. This makes it doubtful that the parents are even remotely aware of the problem. The scary idea used to be that the man who developed the analogue photos would make his own prints and sell them. Today it is companies that keep data and do their business with it. Health data is particularly sensitive data, it is also particularly worthy of protection. Neither Google nor any other big tech company should be able to access it.

The law enforcement authorities have investigated in both cases – against the parents on suspicion of child abuse. The investigation has been dropped, according to the New York Times newspaper report. Clear. Exactly who doesn’t care? Google. Block accounts automatically? Yes. Check account suspension? no Apparently there isn’t enough capacity for that. And that while in the background every picture of every user continues to be screened. The group has already stated several times that people would take over a kind of final control. In these two cases it does not seem to have taken hold.

And who actually controls Google in these control and blocking activities? Google acts as a private company that can block at will. It is also possible that Google will now permanently save the parents as potential criminals. It may be naïve on the part of parents to have many activities run solely through Google, but it also shows that they lack awareness of the issue.

The photo of the child in the bath that used to be on Grandpa’s and Grandma’s fridge may have been cute. Not every nude picture should be censored. But Google, Microsoft and Meta are really not the best places to hand out and distribute such photos. That still needs to be made clear to a broader mass of people.