HomeTech NewsAppsInvestigation of TikTok case for child pornography begins

Investigation of TikTok case for child pornography begins

Published on

- Advertisement -

The Department of Homeland Security has already opened a new investigation into CSAM on TikTok (CSAM stands for child Sexual Abuse Material). This after material of child sexual abuse has been published, both in public and in private, on the video sharing network.

In addition, the platform has been used too much by abusers for the so-called grooming. This word is used for a certain practice of befriending a child online with the intention of later abusing him. This can be both online and offline.

research on TikTok by CSAM

- Advertisement -

The Financial Times reports that TikTok moderators they have so far been unable to keep up with the volume of videos posted. This would happen to mean that the abusive material has already been published in the public feed of some profile.

Now abusers have also taken advantage of a privacy feature that TikTok is currently offering:

The US Department of Homeland Security is already investigating how TikTok handles child sexual abuse material. According to two sources familiar with the case. The Department of Justice is also currently reviewing how a specific privacy feature on TikTok is being exploited by predators. A person with knowledge of the case mentions this.

One pattern the Financial Times verified with law enforcement and child safety groups was content being sourced and traded through private accounts. The same ones that shared the password with the victims and other sexual predators. Keywords are commonly used in public videos, usernames, and bios. However, the illegal content is uploaded using the app’s ‘Only me’ feature, in which the videos are only visible to those who are logged into the profile.

Seara Adair, a child safety advocate, reported this trend to US law enforcement after spotting the content on TikTok.

privacy shortcomings

TikTok is also accused of not be as proactive as other social networks. Well, others are more effective when it comes to detecting and preventing this type of grooming attempts:

“It’s a perfect place for predators to meet, target and hook children,” said Erin Burke, chief of the child exploitation investigations unit at Homeland Security’s cybercrime division. She also called the app a ‘platform of choice’ for this behavior. […]

Burke claimed that international companies like TikTok were less motivated to collaborate with US law enforcement. ‘We want to [las empresas de medios sociales] proactively ensure the safety of children. I can’t say that they are doing it, and I can say that many American companies are doing it, he added.

- Advertisement -

The use of the platform by sexual predators is of particular concern. Since the predominant demographic group is that of adolescents.

TikTok said that it did collaborate with law enforcement “when necessary”:

“TikTok has zero tolerance for child sexual abuse material,” the company said. ‘When we find any attempt to publish, obtain or distribute [material de abuso sexual infantil], we remove content, ban accounts and devices. We also immediately notify NCMEC and collaborate with law enforcement as needed.’

- Advertisement -

- Advertisement -

Latest articles

How to take photos with the new iPhone 14 Pro at 48MP resolution

One of the great innovations included in the iPhone 14 Pro in what...

More like this