Meta, the company behind platforms such as Facebook and Instagram, has just presented a new battery of measures to protect adolescents on both Facebook and Instagram, adding to the measures taken in the past in the same direction.
The most outstanding is the arrival of the most private configuration for new user accounts on Facebook for children under 16 years of age, or under 18 years of age in certain countries, arriving some time after the same was done with the new user accounts in Instagram.
Existing teens will be encouraged to make their account settings as private as possible in a series of settings to catch up with new accounts.
New tools and platforms, as well as new educational material
Meanwhile, Meta is testing new ways to protect teens from adults they have no connection to in real life, including removing the message button on teens’ Instagram accounts when viewed by adults. suspects, meaning suspects those adults who have been blocked or denounced by any young person.
Adolescents are also being offered a series of tools to notify those aspects that may be making them feel uncomfortable in the regular use of Meta applications, for which they will receive invitations to use them through notifications.
For example, we’re asking teens to report accounts after blocking someone, and we’re sending them safety advisories with information on how to navigate inappropriate messages from adults.
And finally, new tools are also coming to stop the dissemination of intimate images of adolescents, especially when these images are used as extortion of adolescents, an activity known as sextortion.
In collaboration with the National Center for Missing & Exploited Children (NCMEC) they are creating a global platform for teens who are concerned that their intimate images may be posted publicly online without their consent.
Meta says that:
This platform will be similar to the work we have done to prevent the non-consensual sharing of intimate adult images. It will allow us to help prevent intimate images of a teen from being posted online and used by other companies in the tech industry. We have been working closely with NCMEC, experts, academics, parents, and victim advocates globally to help develop the platform and ensure it is responsive to the needs of teens so they can regain control of their content in these horrific situations.
In addition, Meta says it is working with Thorn and its NoFiltr brand to create new educational materials that will feature content “that reduces the shame and stigma surrounding intimate images and empowers teens to seek help and take back control if you have shared or are experiencing sextortion.”
More information/image credit: Meta