Meta has removed 135,000 Instagram accounts or ‘sexualizing children’

0 7658

On Wednesday, Meta introduced new safety features specifically tailored for teen users. These features include enhanced direct messaging protections designed to prevent the spread of “exploitative content.”

Thank you for reading this post, don't forget to subscribe!

To assist teens in identifying potential scammers, they will now receive more information about their chat partners. This includes details such as the creation date of the Instagram account and other safety tips. Additionally, teens will have the convenience of blocking and reporting accounts in a single action.

In a recent update, the company revealed that in June alone, they successfully blocked accounts 1 million times and reported another 1 million after users encountered a Safety Notice.

This policy is part of Meta’s broader initiative to protect teens and children on its platforms. This initiative is in response to increasing scrutiny from policymakers who have accused the company of failing to adequately safeguard young users from sexual exploitation.

Earlier this year, Meta took significant action by removing nearly 135,000 Instagram accounts that were engaging in sexualizing children on the platform. These accounts were identified as leaving sexualized comments or requesting sexual images from adult-managed accounts that featured children.

Furthermore, the takedown also involved the removal of 500,000 Instagram and Facebook accounts that were linked to the original profiles.

(Visited 23 times, 1 visits today)

About Post Author


Discover more from CompuScoop.com

Subscribe to get the latest posts sent to your email.

Like what you've read? Leave a comment below:

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Verified by MonsterInsights