Instagram hopes that blurring nudity in messages will make teens safer

Parents have long raised concerns over the types of images their children can be exposed to on Instagram, and now the social media brand is taking a new step to try and fight sexual exploitation on the app. In a major change for their interface, Instagram will begin automatically blurring nude images in direct messages. 

The change, announced by Instagram’s parent company Meta on April 11, is part of a series of new tools designed to minimize child sexual abuse and exploitation across social media brands. The tools will “help protect young people from sextortion and intimate image abuse” and also “make it more difficult for potential scammers and criminals to find and interact with teens,” Meta said in a press release. The company is also testing new ways to “help people spot potential sextortion scams.” 

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE

Sign up for The Week’s Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

To continue reading this article…

Create a free account

Continue reading this article and get limited website access each month.

Subscribe to The Week

Get unlimited website access, exclusive newsletters plus much more.

Subscribe & Save

Cancel or pause at any time.

Already a subscriber to The Week?

Leave a Reply