Parents have long raised concerns over the types of images their children can be exposed to on Instagram, and now the social media brand is taking a new step to try and fight sexual exploitation on the app. In a major change for their interface, Instagram will begin automatically blurring nude images in direct messages.
The change, announced by Instagram’s parent company Meta on April 11, is part of a series of new tools designed to minimize child sexual abuse and exploitation across social media brands. The tools will “help protect young people from sextortion and intimate image abuse” and also “make it more difficult for potential scammers and criminals to find and interact with teens,” Meta said in a press release. The company is also testing new ways to “help people spot potential sextortion scams.”
Sign up for The Week’s Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
How will Instagram’s nudity blurring feature work?
How will this change children’s interactions on Instagram?
Explore More
To continue reading this article…
Create a free account
Continue reading this article and get limited website access each month.
register for free
Already have an account? Sign in
Subscribe to The Week
Get unlimited website access, exclusive newsletters plus much more.
Subscribe & Save
Cancel or pause at any time.
Already a subscriber to The Week?
Unlimited website access is included with Digital and Print + Digital subscriptions.
Create an account with the same email registered to your subscription to unlock access.