Instagram CEOtold a Senate subcommittee investigating the harmful effects of the photo sharing app on teenagers that “keeping young people safe online is not just about one company” and called for industry-wide solutions.
“With teens using multiple platforms, it is critical that we address youth online safety as an industry challenge and develop industry-wide solutions and standards,” Mosseri told lawmakers.
He called for age verification tools at the phone level and said the social media platforms need an industry body to determine how to verify the age of minors, how to design specific experiences, and what kind of parental controls are needed.
The hearing before the Senate subcommittee on Consumer Protection, Product Safety, and Data Security comes after damaging revelations first reported by The Wall Street Journal in September indicated top executives at Instagram were warned by researchers about the potential harmful impacts of the platform.
Internal documents from Instagram’s parent company Meta, formerly known as Facebook, showed researchers found in 2019 that Instagram made body image issues worse for 1 in 3 teen girls. Another presentation showed that among a small group of teen girls who reported having suicidal thoughts, 13% of British users and 6% of American users attributed the feeling to Instagram.
The documents, leaked bywho also shared them with lawmakers on the Senate subcommittee, were provided to a consortium of news organizations, including CBS News, in October through a congressional source.
“Facebook’s own researchers have been warning management, including yourself Mr. Mosseri, for years about Instagram’s harmful impacts on teen’s mental health and well-being,” Senator Richard Blumenthal, the chairman of the subcommittee, said in his opening statement. “Facebook knew, it did the research, and the studies but it continued to profit.”
Mosseri noted, as Meta has previously argued, that on the majority of issues like anxiety and depression, research indicated that teenage users felt Instagram made it easier for them to cope. He also said that Meta spends more resources than competitors on safety measures and said internal research projects have inspired safety changes.
He hailed a recent policy change that automatically sets the accounts of users under 16 to private but Senator Marsha Blackburn, the ranking member on the subcommittee pointed out that her staff was able to create a dummy account as a 15-year-old that didn’t default to private mode. Mosseri admitted that Instagram forgot to implement the updated feature for the web version of the app and promised to “correct that quickly.”
On Tuesday, ahead of the hearing, Mosseri announced that Instagram will be taking a “stricter approach” to content it recommends to teens on the app, stop users from tagging or mentioning teens who don’t follow them and nudge teenage users toward different topics “if they’ve been dwelling on one topic for a long time.”
The company also announced a new “Take a Break” feature that will remind users to close the app once they’ve been scrolling for a certain amount of time but Blumenthal dismissed the recent changes, saying they “fall way short” of what he and other senators are looking to accomplish.
Lawmakers on the Senate subcommittee have held several hearings investigating the negative impacts of social media since the release of the Facebook documents. The Senate panel, which has heard from Haugen as well as Facebook’s global head of public safety Antigone Davis in recent months, promised on Wednesday that legislation to regulate the industry is coming.
“We fully share the goal of protecting kids and teens online but what we aren’t sure about is how the half-measures you’ve introduced are going to get us to the point where we need to be,” Blackburn said.
Despite the “deep reservations” about Instagram, Mosseri told lawmakers the company wants their input and is working to keep users safe. But senators drilled down on specific elements of potential bills to understand what kind of regulatory reforms Instagram would support.
Mosseri said he would back federal legislation requiring social media companies to give 13 to 15-year-olds the right to have their data expunged but would not commit to banning targeted ads for younger users. He said it’s “valuable” for ads to be relevant for all audiences and argued instead that measures limiting the targeting abilities of advertisers on the platforms are necessary.
He agreed that users should have the right to experience Instagram with a chronological feed, so the content ranking algorithms aren’t recommending posts for them to engage with.
Mosseri also proposed creating an independent industry body that sets standards for social media companies with input from civil society groups, policy makers and parents. He said the standards should be approved by policymakers who can also decide if the companies are adhering to the policies.
When asked if the U.S Attorney General should have the power to enforce the policies and sue the companies if they fail to adhere, Mosseri said the enforcement of these regulations should happen at the federal level and added that “without enforcement” his proposal is “just words.”
For Blumenthal, the enforcement aspect was key because Section 230 of the Communications Decency Act provides social media companies immunity from content posted on their platforms by other users. If a user is harmed by content they engage with on the platforms, they currently do not have the right to seek damages from the social media company.
Mosseri said social media companies should have to “earn” some of the protections provided to them by Section 230.
Lawmakers also questioned Mosseri about his plans to build an Instagram for kids between the ages of 10 to 13. Just days after the damaging reports from the Wall Street Journal, Mosseri announcedits plans to build a platform dedicated for kids. On Wednesday, he said the project was still on pause but did not commit to completely abandoning the idea.
Mosseri argued that kids have access to cellphones and want to be platforms like Instagram that aren’t designed for them. Instagram recently said that it deleted more than 850,000 accounts it found to be tied with users under 13 in the third quarter of this year.
“We know that they want to be on platforms like Instagram,” Mosseri said. “The idea was to give parents the option to give their child an age-appropriate version of Instagram.”
Mosseri said he wants to hear from parents, policymakers, and researchers on how to create a product suitable for younger audiences.
He said the ultimate decision to move forward or completely stop building a version of Instagram for kids will be left up to him, but did not say if there is a timeline in place to reach that verdict.