A new Senate bill would require social media platforms to include tools aimed at making their apps less addictive to children and teens and less likely to suggest potentially harmful content.


What You Need To Know

  • A new Senate bill would require social media platforms to include tools aimed at making their apps less addictive to children and teens and less likely to suggest potentially harmful content

  • Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., on Wednesday introduced the Kids Online Safety Act

  • The bipartisan legislation would require social media platforms to provide minors with settings to protect their personal information, disable addictive product features and opt out of algorithmic recommendations, among other things

  • The legislation is in response to a series of explosive articles last year alleging Facebook’s parent company, Meta, had internal research data showing its Instagram app caused mental health problems, anxiety over body image and suicidal thoughts in some young people, but publicly downplayed the negative impacts

Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., on Wednesday introduced the Kids Online Safety Act.

The bipartisan legislation would require social media platforms to provide minors with settings to protect their personal information, disable addictive product features and opt out of algorithmic recommendations. The apps also would be required to default to their most restrictive settings.

The bill also seeks new parental controls and a dedicated channel for reporting harmful content. It also would require companies to perform an independent audit to assess the risks to children. And watchdog groups would be granted access to the companies’ data on their platforms’ potential harms to minors.

“This measure makes kids’ safety an internet priority,” Blumenthal said in a news release. “Big Tech has brazenly failed children and betrayed its trust, putting profits above safety.”

Blumenthal is the chairman of a Senate subcommittee focused on consumer protection and data security issues, and Blackburn is the panel’s top-ranking Republican.

Their legislation is in response to a series of explosive articles last year, the first ones by The Wall Street Journal, that alleged Facebook’s parent company, Meta, had internal research data showing its Instagram photo-sharing app caused mental health problems, anxiety over body image and suicidal thoughts in some young people, but publicly downplayed the negative impacts.

The reports were based on tens of thousands of pages of documents leaked by a former Facebook employee, Frances Haugen.

The Senate subcommittee held five hearings on the impact of social media on children and teens, including one in which Haugen testified

“Protecting our kids and teens online is critically important, particularly since COVID increased our reliance on technology,” Blackburn said in Wednesday’s news release. “In hearings over the last year, Senator Blumenthal and I have heard countless stories of physical and emotional damage affecting young users, and Big Tech’s unwillingness to change. The Kids Online Safety Act will address those harms by setting necessary safety guiderails for online platforms to follow that will require transparency and give parents more peace of mind.”

A Meta spokesperson told Spectrum News the company is reviewing the bill. Google, which owns YouTube, and Twitter did not respond to requests Wednesday about the legislation.

Haugen told the subcommittee in October that Meta has put its “astronomical profits before people” and that algorithms used by Facebook and Instagram tend to recommend content that is more likely to elicit a strong emotional reaction from people, including posts featuring anorexia or fanning the flames of ethnic violence.

Meta previously has argued that Haugen has mischaracterized the research, that the company is committed to stopping harmful content and that many teens have reported Instagram has helped them more than hurt them.

With the exception of YouTube Kids, major social media platforms require users to be at least 13 years old, although Meta has acknowledged some lie about their age and says it removes users when it becomes aware they’re underage.

Instagram paused work in September on an Instagram Kids app after backlash, including from attorneys general from 44 states and territories, who wrote a letter urging it to abandon its plan. 

Instagram announced in December it is rolling out a number of new features aimed at protecting teens. The features include one that asks them to take a break if they’ve been on the app for a certain amount of time, another that nudges them to different topics if they’ve been fixated on a particular subject too much and another that blocks users from tagging or mentioning teens that don’t follow them. In an effort to reduce pressure teens feel online, Instagram also last year introduced an option for hiding "like" counts. And Meta says new parental tools and an education hub are coming soon.

Violators of the Kids Online Safety Act could face penalties under the Federal Trade Commission Act, and state attorneys generals could also file civil lawsuits against the companies. 

“The Kids Online Safety Act would finally give kids and their parents the tools and safeguards they need to protect against toxic content—and hold Big Tech accountable for deeply dangerous algorithms,” Blumenthal said. “Algorithms driven by eyeballs and dollars will no longer hold sway.”

Note: This article was updated to include Meta's response to the bill.

-

Facebook Twitter