The U.S. surgeon general is warning there is not enough evidence to show that social media is safe for children and teens — and is calling on tech companies, parents and caregivers to take “immediate action to protect kids now.”
With young people’s social media use “near universal” but its true impact on mental health not fully understood, Dr. Vivek Murthy is asking tech companies to share data and increase transparency with researchers and the public and prioritize users’ health and safety when designing their products.
“I recognize technology companies have taken steps to try to make their platforms healthier and safer, but it’s simply not enough,” Murthy told The Associated Press in an interview. “You can just look at the age requirements, where platforms have said 13 is the age at which people can start using their platforms. Yet 40% of kids 8 through 12 are on social media. How does that happen if you’re actually enforcing your policies?”
To comply with federal regulation, social media companies already ban kids under 13 from signing up to their platforms — but children have been shown to easily get around the bans, both with and without their parents’ consent.
Other measures social platforms have taken to address concerns about children’s mental health are also easily circumvented. For instance, TikTok recently introduced a default 60-minute time limit for users under 18. But once the limit is reached, minors can simply enter a passcode to keep watching.
It’s not that the companies are unaware of the harms their platforms are causing. Meta, for instance, studied the effects of Instagram on teens’ mental health years ago and found that the peer pressure generated by the visually focused app led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts in teens — especially in girls. One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.
The research was revealed in 2021 by whistleblower Frances Haugen. Meta sought to downplay the harmful effects of its platform on teens at the time, but put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12.
“The bottom line is we do not have enough evidence to conclude that social media is, in fact, sufficiently safe for our kids. And that’s really important for parents to know,” said Murthy, who’s been traveling around the country talking to parents and young people about the youth mental health crisis. “The most common question I get from parents is whether social media is safe for their kids.”