Lawmakers Want Social Media Companies to Stop Getting Kids Hooked
Alexis Tapia opens TikTok every morning when she wakes up and every night before she goes to bed. The 16-year-old from Tucson, Arizona, says she has a complicated relationship with the social media app. Most of what flashes across her screen makes her smile, like funny videos that poke fun at the weirdness of puberty. She truly enjoys the app—until she has trouble putting it down. “There are millions of videos that pop up,” she says, describing the #ForYou page, the endless stream of content that acts as TikTok‘s home screen. “That makes it really hard to get off. I say I’m going to stop, but I don’t.”
Scrutiny of kids, particularly teens, and screens has intensified over the past months. Last fall, former Facebook product manager turned whistleblower Frances Haugen told a US Senate subcommittee that the company’s own research showed that some teens reported negative, addiction-like experiences on its photo-sharing service, Instagram. The damage was most pronounced among teenage girls. “We need to protect the kids,” said Haugen in her testimony.
Proposals to “protect the kids” have sprung up across the US, attempting to curb social media’s habit-forming allure on its youngest users. A bill in Minnesota would prevent platforms from using recommendation algorithms for children. In California, a proposal would allow parents to sue social media companies for addicting their kids. And in the US Senate, a sweeping bill called the Kids Online Safety Act would require social media companies, among other things, to create tools that allow parents to monitor screen time or turn off attention-sucking features like autoplay.
Social media’s negative impact on children and teens has worried parents, researchers, and lawmakers for years. But this latest surge in public interest seems to be ignited in the peculiar crucible of the Covid-19 pandemic: Parents who were able to shelter at home watched as their children’s social lives and school lives became entirely mediated by technology, raising concerns about time spent on screens. The fear and isolation of the past two years hit teens hard and has exacerbated what the US surgeon general recently called “devastating” mental health challenges facing adolescents.
The kids have been through the wringer. Could cracking down on social media help make the internet a better place for them?
Supporters of the new legislation have likened Big Tech’s mental health harms to kids with the dangers of cigarettes. “We’re at a place with social media companies and teenagers not unlike where we were with tobacco companies, where they were marketing products to kids and not being straightforward with the public,” says Jordan Cunningham, the California Assembly member spearheading AB 2408, along with Assembly member Buffy Wicks. The bill would allow parents to sue platforms like Instagram, Tiktok, and Snap if their child is harmed by a social media addiction. Social media companies aren’t financially incentivized to slow kids’ scroll, and “public shame only gets you so far,” Cunningham says.
But unlike the physical damage of tobacco, the exact relationship between social media use and kids’ mental health remains disputed. One high-profile study that tracked increases in rates of teenage depression, self-harm, and suicide in the US since 2012 proposed “heavy digital media use” as a contributing factor. But still other research has found that frequent social media use is not a strong risk factor for depression. Even the internal documents revealed by Haugen resist any simple interpretation: Facebook’s study had a sample size of only 40 teens, over half of whom reported that Instagram also helped counter feelings of loneliness. It’s also difficult to untangle the mental health harms of social media from other psychological harms in a child’s life, like health fears during an ongoing pandemic or the threat of school shootings, which leave a lasting psychological toll on students.
There isn’t a scientific consensus on what a social media addiction is, either. “I am concerned that the medical and psychological communities are still figuring out what defines a digital behavioral ‘addiction’ versus other terms like problematic media use,” says Jenny Radesky, who researches children, parenting, and digital media use at the University of Michigan C. S. Mott Children’s Hospital. In addition to her research, Radesky helps shape the American Academy of Pediatrics’ policy agenda on kids and technology. She also works with Designed With Kids in Mind, a campaign to raise awareness of how design techniques shape children’s online experiences.
Radesky advocates for a more nuanced interpretation of the relationship between social media and young people’s mental health. “People who are trying to ‘protect kids’ within digital spaces often are a bit paternalistic about it,” she says. Well-intentioned adults often regards kids as objects to be protected, not subjects of their own experience. Instead of focusing on minutes spent on screens, she suggests, it’s worth asking how kids build norms around technology. How are they integrating it with the rest of their lives and relationships? How can parents, policymakers, and voters take that into account?
But not every parent is in a position to engage in a real dialog with their kids about screen time. This poses an equity issue: Those who work multiple jobs, for example, may not be able to provide guardrails on screen time, and their children may be more prone to overuse than children of affluent parents.