Beyond TikTok ban: How one state is grappling with teens and scrolling
Loading...
| New York
The TikTok law that recently sailed through Congress was primarily driven by national security concerns. Left unaddressed: a wider strategy on social media and teen mental health.
Federal and state lawmakers face decisions on whether to make tech companies more responsible for safety and well-being issues, or to put that onus on parents and guardians.
Why We Wrote This
A story focused onTeens and officials recognize social media can have both positive and harmful effects on mental health. New York state is seeking a middle ground on finding solutions.
New York is becoming a testing ground for how public officials are trying to help address rising concerns about the effects of unchecked digital immersion on young people’s development.
New York City Mayor Eric Adams’ administration sued five social media companies earlier this year for “fueling” a youth mental health crisis. Meanwhile, the New York state Legislature is considering bills that could rein in the interactions social media platforms can have with minors – either by restricting corporations or by empowering parents and children.
Grace Jung, a high school sophomore, acknowledges that, alongside homework, time on her phone plays a pretty big part in keeping her up at night. “Every time I try turning it off, there’s another app that’s trying to get me on it,” she says.
Will American teens lose their access to TikTok? Should they?
A new law that could ban the video app – a platform especially popular with youth – unless it is sold by Chinese owner ByteDance, moves the former question closer to an answer. But the latter remains less clear.
Public officials are increasingly concerned about teenage screen time and social media use. And, though with mixed feelings, teens are, too.
Why We Wrote This
A story focused onTeens and officials recognize social media can have both positive and harmful effects on mental health. New York state is seeking a middle ground on finding solutions.
Grace Jung is one of them. The high school sophomore readily acknowledges that, alongside homework, time on her phone plays a pretty big part in keeping her up at night. “Every time I try turning it off, there’s another app that’s trying to get me on it,” she says.
At her New York City school, “confessions” groups are making a comeback. Classmates have been posting anonymously on Facebook and Instagram with jabs and observations about how their peers look, who they’re friends with, the latest rumors. Grace cites one friend who’s suffered as a result of comments about her online.
“This could have been avoided or regulated in the first place, so she wouldn’t have to go through this in real life,” she says. “I feel like people always have to walk on eggshells.”
Help might be on the way. Through legislation and lawsuits, her home state is becoming a testing ground for how public officials are trying to help address rising concerns about the effects of unchecked digital immersion on young people’s development.
As with parallel efforts in other states, it’s not yet clear if actions emerging here will succeed in addressing families’ concerns. But officials nationwide see the links between social media and adolescent well-being as an increasingly urgent issue, especially after a 2023 Surgeon General’s advisory found “harmful content” exposure and “excessive and problematic” social media use have become primary drivers of youth mental health concerns.
“This is something that lawmakers have experienced firsthand themselves,” says Kris Perry, executive director of Children and Screens, a nonprofit that studies the impact of digital media on child development. “Many of them are parents or grandparents, and they are shocked and alarmed at what has unfolded in this past decade.”
Here in New York City, Mayor Eric Adams’ administration sued five social media companies earlier this year for “fueling” a youth mental health crisis. The state’s attorney general last fall joined a national lawsuit against Meta, Facebook’s parent company, over youth mental health. Meanwhile, the New York state Legislature is considering bills that could rein in the interactions social media platforms can have with minors – either by restricting corporations or by empowering parents and children.
New York’s actions mirror bipartisan efforts in a number of states to tackle the downsides of addictive social media. Lawmakers in red and blue states have passed laws regulating youth social media use, including in Arkansas, California, Florida, Texas, and Utah. Maryland passed legislation earlier this month.
Social media legislation for kids
The TikTok law that sailed through Congress this week was primarily driven by national security concerns over the Chinese government potentially reaping data on 170 million American users.
Left unaddressed in Congress: a wider strategy on social media and teen mental health. Federal and state lawmakers face a decision on whether to make tech companies more responsible for safety and well-being issues, or to put that onus on parents and guardians.
The Kids Online Safety Act, a bipartisan U.S. Senate bill reintroduced in February by Richard Blumenthal, a Democrat from Connecticut, and Tennessee Republican Marsha Blackburn, includes “duty of care” clauses that make platforms liable for addressing harmful content.
But civil rights groups like the American Civil Liberties Union and the Electronic Frontier Foundation have warned those provisions could lead to increased censorship and loss of First Amendment free speech rights.
The proposals on the table in New York lean more toward parental oversight. The Stop Addictive Feeds Exploitation for Kids Act, or SAFE Act, would ban the use of feed algorithms for pushing content at users under 18. Instead, youth would get a default chronological feed, unless parents consent otherwise. The measure would also allow parents to block their children’s access between midnight and 6 a.m., as well as giving them the opportunity to limit their total usage.
Grace agrees that restricting notifications could help. But she says giving parents too much blanket power to decide what their children can and can’t see could do more harm than good.
“Honestly, I don’t think I would trust my parents to regulate my social media use,” she says. “Even though parents do want what’s for the best, I don’t think they can understand what their children do.”
That’s an attitude Andy Jung, her dad, happens to share. He believes in serving as a role model on how to use the internet responsibly. But he’s not interested in weighing in on how she can or can’t use Instagram.
“I’m not 100% sure how she’s spending most of her time when she’s using her cellphone or the internet,” he says. “All I can say is I trust my daughter. That’s all I can do. I cannot monitor what she’s doing all day.”
Will constraining social media companies help?
The New York Child Data Protection Act, meanwhile, would prohibit websites from collecting or sharing any minor’s personal data without informed consent, including parental consent for those younger than 13. That’s a particularly valued step for civil rights groups, Ms. Perry says, because it has less of a chance of curtailing teens’ free speech but does more to restrict harmful algorithmic manipulation, as well as companies’ abilities to sell the information they collect.
“The less we allow companies to monetize the child’s experience, the less motivated they are to put products in front of children,” Ms. Perry says. And, perhaps, the less children will be inclined to overuse social apps.
Both New York bills are still under committee consideration. In several other states with youth social media laws, NetChoice, a tech industry group, has succeeded in suing to delay implementation.
Any move that constrains the algorithms appears bound to be unpopular among teenagers. It’s the feature that’s made apps like TikTok and YouTube so eerily proficient at presenting content that captivates.
And only 46% of teenagers support requiring parental consent for minors to create social media accounts, compared with 81% of U.S. adults, according to Pew Research Center.
Ash Farley, a high school senior in New York, says teens are well aware platforms are raking in their data. But Ash, who uses they/them pronouns, says restrictions could make it harder to find anything from content that inspires their art to online resources that help teens explore their developing identities away from prying eyes.
“You’d miss out on a lot that you’re interested in,” Ash says. “There’s not a lot of ways to get that stuff, other than social media.”
Ash’s mom, Jennifer Watters-Farley, has tried to take an approach of maximizing communication and enabling good judgment with her children.
“We try to be very candid without scaring them, but at least giving them enough information to know what to look out for and how to act, and to always let us know if something was happening that they were really uncomfortable with,” Ms. Watters-Farley says. “That we wouldn’t be upset with them – we would just want to try to help them handle it.”
Anne Marie Albano, a psychologist at Columbia University Irving Medical Center, says the ideal solution might not be to give parents or legislators more direct control over social feeds, but to ensure that parents have more resources to prepare their kids to make healthy decisions.
“Our kids have to learn how to become independent beings,” she says.