Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

States Eye Social Media Bans Despite Legal Roadblocks

State efforts to restrict kids' social media use have been held up in court. But lawmakers remain concerned about apps and the Internet contributing to mental health challenges.

US-NEWS-FLA-CHILDREN-SOCIALMEDIA-DMT
Florida's Legislature has approved a bill that would keep most kids off of social media.
Dreamstime/TNS
In Brief:
  • State legislators are increasingly concerned about social media making teens feel lonely, anxious and depressed.

  • Proposed restrictions face criticism and lawsuits from tech companies and free speech advocates due to potential free speech and privacy violations.

  • Opponents of these laws suggest that legislation around media literacy and digital safety might be more effective in protecting minors on social media.


  • Would you feel comfortable scanning your government ID before you or your teenager set up a new account on TikTok or X? In Florida, that was one of the features of a proposed social media ban for minors that GOP Gov. Ron DeSantis vetoed this month.

    Studies show that social media platforms have features that are harmful to children, Florida GOP state Rep. Tyler Sirois told his colleagues on Wednesday, during House debate over an amended version of the bill. He cited algorithms that draw them to content that create issues such as depression and anorexia and "increased rates of hospital admission for self-harm."

    On Wednesday, the Florida House passed the amended bill, sending it to DeSantis, who signaled he would sign it. The new version prevents children under 16 from opening social media accounts, with exceptions for 14- and 15-year-olds who would be able to open accounts with parental permission. “This is something that I believe will save the current generation and generations to come if we’re successful,” said House Speaker Paul Renner.

    But free speech groups and tech companies argue that the push to minimize harm by regulating social media use could easily lead to infringements of Internet users’ rights. “When it comes to trying to ban access to information or ban the ability of an individual to speak there’s no way to fix that," says Carl Szabo, general counsel for NetChoice, a tech trade group that is among the groups filing lawsuits to halt states’ social media legislation. "That is essentially a violation of the First Amendment."

    Last month, the Supreme Court heard arguments in a case challenging Texas and Florida laws regarding content moderation requirements for social media companies. Although that is a separate issue, the skepticism expressed by justices points to the problems states may face in trying to restrict social media use by children. Both Justices Elena Kagan and Brett Kavanaugh suggested the laws violate the First Amendment.

    Differing State Approaches


    There are two types of legislation centered around protecting minors on (or from) social media. Red states such as Utah and Arkansas have passed overt bans blocking children under a certain age from being able to use social media at all. These laws mandate that platforms utilize age verification before they allow access to these platforms. This month, Utah passed a bill to allow parents to sue social media companies if their platforms cause their children harm.

    In blue states including Connecticut and Maryland, bills would require platforms to make sure that they don't harm children or expose them to harmful content. Legislators aim to protect teenagers and children from harm of all forms on social media. Some legislators want tech companies to redesign their platforms so access by minors can be restricted, while others, concerned about exploitation and bullying, want tech companies to document harmful content and behavior and come up with ways to mitigate risk to kids.

    A bill with bipartisan support in Congress would require social media and video game platforms to turn on their strongest privacy settings for children, while allowing them to opt out of the personalized algorithms that can lead to endless suggestions for continued scrolling. It would also impose a "duty to care" on companies to prevent harm to minors.

    Millions of teenagers turn to the Internet as their primary form of social interaction and communication, often spending nearly five hours a day on social media. Much of this time is unmonitored. There are guardrails built into social media apps — TikTok, for example, has a restricted mode for users between 13 and 18 — but part of the point of social media is that it's not a moderated space.

    If these laws survive court scrutiny, platforms will have to become extremely proactive around managing the content hosted on their sites and who opens accounts, so they don’t face culpability or open themselves up to fines. But bans on social media use by minors have drawn lawsuits that have blocked them from going into effect. In Utah, the original ban’s initial implementation date of March 1 got pushed back to October following legal challenges brought by NetChoice and tech companies.

    Last year in California, a federal judge blocked the state from enforcing its law requiring platforms to assess whether their content could harm children. The judge found there was a likelihood that the law would violate the First Amendment. First Amendment concerns have also led to temporary injunctions in Arkansas and Ohio.

    Given the legal hurdles, what avenues short of bans or broad restrictions are available for states seeking to protect children? Education for both children and parents might be a workable solution, suggests Josh Withrow, a technology fellow at R Street Institute, a center-right think tank. For students, that could come in the form of media literacy requirements being introduced as part of the curriculum, an approach taken last year by Delaware and New Jersey.

    Withrow suggests educating parents about third-party programs and apps designed to protect child Internet safety, as well as the controls built into social media platforms themselves. “It’s just a matter of bridging that knowledge gap and making sure parents know [their options]," Withrow says.
    Zina Hutton is a staff writer for Governing. She has been a freelance culture writer, researcher and copywriter since 2015. In 2021, she started writing for Teen Vogue. Now, at Governing, Zina focuses on state and local finance, workforce, education and management and administration news.
    From Our Partners