Sewell had been confiding in a bot based on a Game of Thrones character, a bot advertised as “AI that feels alive.” When Sewell expressed suicidal thoughts, Megan told a congressional subcommittee, the bot did not warn him, redirect him to an adult or urge him to seek help. Instead, Megan said it asked him to “come home to me.”
His mom wants parents and policymakers to understand that this was not an accident or a glitch — that this product was released without adequate safety guardrails or testing.
Megan should not sit alone in her grief, and she is not alone in her experience. The daily headlines project a terrifying narrative: AI chatbots are having sexualized conversations with minors as well as encouraging self-harm. Social media is rewiring attention spans and degrading reading and memory. One in 12 children is sexually exploited or abused online.
As a country, we have to decide what we are willing to tolerate. State lawmakers have an opportunity to act with courage to protect our kids.
Social psychologist and author Jonathan Haidt has said many times that we overprotect kids in the real world and radically underprotect them in virtual spaces. He’s right. Our children’s brains are still forming. Their neurons are literally being rewired by the pace, intensity and unfiltered stimuli of the digital world, and we are witnessing the consequences: more anxiety; more depression; more self-harm; deteriorating attention spans; even physical ailments like “text neck,” “pinkie pain” and vision issues because screens are held closely to their faces for prolonged periods of time.
Many states have begun to push back by banning distracting devices from classrooms, a policy that new research shows yielding positive results. Teachers are seeing improved focus. Kids are talking to each other again. Libraries report book checkouts skyrocketing when the phones disappear.
Success keeping phones out of classrooms should embolden lawmakers to take further action to ensure kids stay safe. After all, the digital world doesn’t stop at the schoolhouse door. AI companions don’t care if a child is 12 or 17. They don’t know the difference between loneliness and crisis. They don’t have consciences or moral bearings. They have only pattern recognition and optimization toward engagement. AI doesn’t think. It certainly doesn’t protect.
While some AI companies are making necessary changes, states should take the lead on acting quickly. At a minimum, we must continue to keep phones out of schools; distraction-free learning should become the default policy in K-12 education.
We also need guardrails on the tech and AI used in school settings to ensure that tools are used that improve educational outcomes and are safe for students, and we must have absolute age restrictions for harmful online content no matter where our young people access it. All social media platforms should require age verification and parental permission to opt in, and AI companion bots should be prohibited for all minors. Pornography should never be accessible.
Finally, we must require a duty of care when tech interacts with minors. If a company’s product directs a child toward self-harm, sexual content or addictive loops, the company should bear liability for those outcomes.
Protecting children should not be controversial, and it must become a priority for every state. Washington should contend with issues of AI when it comes to commerce, national security and ensuring America’s leadership position. State lawmakers should take swift action when it comes to child safety. They’ve already demonstrated with phone-free schools that swift, bipartisan progress is possible.
AI holds enormous promise for education when it comes to accelerating learning, closing achievement gaps and making teachers’ jobs easier. But innovation cannot come at the cost of children’s lives. We owe our kids more than outrage after tragedy. We owe them guardrails before it happens.
Jeb Bush, a former governor of Florida, serves as chairman of ExcelinEd in Action, an advocacy-focused nonprofit.
Governing's opinion columns reflect the views of their authors and not necessarily those of Governing's editors or management.
Related Articles