Six-bill package would protect kids from digital harm.

This story was originally published in Rhode Island Current, a publication partner of Ocean State Stories.

PROVIDENCE — What were you doing on Nov. 4, 2023? Amanda Zimmer remembers. She went for a walk that night, and left her phone at home. When she returned, a text message was waiting.

“The message came from my younger child,” Zimmer said, “And it stated, ‘Owen is dead. Owen committed suicide. Dad is yelling, please call me.’”

On Tuesday afternoon, Zimmer recalled that night as she stood in the Rhode Island House of Representatives lounge. She held up a framed picture of Owen, her son, who died at age 17, the day after he had gone to a concert with his mom. 

“We had the best time,” Zimmer said of her final outing with her son.

It was Owen’s life online, and what Zimmer said she later found on his devices and social media accounts, that led her to believe digital platforms routed him toward the extremist and pro-suicide content that eventually nudged him to take his own life. 

“It basically infected him like a virus,” Zimmer, an East Greenwich therapist and licensed social worker, said. “The subtle and sporadic changes that started happening: Talking about conspiracy theories, outrageous propaganda, some right, far-right, leaning ideas like you hear of from the manosphere.”

Zimmer’s account was the emotional center of a legislative event promoting a package of bills that seek to more tightly corral young people’s digital lives. But the half-dozen bills also stretch beyond the factors that Zimmer believes influenced her son’s death. 

Rhode Island’s legislative package aims to address a spectrum of threats kids can encounter online, from the dopamine-spiking user experience design of social media platforms to the potentially lethal sycophancy of AI chatbots.

Rep. Tina Spears, the Charlestown Democrat who served as the event’s emcee, made it clear to the dozens of people in attendance that she is aiming to build a coalition against the platforms — and their makers — for the reckless behaviors she believes represent “a threat to our children.”

“These platforms are the cause…the vehicles that perpetrators are using to exploit our children and harm our children,” Spears said. “I want an army of moms. I want a coalition of advocates, policymakers, our legal system, our law enforcement, to go after these bad actors and put the guardrails we need in place.”

Age gating and AI controls

The bills in the package include two efforts from Spears, with one, H7953, a redux from last year that would institute “age gating” for social media use, and bar people under age 18 from accessing platforms without parental consent. Typically, this is accomplished through an age verification process.

Spears’ second bill mirrors Senate legislation by Sen. Lori Urso, a Pawtucket Democrat, which would require that AI chatbot operators institute additional safeguards and measures to address when users express suicidal thoughts, self-harm, or other injurious behavior. The bill also requires that AI chatbots routinely remind users that the chatbot is not human. 

Urso’s bill is also included in the Senate’s health care agenda this year. She spoke at Tuesday’s press event and noted a February 2025 comment made by Vice President JD Vance: “The AI future is not going to be won by hand-wringing about safety.”

“I think that really underscores the position of laxity that the federal government has taken with regard to this technology,” Urso told the crowd. “And the motivation that provides the states to really step up to protect citizens.”

Caldwell aims for more parental input

Rep. Justine Caldwell, an East Greenwich Democrat, has three bills in the tranche. One bill would set standards for school-issued devices and software, and another would create a legislative study commission to examine the role third-party vendors and platforms play in a public school education. Caldwell explained in her turn at the podium that she recently received an email that her eighth-grade son is part of a class action settlement against an educational tech company for a privacy breach involving his and other students’ data. 

“Even as an admittedly probably overly involved parent in my kids’ lives…I had never even heard of that company’s products in our schools,” Caldwell said.

Another Caldwell bill would force platforms to shut off open messaging features by default for young users, as well as require that parents OK children’s financial transactions or purchases on gaming and social media platforms. 

Rep. Megan Cotter, an Exeter Democrat, speaks during State House press event on Tuesday, March 31, 2026, about her bill to require child-safety considerations in the design of online platforms likely to be accessed by minors – Photo by Alexander Castro/Rhode Island Current

Child-safe design laws face challenges

H7632 is the lone bill from Rep. Megan Cotter, an Exeter Democrat, and it focuses specifically on platform design. Like Spears’ age verification bill, it revives an effort Cotter made last year, and would impose upon online service providers a requirement that minors’ safety be considered in platform design. 

The bill “is based on a simple but powerful idea,” Cotter said. “If companies create a platform that children are likely to use, it should be designed with the best interests of our children in mind, because right now, the digital world is not built in that way.”

This attempt to regulate and make less manipulative what’s broadly known as user experience (UX) design follows similar legislation in states like CaliforniaMarylandNebraskaVermont and South Carolina. On the Senate side, the legislation is sponsored by the chamber’s president, Sen. Valarie Lawson. The House bill also has a sliver of bipartisan support thanks to sponsorship from Minority Leader Mike Chippendale, a Foster Republican. 

Rhode Island Deputy Attorney General Adi Goldstein spoke at the event on behalf of AG Peter Neronha, and noted the office started to go up against big tech in 2023 by joining a 42-state, bipartisan lawsuit against Meta which alleged that the company knowingly crafted and deployed features which harm young users and lure them into compulsive, addictive use, while simultaneously “assuring the public that their product is safe to use,” Goldstein said.

That litigation, the deputy AG said, is ongoing, as are legal battles in other states where legislation similar to Spears’ has encountered legal challenges if passed into law. Attempts to prescribe new parameters for social media have seen age-verification fights in ArkansasOhio and Georgia, while the Californian statute which influenced Cotter’s UX bill generated a lengthy court battle, NetChoice v. Bonta, that saw a mixed decision emerge from the U.S. Court of Appeals for the Ninth Circuit on March 12.

Neither side secured a full victory: The court declined to block fully the child-safe design law, but maintained that the statute’s weak point was its language — specifically, that terms like “materially detrimental,” “well-being,” and “best interests of children” are not clear enough to be enforced.

A smartphone displays Instagram’s Explore page. Lawmakers and advocates say algorithmically generated feeds can steer young users toward harmful content – Photo by Alexander Castro/Rhode Island Current

Even Stephen King isn’t this dark

After the press event, Spears said she wishes the media would do more “deep dives” on the present-day dangers of youths’ tech use.

“The companies are saying that they’re doing everything that they need to, while data tells us everything’s going in the wrong direction for children,” Spears said.

It’s true that, in recent years, a well-established risk of digital life like cyber bullying has mutated far beyond garden variety taunts and into intricate systems of criminality and harassment. The FBI and U.S. Department of Justice, for instance, continue to chip away at the diffuse leadership of multinational groups like 764, arresting and charging leaders of this ultra-extremist group whose tactics involve manipulating young people over apps like Discord, Roblox and Minecraft into creating child sexual abuse material, harming or killing themselves or others, or, as one survivor recounted to Wired in 2024, beheading a pet hamster.

“I don’t think Stephen King is dark enough to come up with some of the stuff that these kids are coming up with,” a DOJ trial attorney was quoted by ABC News in December 2025. 

Spears said she was hopeful her legislation would pass this year. “I think it’s time. It’s beyond time,” she said.

In the meantime, kids are spending a lot of time online. Margaret Holland McDuff, CEO of Family Service Rhode Island, noted at Tuesday’s event that young people are spending about nine hours a day in online environments on average. At the same time, anxiety and depression rates, as well as suicidal ideation, have “sharply risen in the past decade,” Holland McDuff said.

‘This is happening in your state’

Zimmer recounted her son’s story in testimony sent to a U.S. Senate committee in December 2025, where she cautioned that she was not for a blanket ban on technology, which Owen loved, but rather fighting against the fact that “algorithms make active choices about what to show to whom.” 

“When an algorithm first addicts and then decides to recommend suicide content to a depressed teenage boy, that is not passive hosting,” Zimmer wrote. 

Zimmer told the crowd that Owen was one of those kids who spent a lot of time online. She said he was “on the spectrum,” and he entertained “deep passions that come, and then they go.” As a kid, Owen was obsessed with foreign flags, then the Geico gecko, then Coca-Cola. He eventually came to love working with computers and gaming. He even built a gaming rig with his grandfather.   

Minecraft — and what Zimmer described as a not uncommon pastime among young men, watching other young men “yell about Minecraft” — became a focus in Owen’s teenage life. He wanted to be a Minecraft YouTuber. 

“He would practice his ‘Hey, guys,’ because they always start the same way at their intro,” Zimmer said, describing a genre trope of gaming videos on YouTube.

Zimmer learned later that Owen was self-harming. She said kids hide these behaviors because online groomers teach them to, threatening to release explicit or sensitive materials if their victims don’t comply.  

“I’m sharing my story with you to let you know that it’s not somewhere out there in Michigan or in Toronto or Ohio,” Zimmer said. “This is right here. This is happening in your state.”

If you are in crisis or need someone to talk to, you can call or text the National Suicide Prevention Lifeline on 988, chat on 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor.

Amanda Zimmer holds a framed photo of her son, Owen, while speaking during a March 31, 2026, State House press event on legislation aimed at protecting children from digital harm – Photo by Alexander Castro/Rhode Island Current