Despite discovering that nearly 20% of American children are online for 4 hours or more, Trump’s FTC retreats from social media regulation

In an online world where you’re more likely to interact with bots than real people, as children become more tech – savvy every day and can handle phones better than they can ride bikes, social media platforms are seeking ways to balance prioritizing people’s privacy while ensuring the safety of their underage users. Unfortunately, these two aspects often conflict with each other, and the lack of government supervision means there’s little motivation for these companies to do more than maintain the current situation.

That was the case until recently, when something emerged on the public internet and an increasing number of [the text seems incomplete here] decided to take the matter to court. Now, in an effort to proactively keep underage users safe online and also safeguard the privacy of everyone’s collected data, companies are exploring new ways to verify the age of their online users. However, the lack of federal regulations is also exacerbating this paradoxical directive and intensifying the conflict: social media companies can collect the data of users of all ages in order to keep children safe.

The Federal Trade Commission (FTC) issued a statement this week allowing social media companies to collect children’s personal data without parental consent in the name of age verification, creating an exception to the Children’s Online Privacy Protection Rule (COPPA), which has clearly stated that children under 13 are off – limits for data collection until now. Considering that COPPA was designed to protect sensitive data, the FTC is essentially giving social media companies free rein to collect any information they consider necessary in the name of age verification.

“Privacy can sometimes be a double – edged sword,” said Johnny Ayers, the CEO and founder of the AI – powered identification software company Socure. “There is a very dangerous naivety associated with identity fraud, liveness, and deep – fake detection.”

“You can’t collect biometric data from a child,” he told . “So how do you verify that someone is 13 without collecting any data to prove it?”

The FTC is calling this policy change a step in the right direction, but psychologists and privacy experts alike warn that it allows companies to overstep in data collection, undermining any so – called privacy measures, and the damage to children has already been done.

“These platforms were developed for adults. They were made for adults, but kids are using them. It was never intentional, like, what’s the product for kids? It was an afterthought, which means we’re trying to fix the problems,” Debra Boeldt, a generative AI psychologist at the family online safety company Aura, told . “A lot of these companies are trying to help right now, but they don’t have the resources or the evidence – based, trained people to think about and plan for it.”

She supervises the clinical research team at Aura, an online safety solution for individuals and families to protect their identities and those of their children in an increasingly digital world. The company uses AI to monitor families’ online activities and can even recognize keyboard inputs to determine if a child is using harmful language or platforms.

Boeldt is a clinical psychologist with a background in child development. Her team found that nearly one in five children under the age of 13 spend four or more hours online every day, which is leading to higher levels of depression and anxiety among the youngest internet users.

The findings even coined the term “compulsive unlocking,” which refers to when children usually wake up—around 7 a.m., following a biological clock similar to that of a smoker—and check their phones almost religiously. The company also found that girls were 17% more likely to experience anxiety due to the pressures related to digital availability and connection.

Kids are playing digital whack – a – mole

Efforts by social media companies to remove children from their platforms will be difficult because children know how to get around the restrictions.

“This is just their normal social space,” Boeldt said, adding that any attempts “will be like playing whack – a – mole,” where underage users will simply move on to the next platform.

“Maybe your TikTok is taken away. But then you go to [the text seems incomplete here]. Or you go on Discord and start chatting with people there,” he said. “That’s one of the challenges… kids are very smart and will find ways around things.”

Boeldt mentioned Instagram’s recent announcement that it will soon start monitoring accounts it believes belong to children for self – harm language. Parents will receive an alert if their children repeatedly search for suicide or self – harm terms on the platform. This move comes as Instagram’s parent company, [the text seems incomplete here], is accused of creating a social media environment that intentionally harms and causes addiction in young users.

“These alerts are designed to make sure parents know if their teen is repeatedly trying to search for this kind of content and to provide them with the resources they need to support their teen,” the company said in a release.

However, kids already find ways to bypass censors on social media platforms like TikTok and Instagram, using words like “unalive” or referring to “PDF files” to mean more sinister things.

This creates a problem, Boeldt said, because any attempt to stop children from using certain terms will just lead to the creation of new vocabulary, which will then require new monitoring efforts, resulting in an endless cycle.

“When I saw this about Instagram and self – harm, my first thought was, ‘How good is their model? How well will they be able to detect this?’” he added.

Boeldt believes that government regulation is the only way to truly force companies to ensure the online safety of their users. “These companies aren’t held to a certain standard” that would prevent children from accessing their platforms—moreover, these companies “benefit from having kids on their platforms. More users mean more ads.”

“In the end, it actually takes a lot of money and resources to do this.”