A post collectively written by the Shared Hope staff:
Social media, now more than ever before, has become a constant in young adolescents lives. According to one study conducted in 2025, nearly 70% of children between the age of 11 to 15 years old have at least one social media account, with growing competing options from TikTok to Instagram, Snapchat, and YouTube 1. While some social media use could be beneficial with connecting us to different parts of the world, there is increasing concern around the systematic overuse and reliance of children on these platforms.
Studies show us that children’s brain are not fully developed until around age 25. In between birth and adulthood, there is a surge of adaptations that cause rewiring in a child’s brain that is directed dependent on, among other internal influences, its environment. During early adolescence, there is a surge of “synaptic pruning,” where the brain removes connections in the brain it identifies and not needed for its environment. As adolescence progresses, pruning continues while increased myelination of synapses cause faster communication between connections in the brain that are more frequently used. During this time, developing brains are more prone to influence as the connections in their brain are directly dependent on the information it continuously perceives as relevant. 2
The awareness of platforms of this feature of developing brains should encourage them to add protective measures around the interaction and targeted material to children. If platforms, instead, played to these vulnerabilities of developing brains to further promote and facilitate habitual reliance on social media for their own profit and user engagement, then these companies should no longer be celebrated for their ingenuity, but instead be held responsible for the foreseen consequences they have inflicted on children.
In the past few years, free speech, social media liability, and the protection of children have become, at times, seemingly juxtaposed to each other, with courts and legislators attempting to strike a balance between corporation versus individual duties of care. With increasing litigation and pressure on those capable of implementing change, the landscape for how social media platforms operate pertaining to child data collection and their chosen methods implemented in order to protect children against predatory exploitation seem to be under crushing pressure to change.
Various prominent cases have been filed in recent years by both State Attorneys General and individuals affected by social media. In 2023, plaintiff K.G.M. filed a lawsuit against various social media companies, including Meta Platforms, Inc. and Google, claiming that these companies intentionally designed and deployed features in its social media platforms to addict users to their product. K.G.M. claimed that she started using social media products at an early age, with an Instagram account at age 9 and using YouTube at age 6. She claimed that these platforms addicted her to their products, and that these products additionally caused her depression, anxiety, and eating disorders. The trial began February 2026. The jury took 9 days in deliberation before they found Meta and Google liable for all counts. Both companies were ordered to collectively pay $3 million in damages, and additionally another $3 million in punitive damages.3
Another lawsuit, filed by New Mexico Attorney General in 2023, alleged that Meta mislead users about its product safety qualities and knowingly facilitated a design that enabled child exploitation on its sites. The state claimed that Meta had violated New Mexico consumer protection statutes, deceive the public about child safety on its platforms, engaged in deceptive trade practice, and acted “unconscionably” towards minors. This case similarly went to trial in February 2026. The jury took one day to deliberate to reach a verdict holding Meta liable for all counts, finding that Meta willfully violated New Mexico’s Unfair Practices Act. Meta was ordered to pay $375 million in damages. 4
Some experts in the area of social media and child protection suggest that one federal statute, Section 230 of the Communications Decency Act of 1996, has enabled companies to evade liability for any conduct resulting from the content provided to users on their platforms. 5 6 Termed “the 26 words that created the internet” Section 230(c)(1) states “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”7 This establishes that providers of internet services cannot be liable for speech published by a third-party.
While Section 230 worked well at creating a way to protect free speech concerns at the time of the internet’s creation, the use of the legislation as a shield for platforms from obtaining liability for their content placement has been the focal point of several debates.
Matthew Bergman of the Social Media Victims Law Center made this point during this witness testimony at a Senate hearing early in March. Mr. Bergman, along with other advocates holding social media companies liable for their effect on young users, makes the argument that social media is a product. Similarly, the algorithms deployed in these products, those that are responsible for providing users continuous content that they many find interesting based on their monitored interactions, are argued to be a deliberate design of the product. Many advocates in this area argue that when the deliberate design of the product feeds young users exploitative or harmful information, that is conduct that social media companies should be held liable for, especially when facilitating this harm is intended to maintain engagement and maximizes profits. 8
As Section 230 turned 30 years old in 2026, two congressional hearings in March have been held to revisit the effectiveness and previously unperceived risks that this legislation may have on individuals rights and safety. The two primary concerns are how to balance the free speech of user content, with the need for prevention from exposing children to harmful content or creating another avenue of facilitating child exploitation material.
Mr. Bergman offered a potential solution to Congress during his testimony. He stated the Congress’s original intent for this statute “sought to maximize user control over what information was received by individuals,” and to “empower parents and embolden law enforcement.” “Clarifying the original legislative intent to [] encourage the development of technologies that maximize user control of what information is received by individuals” along with “vigorous enforcement of federal criminal law to deter and punish trafficking and obscenity, stalking, and harassment” are one way, Mr. Bergman states, for Congress to take effective measures today that can strike the balance between free speech and child protection. 9
Today, we are facing issues and avenues of child sexual exploitation that was not imagined when online communication was first introduced. In 2023, Shared Hope International began a campaign about internet safety titled The White Van Campaign. It consists of an educational PSA video, along with a toolkit for the aimed at community members and those who are in protective capacities around children. It aims to show that child exploitation is not synonymous with the stereotypical windowless “white van” that had been warned about for decades. In today’s generation, with areas of the digital world rapidly advancing and, inadvertently, creating more avenues for abuse, parents and children should be aware of the dangers that are facilitated online under the guise of typical digital interactions.
For more information about Shared Hope International’s White Van Campaign, watch the PSA video or review the internet safety toolkit here.
References:
6 WATCH: Meta, TikTok and other social media CEOs testify in Senate hearing on child exploitation | PBS News
7 47 U.S. Code § 230 – Protection for private blocking and screening of offensive material | U.S. Code | US Law | LII / Legal Information Institute
8 Liability or Deniability? Platform Power as Section 230 Turns 30 – U.S. Senate Committee on Commerce, Science, & Transportation
9 Liability or Deniability? Platform Power as Section 230 Turns 30 – U.S. Senate Committee on Commerce, Science, & Transportation







Leave a Reply