Social media platforms have become central tools in child sex trafficking cases across the United States. Predators exploit direct messages, friend connections, weak age verification systems, and increasingly, artificial intelligence and deepfake technology to access and manipulate minors.
Advocates at the National Center on Sexual Exploitation say this exploitation continues largely because US law shields Big Tech companies from liability. For 30 years, Section 230 of the Communications Decency Act has protected platforms from being held legally responsible for user-generated content—even in cases involving child sex trafficking and abuse. Critics argue the law must be updated to ensure companies have a clear duty to prevent harm.
Profiting from exploitation
One young woman was trafficked for a year by a stranger she met on Instagram. He was later convicted and sentenced to 40 years in prison. Instagram, however, was not held accountable due to Section 230 protections.
In another case, a 15-year-old girl connected with someone on Facebook who had mutual friends. He used Messenger to groom her before trafficking, raping, and abusing her. Again, the platform faced no liability.
On Snapchat, a 13-year-old boy was extorted into sending sexual images. When he was 16, those images began circulating widely on Twitter (now X). He reported them, but the platform refused to remove them. An opinion piece in the Hill states:
The boy reported the images and sex trafficking to Twitter. Twitter responded by telling him that it had reviewed the images but would not remove them, and they continued to circulate while Twitter profited.
A former head of safety at Instagram testified that Meta allowed accounts accused of sex trafficking to receive up to 17 strikes before removal. The New Mexico Attorney General found that Snapchat ignored reports of sextortion and failed to implement meaningful age verification, even while acknowledging that its features connected minors with adults.
The accountability gap
Social media platforms profit from engagement, including interactions that can lead to the exploitation of children and adults. But the courts have protected tech companies, interpreting Section 230 broadly and limiting survivors’ ability to hold companies accountable when grooming and trafficking occur on their platforms. Advocates argue this legal shield removes incentives for platforms to invest meaningfully in safety.
Momentum for reform is growing. Section 230 was named the primary target of the National Center on Sexual Exploitation’s 2025 Dirty Dozen List. The article firmly states:
The only real solution is for Congress to repeal Section 230. Momentum is growing, as the Sunset Section 230 Act was introduced by a bipartisan group of senators at the end of 2025. Until Section 230 is struck down, tech platforms have no incentive to make their platforms safe.
Child protection should never be treated as secondary to profit. As cases continue to surface, the need for stronger accountability and prevention measures becomes harder to ignore. Without reform, children remain vulnerable in the digital spaces they use every day.
Freedom United is interested in hearing from our community and welcomes relevant, informed comments, advice, and insights that advance the conversation around our campaigns and advocacy. We value inclusivity and respect within our community. To be approved, your comments should be civil.