Loading News Article...
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
Unsealed court documents allege Meta enforced a "17 strike" rule, allowing accounts to commit 16 sex trafficking violations before facing suspension, prioritizing engagement over user safety.

MENLO PARK – Meta is facing fresh scrutiny after newly unsealed court filings alleged that Facebook and Instagram tolerated repeated sex-trafficking violations under an internal “17-strike” enforcement policy.
According to testimony from former Instagram head of safety and well-being Vaishnavi Jayakumar, Meta allowed accounts reported for the “trafficking of humans for sex” to accumulate 16 separate violations before they were permanently suspended. She described the threshold as “a very, very high strike level,” far beyond what she had seen elsewhere in the industry.
The allegations are part of a broader U.S. lawsuit by school districts and other plaintiffs accusing Meta of systematically putting engagement growth ahead of child safety and user protection.
Court filings say that when Jayakumar joined Meta in 2020, she discovered that accounts flagged for sex-trafficking activity were not subject to immediate removal despite the company’s public “zero-tolerance” stance on human exploitation.
Instead, prosecutors say, internal rules allowed:
Up to 16 violations for prostitution, sexual solicitation or sex-trafficking–related offenses
Suspension only at the 17th recorded violation
No clear, dedicated mechanism for users to report child sexual abuse content, even as the platform made it easy to report spam or copyright issues.
Plaintiffs argue that this gave repeat offenders a long runway to continue exploiting the platform, and that parents and the public were never told such a high threshold existed.
A Meta spokesperson has rejected the characterization, saying the company has “for years removed accounts immediately if we suspect them of human trafficking or exploitation” and has expanded reporting tools around child-safety issues.
Beyond the “17 strikes” allegation, the unsealed filings describe what plaintiffs call a pattern of choosing engagement over safety:
Internal studies allegedly showed Meta’s platforms could worsen anxiety and depression, particularly in teens; some research projects were paused after results suggested harmful effects from Facebook and Instagram use.
Product teams reportedly recommended safety features such as making teen accounts private by default and limiting contact from unknown adults. According to internal messages quoted in the filing, growth teams warned that such changes would create “a potentially untenable problem with engagement and growth.”
Plaintiffs say Meta failed to disclose these risks to parents, regulators, or Congress, even while executives publicly emphasized their commitment to teen safety.
The “17-strike” rule, they argue, is emblematic of a deeper structural problem: safety fixes were allegedly delayed, watered down or shelved whenever internal modelling showed they might reduce time spent on the apps.
The multidistrict litigation, brought by U.S. school districts and other plaintiffs, paints a sweeping picture of neglect:
Difficult reporting: In 2020, there was reportedly no straightforward option within Instagram’s interface to report child sexual abuse content, even as users could easily flag much less serious issues like spam.
Systemic tolerance: Sex-trafficking content was “both difficult to report and widely tolerated,” with predators allegedly able to use Meta platforms to recruit and exploit minors.
Misleading the public: Filings say Meta downplayed internal evidence of harm when speaking to lawmakers and the public, including halting or burying research that showed causal links between Facebook use and deteriorating mental health.
Meta has broadly denied that it puts profits over safety, pointing to investments in content moderation, machine-learning tools to detect abuse, and partnerships with law-enforcement and child-protection organizations.
If the 17-strike threshold is substantiated in court, the implications for Meta could be severe:
Corporate responsibility: Allowing repeated trafficking-related violations before enforcement would contradict Meta’s public commitments to “zero tolerance” for exploitation and could strengthen claims of negligence or willful disregard.
Regulatory risk: Lawmakers have already seized on the revelations; U.S. Senator Marsha Blackburn, for example, publicly highlighted the “17x” policy as evidence that platforms cannot be trusted to self-police harms to children.
Reputational damage: For parents, advertisers and regulators, the idea that sex traffickers could receive 16 chances before suspension reinforces the perception that Meta treats safety as secondary to growth.
Precedent for other platforms: The case could shape how courts and regulators expect social networks to structure enforcement—especially regarding repeat violations involving serious crimes.
The controversy highlights a core tension in modern social media: platforms optimized for growth struggle to control abuse at scale, especially when stronger enforcement may reduce engagement.
Key questions raised by the filings include:
How many resources should platforms be required to devote to proactively detecting human trafficking and child exploitation?
Should “three-strike”-style rules be codified in regulation for the most serious offenses, rather than left to internal policy?
How transparent should companies be about their enforcement thresholds and failure rates?
For now, the “17 strikes” allegation has become a powerful symbol of what critics see as a broken safety culture inside one of the world’s most influential tech companies. Whether the courts agree will depend on how judges and juries evaluate Meta’s internal documents, testimony from executives and former employees, and the company’s record of responding to abuse on its platforms.
Keep the conversation in one place—threads here stay linked to the story and in the forums.
Other hot threads
E-sports and Gaming Community in Kenya
Active 6 months ago
Popular Recreational Activities Across Counties
Active 6 months ago
Investing in Youth Sports Development Programs
Active 6 months ago
The Role of Technology in Modern Agriculture (AgriTech)
Active 6 months ago