What do employees at pornography companies that profit from videos depicting child sexual abuse think about their work? Thanks to an archival error in a Federal District Court in Alabama, thousands of internal Pornhub documents—previously sealed—have been made public, shedding light on this question.
Most of these documents date from 2020 or earlier and reveal disturbing attitudes among some employees, including casual laughter about the content hosted on the site.
One employee joked in a message, “I hope I never get in trouble for having those videos on my computer LOOOOL.”
Others expressed concern; one sent a message stating, “There is A LOT of very obvious and disturbing CSAM here,” referring to child sexual abuse material.
An internal report from May 2020 revealed that Pornhub had 706,000 videos flagged by users for depicting minors being raped or assaulted or for other violations. The documents suggest the platform often waited until a video had been reported at least 16 times before reviewing it for removal.
The company also made it harder to report problematic videos by allowing only registered users to file complaints. An internal note remarked this change “will greatly reduce the overall volume of reports.”
Pornhub and its affiliated 'tube' sites such as Redtube, Tube8, and YouPorn do not produce sexual videos themselves. Instead, they provide platforms for users to upload content.
Representatives from Pornhub declined to comment on the documents from the legal discovery process, which are publicly accessible through court websites, or on ongoing litigation. They emphasized that since the period covered by these documents, the company has strengthened its policies to remove non-consensual material and stated significant improvements have been made.
Nonetheless, these documents expose the company’s previous behind-the-scenes approach: an unrelenting pursuit of market share with little regard for the well-being of those depicted in the videos.
The files underscore how profit motives overshadowed ethical considerations, and suggest that despite recent reforms, Pornhub has not done enough to remove videos depicting the sexual abuse of minors.
In one exchange, an employee advises a colleague not to copy an administrator when reporting videos involving child sexual abuse. The other responds, “You don’t want to know how much CP we’ve ignored in the last five years.” CP stands for child pornography.
A private memorandum acknowledged that videos showing apparent child sexual abuse had been viewed 684 million times before removal.
The internal memos reveal executives obsessed with maximizing revenue by attracting the widest audience possible—including pedophiles. One memo proposed banning certain words from video descriptions—like “infant” and “kid”—while recommending the site continue allowing terms such as “brutal,” “childhood,” “force,” “snuff,” “forced,” “minor,” and “drunk.”
Another note stated that users who uploaded videos featuring minors should not be removed because “the user made money.”
These documents were submitted during a civil lawsuit filed by a woman from Alabama who, starting at age 16, was secretly filmed performing sexual acts—including at least one instance of being drugged and sexually assaulted. These videos were published on Pornhub and garnered thousands of views.
Additional material surfaced in a related California lawsuit filed by Serena Fleites, who recorded a nude video of herself in high school at the request of a boy she liked. The video ended up on Pornhub, leading to school harassment, withdrawal from studies, self-medication with drugs, and homelessness—all while Pornhub profited from the video’s over 2.7 million views.
In late 2020, an article exposing Pornhub’s proliferation of videos depicting assaults on minors led the site to remove approximately 10 million videos—around three-quarters of its total—that lacked consent. This sparked criminal investigations, legislative hearings, and numerous lawsuits.
Despite the corporate tone of the disclosures, they highlight Pornhub’s position as a tech giant skilled in search engine optimization. The company mastered tagging videos to dominate search results on Google for queries such as “precious teen undressing” or “stop, it hurts porn video,” ensuring its content outranked competitors.
Executives showed some concern about illegal content, such as videos involving persons aged 17 or younger, and internal records document efforts to remove the most blatant child abuse videos. However, they also monitored the popularity of content featuring nude teenagers, which was a major draw. The term “teen” was sometimes the second most searched on Pornhub, after “lesbian.”
While “teen” can legally include adults 18 or 19 years old, an internal message noted the site did not block “very young teens.” It is critical to remember that minors cannot legally consent, and videos involving minors constitute sexual assault.
The memos reveal internal debates over banning phrases like “young girl,” “first anal crying,” and “abused by daddy,” ultimately deciding these terms were acceptable.
Staff also noted that Pornhub’s algorithm suggested searches like “12 years old” and “little girl.” A 2020 internal message indicated a “underage” filter yielded 183,301 videos on the site.
One discovery memo showed 155,447 videos tagged with “12yo.” Other tracked categories included “11 years old,” “degraded teen,” “under 10 years old,” and “extreme asphyxiation,” though these searches have since been removed.
To understand the human impact, several victims of Pornhub’s negligence shared their stories.
Stephanie Stewart grew up in a conservative Christian home and had never kissed a boy when, at 15, a friend took her to a stranger’s house. She has no memory after that night. Somehow, she returned home and slept for more than a day with a severe headache. Reflecting back, she is certain she was drugged.
Stewart was unaware of what had happened until her life changed dramatically. Schoolchildren called her derogatory names, while her mother— a city doctor—received harassing calls labeling her daughter with slurs.
“People mocked me; it was a nightmare,” Stewart recalls. She dropped out of school and later earned a GED diploma.
Eventually, someone emailed her a link to Pornhub. Clicking it made her vomit.
“I was completely shattered,” she said. The 45-minute video showed Stewart being raped by four men in their twenties or older. She never learned their identities, and they were never held accountable. At the time she saw it, the video had over a million views.
Fearful to leave home, Stewart sent multiple messages to Pornhub begging for the video’s removal, explaining she was underage and raped. Nothing happened. She even posted comments directly on the video stating, “I am underage, please remove this video,” but received no response.
She endured harassment from stalkers who discovered her workplace and shouted obscenities at her. The trauma led to severe depression, requiring years of therapy.
Seeking help, Stewart turned to law enforcement. “It was absolutely humiliating,” she recalled. “I had to show them the video on my phone while standing there.” Ultimately, the police said they could not assist.
Stewart filed a complaint with the National Center for Missing & Exploited Children, which succeeded in having the video removed in 2020. But the emotional scars remain.
“There are no words to describe how horrible it has been to face this,” she said. “It’s something I think about every day.”
She said the documents reveal Pornhub executives’ contempt for victims. “They don’t care. It’s money in their pockets.”
Stewart and her family debated whether to share her name publicly. Ultimately, she chose to do so, hoping her story will spur laws and safeguards to prevent similar abuses.
Following intense criminal investigations and civil suits, Pornhub has made considerable improvements in recent years by enhancing video removal processes and content controls. The site stopped allowing downloads, implemented more rigorous age and consent verification, and claims that by July it will have verified that all performers in its video library are adults who have provided consent.
However, the system can still be deceived—if a face is blurred, it’s unclear whether the consent form corresponds to the person shown. Nevertheless, the emphasis on consent appears to have reduced videos depicting rape, torture, or abuse of minors.
Solomon Friedman, a partner at Ethical Capital Partners, which acquired Pornhub two years ago, said the site has banned 60,000 words, phrases, and word-emoji combinations (such as “underage,” “rape,” and “hypnosis”) and that 20 percent of employees now work in moderation and safety roles.
Despite these advances, skepticism remains. Similar safety assurances were offered five years ago when Pornhub promoted itself as a “healthy” platform. Many executives from that era remain, and the site still appears to cater to pedophilic and sadistic interests.
While searches for terms like “minor” and “forced” are no longer available, many videos still contain words such as “hurts,” “painful,” or references to “schoolgirls” and “school.”
Pornhub appears more cautious with English-language content and U.S. users, likely due to greater legal risks. It blocks searches for “juveniles,” “youth,” and “adolescents.” However, Spanish-language searches for “joven” (young) yield abundant results, and the site even suggests “jovencita” (little young girl). It hosts many videos labeled “adolesentes” (a misspelling of adolescents) and suggests a vulgar Spanish term meaning “13-year-old girls having sex.”
Laila Mickelwait, an activist from the Justice Defense Fund and author of a book chronicling efforts to hold Pornhub accountable, says much unverified illegal content remains on the site, from which Pornhub profits.
“Illegal content continues to flourish on Pornhub,” she said. “Pornhub doesn’t seem to care.”
Mickelwait also acknowledges progress: “When they realize the risk outweighs the reward, they are forced to change.”
This dynamic suggests potential, if imperfect, paths toward solutions.
Pornography is likely here to stay, but it seems feasible to use civil and criminal penalties to compel the industry to host only videos with verified age and consent.
“The key here is money,” said Michael Bowe, a New York attorney representing Serena, referring to the financial systems supporting companies that host videos of minors. He argues that if payment processors demand higher standards, companies will comply. For this reason, he has sued Visa and two investment firms that allegedly enabled Pornhub’s exploitation of Serena. (A judge has suggested Visa might be removed from the case.)
Similarly, Justine J. Li, a tech professional and entrepreneur, founded Prune, a company that pressures web hosts, payment processors, ad networks, and domain registrars to distance themselves from sites publishing non-consensual porn.
“We just need to make it more expensive, complicated, and less profitable to operate without accountability,” she said.
Li’s interest is both personal and professional: she was a Princeton student when a sexually explicit video of her as a minor appeared on Pornhub. She was hospitalized after a suicide attempt but found healing through creating Prune, which also offers free assistance to victims seeking removal of videos from pornographic sites.
Google is also complicit, playing a central role in the business model of companies that post non-consensual images. For example, a Google search for “choking porn videos” leads directly to such content. Google also directs users to at least one website monetizing trafficking victims’ abuse (unlike Google, this article will not name or promote that site).
There are gray areas: what about 18-year-olds who look much younger, attracting pedophiles? Or videos showing painful sexual acts where the participant accepted money or drugs? Artificial intelligence is revolutionizing pornography and raising new issues: what about realistic AI-generated depictions of sexually abused children?
It is unclear where the boundaries should lie. But to understand why trusting tech companies to self-regulate is a mistake, consider Rocky Franklin, a man from Alabama sentenced in 2022 to 40 years for sexually exploiting minors. A lawsuit alleges Franklin filmed abuse of a 12-year-old and posted videos on Pornhub, which accumulated 188,000 views.
The abuser was imprisoned, yet Pornhub profited from advertising revenue while distributing the abuse nationwide.
When individuals like Harvey Weinstein, Bill Cosby, or Jeffrey Epstein are credibly accused of abusing multiple women or girls, society responds with outrage and demands harsh criminal penalties, as it should. Yet when multinational corporations like Pornhub, backed by financiers and search engines, exploit countless victims, it seems accepted as business as usual.
As these Pornhub documents illustrate, one must ask: why have we allowed companies to get away with abusing children?
0 Comments
No comments yet. Be the first to comment!