Online Consent Moderation

New Approaches to Preventing Proliferation of Non-Consensual Intimate Images
stop hand facing video camera

Key points:

  • A single federal law criminalizing creation and distribution of non-consensual intimate imagery (NCII) would be a welcome improvement over the current state-by-state patchwork of laws, but it should be crafted carefully to avoid unintended consequences.

  • A broader definition of NCII should provide clarity around gray areas, including the revocability of consent.

  • Payment providers using their market clout to shut down sites such as PornHub are unlikely to reduce the proliferation of NCII. Instead, these companies should consider using their significant power to push for increased industry safeguards.

  • A centralized NCII reporting, fingerprinting, and database system would be good for victims and the industry as a whole.

What is non-consensual intimate imagery?

Broadly speaking, NCII is sexual content distributed without the consent of the people depicted. Colloquially referred to with the problematic term “revenge porn,” NCII can have profoundly damaging psychological consequences for the victims due to not only the initial act of dissemination, but also its redistribution. As a result of a recent New York Times article detailing instances of non-consensual intimate imagery (NCII) being distributed on PornHub, the pressure on the online adult entertainment industry to improve content moderation and prevent exploitation has significantly increased. Following outcry over the article, PornHub’s parent company MindGeek announced policy changes that prevent the download of videos to limit redistribution and restrict which users can upload content. PornHub also took down several million videos, leaving only those uploaded by verified users. Even with these actions, payment providers Visa and MasterCard chose to disallow their platforms from being used to pay for PornHub subscriptions and by extension to pay performers and content providers on PornHub.

Policymakers have weighed in: Senators Josh Hawley, R-Mo., Maggie Hassan, D-N.H., Joni Ernst, R-Iowa, and Thom Tillis, R-N.C., introduced legislation that would criminalize the knowing distribution of NCII and require websites that host pornography to develop a notice and takedown system for NCII similar to that which is spelled out for copyright infringement in the Digital Millennium Copyright Act (DMCA). Since federal criminal offenses have never enjoyed protection under Section 230 of the Communications Decency Act, under this proposal, online platforms would lose immunity for hosting NCII. Groups affiliated with sex workers have also entered the conversation, saying that Mastercard and Visa’s actions put them at risk.

We believe that there are technological solutions that PornHub and the industry at large can implement to cooperate in the fight against NCII. Payment providers and policymakers should pressure adult sites to implement these solutions instead of attempting to demonetize adult sites in a piecemeal fashion that could put victims further at risk.

Non-Consensual Intimate Imagery Policy Debates: An Overview

Currently, NCII laws are a state-by-state patchwork, some classifying distribution of NCII as a misdemeanor, some as a felony, often with an exemption for "public interest." Without these laws, the recourse for victims is typically only copyright enforcement via the DMCA — problematic given that the victims often don't own the copyright to the media if it was recorded by another party. Even if victims do own the copyright, notice-and-take-down systems are by design quite burdensome on the copyright holder, an experience that can re-traumatize victims of abuse with every report they have to file.

The proposed "Survivors of Human Trafficking Fight Back Act" would increase penalties and centralize the current state-level legal patchwork. While well-intentioned, previous legal attempts to address online content linked to problems such as sex trafficking, for instance the 2018 FOSTA-SESTA laws in the United States, have had negative effects on vulnerable populations and did not necessarily lead to their intended effect. Laws that expose content platforms — even those that do not primarily host adult content — to liability can lead them to over-moderate or build filters with high error rates that disparately impact certain users. Furthermore, Congress walks a fine line with regard to the Fourth Amendment (as it does with proposed updates to child exploitation regulation) and must consider the other challenges raised each time it seeks to update laws governing online harms.

Adding to the complexity, this proposed bill raises additional challenges specific to NCII.

First, the proposed bill raises questions about the scope of what is considered to be NCII. Notably, it encompasses two different but potentially overlapping scenarios: distributing content that contains acts that were performed non-consensually, and distributing content that may have been consensually produced, but was distributed without consent. Most current laws focus on distribution without consent, so including recordings of non-consensual acts would be a welcome expansion. However, for this change to be implemented, Congress must clarify what "consent" means in this context: for example, can consent to distribute a video be revoked? Can someone object to the distribution of images depicting them in sex acts that never happened? 96 percent of deepfakes are estimated to be pornographic. Is there some degree of realism that would trigger a victim’s ability to sue?

Second, we must consider how this change will intersect with the above-ground adult industry. Should the law refer to commercially produced content? Imagine an adult performer appeared in video content under contract with a producer who later states that some or all of the activity in the video was non-consensual or produced under duress — something that has unfortunately been documented in the adult entertainment industry several times. In such a case, would distribution of the resulting video fall under the proposed NCII legislation?

Finally, can a person appearing in intimate imagery set the terms of the distribution of content? Let’s say a model distributes content via a subscription site such as OnlyFans or a private video show, but then a viewer uploads that content to a site such as PornHub without the consent of its creator. Currently, a model could request takedowns via DMCA whac-a-mole, the same as copyright holders of other pirated content. Under the proposed law, such actions could be criminalized as the knowing, nonconsensual distribution of sex acts, with potentially serious legal consequences.

While a single federal law criminalizing creation and distribution of NCII would be a welcome improvement over the current fragmented approach, it needs to be more carefully crafted to avoid unintended consequences.

Influence or demonetize

Policymakers and payment processors must grapple with the fundamental question of whether they want to influence sites like PornHub to be more responsible or drive them underground, pushing them toward unregulated Bitcoin transactions as a payment mechanism. There is much more these sites can do to address the problem of NCII, should they choose to act. PornHub does have channels for reporting NCII, but victims struggle to remove known instances of exploitation there and from other adult video platforms where their abuse has proliferated. In response to the Times' story, PornHub took steps to disable downloads and prevent uploads from unverified users — something many adult performers had been requesting for some time. Its actions steered toward a licensed-content "Netflix" model instead of a user-generated "YouTube" model. This is undoubtedly an improvement, as largely anonymous uploads of adult material are a content enforcement minefield.

Public pressure brought about those changes. However, the attempt to cut off cash flow to the company by payment providers who were never likely to be held liable for NCII is likely a harmful overreaction: it not only largely demonetizes PornHub, but also removes above-ground mechanisms to pay legitimate content producers.

For better or worse, MindGeek’s dominance in the industry offers users a mechanism to centrally pay for content instead of relying on piracy. If MindGeek is forced to stop operating or significantly scale back operations, it would not end nonconsensual pornography online or diminish the demand for adult content. Rather, users would upload content to a wider array of smaller sites, magnifying DMCA takedown challenges for victims, and increasing barriers for legitimate content producers. Eliminating mainstream payment models and pushing adult sites underground will likely exacerbate the problem of NCII as a whole. Payment providers should instead focus on using their leverage to improve NCII enforcement across the industry.

Adult content moderation and (de)centralization

In order to effectively fight NCII across the adult industry, it is worth considering a clearinghouse model similar to the well-choreographed, collaborative process the technology industry uses to identify and report child sexual abuse material (CSAM). This process relies on technologies including machine learning models to detect nudity or genitalia and Microsoft's PhotoDNA perceptual hashing. Currently, many companies use the same centralized PhotoDNA service to proactively scan for known CSAM and report it to the National Center for Missing and Exploited Children (NCMEC) for tracking and investigation.

Pornhub already participates in this process for CSAM. A similar, shared system could be devised for NCII — an API for querying and storing perceptual hashes of identified NCII video content, funded by the industry and with mechanisms for interfacing with law enforcement. It would allow users to report NCII once in a central location, instead of forcing victims to identify every single upload on thousands of sites. Ideally administered by a central entity independent from any particular content distributor, it could also facilitate proactive submission of material in conjunction with partner nonprofits. Similar to current DMCA enforcement, this entity would need to create a definition of NCII (given no stable one currently exists in law), receive reports from victims, and handle counter-notices from people who believe the victims are wrong.

Success would of course rely on content sites opting in to the system. However, given the pressure from payment providers, the market share of the large players in the space, and increasingly strict NCII legislation, companies’ incentives to mitigate NCII challenges grow daily.

While PhotoDNA has been immensely useful for combating CSAM, it does have deficiencies when it comes to NCII. Most obviously, it works primarily on static images — insofar as PhotoDNA can be used on videos, it relies on extracting screenshots.

Also, due to a variety of concerns the use of PhotoDNA remains restricted to the multinational fight against CSAM by the owner of the PhotoDNA intellectual property, Microsoft. Privately implemented hashing databases have been used to expand this scope — for example, Facebook’s pilot program to proactively submit material that might be uploaded to the platform as NCII — to mixed reviews.

However, technical challenges can be addressed. In 2019, Facebook open-sourced TMK, a mechanism for detecting video similarity analogous to PhotoDNA. Audio fingerprinting is also relatively well established, and could be used as part of a combined approach. Developing these systems from scratch requires extensive engineering time and administrative and computational resources. PornHub provides takedown services, including fingerprinting, for NCII, but it is insufficient. A centralized database, while it does concentrate private power, is the best way to keep known exploitative material from propagating online while offering a single point of appeal for both victims and those who feel their content has been removed in error.

Recognition of distribution and creation of NCII as a serious internet-facilitated abuse has been building in recent years, and the focus on adult content aggregators as a distribution channel for it is warranted. The adult industry should take this opportunity to work together to meet the needs of NCII victims while protecting legitimate content producers. Policymakers and payment platforms should support this approach instead of rushing to enact legislation or sanctions, however well intentioned, that could have unintended consequences.