What is it?
Exploitative content refers to imagery that may not be considered unlawful child sexual abuse material (CSAM) but which nonetheless violates the privacy of depicted children and/or is used in a sexually exploitative manner.
NCMEC's Position:
Platforms that moderate content (beyond removing plainly illegal CSAM) should prioritize child protection by restricting the distribution of legal images that violate children's privacy.
Why does it matter?
Criminal laws around the world prohibit the creation, possession, and distribution of images clearly depicting the sexual abuse and exploitation of children. However, other types of content not subject to those laws can be used in sexually exploitative ways. When children’s privacy and dignity are violated—or criminal activity is promoted—through such imagery, child safety is compromised, and children may be harmed.
What context is relevant?
Imagery considered sexually exploitative but not illegal can come from a wide variety of sources, but offenders use such material for sexual purposes. Even images or videos created with no malicious intent can lead to harm.
Perhaps the most harmful imagery of this type is non-explicit imagery associated with known and documented “series” of CSAM. These images and videos—which depict a child known to have been sexually abused and exploited through CSAM creation—are not themselves sexually explicit. They may show an identifiable child’s face, distinguishing marks, and/or clothing consistent with what is visible in the associated CSAM. They may also depict the same locations and other elements present in the associated CSAM. Offenders circulate these non-explicit images to facilitate communication with other offenders—such as specific interests or the specific CSAM of that child that is available for distribution—without subjecting the distributor or possessor to consequences that may come from openly sharing the associated CSAM. Distributing such images and videos also can exert coercive influence or control over survivors.
Some images of children are created with a sexually exploitative purpose though the content falls short of crossing legal thresholds into criminality. Often promoted as merely “child modeling,” these images may sexualize children through provocative clothing, poses, or props.
Other types of imagery with completely innocent origins can also be used in sexually exploitative ways. Photos and videos created by parents—or by children themselves—to document social activities, athletic achievements, or even daily life can be misappropriated by sexually motivated offenders and used for malicious purposes, just as CSAM is used. It is not uncommon for law enforcement to find, within collections of clearly illegal CSAM seized from offenders, images originally created for innocent purposes. These might include depictions of children dressed in physically revealing or very tight athletic attire, such as swimwear or leotards for dance or gymnastics, and other images that might be sexualized by offenders. While this content is not sexual, some offenders may collect, share or otherwise use such images with sexually exploitative intent, such as seeking sexual gratification from viewing the imagery.
Parents who routinely share via social media personal information and images of their children have been criticized for “sharenting,” which a 2023 study in Türkiye (Turkey) found could contribute to child abuse. Pediatricians in Argentina have been advised to address “sharenting” harms—including the possibility that publicly shared images of children could end up on CSAM focused sites and forums—with parents and children during medical visits.
In 2018, Child Rescue Coalition (a United States-based non-governmental organization) launched an awareness and prevention campaign using the Instagram account @KidsForPrivacy to disrupt the “overexposure” of children through the distribution of legal images that could violate children’s privacy.
Many tech platforms, particularly social media and social networking services, already use a variety of content moderation strategies to detect, remove, and report illegal content, including CSAM. Similar efforts can be made to restrict the distribution of legal images that threaten the privacy and safety of children and contribute to child sexual exploitation. These measures need not result in a ban on such images, although platforms typically have the right to set such limits. Relevant restrictions might be achieved by:
- Disrupting the viral spread of images that may be legal but sexually exploitative;
- Displaying a warning or guidance to users posting images that appear to depict children in compromising situations;
- Limiting the visibility of content posted by young users to prevent public/unrestricted access;
- Adding protections against on-platform redistribution of content within private groups created for family communication; and/or
- Using cryptographic and perceptual hashing to detect and remove non-explicit images associated with known CSAM series.
In response to reports of legal but sexually exploitative material and predatory text—such as sexual comments or personal information about an identified child victim—NCMEC routinely sends notices to online platforms. Because this material is generally not illegal, online platforms have discretion over whether and how to respond to NCMEC’s notices.
NCMEC also shares hashes of legal but sexually exploitative material through its Exploitative Hash-Sharing Initiative. Hashes added by NCMEC to this list are derived solely from images and videos reported to NCMEC’s CyberTipline by online platforms. The hashes are made available to participating online platforms for use in detecting exploitative material and removing it from their platforms.
NCMEC’s “Take it Down” service, available in more than 30 languages, can help people who (1) were under age 18 when images or videos of them nude, partially nude, or in a sexually explicit situation were taken, and (2) believe the images have been or will be shared online. More than ten online platforms, including some of the most popular and commonly used services, participate in Take it Down.
Online platforms have legitimate business interests to take actions to promote their services as safe for children and hostile to sexually exploitative conduct. Where legal to do so, companies could engage in voluntary initiatives to restrict the distribution of sexually exploitative content without a need for legal or government mandates.
What does the data reveal?
In 2023, NCMEC sent more than 6,000 notices about exploitative material (images and videos associated with known CSAM) and predatory text to online platforms, which acted to remove the reported content after about four days, on average. Notably, nearly 140 notices of both types received no response from the notified platforms.
In 2024, the number of hashes NCMEC contributed to the Exploitative Hash-Sharing Initiative surpassed 315,000; 18 ESPs have voluntarily chosen to access this list.
What have survivors said about it?
Survivors recognize the tension inherent in this issue involving the malicious misappropriation of innocent imagery and caution against approaches that could be interpreted as “victim blaming.” Children, parents, and others should have the right to live and document their lives without prohibitions on freedom of expression. However, online platforms—which are in the business of tech innovation—should use their expertise and resources to make users’ online interactions safer on their platforms by mitigating the risk of certain predatory behaviors.
Some survivors advise that businesses selling or marketing children’s underwear, swimwear, or leotards should not use child models to do so, because of the risk that those legitimate images could be used in exploitative ways. Others have noted that some platforms or apps—such as those used to administer academic tests—can prohibit (or at least obstruct) efforts to download, save, or use screen capture capabilities on displayed content, and they suggest that adoption of similar strategies be applied to protect certain images from rapid redistribution.
We all share a duty to protect children from harm. Unfortunately, there are those who nefariously use images of children in various states of undress for their own benefit and gain. We must be aware and mindful of the creation and distribution of such images, even if it seems to be of innocent intent.
- Survivor
What drives opposing viewpoints?
Aside from offenders who maliciously possess and distribute images and videos, there are two key perspectives that might oppose demands for online platforms to restrict the distribution of images that may violate the privacy of children.
Online platforms themselves may be concerned about the impact on user experience and the potential loss of users to other platforms without such restrictions. Additionally, the development and adoption of any new features may require financial cost, and companies may prefer to invest resources in other aspects of their businesses. There also may be concerns about the reliability of age estimation technologies that are used to help distinguish images depicting adults from those depicting children.
Proponents of unlimited free expression may see such restrictions as violative of users’ rights to engage in lawful behavior when the images in question are commonly not classified as illegal. Some parents or children who are motivated to build an online audience—whether for personal or commercial gain—might balk at actions that prevent social media content from “going viral” or otherwise being widely distributed.