country flag

Briefly: App Store Liability

Introduction to the issue and NCMEC's position

What is it?

To facilitate the sale and distribution of software applications (“apps”) to users of modern computing and communication devices various companies have launched online services generally referred to as “app stores” that enable developers to make their apps available to consumers. App stores typically have guidelines related to acceptable app features and design, yet apps that facilitate certain types of child sexual exploitation are sometimes approved for distribution.

NCMEC's Position:

App stores that distribute or facilitate the distribution of apps it knows or reasonably should know violate the app store’s policies by uniquely enabling technology-facilitated child sexual exploitation should be subject to liability for harm caused by these apps.

Why does it matter?

Children are sexually exploited by offenders using many different online platforms, apps, communication services, and devices. There are multiple points at which responsible actions should be taken by companies to better protect children from exploitation, including children who are not users of the companies’ products.

Operators of app stores not only create the marketplace for consumers to download and purchase various apps; they also position themselves as gatekeepers between app developers and consumers. That gatekeeping function includes creating and enforcing policies and standards about what types of apps are permitted for users of certain ages. When app store operators distribute apps they know or reasonably should know violate their own policies relating to child protection, they should be subject to shared liability for resulting harm .

The issue in question here is narrowly focused on the responsibilities of app store operators. NCMEC has addressed civil liability for online platforms more broadly elsewhere.

What context is relevant?

Users can access a variety of services on various communication devices—including desktop and lap computers, smartphones, tablets, etc.--through both first-party apps (developed by a device manufacturer or operating system developer) or third-party apps (developed by another company). To facilitate distribution of apps, companies have created platform-specific services, portals, or stores. These include Google Play Store, Apple App Store, Amazon Appstore, Microsoft Store, and others. Each of these stores have published policies specifically prohibiting apps that include certain types of harmful content or functionality.

Collectively, millions of apps have been removed (after previously being approved) or rejected (upon initial review) by app store operators for violations of the stores’ policies. Apple’s App Review Guidelines prohibit “apps that may include pornography or be used to facilitate prostitution, or human trafficking and exploitation” (section 1.1.4) or “apps that solicit, promote, or encourage criminal or clearly reckless behavior…” (section 5). Google’s Play Console Child Endangerment policy prohibits “use of apps to promote predatory behavior towards children, such as…sexualization of a minor…”

Yet some apps that violate clear policies prohibiting exploitative, predatory, or harmful conduct towards children seem to have escaped timely review and removal, despite the presence of app store policies and review practices that should have prohibited the apps from ever being offered to the public.

In 2023, police in Spain investigated circumstances in which 20 girls were victimized by the creation and distribution of images that depicted them unclothed that were generated by an app using artificial intelligence to modify existing innocuous images. Such an app would seem to violate clear policies of major app stores. In 2024, Apple and Google each reportedly removed multiple apps after news reports of advertisements specifically promoting the apps’ ability to generate non-consensual “nude” images of adults and “nude” images of children, a use that violates the policies of the app stores.

What does the data reveal?

What have survivors said about it?

Survivors have expressed interest in seeing app store operators subject to liability in certain circumstances. Just as a restaurant or bar may be subject to liability for inappropriately serving alcohol, such as to patrons who are underage or already visibly intoxicated, app store operators should also face liability for distributing or facilitating the distribution of apps that the app store knows or reasonably should know violate their policies by enabling technology-facilitated child sexual exploitation.

While clear violations of existing policies are easily discovered, the lack of comprehensive policies, publicly available transparency reports, or enforcement mechanisms can conceal additional problems. Therefore, survivors have called for minimum standards for app store policies (e.g., requiring age verification/assurance measures at the store interface, rather than relying solely on parental controls set on the device).

Opening Quote

Tech companies that choose to operate app stores should be held to the same standard of responsibility as any other company offering a product or service. When apps violate store policies and children are victimized as a result, the responsibility of those crimes does not end with the offender. Tech companies should not be able to profit off of apps distributed via their stores and then avoid liability when those app profits are generated by the abuse and exploitation of children.

- Survivor

What drives opposing viewpoints?

Major app store operators have policies and practices in place to screen submissions. Transparency reporting indicates that many app store operators have taken action against millions of apps for violating their policies. These operators might argue that whatever violators escape rejection or removal are rare exceptions, and existing good faith efforts to prevent violating apps justify shielding operators of app stores from liability. Some also believe that responsibility for apps that exploit children should fall entirely on the developers of the offending apps and/or the individuals misusing them to cause harm.