country flag

Briefly: Online Platforms' Responsibilities

Introduction to the issue and NCMEC's position

What is it?

Online platforms operate across jurisdictions—most of which do not significantly regulate their conduct—and engage children from varied social, economic, linguistic, and cultural backgrounds. Some online platforms apply different policies, services, and practices relating to access to content, parental controls, and content moderation, among other items, based on where users are located. These differences are only partially based on different regional and local legal requirements and expectations.

NCMEC's Position:

Online platforms that allow children to create accounts or profiles, send/share content, or communicate with other users should offer certain features, including documentation in the user country’s official or common languages; simple mechanisms for reporting abuse/misconduct on the platform; accessible and effective notice and takedown procedures; and reasonably standardized safety measures across jurisdictions.

Why does it matter?

While specific risk factors may vary, children around the world are vulnerable to harms encountered or facilitated through online platforms. For some reasons that are clear, such as compliance with regulatory obligations—and perhaps other reasons that are less obvious, such as profit-seeking—online platforms sometimes make available or withhold certain safety-related features depending on the market.

For example, the default privacy settings of a user’s account can impact online safety. If a child’s online activity and friends/followers/connections lists are public—viewable by any other user—by default, predatory offenders have access to information that can be used to exploit children. Offenders’ use of such information has been frequently observed in cases of financially motivated sextortion. Visibility to a child’s online contacts gives offenders the leverage they need to threaten broad distribution of exploitative images, leading to emotional distress that has, in some cases, resulted to victims’ death by suicide.

Strictly private default settings—which online platforms can establish—for children’s online accounts could help mitigate that risk.

What context is relevant?

In an analysis conducted by Fairplay with ten other organizations, researchers found geography-based disparities in safety-related features offered by targeted online platforms.

Published in 2022, that study highlighted, for example, that platforms set default public or private visibility settings differently based on a user’s location. In three European jurisdictions evaluated, Instagram defaulted to “private” for the researchers’ newly created accounts created as 17-year-old users. In 11 other jurisdictions across the world (including one in Europe), the user was prompted to choose between public and private settings. Similarly, TikTok defaulted to “public” for new accounts in jurisdictions outside Europe. For those in Europe, users were prompted to either proactively select “private” or skip the selection, which then defaulted to “public.”

These differences were attributed to existing or anticipated regulatory requirements in the EU and United Kingdom, while online platforms were largely unregulated in the other regions. Fairplay criticized the general industry preference for public accounts as more aligned with companies’ commercial interests than the best interests of children, who would benefit from increased privacy and safety by default.

Topics with geography-based disparities assessed in that study include:

  • “age-appropriate experiences” being limited to certain jurisdictions
  • stronger data privacy practices in some jurisdictions than in others;
  • the availability of documentation and user interface elements in official languages of jurisdictions in which services are made available;
  • default privacy settings for young users in some jurisdictions but not in others;
  • minimum age requirements, highlighting internally inconsistent messaging about jurisdiction-specific rules; and
  • accessibility of online platforms’ policies, guidelines, and support features in only some official languages of jurisdictions where their services are available.

In Australia, the Online Safety Act included the establishment of Basic Online Safety Expectations (BOSE) and empowered the eSafety Commissioner to require reporting from online platforms about their progress toward compliance with the BOSE. Information has been submitted by several online platforms about a variety of online safety topics in response to eSafety’s regulatory demands for transparency reporting.

In 2022 and 2023, eSafety required reporting from designated online platforms about of their efforts to counter online child sexual exploitation and abuse. In 2024, eSafety initiated the first periodic reporting from selected online platforms, which are now required to report every six months about their progress implementing the BOSE. This regulatory framework applies only to the services as made available to Australian users, but the transparency reports can be useful in identifying where geography-based differences in online safety practices exist.

What have survivors said about it?

Survivors express confusion about why online platforms do not already standardize features and safety measures across jurisdictions. They also wonder about what other factors (gender, race, etc.) impact safety-related differences in how online platforms treat children.

Online platforms should not bury privacy settings, support articles, abuse reporting processes, or safety policies under cumbersome, multi-step navigation or with complicated legal jargon. To increase engagement and understanding of important safety information or processes, they should be presented in formats that are easy to find and relevant to users. For example, on a platform primarily intended for video interaction, safety and policy information should be presented to users in video as well as written documentation.

Opening Quote

The rules, regulations and responsibilities of [online platforms] should be standardized and there should be no differentiation from country to country.

- Parent of a CSAM Victim

This responsibility undoubtedly falls on online platforms: to guard youth with the best safety and privacy standards globally and across all services.

- Survivor

What drives opposing viewpoints?

Online platforms may prefer default public settings for user accounts for several reasons, such as promoting user interaction, aiding in content discovery, enabling data collection to bolster targeted advertising, expanding networks to grow the user base, and content monetization. Additionally, private accounts may require more resources to moderate and ensure privacy and security expectations are met.