Legal definitions
In India, the definition of a “child” or “minor,” is tied to specific statutes and varies greatly between those statutes. Depending on the statute, a “child” or “minor” is defined with an upper age limit of 14 to 21 years. The legal age of consent for sexual activity is 18 years.
“Child pornography” is clearly defined—inclusive of computer-generated or modified CSAM—with a reference to “sexually explicit conduct,” which is not defined. “Child sexual exploitation,” “child sexual abuse,” “enticement or grooming,” and “sextortion” are not defined, but related conduct and concepts are addressed through other laws, largely within a broad “sexual harassment” provision.
Regulatory requirements/recommendations
In India, an online platform—called an “intermediary”—is required to make reasonable efforts “to not host, display, upload, modify, publish, transmit, store, update or share” certain information, including child sexual exploitation content. Some intermediaries (meeting certain feature and user base thresholds) are further required to “endeavour to deploy” tech tools to detect known or previously classified child sexual exploitation content and other illegal information. Those intermediaries are also required to use human oversight of automated tools. Persons, including intermediaries, are required to report exploitative material to the police. Intermediaries have the additional requirement to provide authorities with “necessary” materials and information relating to the source and device details from which the CSAM originated.
Further, intermediaries have a responsibility to remove CSAM material after being notified by a court or other authority. Intermediaries must remind users of these obligations through their user agreements and at least once a year.
Online platforms are not required to use any specific technology to detect, block, or remove CSAM.
Age verification requirements/recommendations
Online platforms generally are not required to implement any method of age verification before a user can access their services. Online gaming and/or gambling companies, as well as other “regulated entities” are required to implement age verification requirements under certain financial sector “know your customer” regulations when access or services are based on financial payments.
Parental consent requirements/recommendations
“Data Fiduciaries” are required to obtain verifiable parental consent in order to process a child’s personal data. Similarly, online real-money gaming platforms are required to obtain parental consent. However, there appears to be no legal requirement or recommendation that all online platforms implement methods to obtain parental consent.
Legal remedies for child victims
There are legal remedies available to child victims in India. Intermediaries are required to actively remove or disable access to CSAM within 24 hours of the initial report or 36 hours of official court notice and make users aware of their obligation through the platform’s rules and regulations. “Significant” social media online platforms are required to use technology to proactively remove exploitative content.
Victims may apply for temporary injunctions or other preliminary actions, to be granted at the court’s discretion. Courts can issue other orders, such as protection orders, as they deem necessary. There is also a special judicial process for online crimes.
Child victims may seek compensation from offenders in addition to criminal penalties, or the court may seek such compensation without needing an application from the child. Victims may also apply to the state for compensation if authorities cannot identify an offender. Compensation rules vary by region. Child victims and their parents or guardians are entitled to notification by authorities about topics including available support, offender arrest, procedural steps, scheduled court proceedings, and others.
"Safety by Design" requirements
Online platforms are obligated to implement some Safety by Design standards in India only if the online platform is a “significant” social media platform or an online gaming platform. The platform must implement these standards prior to its launch.
If the mentioned online platform does not follow these standards, the platform may be liable for non-compliance. Online gaming platforms have additional transparency requirements.