The National Commission for Protection of Child Rights (NCPCR) has written to the IT ministry and the information and broadcasting ministry, urging them to ensure compliance of the Digital Personal Data Protection (DPDP) Act, and especially provisions in the regulation that are aimed to protect children, by social media platforms.

This letter comes in the backdrop of a meeting the child rights body held with social media companies in August, as earlier reported by Moneycontrol. Platforms who had attended the meeting included X (formerly Twitter), Instagram, WhatsApp, YouTube, Snap, Reddit and Bumble.

Moneycontrol understands that the body has written two letters, firstly, one separately to the Ministry of Electronics and Information Technology, and the second one, to both the IT ministry and Information and Broadcasting ministry and social media firms.

In the second letter sent on September 20 that was also reviewed by Moneycontrol, the NCPCR said, "For the safety and security of children on platforms, the platforms must follow robust know your customer (KYC) procedures and sec 9 of DPDP Act which is an Act passed by the Parliament of India and we all have to maintain its sanctity.."

In the letter, the NCPCR has asked ministries and the social media platforms to submit an action taken report the commission with seven days of issuance of the letter.

The DPDP Act defines a child as someone below the age of 18 and Section 9 of the legislation mandates that such children have to be verified and parental consent will be required before processing their data.

Also read: Educational, health institutions may be exempted from restrictions on children’s data processing

This development comes, even as the IT ministry has reportedly "solved" the children's age-verification issue in the DPDP Rules. The rules are expected to be released for consultation by the end of this month.

While the DPDP Act, 2023, introduces a legal framework for the protection of personal data in India, the rules formulated under this Act will detail how the provisions of the Act are implemented and enforced.

What the letter said

The letter said that platforms are obligated to report instances of child sexual abuse material (CSAM) directly to local law enforcement, in accordance with Section 19 of the POCSO Act.

Platforms must obtain explicit parental or legal guardian consent before minors can engage in any online contracts, it added.

"Social media platforms must display disclaimers in English, Hindi and local/regional languages before showing any adult content...," the letter said adding that the disclaimers should also notify parents of legal responsibilities under the POCSO Act and Juvenile Justice Act.

Platforms are required to report data related to child exploitation cases to the National Center for Missing and Exploited Children (NCMEC), including image or video hashes and other metadata, for the period between January and June 2024, it read.

Children's age verification issue

On July 18, the Ministry of Electronics and IT (MeitY) held a meeting with social media platforms to discuss the ongoing challenge of identifying a foolproof method for verifying children’s age under the DPDP Rules.

Despite previous efforts to implement systems using Aadhaar or Digilocker, these methods were deemed impractical, according to sources.

During the meeting, it was decided that MeitY would not impose a specific method for age verification. Instead, the industry was tasked with developing and submitting their own solutions. Over the following two weeks, various submissions were sent in from industry representatives.

Last week, government sources confirmed that MeitY convened a review meeting on the DPDP Rules, where they finalised a mechanism for verifying children’s age.