News

WhatsApp groups for child sex abuse videos are created daily: Indian cyber specialist

Nitish Chandan, a cyber security specialist, says that despite his shocking revelations about child sex abuse propagated through WhatsApp, measures to stop it are still lacking.

Written by : Geetika Mantri

A few months ago, Nitish Chandan, a cyber security specialist, revealed shocking observations from his investigation into WhatsApp groups where child sexual abuse material (CSAM), that is, media involving sexual abuse of minors, is shared. He reported the same to the Ministry of Home Affairs (MHA) and WhatsApp too. And while the apps disseminating links to join these WhatsApp groups have since been removed by Google and Apple from their app stores, Nitish has found in a follow up study that WhatsApp groups such as these are still being created, and continue to be easy to access.

Nitish, founder of The Cyber Blog India and project manager of Cyber Peace Foundation, tells TNM that while the apps themselves have been removed, the apk version of their files are still available online, just a web search away. This essentially means that one can download the apps on their phones using files with the .apk extension available online. And on the apps where invite links for such WhatsApp groups are available, CSAM is being sought and solicited openly, at least going by the group names and descriptions.

“Some of the groups are spam that solicit people to join with links to CSAM in their group description itself. These links are mostly found loaded with malicious content and not CSAM,” the report says. Nitish says that this malicious content could be pornography websites, links that trigger forced downloads, malware, enable phishing etc.

That said, most of these groups have display pictures that show obscene or sexual activity, often involving minors. In some of the screenshots Nitish shared with TNM, the groups are openly named “smallgirl xxx”, “I have young porn”, “lovely kid” (which has a display picture of what looks like a child being sexually assaulted), and “small young xx” (which has on display a collage of screenshots of pornographic and/or CSAM videos). The description of another group clearly mentions “categories” like “sister uncle”, “schoolgirl” and “father anti sex”.  

What is alarming is that some of the groups get to their full capacity of members as soon as they are created, meaning that even if CSAM is not shared on it ultimately, it is being openly sought by hundreds of people.

For instance, Nitish joined one group for the purpose of this investigation. The admin kept demanding for the members to send “CP” (child porn). “When people did not respond to him on the group, I think he started messaging them personally, including myself. When I asked him why he is asking for CSAM only, he said, ‘yehi chahiye bas mujhe’ (this is all I want),” Nitish narrates.

Then, to prove that he had CSAM material that he would share on the group, he sent a screenshot of his phone gallery which showed possible minors engaged in sexual activity.

“Many of these groups lay emphasis on the fact that only “Child Pornography” videos are to be shared and no other links or anything. The disclaimers in group descriptions say that if someone doesn’t send a specified number of videos daily, they will be removed from the group,” Nitish’s report says. “There are several groups that also solicit physical contact with both children and adults at a price with things like coverage areas, timing etc.”

Nitish says that he has reported these groups to the MHA, National Commission for Protection of Child Rights (NCPCR) and WhatsApp about 50 days ago. While he has received acknowledgement from all three, many of these groups are still active. “These are separate from the groups that are coming up every day. And while there are many groups being revoked and deleted regularly also, it’s difficult to say if it’s because authorities are cracking down on them. Many times, the admin of the group themselves change the invite link of the group, or change its name to say that it’s being deleted. They may then create a new group with more select members who have actively shared CSAM,” Nitish observes.

Potential solutions and challenges

According to Nitish, there are tools and technologies available to screen groups based on their profile photos and names, considering that conversations on WhatsApp are encrypted. “We can start there at least because the group icons and names themselves are a giveaway. There is tech that can be programmed to, say, flag content as inappropriate based on the amount of skin being shown in the display picture along with certain words. This, along with manual screening can at least be a starting point to stop this,” he says.

“Further, the WhatsApp group invite links that these apps show may not always be actually within the app. The apps may be pointing to links posted on websites, forums etc. and can be from multiple sources as well,” says Nitish, explaining why it may be difficult to eradicate these groups fully by just removing one link.

When asked if there was any difference from the time of his last investigation, apart from the apps providing these group links being removed from app stores, Nitish says that it’s difficult to say. “There is no baseline for this sort of activity, so we cannot say if it has increased or decreased. But what is evident is that such groups are being created, being sought and deleted daily. Law enforcement is also facing an issue because they either cannot get into these groups as they have reached full capacity, or are kicked out when they don’t share CSAM.”

What is WhatsApp doing?

WhatsApp presently does scan the unencrypted information like display picture and the group information to identify imagery that is child exploitative. It also uses a common database of this harmful content that relies on a technology called PhotoDNA.

If a user or group profile draws a match from the PhotoDNA database, WhatsApp bans the uploader and all group members. Similarly, if a profile photo is reported, it is added to the database and the account is banned and its information is reported to the US-based National Center for Missing and Exploited Children (NCMEC) for their future coordination with law enforcement.

It appears that banning the offending users is the highest penalty that WhatsApp imposes, and it only reports offenders to NCMEC, and not local law enforcement. However, in February this year, India signed an MoU with the US which would allow the National Crime Records Bureau to access reports available with NCMEC in order to remove CSAM and child exploitative imagery.

According to its website, “PhotoDNA creates a unique digital signature (known as a “hash”) of an image which is then compared against signatures (hashes) of other photos to find copies of the same image. When matched with a database containing hashes of previously identified illegal images, PhotoDNA is an incredible tool to help detect, disrupt and report the distribution of child exploitation material.”

PhotoDNA was developed by Microsoft in partnership with Dartmouth College. It was then donated to the National Center for Missing & Exploited Children in the US. It is used by several organisations to combat child exploitation.

When TNM reached out to WhatsApp over Nitish's report, a WhatsApp spokesperson said, "WhatsApp cares deeply about the safety of our users and we have no tolerance for child sexual abuse. We rely on the signals available to us, such as group information, to proactively detect and ban accounts suspected of sending or receiving child abuse imagery. We have carefully reviewed this report to ensure such accounts have been banned from our platform. We are constantly stepping up our capabilities to keep WhatsApp safe, including working collaboratively with other technology platforms, and we'll continue to prioritize requests from Indian law enforcement that can help confront this challenge."

(Screenshots courtesy Nitish Chandan)

(This story has been updated with WhatsApp's response.)

Gautam Adani met YS Jagan in 2021, promised bribe of $200 million, says SEC

Breaking down the Adani bribery allegations: What the US indictment reveals

Bengaluru: Church Street renovations spark vendor frustration and public debate

‘Nayanthara: Beyond The Fairytale’: A heartfelt yet incomplete portrait of a superstar

The Maudany case: A life sentence without conviction