News

Little awareness, lax policies: Why child sex abuse photos, videos persist online

It’s clear that one no longer needs to go on the dark web to solicit and share child sexual abuse material (CSAM).

Written by : Geetika Mantri

Last year, Facebook said it removed 8.7 million sexually exploitative images of children in three months. In May this year, Twitter announced that it suspended 4,58,989 accounts for violations related to child sexual exploitation on its platform. And six months ago, WhatsApp said it had removed 1,30,000 accounts in 10 days disbursing child sexual abuse material, often referred to as child pornography.

It’s clear that one no longer needs to go on the dark web (part of the World Wide Web which is not indexed, and therefore cannot be accessed using usual browsers) to solicit and share child sexual abuse material (CSAM). In fact, India is one of the biggest contributors and consumers of CSAM, even though it’s completely illegal.

Creating (real or simulation) as well as storing CSAM for commercial purposes is illegal under the Protection of Children from Sexual Offences (POCSO) Act (sections 13-15). Further, the section 67B of the Information Technology Act bars publishing as well as sharing material depicting a child in a sexually explicit act in electronic form. Browsing, downloading, advertising, promoting, exchanging and distributing such material in any form is also prohibited under the Act. The maximum punishment for these offences is imprisonment of seven years.

So, what causes people to openly seek out and share CSAM?

Technological intervention

Nitish Chandan, a cyber security specialist, points out that while there may be difficulties in zeroing in on the age of older children in CSAM on these social media platforms, there is tech and AI that companies are using to identify and remove CSAM especially when it comes to younger children. Further, Siddharth Pillai of Aarambh, an online portal which works on online safety and combats CSAM online, adds that ambiguity rises when the thumbnails on videos or images are merely suggestive. “When it comes to explicit CSAM or imagery featuring children, most platforms have a zero tolerance policy and respond to reports promptly,” Siddharth observes.

TNM has spoken in the past with WhatsApp and Telegram about the issue of CSAM on their platforms. Telegram has a dedicated option for users to select if they want to report a group for having CSAM. On the other hand, WhatsApp, which does not give the option of selecting the reason for reporting to users, does use a technology called PhotoDNA like its parent company, Facebook, to identify sexually exploitative imagery of children.

According to its website, “PhotoDNA creates a unique digital signature (known as a “hash”) of an image which is then compared against signatures (hashes) of other photos to find copies of the same image.” Essentially, WhatsApp scans the unencrypted information like the display picture and the group information to identify imagery that is child exploitative. If a user or group profile draws a match from the PhotoDNA database, WhatsApp bans the uploader and all group members.

Similarly, if a profile photo is reported, it is added to the database and the account is banned and its information is reported to the US-based National Center for Missing and Exploited Children (NCMEC) for their future coordination with law enforcement. This February, India signed an MoU with the US which would allow the National Crime Records Bureau to access reports available with NCMEC in order to remove CSAM and child exploitative imagery.

However, Nitish points out that there are not as many uploads to the PhotoDNA database from India as there are from Western countries like the US. “Here, if online CSAM is discovered, law enforcement still does not have a mandate to consult or upload to PhotoDNA. Neither does India have its own repository of images.”

Further, if any of these platforms find CSAM or child exploitative images, there is no mandate for them to find out if it originated in India or even inform Indian law enforcement about it, Nitish adds.

Demand for CSAM due to lack of awareness, deterrence

Siddharth and Nitish both agree that like everything else, CSAM continues to persist because of its high demand. In his research, Nitish has found that people don’t even bothered to mask their numbers or profile photos while asking for “cp” or child porn on WhatsApp and Telegram groups.

It appears that people are unaware that soliciting, downloading and storing CSAM is also illegal. “If people were aware, they wouldn’t be using their real numbers starting with +91, and displaying their faces and personal identifiers in the display photos. It’s likely that they feel they aren’t doing anything wrong by downloading, viewing or forwarding because they are not harming the child,” Nitish says.

Further, the demand for CSAM is very real. Aarambh, which allows people to report CSAM on its website, found that in 70% reported cases, videos clearly featured children below 10 years. “There was no longer an excuse that the kids looked older, and people could have mistaken them for being above 18,” Siddharth says.

We also know very little about the identity of the people who are creating the demand for CSAM, he adds. “Who are the ones creating and sharing most CSAM? Who are the ones demanding it? Are they aware of the law? We know very little,” Siddharth observes. Nitish believes that understanding who these people are could help, but it would have to be a government commissioned investigation and study which involves the law enforcement. That being said, simply raising awareness is not enough – there has to be deterrence, he argues.

“In the US, once the NCMEC is informed of a user who is sharing CSAM, they inform the local law enforcement and then action is taken. Why can’t we have something like that here?” he says. Through this tip, if, say, the law enforcement decides to investigate one group sharing CSAM, even if the action taken against those unaware that soliciting and downloading CSAM - as opposed to those creating and providing CSAM - is not very harsh, the news of the same would make people aware, and act as a deterrent.  

It also doesn’t help that companies’ only focus is weeding out CSAM and child exploitation content from their domains. “All that happens is that the platforms remove those images and videos and/or suspend those accounts. But where’s the accountability and deterrence there? People can simply move on to other platforms which offer more anonymity, more privacy. Cleaning up one website is not an answer,” Siddharth argues.

Privacy vs intervention

A debate that keeps surfacing on issues such as these is the importance of maintaining digital privacy vis-à-vis government intervention to address social issues such as CSAM. Especially with WhatsApp, Telegram and other apps providing end-to-end encryption, it is impossible to intercept CSAM content in the messages unless it’s reported by a third party.

While there are no easy answers to this, Nitish says that if there has to be government intervention, there has to be due process. “The question of the trade-off arises because the government is far more powerful. But if interception has to happen at some point, the questions to consider are the probable causes for it? Which agencies can be authorizsd? And what is the process and circumstances under which it can be done?”

Gautam Adani met YS Jagan in 2021, promised bribe of $200 million, says SEC

Activists call for FIR against cops involved in alleged “fake encounter” of Maoist

The Jagan-Sharmila property dispute and its implications on Andhra politics

The Indian solar deals embroiled in US indictment against Adani group

Maryade Prashne is an ode to the outliers of Bengaluru’s software gold rush