Severe child sexual abuse material online has more than doubled since 2020

IWF Report

James Gerald Jr., 13, plays a game on a phone in his family’s house in St. John’s, capital of the island of Antigua, part of the country of Antigua and Barbuda.
UNICEF/UN0345667/LeMoyne

Highlights

Child sexual abuse material (CSAM) is one of the fastest-growing and increasingly complex threats to children’s safety in the digital world. As children spend more and more of their lives online, the opportunities it opens up are accompanied by threats. 

In 2022 alone, the Internet Watch Foundation assessed a webpage showing child sexual abuse imagery every two minutes. Their 2022 Annual Report is now out and reveals insight into the scale and dynamics of child sexual abuse material online. 

HERE ARE THE KEY FINDINGS: 

In 2022, the IWF investigated a total of 375,230 reports suspected to contain child sexual abuse imagery - an increase of 4% on 2021. Of these, 255,580 reports were confirmed to contain images or videos of children suffering sexual abuse. 

Category ‘ A’ abuse on the rise: IWF classifies the ‘severity’ of abuse, with Category A material containing the most severe kinds of sexual abuse. In 2022, IWF discovered more Category A material online than ever before. This means that the worst type of online abuse imagery is on the rise. Category B, the second most severe type, made up the highest proportion of total abuse

2022 continued to see a high proportion of ‘self-generated’ imagery. These are child sexual abuse images and videos created using smartphones or webcams and then shared online. In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves by someone who is not physically present in the room with the child. 

Age concerns: The age group 11-13 age was most common, with 7-10 growing to make up a third of all CSAM. Some of the most extreme sexual abuse involved the youngest children including babies and toddlers being subjected to acts including rape and sexual torture.  

In terms of gender: While girls’ image accounted for the vast majority of child abuse imagery (96 percent), there was a simultaneous 137% increase in the imagery of boys compared to the previous year. 

In terms of region: 66% of all CSAM was hosted in European countries. 

There is positive momentum building up toward action. IWF has reported that more companies than before are taking up services to clean up the internet of CSAM.

Governments and international leadership are taking action as well. Among the most promising is the European Commission's proposed legislation to tackle the growing spread of child sexual exploitation and abuse online. It aims to steer the technology platforms to detect, report, and remove CSAM. This will have a huge impact on protecting children, preserving children’s privacy, and reducing the devastating harm this content has on survivors of child sexual abuse and their families. 

14 organisations, including the End Violence Partnership, have launched a global advocacy campaign to support the European Commission’s proposal. 

The Safe Online initiative at End Violence is working to provide actionable data to tackle online violence and make the internet safe for children through significant investments and global advocacy. 

Its pioneering work on evidence generation, including the large-scale, multi-country Disrupting Harm (DH) project aims to understand the scope and nature of online child sexual exploitation and abuse and how existing national child protection systems are responding, and the partnership with Tech Coalition is investing in knowledge to help design products and services that keep children safe online. 

Author(s)
End Violence
Publication date
Languages
English

Files available for download