According to many reports, the impact of the COVID-19 pandemic-induced lockdown significantly increased the spread of Child Sexual Abuse Material (CSAM) compared to the pre-pandemic level, with an almost 200% rise in CSAM removals for some platforms.
As a result, stricter regulations to protect children on the Internet are imposed to fight against child abuse or sexual solicitation of minors. In the coming years, platforms will have to be more efficient in removing illegal content; we have described the strategy of regulations to be introduced in the European Union here.
Underage users have become easier to contact online, develop relationships with, and groomed as potential victims of sexual exploitation. Online grooming tactics that predators exploit consist mainly in:
It seems that the most dangerous online spaces and content for underage are Online Chat Rooms, Unsupervised Accounts (preferred target: underage dancing videos), Accounts or Websites depicting (sexualized) underage models—usually wearing lingerie, bikini, or gymnastic outfits—
Despite moderation efforts, CSAM trade and coordination with organized discussions to promote and recruit other predators are rising. Content often circulates through encrypted links across various platforms and servers associated with passwords valid for a short time before expiration. While the phenomenon is global, we have observed coordination of one type CSAM in specific regions with coded languages in group names and online chats. In addition, fan pages that gather collections of pictures (underage in school uniforms, minor foot, underage models in bikinis) facilitate the connection between abusers and their trade materials.
An alarming trend has emerged in the creation of self-generated CSAM by underage users who sell their own sexual content online. But also disguised predators using emojis and hashtags of the “MAP: Minor Attracted People” phenomenon. (Umbrella term used to designate self-identifying pedophiles who have admitted that they are facing their disorders.)
Online safety for children has always been a primary concern of tech and social media platforms. Unfortunately, new tactics are constantly developed by bad actors to bypass moderation. For example, we came across Child Physical Abuse Material (CPAM), in other words, content depicting and glorifying underage physical abuse mainly on video-sharing platforms, where users and dedicated accounts encourage others to share this type of imagery (student punishment videos, dangerous challenges, minors school fight). Furthermore, we noticed that in most of the analyzed content, the underage victims seem close to their abusers (parents, teachers, schoolmates, and friends).
Increasingly, underage users engage in dangerous behaviors online, reflecting some of the risks they face in the real world. As a result, there is a stronger need for human and technological efforts that protect underage users’ safety.
Policy Research Lab
We recommend platforms to do any of the following
Note that every violation spotted through our investigation has been reported on respective platforms, which are legally obliged to collaborate with Law Enforcement and NCMEC for any content they assess as illegal.