On the eyes of a crocodile: Text-based CSAM forums on the Darkweb

Written by: Carolina Christofoletti

Photo by Kelly Lacy from Pexels

When we think about Child Sexual Abuse Material (CSAM) related forums, it is a common belief that pages as such advertise, explicitly, their criminal nature on the Darkweb. Even if this comes to be a reality for some cases, setting apparently innocent places as a meeting point is, sometimes, perceived as a much cleverer strategy by criminals. The Italian Mafia used to meet their peers in expensive restaurants, and child sexual abusers will meet peers in places where the requirements of a search warrant are expected to be “more blurred”.

The present analysis, conducted together with Law Enforcement Authorities is, exactly, one of those cases: Textual forums where the prohibition of posting anything related to CSAM is set as an irrevocable rule. Moreover, except for some profile pictures, children’s photographs do not appear anywhere. In the profile case, explicit images there will be immediately removed by the forum’s moderators. Not because they disagree with the images, but simply because, as the moderators themselves have stated, this creates a security risk for everyone there. Things were meant to pass, unperceived, through crawlers and through human eyes: Everything there is disguised. No explicitly cover, no crawlable commentaries, nothing. As another user suggest somewhere else, maybe this is why the forum has been kept active while many others have been already closed.

And, if investigators have doubt on that, the administrators will point them out to the forums rules: Posting “CSAM is a violation of the forum’s Terms of Service”. Adversaries, one member posts, are here and everyone must watch out.

Photo by cottonbro from Pexels

The case is the perfidious interception between the Surface and the Darkweb where criminals “update” each other about what is going on. For the purpose of this study, only the public part of the forum was analyzed – for that is, for a series of purposes, exactly where things matter the most: The forum was built upon this “innocent” intelligence.

Maybe, the most interesting part of this case study is, exactly, that this seems to be a place where criminals meet each other for guidance (or de-sensibilization) purposes. And, whilst some of the sections appearing in this particular forum are also common to CSA/CSAM forums where CSAM is being actually traded, the targeted forum seems to have a much more socializing purpose than what is usually seen in the last ones.  

Rather than posting pictures, the purpose of the targeted forum seems to be “keeping the communication active”. Most important seems to be that “members” keep constantly interacting with each other- and in a friendly manner. Curiously, the “friendless” rule is mentioned somewhere else as being, simultaneously, also a security rule.

Moderators in this specific textual forum have, apart from the duty of acting as active threat intelligence for such a Darkweb place, the additional task of keeping the “atmosphere” passive. Do not post “anything offensive, please.” 

And this is also an interesting point: Like any social media, CSAM related forums also have moderators- a position that belongs, usually, to the “high hierarchy” members or to the forum founders. Moderators are selected by the Administrators, which must be especially trustworthy – as warned somewhere else, all private messages and navigation history is being seen by the administrator. 

Photo by Junior Teixeira from Pexels

Already this feature in a Darkweb environment seems to be a weird one. Except for the fact that, in the limit, Darkweb pages are also programmed as Surface Web pages are. 

Furthermore, while there is also a private part of the website to be seen, access to that is conditioned to a pre-approved application. Except for the age and gender preference stamped at the member’s profile, all other information must be kept anonymous. On this behalf, it calls immediate attention to the number of those forum members whose age variation stamped on their profiles go down and down, so to reach, also, very, very young children.

As one user clearly states, “membership requests must go through an approval procedure” – which is, he(she) says, pretty much like a job interview. There must be a cover letter. How is the member’s adequacy, according to the purpose of the forum, proved? Among others, by the fact that one is able to write the cover letter without any identifiable information.

Some metrics relating to the public and the private part of the website are, additionally, to be publicly seen. Rejected accounts are asked not to post anything on the public part of the forum. The forum is expected to have thousands of unique visitors daily.

Applications will be reviewed under the “Effort Rule” and everything must be done in the forum’s conventional language, even though the audience is international.  

As one of the moderators said, “information leaked at the cover letters means a security threat for the forum as a whole”. Rejected applications will be blacklisted and deleted accounts are not allowed to send an additional membership request.  

What to do with the text-only forums in terms of Internet Governance? What to do with the forums with no explicit CSAM pictures? The two legal cases of the decade, where escalation grows under open, surfaced eyes.