Stopping CSAM

Stopping CSAM

A Primer for Financial Institutions

STOPPING CSAM

What is CSAM?

Child Sexual Abuse Material, or CSAM, is a visual manifestation of the sexual abuse and exploitation of children. CSAM is widely distributed online. In 2022 alone, more than 32 million reports of suspected child sexual exploitation were reported in the United States by online platforms to the National Center for Missing & Exploited Children’s CyberTipline, and that number is growing yearly. Those 32 million reports included 88 million images and videos of CSAM.   

CSAM can be found in virtually any online venue – from email to social media platforms to on-line gaming sites. The US now holds the (disgraceful) distinction of hosting more CSAM than any other country in the world. The MIT Technology Review attributes the rapidly growing CSAM problem in the US to multiple factors, including the U.S.’s sheer size; the fact that it is home to the highest number of data centers and secure internet servers in the world, creating fast networks with swift, stable connections that are attractive to CSAM hosting sites; and the imbalance between the vast scale of CSAM and the resources dedicated to stopping it. The U.S.’s ranking aside, CSAM remains a global problem as is evident in Polaris’s Global Modern Slavery Directory.

CSAM isn’t just about videos or images. These videos and pictures depict actual crimes being committed against minors – children who have yet to reach the age of consent. Children who are victimized over and over again each time a video or image is shared. Children who may never be identified as victims because CSAM is a highly under-reported crime. Children who may suffer from the victimization throughout their lives with physical and mental health problems, from substance abuse and other risky behaviors, from homelessness and from other forms of abuse. Who are these children? Most studies, including this which was conducted in cooperation with the National Center for Missing and Exploited Children, conclude that the children are predominately female. Another study from EPCAT International and Europol showed:

This same study concluded that 84.2 percent of videos contain severe abuse and the younger the victim, the more severe the abuse is likely to be. Another study found that nearly 40 percent of CSAM users have sought direct contact with children after viewing illicit images, further extending the criminality of the original act.

If you are sickened or outraged by this and afraid for the children who are or will become victims, then think about what you can do to stop CSAM, or at least curtail its distribution. This document provides some resources – by no means exhaustive – to get you started. Be forewarned that some of the information included herein is very disturbing.

Understanding the terminology

There are many terms and acronyms used to discuss CSAM and related crimes – Grooming, Revictimization, CSE, NCII, and CST, to name but a few. Two very helpful directories of these terms and acronyms are published by Thorn and its sister company Safer. Thorn was founded in 2012 by Ashton Kutcher and Demi Moore after a documentary about child sex trafficking in Cambodia led them to try to learn more about this problem. Through the research that followed, they concluded that child sex trafficking is as big an issue in the US as elsewhere and that technology plays a central role in child sex trafficking. Thorn’s mission is to “build technology to defend children from sexual abuse.” The Safer directory also provides information on the tools and methods used by technology firms to identify CSAM.

What is the relationship between CSAM and Money Laundering?

United States federal law considers child pornography as any visual depiction of sexually explicit conduct involving a minor (a person less than 18 years old) and includes it as a predicate crime to money laundering in 18 § 1956. The sexual exploitation that is the basis of CSAM is also included in the definition of Human Trafficking, one of the eight National AML/CFT Priorities identified by FinCEN in June 2021.

In September 2021, FinCEN issued a notice (FIN-2021-NTC3) to call attention to an increase in online child sexual exploitation (OCSE) during the COVID pandemic which was noted by multiple law enforcement agencies and was evident in SAR filings. Specifically related to SARs, FinCEN noted:

  • Between 2017 and 2020, there was a 147% increase in OCSE-related SAR filings, including a 17% year-over-year increase in 2020.
  • Convertible virtual currency (CVC) was increasingly becoming the preferred payment of choice for those accessing CSAM sites.
  • OCSE facilitators often attempt to conceal their illicit file-sharing and streaming activities by transferring funds via third-party payment processors.

The FinCEN notice also advised U.S. financial institutions on the proper filing of SARs related to CSAM. Other jurisdictions also require the reporting of suspicious activity reports for CSAM.

What are the Red Flags For Identifying Potential CSAM-Related Suspicious Activity?

Project Shadow, a public-private partnership co-led by Scotiabank and the Canadian Centre for Child Protection and supported by the Financial Transactions and Reports Analysis Centre of Canada (FINTRAC) with participation from major financial institutions and law enforcement agencies in Canada, has been a leader in educating the financial community on online child sexual exploitation and its criminal use of the financial system.

Through its research, which included analysis of the STRs submitted to FINTRAC, three distinct categories of exploitation were identified:

  1. Money laundering indicators related to possible perpetrators who are consumers and/or facilitators of online child sexual exploitation.
  2. Money laundering indicators related to possible perpetrators who are producers of online child sexual exploitation material.
  3. Financial indicators possibly related to online child sexual exploitation in the form of luring.

The following Red Flags were identified for each of the three categories:

Consumers and/or Facilitators of Online Child Sexual Exploitation

  • Payments to or funds received through or from payment processors, including ones that deal in virtual currencies.
  • Purchases on webcam/livestreaming platforms, including those for adult entertainment.
  • Purchases on dating platforms, particularly Asian dating websites or ones that also offer adult. entertainment content.
  • Purchases at adult entertainment venues and/or adult entertainment websites.
  • Payments to or purchases through a payment processor that specializes in serving high-risk merchants such as those in the adult entertainment industry—some of which appear able to conceal the merchant’s name.
  • Payments to a self-storage facility and/or for office rentals.
  • Purchases at multiple vendors of electronics, computers, and cell phones and/or payments to multiple cell phone service providers.
  • Purchases at a vendor that rents or leases computers and/or computer equipment.
  • Purchases at online gaming platforms and/or gaming stores.
  • Transactions to reload prepaid credit cards (particularly ones that deal with virtual currencies).
  • Purchases at online merchants.
  • Purchases of gift cards and/or payments made using gift cards.
  • Payments to or purchases through social media platforms, including ones that enable payment services through a payment processor.
  • Email money transfers that include a partial email address or reference with terms possibly related to child sexual exploitation.
  • Use of virtual currencies to fund a virtual currency account, convert funds and/or transfer funds to another virtual currency wallet, obtain a cryptocurrency loan or withdraw funds in cash.

Producers of Online Child Sexual Exploitation Material

  • Purchases at vendors that offer software for P2P platforms for P2P sharing of videos and images, including software to share hard drive content directly over the internet.
  • Purchases at vendors that offer software for capturing video from websites or other online platforms.
  • Purchases at vendors that offer Voice-over-IP communication services.
  • Purchases at domain registration/website hosting entities.
  • Purchases at vendors specializing in equipment or software for photography or videomaking.
  • Purchases at creator-content streaming websites (e.g., membership fees or subscriptions to these sites or payment of funds to other streamers on these sites).
  • Receiving funds from a payment processor and having a profile on a creator-content streaming website (particularly a creator-content website that includes adult entertainment content with a subscription-based channel model).

Online Child Sexual Exploitation in the Form of Luring

  • Multiple purchases for accommodations (hotel/motel/peer-to-peer accommodation rentals), particularly at venues in the individual’s own city or in a nearby city.
  • Use of separate email accounts to send or receive email money transfers.
  • Email money transfers sent to multiple females including minors.
  • Purchases at youth-oriented stores or venues (e.g., toy stores, children’s clothing stores, amusement parks, candy shops).
  • Purchases at vendors for cannabis/cannabis-related products and equipment and/or at pharmacies.
  • Payments to an online classified ad website.
  • Purchases at youth-oriented live online chat rooms.

An ACAMS infographic includes some of these same Red Flags as well as some additional indicia of suspicious behavior. Financial institutions that offer personal and business deposit accounts and/or provide payment services for individual and business customers should ask themselves whether they have deployed transaction monitoring scenarios that would identify these Red Flags.

CSAM users are often identified by technology providers and platforms that work closely with law enforcement and the organizations on the front lines of combatting CSAM. CSAM that is identified in the US is required by law to be reported to law enforcement. If you would like to learn more about the role technology plays in combatting CSAM, useful sources include the WeProtect Global Alliance, which brings together experts from government, the private sector and civil society to protect children from sexual exploitation and abuse online and the websites of major technology companies such as Microsoft, Amazon, Google, Meta and others which describe their own practices.

While it is tempting to think that organized crime or other known, bad actors are the perpetrators of CSAM, it’s important to understand that family members are often the primary offenders. For example, one report by the Canadian Centre for Child Protection based on a study it performed revealed that a family member was involved in 83% of abuse cases. That said, the perpetrators and users of CSAM are varied, as is evident in these US law enforcement cases:

CSAM and Crypto

As noted above, crypto currency has increasingly become the payment method of choice for accessing CSAM sites. An excellent study on Cryptocurrency and the Trade of Online Child Sexual Material by The International Centre for Missing & Exploited Children and Standard Chartered Bank was published in February 2021. This was a follow-up to a study conducted in 2017. The study contains case studies including one that ended with the US Department of Justice indicting a South Korean national who operated a dark web site exclusively devoted to CSAM. The site housed more than 200,000 unique video files. Users who opened an account with the website received a unique Bitcoin address. By the time the website was taken down (less than three years after it started operating), there were more than one million Bitcoin addresses hosted on the site’s server.

For financial institutions that provide services to crypto firms, the study is a useful reminder of the importance both of performing appropriate KYC on crypto exchanges and other virtual asset providers and on having robust transaction monitoring capabilities.

CSAM and the Metaverse

Not surprisingly, there are a growing number of reports about child exploitation in the metaverse, such as this one from the Cyberbullying Research Center. While steps, such as using AI to detect this exploitation and making it easy to report CSAM, are being taken, it’s important to remember that fighting crime in the metaverse is unchartered waters, in which the application of traditional laws and regulations may fall short of accomplishing their intended goals. It’s not unreasonable, therefore, to expect CSAM to thrive in the metaverse until legal and regulatory frameworks catch up with the virtual world.

Training and Awareness

As with any other type of money laundering, understanding what CSAM is and recognizing the Red Flags are keys to identifying and reporting it. As one of the crimes included in Human Trafficking, it should already be included in your AML training, but if it isn’t, add it now. Take advantage of resources identified in this paper and those provided by organizations and agencies such as:

Talk with your peer institutions and understand how they are addressing this challenge. There is an important role that financial institutions can play in stopping CSAM. Companies like Scotiabank and Standard Chartered – and others – are showing the way. Join them to help save the children.


About the Author:

Carol Beaumier is a Senior Managing Director in Protiviti’s Risk and Compliance solution. Prior to joining Protiviti, Carol was a Partner with Arthur Andersen where she led the Global Regulatory Practice; a founding member of The Secura Group and leader of the firm’s Risk Management practice; and a regulator with the Office of the Comptroller of the Currency, a bureau of the U.S. Treasury Department. An experienced consultant with more than thirty years of experience, Carol has extensive experience with numerous regulatory issues that affect multiple industries. She is a frequent author and speaker on regulatory and other risk issues. Carol is also Chair of the Washington, DC Chapter of the Association of Certified Sanction Specialists (ACSS) and a Member of the Advisory Council of the Anti-Human Trafficking Intelligence Initiative (ATII). It is Carol’s work with ATII that sensitized her to the magnitude of the CSAM problem and provided the impetus to develop this primer.

Note: This document is not intended as, nor should be viewed, as legal advice. While the organizations cited in the paper are all believed to be reputable and offer useful information, reference herein to an organization do not represent an endorsement of that organization. Readers should do their own due diligence in selecting their best sources.