CSAM (Child Sexual Abuse Material) on the Fediverse
/2024-03/session/3-c/
Convener: Jayne Samuel-Walker (@tcmuffin@toot.wales)
Participants who chose to record their names here:
- Emelia Smith (@thisismissem@hachyderm.io): IFTAS, Tech lead for their Content Classification Service
- Lexicaleigh (@lexicaleigh@toot.wales)
- Andreas Savvides (@andrs@mastodon.social)
- Marius (@fihu@norden.social): moderation team at norden.social
- Lillian Outlaw (@lillian@mastodon.iftas.org)
Tags / links to resources / technology discussed, related to this session:
- https://github.com/iftas-org/resources/tree/main/CSAM-CSE
- https://www.w3.org/wiki/ActivityPub/Primer/Detecting_and_Reporting_CSAM
- https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media
- https://github.com/facebook/ThreatExchange/
Convener notes and resources
What is CSAM?
- Child Sexual Abuse Material
- AKA Child “Pornography”
- SSA (Sibling Sexual Abuse)
Extent of CSAM
- Stanford report on the extent of CSAM on the Fediverse: https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
- Perpetrators
- In the USA
- 60% of CSAM offenders in FY 2019 were related to or otherwise maintained a position of trust over the child.
- The majority of those possessing and distributing CSAM also commit hands-on sexual offenses against children.
- In about four out of 10 cases, there was more than one victim, ranging from two to 440 children.
- Where it was possible to determine gender, about 93% of offenders were male and 7% female.
- Victims
- From ECPAT (https://ecpat.org/wp-content/uploads/2021/05/TOWARDS-A-GLOBAL-INDICATOR-ON-UNIDENTIFIED-VICTIMS-IN-CHILD-SEXUAL-EXPLOITATION-MATERIAL-Summary-Report.pdf)
- 56.2% of cases depicted were prepubescent children.
- 25.4% were pubescent children.
- 4.3% were very young children (infants and toddlers).
- 14.1% of cases featured children in multiple age categories.
- The younger the victim, the more severe the abuse was likely to be.
- 84.2% of videos and images contained severe abuse.
- From ECPAT (https://ecpat.org/wp-content/uploads/2021/05/TOWARDS-A-GLOBAL-INDICATOR-ON-UNIDENTIFIED-VICTIMS-IN-CHILD-SEXUAL-EXPLOITATION-MATERIAL-Summary-Report.pdf)
- In the USA
Jurisdictions
- Global
- United States
- Rest of the Americas
- European Union
- Interpol
- Rest of Europe
- UK
- https://assets.publishing.service.gov.uk/media/5fd78f5fe90e07662c96cf73/1704__HO__INTERIM_CODE_OF_PRACTICE_CSEA_v.2.1_14-12-2020.pdf
- https://www.police-foundation.org.uk/2022/07/turning-the-tide-on-online-child-sexual-abuse/
- https://nationalcrimeagency.gov.uk/what-we-do/crime-threats/child-sexual-abuse-and-exploitation
- UK
- Asia
- Oceania
Prevention
- Protecting children from sexual abuse: https://learning.nspcc.org.uk/child-abuse-and-neglect/child-sexual-abuse
- Key Words/Phrases
- Blocklists
Detection
- Human
- Moderators are volunteers
- Moderator Support
- IFTAS - Moderator Support: https://about.iftas.org/activities/moderator-community/
- “Tall Poppies”: https://www.tallpoppy.com/
- Instance members’ reports
- Automated
- https://www.thorn.org/about/
- Key Words/Phrases
- LLM (Large Language Model)
- Often described as AI (Artificial Intelligence)
- Privacy issues
- Hybrid
- The “Dark Web” and image obsfucation (e.g. steganography)
Resources
- IFTAS - Independent Federated Trust and Safety: https://about.iftas.org
- https://protectingchildren.google/intl/en_uk/#introduction
- https://protectingchildren.google/intl/en_uk/tools-for-partners/
- Reporting CSAM
- https://report.iwf.org.uk/org/
- https://www.childline.org.uk
- https://www.samaritans.org
- https://crimestoppers-uk.org/give-information/forms/give-information-anonymously
- https://report.cybertip.org
- https://takeitdown.ncmec.org
- https://www.bravemovement.org
- https://www.protectchildren.ca/en/programs-and-initiatives/survivor-advocacy-groups/
Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps:
I will be hugely grateful for anyone who takes notes, as I’m still on holiday in west Wales and working off my iPad…so thank you 🙏
This session is a follow-up from Jaz’s session on IFTAS and I’ll be focussing on CSAM (Child Sexual Abuse Material) in the Fediverse.
Different laws for different jurisdictions
CSAM detection as described by Emelia Smith:
- Fedi instances allow IFTAS to scan their own media content via Webhooks, our software then create hashes for each media file.
- Hash comparison handled by Thorn.org. Media files never leave IFTAS, only hashes are transferred to Thorn.
- Cryptographic hashing is easy, but can be circumvented by simply rotating, warping or color-changing the image, or changing metadata on the image.
- Perceptual hashing algorithms are the only way to reliable detect content, as media may be cropped, colourised or decolourised, etc using other programmes.
Downside: Perceptual hashing is cpu intensive and only works against known material.
There is also LLM / ML based models for identifying CSAM and related content, these often require significantly more computing power, and also require significant tuning to ensure they work as expected without too many false positives. db0 (fediseer, haidra) does have software for pict-rs for lemmy to detect CSAM this way.
StopNCII.org: Stop Non-Consensual Intimate Image Abuse — not part of the Thorn Safer datasets, uses the Meta perceptual hashing algorithms
Mastodon: currently no easy way to integrate detection due to inaccessible image upload process.
Lemmy: implementing detection with PictRS
Pixelfed: in talks about implementation
PhotoDNA, implementable on own servers, required a license from microsoft for using PhotoDNA’s libraries, also requires the database of hashes, which are usually tightly controlled..
Reporting UI is usually not great, often unable to classify reported media as CSAM. CSAM reports > media should be blurred, greyscales, and/or sepia tones, and only visibile after moderator interaction, as to shield your moderators from harmful content.
As a lot of Mastodon instances are based in Germany, any CSAM you find you must delete straight away as you can be arrested and charged by law enforcement if you hold it, even if thats for the purposes of reporting to law enforcement.