@fediforum@mastodon.social
@fediforum.org

Moderating the Fediverse: IFTAS Update, and Open T&S Conversation

/2025-06/session/7-a/

Convener: Jaz-Michael King (@jaz@mastodon.iftas.org, https://about.iftas.org)

Participants who chose to record their names here:

Notes

What we’ve had to stop: https://about.iftas.org/2025/03/03/iftas-service-shutdowns/

What we’re still doing: https://about.iftas.org/2025/04/08/staying-the-course-our-continuing-mission/

Question: Are there like tags? Like I don’t want hate speech, but I’m ok with XYZ?

Question: Can you demystify lolicon? Is there an element of Orientalism?

  • Cultural differences in Japan between what is classified as a minor and a youthful elf character.
  • Challenge of decentralized community servers hosting content outside the jurisidiction of its operation and what is legal on the server of origin and the server of receipt.
  • Server admins don’t want the legal liability of potentially hosting content that would be illegal in their country of operation.
  • Benefit of decentralized community servers is that local community norms can be applied.

How do IFTAS budgets work?

  • Use the money that comes in based on the survey results. $450k in funds utilized so far. $35k left.
  • 2 other grants required matching grants and couldn’t get them matched.
  • CSAM most expensive initiatives. Server costs and legal overhead. (~$650k budget)
  • Hoping to raise $1.2M

Post-mortem on IFTAS’s CSAM work: https://about.iftas.org/2025/03/27/content-classification-system-post-mortem/

Question: What about Matrix moderation?

  • 1/3 of activity was around federated moderation.

Useful links:

  • SWICG T&S https://github.com/swicg/activitypub-trust-and-safety
    • Working on things to allow federated moderation to occur. Need to be able to transfer moderation actions to each other (instead of just user content!). Make interoperability of moderation data.
    • Next activity likely to be a co-design of moderation tooling with the Bonfire team, join IFTAS Connect to participate!
    • Needs tooling to blur images, gives labels to help moderators know what they’re about to look at.
  • ROOST Moderation Tooling https://roost.tools/
    • Came out of Columbia University. Has much money and resources to build moderation tooling. Focused more on Big Tech. Heavily engaged with Bluesky. Hopefully some will be transferrable to ActivityPub. Worth following.

Question: So when you come across CSAM what should you do? I was on a matrix channel that got raided by trolls who posted the most horrendous CSAM that you could think of and I didn’t know what to do other than cry and call a hotline

Anca: Have you considered charging small subscription fees for the moderation tooling that you currently provide for free? (e.g. CARIAD, DNI)

  • Jaz: It was considered. Was hoping to have 3 years of civil society funding and then have other well-funded entities would support (Threads, Ghost).
    • Big companies do this themselves, so not interested in funding. Small players can’t pay.
    • RightsCon was full of stories of many organizations losing funding and struggling to survive.
    • Would rather see servers putting money into their moderators’ pockets than IFTAS’s.

Anca: Is the market really that interested in the safety of children?

  • Jaz: Regulators may not care about fediverse yet, but will at some point and you will be if you’re not safeguarding children.
  • Mastodon’s age check feature is inadequate for countries with age check requirements.
  • Regulators move slowly. Regulations move slower. Legislators move slowest. And yet, hard to keep up with all the rule changes. US has different state-level rules. Even with EU, different countries have additional requirements to the EU’s.
  • UK’s OFCOM has already put a few sites under enforcement.

Jeremiah: In another model, where individuals have servers they operate individually and not large communities, how does this threat model change? How far away are we from how email’s fairly good experience with limited spam/bad stuff and the fediverse?

  • Jaz: Another world where subscribing to filtering, like ATProto.
    • Bryan: Composable moderation system on Bluesky: hoping for specific organizations that focus on their specialty, like CSAM or spam, but not all. Too hard for one organization to do it all. End-user selectable, but smart defaults.
    • “The devil is in the defaults” - Dave Morin of Path, mentioned by Jeremiah
  • Jon: are there communities of practice/best practices bubbling up in AT Discord or other places that are not super visible publicly but happenining nehind the scenes
    • Bryan: Labels are still in progress. Labeler probably not the tool to be used for user facing. On community side, Bluesky can control its client and block bad moderators, acting as a gatekeeper for its client. Interesting research happening on how good moderators are in responding to incidents (how fast content gets taken down).

Let’s Talk! Anything trust & safety / integrity / moderation, or a few ice breakers below…

Admins/Mods: What moderation tools are you using right now, and what’s missing from them?

Admins/Mods/Hosts: What are you biggest bugaboos right now? Regulations? Harmful content? Growth?

Admins/Mods: When something harmful happens on your community, what’s your process? Who’s involved, and what slows you down?

Admins: If someone joined your mod team today, what would you hand them? Would they be ready for the job?

Admins/Mods: Have you ever needed help from another instance? How did you reach them, and what got in the way?

Mods: What’s the hardest part of your moderation role that no one talks about? What’s helped you cope, or what would help?

Admins/Mods/Devs: Have you ever had to handle something that felt bigger than you were prepared for? What kind of backup would have helped?

Devs: What moderation features or APIs have instance admins or moderators asked for but you haven’t built yet? Why?

Devs: How do you test new features for safety impacts, and who do you include in that process?

Jaz: IFTAS funding is drying up.

Jaz: Take the content and share it. So it lives on.