Trust & Safety in the Fediverse + Protocol Issues
/2024-09/session/1-b/
Convener: Emelia Smith (@thisismissem@hachyderm.io)
Participants who chose to record their names here:
- Jaz-Michael King (@jaz@mastodon.iftas.org)
- Andreas Savvides (@andrs.svds@threads.net)
- Crash (@crash_thepose@kolektiva.social) (moderator)
- James Theophilos (@jat823@aoir.social)
- Jennifer Moore (@jenniferplusplus@hachyderm.io)
- Simon Blackstein (@sblackst@threads.net)
- Jon Pincus (@jdp23@blahaj.zone)
- Lexicaleigh (@lexicaleigh@toot.wales)
- Cassandra (@cassandravert@indieweb.social)
- Evan Prodromou (@evan@cosocial.ca)
Notes
We approved the Trust & Safety taskforce at the last SWICG meeting, the meeting notes can be found here: https://www.w3.org/wiki/SocialCG/2024-08-02.
Social Web Incubator Group convened T&S taskforce: https://lists.w3.org/Archives/Public/public-swicg/2024Aug/0026.html.
Flags are federated, however no mechanism exists for issuing updates on them, https://github.com/w3c/activitystreams/issues/609.
- The vocabulary available for Flag activities doesn’t provide enough information to moderation teams: https://www.w3.org/TR/activitystreams-vocabulary/#dfn-flag.
- Different implementations have different moderation capabilities, and different interpretations of the spec. For instance, Mastodon currently requires both the Actor and the object to be sent in the Flag activity, whilst also not having a way to report non-Actors: https://docs.joinmastodon.org/spec/activitypub/#Flag.
In moderation tools, there’s techniques for ensuring moderator safety, for instance, blurring and greyscaling media in moderator tools to reduce the impact of exposure.
The Fediverse Governance Report: https://fediverse-governance.github.io/ did outline of a lot of the missing features, and this can likely be used as a foundational reference for the task force.
Within reporting and sharing information, have standardised and shared vocabularies of labels is important. E.g. the IFTAS published DTSP labels https://connect.iftas.org/library/iftas-documentation/shared-vocabulary-labels/.
Account and Content reporting workflow guidance: https://connect.iftas.org/library/tools-resources/information-for-software-developers-and-designers/#4-toc-title.
Documentation about how moderation features work is also important, and the end of the day setting up a server tends to be the “easy part” compared to the challenges of moderation and trust and safety. There’s a fairly famous quote that “trust and safety is the product, the social interactions are a side-effect” that is, if you don’t have trust and safety, you end up like what X has become.
Moderation is always subjective, and that’s where having guides and runbooks that give a shared understanding of how you moderate are important. There’s also legal liabilities with hosting certain types of content (see for example the current legal situation of Telegram in France or X in Brazil).
Hachyderm has worked on documenting what happens with reports in Mastodon: https://community.hachyderm.io/docs/mastodon/.
Different federation models besides “federate with everything” and “federate with a limited set of servers” might be necessary, and could potentially be called out in the protocol specification Examples include:
- Fedifams: affinity groups based on similar politics/moderation approaches: https://kolektiva.social/@ophiocephalic/110793531238090472.
- Approval or consent-based federation https://privacy.thenexus.today/free-fediverses-and-consent/.