Content Policy on the Social Web
-
On Monday, Mark Zuckerberg, CEO of Meta, announced a new content policy for Meta on Threads. We are disappointed in these changes, which put vulnerable people on and off Meta platforms in harm’s way for harassment. Ideas matter, and history shows that online misinformation and harassment can lead to violence in the real world. There are good analyses of the details of the policy changes at EFF, The Verge, and Platformer.
Meta is one of many ActivityPub implementers and a supporter of the Social Web Foundation. We strongly encourage Meta’s executive and content teams to come back in line with best practices of a zero harm social media ecosystem. Reconsidering this policy change would preserve the crucial distinction between political differences of opinion and dehumanizing harassment. The SWF is available to discuss Meta’s content moderation policies and processes to make them more humane and responsible.
Distributed Moderation
What do these changes mean for the Fediverse? Through ActivityPub, Meta’s Threads network is connected to the social web, also called the Fediverse. This is a diverse network of independent social services using different codebases and different kinds of content. The network of 300M users of Threads can follow and be followed by people in tens of thousands of other communities. These services are operated by a variety of entities: corporations, universities, enterprise IT, cooperatives, non-profit organizations, and self-organized volunteers.
Theoretically, this distributed structure allows people to make choices about which platforms they want to use – based not only on technical features, but also on community composition and moderation policies. Users don’t need to give up on social connections they already have with friends and family; they can stay connected across services using ActivityPub. Different communities and services can have different content policies, but people in different communities can still stay connected.
Ideally, having an account on a Fediverse service gives people the best of both worlds: they can stay connected to users and content they like, and filter out content and users that they don’t. When unwanted content from one community lands in the feeds of people in other communities, the receiving users or their moderators can react under their own local policy: removing individual text or image posts; blocking individual users; or blocking the entire sending community.
Practically, though, there are limitations to this flexibility. Filtering on the receiving side requires orders of magnitude more effort. If a single sending service delivers bad content to users on one hundred or one thousand receiving services, each moderator on the receiving end has to clean up the mess locally. Moderators get understandably frustrated with this kind of displacement of responsibility. A common response is to block servers that send bad content entirely.
In the case of Threads, though, there are complicating factors. Threads is much, much bigger than the typical Fediverse community, and it has many high-profile users in politics, media and technology. It’s also an easy onboarding service to the Fediverse for people who are used to Facebook or Instagram, meaning many of our friends, colleagues and family use it. Blocking the Threads service means blocking access for all users on the receiving service from all these important accounts.
Unfortunately, there’s not an easy answer for Fediverse moderators. We encourage trust and safety teams across the social web to use their best judgement and the tools available to keep users safe, connected, and informed, and also to minimize moderators’ stress and burnout. IFTAS Connect is a great community resource for connecting with other moderators to discuss these tradeoffs.
Improving Social Web Resilience
We see the challenge of a large service that has poor local content policy as a chance to strengthen the social and technical infrastructure of the Fediverse. None of these options will resolve current problems immediately, but we hope starting the research now will make the Fediverse more resilient in the future.
- Finer-grained filtering tools. As mentioned above, moderators on the Fediverse can automatically filter content by author or by originating service. Some platforms also let moderators filter by keywords – for example, blocking out racist or homophobic slurs. More difficult forms of filtering, such as detecting unacceptable images, or the subtle meaning of text, requires more sophisticated algorithmic filtering not supported by many Fediverse platforms. Balancing the ease of use of this kind of filter with the desire from many communities to have final control by human moderators is a good area for future research.
- Collaborative moderation tools. Email filtering systems re-use signals received from other users, so that if a message is marked as spam by one user, or a few users, other users will never see the message. This can balance the desire for human moderation with a significantly lowered total effort for moderators. Shared server blocklists are somewhat common on the Fediverse, but deeper per-user and per-post collaborative filtering is not. Balancing, once again, the specific priorities of a given community with the advantage of collaborative filtering would also require further research.
- Fact-checking and community notes. One major part of the Meta announcement was a cancellation of the fact-checking program for posts on Meta, and its replacement with a community notes feature, which defers fact-checking to volunteers. Neither of these features (professional and volunteer fact-checking) are supported directly in ActivityPub. We think there’s a place for a variety of fact-checking services on the Fediverse, providing annotations on Fediverse content without requiring permissions from the author, the sending service, or even the receiving service. Building the protocol features and reference implementations, as well as encouraging the participation of fact-checking services, is a good next step in this area.
- Jurisdictional boundaries. Zuckerberg mentions in his update that Meta will be collaborating with the US government to resist demands for content policy changes by other governments. Regardless of the valence of these content policy demands, this question highlights an important feature of the Fediverse, namely, that federated services can operate within specific jurisdictions and conform with their regulations. Content that is conveyed across legal boundaries between services can be more clearly filtered or blocked to comply with local rules. We encourage national and regional governments to further investigate this structure for social networking and global connectivity.
- Data portability. Choosing a social media platform to use is an important freedom in the social web. The Fediverse supports limited data portability, such that users can move their followers and followed accounts to a new server almost seamlessly. However, this move leaves all posted content, like text and images, on the old server, as well as metadata such as likes and shares. The new LOLA protocol would allow a full move between servers. We want to see more work on implementing LOLA in Fediverse platforms.
Ultimately, the safety and well-being of people around the world should not be in the hands of any single company. Moderation policies are a competitive advantage in an open social network. We continue to encourage the use of ActivityPub, and the distributed control that it brings.
-
otto42@fosstodon.orgreplied to evanprodromou@socialwebfoundation.org last edited by
@evanprodromou Jesus, can you not say what the policies that are actually harmful are in the first text of your post? Actually define the problem and then define the solution to it.
-
evanprodromou@socialwebfoundation.orgreplied to otto42@fosstodon.org last edited by
You should definitely read the links! Those cover pretty well what the harmful changes are.
-
mpjgregoire@cosocial.careplied to evanprodromou@socialwebfoundation.org last edited by
@evanprodromou I'm glad to see that, rather than simply criticising Meta for dropping fact-checking, this post encourages work to add fact-checking and community notes type services to the Fediverse.
-
evanprodromou@socialwebfoundation.orgreplied to mpjgregoire@cosocial.ca last edited by
Thanks! I think there is still a lot to do.