@neale@teh.entar.net I’m glad to hear that!
evanprodromou@socialwebfoundation.org
Posts
-
The Internet Doesn’t Have To Be Like This -
The Internet Doesn’t Have To Be Like ThisI loved this video that the Daily Show‘s Desi Lydic posted on Instagram, Tiktok, and YouTube. Give it a watch:
- YouTube
Enjoy the videos and music that you love, upload original content and share it all with friends, family and the world on YouTube.
(www.youtube.com)
Lydic talks about the dizzying changes that are happening in social media these days. Internet users over the last decade have gotten used to a small number of huge social platforms. But political changes, content policy issues, and legal platform shutdowns have upended that formerly stable structure. People can no longer count on their friends, family, colleagues and neighbours all being on the same social networking system, much less news outlets, politicians, and celebrities. So they’re racing around, trying new applications (including, as Lydic notes, the awesome Pixelfed), and seeking a place to be social again.
Why should anyone have to do this? After all, you and I didn’t change our political outlook or our content policies or our legal ownership structure. We have governments and companies changing all around us that interfere with how we can interact with the people that matter most to us. Regardless of how you feel about these changes, why do everyday users have to be the ones to scramble to adapt?
The Fediverse is based on the simple belief that your social connections and your published content are yours. They belong to you. You should get to decide where to set up your home on the social web, based on your own priorities — technical, political, financial, romantic, whatever. And once you have that place on the social web, you can connect to anybody else, on any Fediverse platform, as easily as if they were on your own.
So when your friends are all trying a new Fediverse-enabled app from the app store, you can follow them from your own Fediverse home, see what they’re posting, like, comment, and share. You don’t have to scramble to install yet another application, go through the complicated signup flow, set up your profile, and alert everyone you know about yet another identity you have. You can stay put, keep all your current connections, but still stay connected to your restless friends and bleeding-edge influencers.
And if you get tired of the place you’ve set up your Fediverse home, you can move completely — taking all your social connections (and, soon, all your content) to the new platform you’ve chosen. You won’t have to make a series of announcements, like Lydic does, about all the different places your Internet presence is scattered. It’s handled automatically by the Fediverse platforms. Your followers, family and friends might not even notice the difference.
Social media is fun; we get it. And there’s nothing wrong with trying new apps. Being a pioneer on the cool new platform is invigorating. But if it’s not fun, and you’re feeling the whiplash of multiple platforms rising and falling weekly, please consider setting up your long-term homebase on a Fediverse-enabled platform. You might be surprised how many platforms are already Fediverse-enabled, and more are coming online every day.
-
Welcome to the New Non-profit on the FediverseWelcome to the New Non-profit on the Fediverse
Mastodon today announced a new non-profit to manage the next steps for the project. From our perspective, this is a great sign of maturation in the social web software space. Best of luck to Eugen and team as they take this next step. We look forward to working with Mastodon towards a bigger, better Fediverse.
-
Content Policy on the Social WebThanks! I think there is still a lot to do.
-
Content Policy on the Social WebYou should definitely read the links! Those cover pretty well what the harmful changes are.
-
Content Policy on the Social WebOn Monday, Mark Zuckerberg, CEO of Meta, announced a new content policy for Meta on Threads. We are disappointed in these changes, which put vulnerable people on and off Meta platforms in harm’s way for harassment. Ideas matter, and history shows that online misinformation and harassment can lead to violence in the real world. There are good analyses of the details of the policy changes at EFF, The Verge, and Platformer.
Meta is one of many ActivityPub implementers and a supporter of the Social Web Foundation. We strongly encourage Meta’s executive and content teams to come back in line with best practices of a zero harm social media ecosystem. Reconsidering this policy change would preserve the crucial distinction between political differences of opinion and dehumanizing harassment. The SWF is available to discuss Meta’s content moderation policies and processes to make them more humane and responsible.
Distributed Moderation
What do these changes mean for the Fediverse? Through ActivityPub, Meta’s Threads network is connected to the social web, also called the Fediverse. This is a diverse network of independent social services using different codebases and different kinds of content. The network of 300M users of Threads can follow and be followed by people in tens of thousands of other communities. These services are operated by a variety of entities: corporations, universities, enterprise IT, cooperatives, non-profit organizations, and self-organized volunteers.
Theoretically, this distributed structure allows people to make choices about which platforms they want to use – based not only on technical features, but also on community composition and moderation policies. Users don’t need to give up on social connections they already have with friends and family; they can stay connected across services using ActivityPub. Different communities and services can have different content policies, but people in different communities can still stay connected.
Ideally, having an account on a Fediverse service gives people the best of both worlds: they can stay connected to users and content they like, and filter out content and users that they don’t. When unwanted content from one community lands in the feeds of people in other communities, the receiving users or their moderators can react under their own local policy: removing individual text or image posts; blocking individual users; or blocking the entire sending community.
Practically, though, there are limitations to this flexibility. Filtering on the receiving side requires orders of magnitude more effort. If a single sending service delivers bad content to users on one hundred or one thousand receiving services, each moderator on the receiving end has to clean up the mess locally. Moderators get understandably frustrated with this kind of displacement of responsibility. A common response is to block servers that send bad content entirely.
In the case of Threads, though, there are complicating factors. Threads is much, much bigger than the typical Fediverse community, and it has many high-profile users in politics, media and technology. It’s also an easy onboarding service to the Fediverse for people who are used to Facebook or Instagram, meaning many of our friends, colleagues and family use it. Blocking the Threads service means blocking access for all users on the receiving service from all these important accounts.
Unfortunately, there’s not an easy answer for Fediverse moderators. We encourage trust and safety teams across the social web to use their best judgement and the tools available to keep users safe, connected, and informed, and also to minimize moderators’ stress and burnout. IFTAS Connect is a great community resource for connecting with other moderators to discuss these tradeoffs.
Improving Social Web Resilience
We see the challenge of a large service that has poor local content policy as a chance to strengthen the social and technical infrastructure of the Fediverse. None of these options will resolve current problems immediately, but we hope starting the research now will make the Fediverse more resilient in the future.
- Finer-grained filtering tools. As mentioned above, moderators on the Fediverse can automatically filter content by author or by originating service. Some platforms also let moderators filter by keywords – for example, blocking out racist or homophobic slurs. More difficult forms of filtering, such as detecting unacceptable images, or the subtle meaning of text, requires more sophisticated algorithmic filtering not supported by many Fediverse platforms. Balancing the ease of use of this kind of filter with the desire from many communities to have final control by human moderators is a good area for future research.
- Collaborative moderation tools. Email filtering systems re-use signals received from other users, so that if a message is marked as spam by one user, or a few users, other users will never see the message. This can balance the desire for human moderation with a significantly lowered total effort for moderators. Shared server blocklists are somewhat common on the Fediverse, but deeper per-user and per-post collaborative filtering is not. Balancing, once again, the specific priorities of a given community with the advantage of collaborative filtering would also require further research.
- Fact-checking and community notes. One major part of the Meta announcement was a cancellation of the fact-checking program for posts on Meta, and its replacement with a community notes feature, which defers fact-checking to volunteers. Neither of these features (professional and volunteer fact-checking) are supported directly in ActivityPub. We think there’s a place for a variety of fact-checking services on the Fediverse, providing annotations on Fediverse content without requiring permissions from the author, the sending service, or even the receiving service. Building the protocol features and reference implementations, as well as encouraging the participation of fact-checking services, is a good next step in this area.
- Jurisdictional boundaries. Zuckerberg mentions in his update that Meta will be collaborating with the US government to resist demands for content policy changes by other governments. Regardless of the valence of these content policy demands, this question highlights an important feature of the Fediverse, namely, that federated services can operate within specific jurisdictions and conform with their regulations. Content that is conveyed across legal boundaries between services can be more clearly filtered or blocked to comply with local rules. We encourage national and regional governments to further investigate this structure for social networking and global connectivity.
- Data portability. Choosing a social media platform to use is an important freedom in the social web. The Fediverse supports limited data portability, such that users can move their followers and followed accounts to a new server almost seamlessly. However, this move leaves all posted content, like text and images, on the old server, as well as metadata such as likes and shares. The new LOLA protocol would allow a full move between servers. We want to see more work on implementing LOLA in Fediverse platforms.
Ultimately, the safety and well-being of people around the world should not be in the hands of any single company. Moderation policies are a competitive advantage in an open social network. We continue to encourage the use of ActivityPub, and the distributed control that it brings.