That is just the tip of the iceberg with the moderation framework I have in mind.
Anyone can become a moderator by publishing their block / hide list.
The more people that subscribe to a moderator or a moderator team, the more “votes” they get to become the default moderator profile for a topic (whatever that is on the given platform, subreddit for reddit etc).
By being subscribed to a moderation team (or multiple), when you block or hide, it gets sent to the report queues of who you’re subscribed to. They can then review the content and make a determination to block or hide it for all their subscribers.
Someone who is blocked or hidden is notified that their content has been blocked or hidden when it is by a large enough mod team. They can then file an appeal. The appeal is akin to a trial, and it is distributed among all the more active people that block or hide content in line with the moderation collective.
An appeal goes through multiple rounds of analysis by randomly selected users who participate in review. It is provided with the user context and all relevant data to make a decision. People reviewing the appeal can make decision comments and the user can read their feedback.
All of this moderation has a “karma” associated with it. When people make decisions in line with the general populace, they get more justice karma. That creates a ranking.
Those rankings can be used to make a tiered justice system, that select the best representative sample of how a topic wishes to have justice applied. The higher ranking moderators get selected for higher tiered decisions. If a lower level appeal decision is appealed again, it gets added to their queue, and they can choose to take the appeal or not.
All decisions are public for the benefit of users and accountability of moderators.
When a user doesn’t like a moderator’s decision they can unblock or unhide content, and that counts as a vote against them. This is where it gets interesting, because this forms a graph of desired content, with branching decision logic. You can follow that train of thought to some very fascinating results. Everyone will have a personally curated content tree.
Some will have a “cute” internet, filled with adorable content. Some will have a “violent” internet, filled with war videos and martial arts. Some will have a “cozy” internet, filled with non-triggering safe content. And we will be able to share our curations and preferences so others can benefit.
There is much more but the system would make moderation not just more equitable, but more scalable, transparent, and appreciated. We’d be able to measure moderators and respect them while honoring the freedom of individuals. Everyone would win.
I see a future where we respect the individual voices of everyone, and make space for all to learn and grow. Where we are able to decide what we want to see and share without constant anxiety. Where everything is so fluid and decentralized that no one can be captured by money or influence, and when they are, we have the tools to swiftly branch with minimal impact. Passively democratic online mechanisms.
That is actually a really interesting approach to moderation, huh.
That is just the tip of the iceberg with the moderation framework I have in mind.
Anyone can become a moderator by publishing their block / hide list.
The more people that subscribe to a moderator or a moderator team, the more “votes” they get to become the default moderator profile for a topic (whatever that is on the given platform, subreddit for reddit etc).
By being subscribed to a moderation team (or multiple), when you block or hide, it gets sent to the report queues of who you’re subscribed to. They can then review the content and make a determination to block or hide it for all their subscribers.
Someone who is blocked or hidden is notified that their content has been blocked or hidden when it is by a large enough mod team. They can then file an appeal. The appeal is akin to a trial, and it is distributed among all the more active people that block or hide content in line with the moderation collective.
An appeal goes through multiple rounds of analysis by randomly selected users who participate in review. It is provided with the user context and all relevant data to make a decision. People reviewing the appeal can make decision comments and the user can read their feedback.
All of this moderation has a “karma” associated with it. When people make decisions in line with the general populace, they get more justice karma. That creates a ranking.
Those rankings can be used to make a tiered justice system, that select the best representative sample of how a topic wishes to have justice applied. The higher ranking moderators get selected for higher tiered decisions. If a lower level appeal decision is appealed again, it gets added to their queue, and they can choose to take the appeal or not.
All decisions are public for the benefit of users and accountability of moderators.
When a user doesn’t like a moderator’s decision they can unblock or unhide content, and that counts as a vote against them. This is where it gets interesting, because this forms a graph of desired content, with branching decision logic. You can follow that train of thought to some very fascinating results. Everyone will have a personally curated content tree.
Some will have a “cute” internet, filled with adorable content. Some will have a “violent” internet, filled with war videos and martial arts. Some will have a “cozy” internet, filled with non-triggering safe content. And we will be able to share our curations and preferences so others can benefit.
There is much more but the system would make moderation not just more equitable, but more scalable, transparent, and appreciated. We’d be able to measure moderators and respect them while honoring the freedom of individuals. Everyone would win.
I see a future where we respect the individual voices of everyone, and make space for all to learn and grow. Where we are able to decide what we want to see and share without constant anxiety. Where everything is so fluid and decentralized that no one can be captured by money or influence, and when they are, we have the tools to swiftly branch with minimal impact. Passively democratic online mechanisms.