In part 1, we talked about HN/Lobsters-style sites and how their karma mechanics promoted good behavior and helped produce civility and discussion.
In part 2, we talked about how the same mechanisms could be gamed in ways that tended to corrupt and make those communities toxic.
In this final part, I’ll go over some strategies that I think can help mitigate these problems.
How to Win Big
Order posts randomly
One of the consistent themes from the previous section involves fiddling with the post order in order to get more upvotes. Even with several posts of isomorphic content, the first post will probably get more upvotes than the others.
So, in order to remove the entire category of “Hey, let’s play with the ordering of posts and reply to things that otherwise aren’t related”, sites should just display subthreads in randomized order. This would mean:
- All child posts of a subthread at the same level should be displayed in random order.
- Randomization shouldn’t occur per-pageload, because that’d be slow.
- Randomization should occur per-user, so the same user can rapidly find things.
This does perhaps mean that bad posts show up at the top of the page, but that isn’t guaranteed to happen. An option for omitting posts below a certain threshold (say, 0 karma) might also solve this, but that introduces secondary effects that might be bad.
Throttle story submissions
In order to prevent spamming of the front page with little easy-upvote and click-bait stories, prevent users from submitting more than one or two stories over the course of some time interval. This also:
- Prevents a handful of users from dominating the front page through sheer quantity.
- Prevents users from being overloaded by constant churn of submissions and allows proper
- Slows the effects of bad submissions (even if bad ones get in, it is at a slower rate so damage isn’t as acute).
Aggressively and transparently moderate your community
HN already does this, though the transparency is lacking. Lobsters is completely open about all moderation actions, and has a public log of what has occurred.
Aggressive moderation is not necessarily harsh moderation, but instead refers to admins and sysops keeping a close eye on the community and forums and making sure that bad actors are dealt with quickly. This means:
- Identifying and dealing with abusive behavior.
- Proactively organizing tags and titles for consistency.
- Reminding users of community decorum as needed.
Moderation should never, though, exhibit the following behavior:
- Editing user posts without explicit permission
- Banning users with dissenting opinions but who are civil.
- Removing stories or comments preemptively (censorship)
Note that there is seeming conflict here: how can a moderator both protect the community against abuse and also avoid censorship? That is the reason that frequent and close (aggressive) moderation is so important: done correctly, it allows the community to build up a set of standards and hold to them, and thus they learn to self-regulate to an extent.
Moderation is ultimately dependent on community cooperation and focus, and as long as the community hasn’t normalized into its culture abusive or disruptive behavior or submission strategies (as talked about in the previous article), then moderation itself can be done without devolving into strong-arming.
That’s it for this series. At this time, I don’t really feel like I’ve got better advice, and frankly I think I’m done writing about this topic for now. I’ll update this article as ideas occur to me later. Thanks for reading!