Paidcontent.org released an article today…
Let’s discuss it!
The social web has been around for more than a decade now, but even after all that time, no one has quite figured out how to fix online comments. Some bloggers have given up trying and don’t allow comments at all…
Discourse is first and foremost a discussion forum platform - not exactly the same as blog comments. I think it will probably require different features/plugins to be used effectively as a blog-commenting platform - at least as most people think of them today. One task at a time…
In addition to some other innovations, such as links that automatically expand within a comment (in the same way Twitter’s “expanded tweets” do), Atwood says he is trying to build a reputation system that will grant users new abilities based on the level of trust the platform has in them. Although he doesn’t provide a lot of detail, in a comment on a Hacker News discussion thread he suggests that it will be based on behavior such as flagging abusive posts.
Here’s where the lack of specificity leads to some early speculation about how “trust” mechanisms on Discourse will work. For example, the “flagging” reputation score is kept separate from other scores - it simply makes sense that being good at flagging content means that you can be more trusted at flagging content. [As I pointed out already in this thread]
Measuring trust and rewarding good behavior is something online communities have been trying to do for years, with mixed success. Some believe that sites like Slashdot — which has a moderation platform that awards “karma points” for certain behavior and appoints moderators automatically — have a good solution to the usual problems of trolling and flame wars, while others argue that these systems are almost always fatally flawed. Metafilter (which charges users $5 to become members) has many fans, but it is also a relatively small community. Branch is another attempt to reinvent user forums and discussion as invitation-only hosted conversations.
Why no mention of StackOverflow/StackExchange which are successful examples of trust scores used for unlocking features? How is StackOverflow “fatally flawed”? On the Spam/Troll end of the scale, there are many “automatic” and crowd-sourced reputation systems that filter out billions of pieces our junk-mail, filter spammers by IP address, and auto hide postings every single day. We don’t see most of it, exactly because of how effective it usually is.
Atwood says he wants to use a badge system for rewards (something Huffington Post also uses), but Gawker founder Nick Denton said in an interview last year that a similar reward system his sites used was a “terrible mistake,” because it was easily gamed and encouraged the wrong kinds of behavior. Denton has since completely revamped Gawker’s commenting system in an attempt to make reader comments the centerpiece, as well as a potential business model.
Bad badging systems work badly. It’s true. Here’s a story of gamification that actually killed a site. There are many of these stories:
All I can say is some reputation systems accomplish their goals, and some don’t. Most of the ones that failed that I’ve looked at were not well thought-through, designed or tested. As long as I’m helping Discourse, I pledge to do my best to work to provide incentive systems (badges, points, whatever) as options to forum operators that we have reason to believe will reinforce the behaviors desired: Timely suppression of the worst content, and recognition of the best.
Veteran blogger Anil Dash pointed out in an insightful post in 2011 that one of the only ways to maintain and encourage a healthy conversation — regardless of what platform you use — is to be involved in those discussions yourself as much as possible (a point Bora Zivkovic of Scientific American also made recently). Unfortunately for publishers looking for a quick or inexpensive fix, that kind of engagement is almost impossible to automate.
I agree deeply with the engagement requirement - how could anyone disagree (except someone that thinks online community is somehow a no-effort marketing-channel)? Discourse’s tools aren’t offering some magical solution to content moderation - we’re only trying to make the moderation task easier by offloading much of the work the the community, who by definition has more time-per-post than any moderation staff could ever have.
Just as “this is spam” helps mail providers get filter junk-mail, and StackOverflow uses upvotes to identity best content (and content authors) - This platform uses reputation systems to improve online discourse.
Help us identify the behaviors that you as a forum operator want to encourage and discourage so we provide you with the tools to make that happen.