I used discourse (via discoursehosting) for a rather unusual purpose - hosting a distributed evaluation of a global UN project. It worked pretty well, both from a technical and professional point of view. I wrote a review: here. Anyone else tried anything similar?
What is crowd sourced evaluation?
An evaluation report in its draft version gets sent around a bunch of stakeholders who make comments perhaps using a tracked changes feature, and the evaluator has to respond to those comments, sometimes in a procedure which gets repeated two more rounds. Over half the inputs and comments might be defensive or off-topic, but there are always at least a few which bring you vital new information and interpretation which I hadn’t picked up during the evaluation process.
I always thought:
How come I have to do all the work of writing an almost complete report & then wait for this highly valuable expert input which comes almost too late? Couldn’t I put these experts to work to write the thing for me in the first place?
So my idea for a crowd sourced evaluation is, essentially, to create an online version of the report, initially consisting of just the evaluation questions as sections and subsections, together with some initial evidence and ideas for answering them; then I send invitations to contribute relevant evidence, opinion and interpretation to help answer any of the questions. The invitation goes out by email to a much broader range of stakeholders than is usually possible, perhaps in waves, for example beginning and ending with invitations to the most important stakeholders.