I’m excited about your take on forums structure and organization of online communities.
I’m considering using it in the near future for an educational community for children, where children aged 6-17 would be the main users, perhaps a few thousand of them.
Naturally I worry about their online safety, and how Discourse could be used to an advantage over existing forums.
For simplicity considering an all-private discourse forum, what security features are there to keep dangerous people away? Such as requiring an approval or invitation from X number of people before signing in, or the absence of private messages all-around.
I know about the 4 levels of trust already, but this is not enough to protect children, I think.
We did add a “agreed to the terms and conditions” checkbox to new signups for a customer. You would definitely want that here – @neil how can they turn it on?
What @riking said. tos_accept_required is one thing you’ll want to turn on.
Having some parents as moderators, or at least trusted members of the forum, could help too (unless it would stifle conversation to know that parents are reading everything). Education about flagging and how to spot suspicious activity can go a long way.
This would be really good.
Just messages to and from Admins and Mods.
One less thing to worry about if your forum’s a little wild and free to sign up.
I’m also very uncomfortable about the messages being read by admins. The topic was closed I wish it could be reopened.
A Boolean to salt and obfuscate?
As I understand it, mods, and definitely administrators have the means to view private messages between other users. Coupled with some sort of homegrown reporting tool using the API, it should at least be easy to monitor for suspicious activity, particularly if PMs are expected to be a rarity.
Making it clear that you are doing this, or have the ability to do this may very well be enough to make troublemakers reconsider.