The UK government are planning to change the laws of online safety and I was curious what measures Discourse are taking to address this or if there is a suggested response for anyone who is hosting in the UK?
Our community are looking into this but it’s unclear to me if the onus is on us or Discourse, which in theory would mean every WhatsApp groupchat or Discord server would need to do this.
Thanks in advance for any help (I did check but couldn’t find a recent thread about this)
From Ofcom’s website: “firms are now legally required to start taking action to tackle criminal activity on their platforms”
I think that Discourse already provides a way to tackle criminal activity by the ‘Its Illegal’ flag to alert staff of illegal activity on the site. Realistically, other than putting in measures like the illegal flag option is there much else that can be done?
Yup we’re on top of it. We have been monitoring it for most of the year and are prepared.
It’s on both of us. We provide the tools to comply with the OSA (because we have to comply here on Meta) but you are responsible for how you use them.
The key considerations are:
We have this with the illegal content flag that @ondrej posted above, which should trigger you to use the existing tools to remove the content in a compliant way and do the appropriate reporting.
As above – the illegal flag or other custom flag types are available for you to use. We are making a change so that not logged in users can also flag illegal content. Do we have an ETA for that @tobiaseigen?
You will need to define your internal processes yourself, but the data is all logged so you will be able to report on it when required.
I’m interested too so if anyone can find it out I’d like to know too. Although I can’t find a specific definition it appears that different sized services are treated slightly differently to larger service providers. “we aren’t requiring small services with limited functionality to take the same actions as the largest corporations.” Gov.uk, para. 6
Can the forum software readily provide a list of mod actions over a year? Or perhaps mod actions filtered by, say, responding to flags? I wouldn’t want to have to keep a separate record. (Sometimes I will delete a user because of a flag - that isn’t an option when responding to a flag.)
No and I’m afraid it’s unlikely that we will provide those. As with GDPR we provide the tools to comply but you will need to seek your own legal advice.
There’s a discussion on HN here (concerning a specific case where a person running 300 forums has decided to close them all) which contains useful information and links to official docs.
AFAICT, 700 thousand monthly active UK users is the threshold for medium. 7 million is the threshold for large.
Note that, I think, some aspects of the law are not sensitive to the size of the service, where others are.
I think this is a case of the risk to forum owners being low-probability but high-cost. Individual judgement, and perception of risk and attitude to risk will be in play.
Thanks Hawk. (I see the pdf link resolves to the latest, despite looking like a link to a specific version.)
Here’s a sensible (but not authoritative) description of what the new laws might mean to self-hosted small-scale forums which don’t specifically target children or offer porn. The main point, I think, is to understand the law and document your approach. From there:
Duties
As a small user-to-user service, the OSA requires you to: