What’s next for NSFW detection in Discourse AI

If you’ve seen our previous announcement, then you are aware that we are shifting to prefer LLM preferred solutions as they provide a vastly superior experience. Hence, coming soon we’ll be decommissioning the NSFW module in favor of Discourse AI Post Classifier - Automation rules (See update below)

Why are we doing this?

For the same reasons…

Whats new?

What happens to NSFW?

This announcement should be considered very early, until we are ready to decommission you can continue to use NSFW. When we do it, we will be decommissioning the module, and removing all code from the Discourse AI plugin and associated services from our servers. The deprecation will only happen after AI Triage is ready to handle images which will be done soon.

:point_right:t5: Update: NSFW module has now been officially removed from Discourse, this includes all related site settings and features. We are now urging users to transition to using Discourse AI - AI triage and follow the guides listed above.

Business and Enterprise customers will see the following under What's New in the admin settings on their sites, allowing them to enable Discourse hosted LLMs to power AI triage at no extra cost.

9 Likes

FYI about the following change

2 Likes

This part is now complete, configure and use a vision-powered LLM via AI triage for Image detection

As a reminder, we have NOT yet deprecated the NSFW module. This will still take a bit of time

A new guide is available here

3 Likes

As a heads-up, we are now hiding site settings for enabling/disabling Toxicity and NSFW. This is our ongoing effort as we continue to deprecate the features.

If you have these features turned on, it will still operate as normal. We have NOT fully deprecated the features yet.

If you have it turned off and would like to turn it on, you will now be unable to do so.

1 Like

Hey folks, dropping this update here