Discourse Image Filter

This plugin uses the Google Cloud Vision API to restrict uploading explicit images to the forum. It accounts for all the images uploaded i.e. avatars, logos, etc.

It uses the upload error popup to display a message explaining which violations resulted in the restriction of the image upload.

The acceptable limits can be set via the site settings.



  1. Create a service account on google cloud Creating and managing service accounts  |  Cloud IAM Documentation
    Google will trigger a download of a JSON file. Store its contents safely.

  2. Paste these lines at the bottom of env section in your app.yml

     GOOGLE_ACCOUNT_TYPE: 'service_account'
     GOOGLE_CLIENT_ID: 'client-id-from-json-file'
     GOOGLE_CLIENT_EMAIL: "service-account-email-address"
     GOOGLE_PRIVATE_KEY:  instructions below
  • How to setup GOOGLE_PRIVATE_KEY parameter?

    • Paste the key in the json file to code editor.
    • Use find and replace and replace \n with \\n
    • Copy the result and paste it in app.yml in front of GOOGLE_PRIVATE_KEY wrapping it in single quotes ' ' .
  1. Now follow the regular steps of installing the plugin.

Plugin Settings

if_adult_max_acceptable: Maximum acceptable level of the category adult
if_spoof_max_acceptable: Maximum acceptable level of the category spoof
if_medical_max_acceptable: Maximum acceptable level of the category medical
if_violence_max_acceptable: Maximum acceptable level of the category violence
if_racy_max_acceptable: Maximum acceptable level of the category racy

You can read about these criteria and the API itself in detail here. Detect explicit content (SafeSearch)  |  Cloud Vision API  |  Google Cloud

:page_facing_up: Get the code

:raising_hand_woman: Request a feature

:bug: Report a bug

Development Notes

For using in a dev environment on macos, add this line to your .bash_profile file


Thanks @Terrapop for sponsoring the plugin.



I made this PR https://github.com/discourse/discourse/pull/10605 because I realized that the upload error popup wouldn’t display if there’s was a rails side exception while uploading avatars/logos. It works well while uploading from composer. I thought it would be a nice upstream change to incorporate.


Thanks for the ping, missed the Github notification.


Thanks a lot for the merge. It helps discourse and my plugin both.


Great, thanks for your help @fzngagan making this possible, was a pleasure working with you.

For all communities that have child-safe COPPA requirements or want to stay safe in terms of Adsense, or simply don’t want to have gore or nude images anywhere this plugin is a must-have.


This might be something of interest for you

Thanks Faizaan! I was thinking about making something more focussed on language understanding and customisable. Have you had any experience with that?



@jahan_gagan might have something to say on NLP.


I had tried the stackoverflow dataset for tags prediction , after that I tried same code for discourse dataset, but due to less volume of data at discourse that code didn’t work here


What scopes do I need in my service account? I’m getting “Request had insufficient authentication scopes".

This can help? Runs completely on client side

text toxicity model

This plugin is about IMAGES, not text.

For text there is a Google Perspective plugin available based on the Jigsaw engine: