This plugin uses the Google Cloud Vision API to restrict uploading explicit images to the forum. It accounts for all the images uploaded i.e. avatars, logos, etc.
It uses the upload error popup to display a message explaining which violations resulted in the restriction of the image upload.
The acceptable limits can be set via the site settings.
Copy the result and paste it in app.yml in front of GOOGLE_PRIVATE_KEY wrapping it in single quotes ' ' .
Now follow the regular steps of installing the plugin.
Plugin Settings
if_adult_max_acceptable: Maximum acceptable level of the category adult if_spoof_max_acceptable: Maximum acceptable level of the category spoof if_medical_max_acceptable: Maximum acceptable level of the category medical if_violence_max_acceptable: Maximum acceptable level of the category violence if_racy_max_acceptable: Maximum acceptable level of the category racy
I made this PR https://github.com/discourse/discourse/pull/10605 because I realized that the upload error popup wouldn’t display if there’s was a rails side exception while uploading avatars/logos. It works well while uploading from composer. I thought it would be a nice upstream change to incorporate.
Great, thanks for your help @fzngagan making this possible, was a pleasure working with you.
For all communities that have child-safe COPPA requirements or want to stay safe in terms of Adsense, or simply don’t want to have gore or nude images anywhere this plugin is a must-have.
I had tried the stackoverflow dataset for tags prediction , after that I tried same code for discourse dataset, but due to less volume of data at discourse that code didn’t work here