This is a guide aimed at running your own instances of the services that power Discourse AI modules.
Introduction
If you want to use Discourse AI on your self-hosted instance, you may need to also run the companion services for the modules that you want to enable.
Each module has one or more needed companion services, and those services use more CPU / GPU / disk space than Discourse itself, so keep in mind that this is not recommended for people unfamiliar with Linux server administration and Docker.
Summarization / AI Helper / AI Bot
Embeddings
Toxicity
To run a copy of the classification service use:
docker run -it --rm --name detoxify -e BIND_HOST=0.0.0.0 -p6666:80 ghcr.io/discourse/detoxify:latest
NSFW
To run a copy of the classification service use:
docker run -it --rm --name nsfw -e BIND_HOST=0.0.0.0 -p6666:80 ghcr.io/discourse/nsfw-service:latest
Sentiment
To run a copy of the classification service use:
docker run -it --rm --name sentiment -e BIND_HOST=0.0.0.0 -p6666:80 ghcr.io/discourse/sentiment-service:latest
Running in production
You may want to put this service behind a reverse proxy to enable features like load balancing, TLS, health checks, rate limits, etc when running in a live site.
After the service is up and running, configure the module to connect to the domain where the service is running using the appropriate site setting and then enable the module.