Self-Hosting Sentiment and Emotion for DiscourseAI

The Discourse AI plugin has support for requesting emotion/sentiment classification of new posts, which is stored in the database and can be used in reports and admin dashboards.

Running with HuggingFace TEI

HuggingFace provides an awesome container image that can get you running quickly.

For example:

mkdir -p /opt/tei-cache
docker run --rm --gpus all --shm-size 1g -p 8081:80 \
  -v /opt/tei-cache:/data \
  ghcr.io/huggingface/text-embeddings-inference:latest \
  --model-id cardiffnlp/twitter-roberta-base-sentiment-latest \
  --revision refs/pr/30

This should get you up and running with a local instance of ‘cardiffnlp/twitter-roberta-base-sentiment-latest’, an open model that can classify posts into positive/negative/neutral.

You can check if it’s working with

curl http://localhost:8081/ \
    -X POST \
    -H 'Content-Type: application/json' \
    "{ \"inputs\": \"I am happy\" }"

Which should return an array of confidence for each label under normal operation.

Supported models

Making it available for your Discourse instance

Most of the time, you will be running this on a dedicated server because of the GPU speed-up. When doing so, I recommend running a reverse proxy, doing TLS termination, and securing the endpoint so it can only be connected by your Discourse instance.

Configuring Discourse AI

Discourse AI includes site settings to configure the inference server for open-source models. You should point it to your server using the ai_sentiment_model_configs setting.

After that, enable the classification by toggling ai_sentiment_enabled.