Discourse AI - Sentiment

Yes, the supported models are the ones listed in the OP.

We will eventually add support for classifying using LLMs for people whose cost isn’t an issue.

Well, the whole feature is build around classifying posts using ML models, so yes, you need somewhere to run those.

And since Discourse can run in the very cheapest VPS out there, running ML models is indeed more expensive. If you wanna have the feature on the cheapest way possible, it is doable to run it on a server with just a handful of CPU cores, as long as you have enough RAM to load the models.