Please reach out to team@discourse.
Its a huge shame not to have “related topics” enabled on your site and this is 100% hosted by us (and already included in all plans) so there are no data concerns or token concerns or pricing concerns.
Similarly we self host the sentiment models and would be happy to explore them with you. I remain a little bit skeptical about sentiment cause I am trying to understand what specific example problems this solves.
We are also exploring running open models on our hardware, llama 3 is surprisingly capable. It is possible we may be able to power features such as summaries on our own models longer term, especially if we can get the 7B models to do them confidently.
I hear you on budgets, Discourse AI already tracks token usage per person, we intend to add quota into the plugin.
This is very much on our roadmap, we are working on this right now, the ability to point at a URL and specify the “dialect” it speaks.