是什么阻止你尝试 Discourse AI?

I think most people are using a paid plan with one of the larger AI service providers (there’s a list of supported models here in the documentation).

Unfortunately I’m not aware of any affordable options for self-hosters - anything GPU-based I know is in the price range you mentioned, and I suspect, CPU-based inference will be too slow, even on more powerful machines.

1 个赞