Einrichtungsanfrage: KI-Zusammenfassung in Discourse und LLM-Integration

Hello Discourse Support Team,

I have a few questions regarding the use of AI summarization in Discourse, and I would appreciate your help in guiding me through the setup process:

  1. How to Initialize AI Summarization:

    • What steps do I need to follow to activate and configure AI summarization in Discourse?

    • Are there any plugins required to get started with AI summarization? If so, which ones, and where can I find them?

  2. Using Our Own LLM (from Dify):

    • Is it possible to integrate our custom LLM, created through the Dify application, for summarization tasks in Discourse? If so, how can we do that using an API?

    • Can you provide any API documentation or integration guides for connecting third-party LLMs to Discourse?

  3. Cost of AI Models in Discourse:

  • Available LLM Models: What are the different AI summarization models available in Discourse, and how do I choose between them?

  • Free vs Paid Models: Are any of the LLM models free to use, or is there a cost associated with all of them?

  • Usage Restrictions: Are there any limitations on usage for free models, such as daily request limits or token restrictions?

  • Cost Metrics: What is the pricing structure for each model, including any per-request or per-token costs? If there are multiple pricing tiers, can you provide a breakdown for each model?

  1. Additional Information:

    • Please let me know if there are any additional considerations or steps I need to take to fully utilize the AI summarization features in Discourse.

Thank you in advance for your help!

Hi @S.AAKASH_MUTHIAH , most of your queries can be answered in the docs for the AI plugin here.


See:

And:


Yes, the ai plugin. It is included in core if you’re updated, just go to /admin/plugins to enable and configure it.



Most are paid, but you can take a look at OpenRouter, which provides some free models as well.


AFAIK this will be up to your provider, but CMIIW you can enforce limits in Discourse as well.

Maybe see:


That will be up to your AI LLM provider; they differ.


See all the docs for the AI plugin here.

For the settings offered in the plugin, see this guide.


Hope this helps!

1 „Gefällt mir“

Hi,
Thank you so much for your quick and helpful response!

I checked the plugin options on my Discourse instance, but I don’t see anything related to AI listed there. I’ve attached an image showing the available plugin options on my end — could you please take a look and let me know if I’m missing something?

Are you on the latest test-passed version?

1 „Gefällt mir“

my current version

Try updating your forum and see if that works. Taking a look at your version, it looks like the AI plugin was not bundled in core yet.

1 „Gefällt mir“

Thank you, It worked now.

I have a question. I created a workflow on DIFY and now I want to use that workflow’s API to integrate my own LLM. Could you also let me know what format the input should be sent in?
And where to add API.
And what option need to be clicked to configure our own.

I found one thing is it this page where i need to configure settings?


If yes what option i should select from this dropdown

If you choose ‘Manual configuration’ in the LLMs page, and put your keys in there, does it work?

I haven’t used the AI plugin in a while, so my memory of it is a little hazy.

I would like to understand the type of input that will be sent to the API endpoint from Discourse.