I have a few questions regarding the use of AI summarization in Discourse, and I would appreciate your help in guiding me through the setup process:
How to Initialize AI Summarization:
What steps do I need to follow to activate and configure AI summarization in Discourse?
Are there any plugins required to get started with AI summarization? If so, which ones, and where can I find them?
Using Our Own LLM (from Dify):
Is it possible to integrate our custom LLM, created through the Dify application, for summarization tasks in Discourse? If so, how can we do that using an API?
Can you provide any API documentation or integration guides for connecting third-party LLMs to Discourse?
Cost of AI Models in Discourse:
Available LLM Models: What are the different AI summarization models available in Discourse, and how do I choose between them?
Free vs Paid Models: Are any of the LLM models free to use, or is there a cost associated with all of them?
Usage Restrictions: Are there any limitations on usage for free models, such as daily request limits or token restrictions?
Cost Metrics: What is the pricing structure for each model, including any per-request or per-token costs? If there are multiple pricing tiers, can you provide a breakdown for each model?
Additional Information:
Please let me know if there are any additional considerations or steps I need to take to fully utilize the AI summarization features in Discourse.
Hi,
Thank you so much for your quick and helpful response!
I checked the plugin options on my Discourse instance, but I don’t see anything related to AI listed there. I’ve attached an image showing the available plugin options on my end — could you please take a look and let me know if I’m missing something?
I have a question. I created a workflow on DIFY and now I want to use that workflow’s API to integrate my own LLM. Could you also let me know what format the input should be sent in?
And where to add API.
And what option need to be clicked to configure our own.