تحذير من تضمين `input must have less than 8192 tokens` مع discourse ai

If you self-host that same model, it can take up to 32k tokens. It is what we run on our hosting these days.

If that’s out of the question, then you need to configure the embeddings model to limit inputs to the maximum allowed on your provider. This way, our AI Bot RAG will split uploaded files into chunks, and Related Topic / Search will take only the first 8192 tokens in each topic.