Discourse AI - 埋め込み

Yes.

Yes, each model produces different vector representations.

It’s basically one call per topic, so very easy to calculate.

If most of your topics are long, they will be truncated to 8k tokens, otherwise they will use your topic length.

Yes.

Both work at the topic level, so one per topic.

「いいね!」 2

It seems that this documentation topic is out of date since this commit, as well as this documentation topic.

「いいね!」 4

Indeed. @Saif can you update here?

「いいね!」 3

The OP has now been updated

「いいね!」 1

May I know how to properly add the gems without forking the plugin with the suggested PR?

I’m trying the scale-to-zero feature on HuggingFace and I just need to use the rake task for backfill embeddings.

why return 418 error code when I using discourse ai embeddings full search in DiscourseAi::Embeddings::EmbeddingsController#search as JSON? Could you help me?