大家好,我更新了我的插件,但 GPT 3.5 turbo 仍然无法工作
Message (3 copies reported)
Error running LLM report! : DiscourseAi::Completions::Llm::UNKNOWN_MODEL : DiscourseAi::Completions::Llm::UNKNOWN_MODEL
Backtrace
/var/www/discourse/plugins/discourse-ai/lib/completions/dialects/dialect.rb:27:in `dialect_for'
/var/www/discourse/plugins/discourse-ai/lib/completions/llm.rb:64:in `proxy'
/var/www/discourse/plugins/discourse-ai/lib/automation/report_runner.rb:67:in `initialize'
/var/www/discourse/plugins/discourse-ai/lib/automation/report_runner.rb:33:in `new'
/var/www/discourse/plugins/discourse-ai/lib/automation/report_runner.rb:33:in `run!'
/var/www/discourse/plugins/discourse-ai/discourse_automation/llm_report.rb:75:in `block (2 levels) in <main>'
/var/www/discourse/plugins/discourse-automation/app/models/discourse_automation/automation.rb:135:in `trigger!'
/var/www/discourse/plugins/discourse-automation/app/jobs/regular/discourse_automation_trigger.rb:13:in `execute'
/var/www/discourse/app/jobs/base.rb:297:in `block (2 levels) in perform'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/rails_multisite-5.0.0/lib/rails_multisite/connection_management.rb:82:in `with_connection'
1 个赞
Roman
(Roman Rizzi)
2
嘿 @whitewaterdeu
- 我怀疑你的自动化脚本可能还在使用错误的模型名称 (gpt-3-5-turbo)。请问你是否可以把脚本设置中的模型名称更新为 gpt-3.5-turbo?
谢谢
,我已更改为 gpt-3.5-turbo,它奏效了,但又出现了另一个错误。
看起来我需要减少发送到 openai 的 token 数量,但我不知道该怎么做。
Message
DiscourseAi::Completions::Endpoints::OpenAi: status: 400 - body: {
"error": {
"message": "此模型的最大上下文长度为 4097 个 token。但是,您的消息导致了 5605 个 token。请减少消息的长度。",
"type": "invalid_request_error",
"param": "messages",
"code": "context_length_exceeded"
}
}
Backtrace
/var/www/discourse/plugins/discourse-ai/lib/completions/endpoints/base.rb:91:in `block (2 levels) in perform_completion!'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/net-http-0.4.1/lib/net/http.rb:2353:in `block in transport_request'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/net-http-0.4.1/lib/net/http/response.rb:320:in `reading_body'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/net-http-0.4.1/lib/net/http.rb:2352:in `transport_request'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/net-http-0.4.1/lib/net/http.rb:2306:in `request'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/rack-mini-profiler-3.3.0/lib/patches/net_patches.rb:19:in `block in request_with_mini_profiler'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/rack-mini-profiler-3.3.0/lib/mini_profiler/profiling_methods.rb:50:in `step'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/rack-mini-profiler-3.3.0/lib/patches/net_patches.rb:18:in `request_with_mini_profiler'
/var/www/discourse/plugins/discourse-ai/lib/completions/endpoints/base.rb:89:in `block in perform_completion!'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/net-http-0.4.1/lib/net/http.rb:1570:in `start'
Roman
(Roman Rizzi)
4
你不能这样做。我们应该自动裁剪内容以使其适合上下文窗口。我们的代码一定有 bug。
在我们解决此问题期间,您可以切换到 gpt-4,它的上下文窗口是两倍。
3 个赞
sam
(Sam Saffron)
6
由于过时,GPT-3.5 不应再使用,现在大多数模型都配备了非常大的上下文窗口。