After enabling sentiment analysis using OpenAI, my error log is filled with
Job exception: Net::HTTPBadResponse
Then in Sidekiq:
Jobs::PostSentimentAnalysis
Jobs::HandledExceptionWrapper: Wrapped Net::HTTPBadResponse: Net::HTTPBadResponse
Looking at the post_ids in question, it seems the issue is likely caused by the unusually long posts. Otherwise, the sentiment graphs generate without any problems.
Jagster
(Jakke Lehtonen)
2024 年 9 月 15 日午前 7:26
2
I didn’t know the sentiment analysis can be used thru OpenAI
To add more information, I’ve got Number of tokens for the prompt
set according to the recommendation (64000 which is 50% of 128K context window). But I’m not sure if this plays any role…
Jagster
(Jakke Lehtonen)
2024 年 9 月 15 日午後 3:27
4
No it doesn’t, because then you would get different error. What you get is exacly what it sound, wrong status code.
How did you set yp sentinent analysis using GPT of OpenAI?
The error message isn’t exactly descriptive. I don’t even know the exact HTTP code.
The culprit is clear, it has something to do with a post length. In that case OpenAI API returns 400 Bad Request.
You just enable the module ai sentiment enabled
and it works as long as you have LLM model parameters entered.
「いいね!」 1
I’ve experimented with setting as low as 10,000 tokens and no improvement. Net::HTTPBadResponse
is still filling my error log.
sam
(Sam Saffron)
2024 年 9 月 15 日午後 10:12
7
Can you share a screenshot of your exact config, and your llm config.
Falco
(Falco)
2024 年 9 月 21 日午前 2:52
9
There is no sentiment analysis feature via OpenAI in DiscourseAI. Please follow Discourse AI - Self-Hosted Guide if you want to run it locally.