Overgrow
(OG)
September 14, 2024, 11:00pm
1
After enabling sentiment analysis using OpenAI, my error log is filled with
Job exception: Net::HTTPBadResponse
Then in Sidekiq:
Jobs::PostSentimentAnalysis
Jobs::HandledExceptionWrapper: Wrapped Net::HTTPBadResponse: Net::HTTPBadResponse
Looking at the post_ids in question, it seems the issue is likely caused by the unusually long posts. Otherwise, the sentiment graphs generate without any problems.
Jagster
(Jakke Lehtonen)
September 15, 2024, 7:26am
2
I didn’t know the sentiment analysis can be used thru OpenAI
Overgrow
(OG)
September 15, 2024, 2:59pm
3
To add more information, I’ve got Number of tokens for the prompt
set according to the recommendation (64000 which is 50% of 128K context window). But I’m not sure if this plays any role…
Jagster
(Jakke Lehtonen)
September 15, 2024, 3:27pm
4
No it doesn’t, because then you would get different error. What you get is exacly what it sound, wrong status code.
How did you set yp sentinent analysis using GPT of OpenAI?
Overgrow
(OG)
September 15, 2024, 3:51pm
5
The error message isn’t exactly descriptive. I don’t even know the exact HTTP code.
The culprit is clear, it has something to do with a post length. In that case OpenAI API returns 400 Bad Request.
You just enable the module ai sentiment enabled
and it works as long as you have LLM model parameters entered.
1 Like
Overgrow
(OG)
September 15, 2024, 7:46pm
6
I’ve experimented with setting as low as 10,000 tokens and no improvement. Net::HTTPBadResponse
is still filling my error log.
sam
(Sam Saffron)
September 15, 2024, 10:12pm
7
Can you share a screenshot of your exact config, and your llm config.
Overgrow
(OG)
September 16, 2024, 11:07pm
8
Sure, here are the settings
Falco
(Falco)
September 21, 2024, 2:52am
9
There is no sentiment analysis feature via OpenAI in DiscourseAI. Please follow Discourse AI - Self-Hosted Guide if you want to run it locally.