Posts too long for PostSentimentAnalysis

After enabling sentiment analysis using OpenAI, my error log is filled with

Job exception: Net::HTTPBadResponse

Then in Sidekiq:

Jobs::PostSentimentAnalysis
Jobs::HandledExceptionWrapper: Wrapped Net::HTTPBadResponse: Net::HTTPBadResponse

Looking at the post_ids in question, it seems the issue is likely caused by the unusually long posts. Otherwise, the sentiment graphs generate without any problems.

I didn’t know the sentiment analysis can be used thru OpenAI :thinking:

To add more information, I’ve got Number of tokens for the prompt set according to the recommendation (64000 which is 50% of 128K context window). But I’m not sure if this plays any role…

No it doesn’t, because then you would get different error. What you get is exacly what it sound, wrong status code.

How did you set yp sentinent analysis using GPT of OpenAI?

The error message isn’t exactly descriptive. I don’t even know the exact HTTP code.

The culprit is clear, it has something to do with a post length. In that case OpenAI API returns 400 Bad Request.

You just enable the module ai sentiment enabled and it works as long as you have LLM model parameters entered.

1 Like

I’ve experimented with setting as low as 10,000 tokens and no improvement. Net::HTTPBadResponse is still filling my error log.

Can you share a screenshot of your exact config, and your llm config.

Sure, here are the settings