Sorry for the confusion – the initial post was part of a broader troubleshooting session that started with API key issues and evolved into this plugin integration problem. It’s the same overall issue: trying to set up xAI Grok in the Discourse AI plugin.
Regarding the logs: Yes, I checked /admin/logs after attempting to save/test the LLM config. Here’s what I found [insert relevant log entries here, e.g., “500 - undefined method ‘test_connection’ for nil:NilClass” or “Error: Connection refused to localhost:8000” – if none, say “No specific entries related to the AI plugin error, only general access logs”]. If there’s a better place to look (e.g., specific container logs in Docker), let me know!
For completeness, my Discourse version is [insert, e.g., 3.3.0.beta2 from /admin/upgrade], AI plugin version is [insert, e.g., from admin/plugins or GitHub commit], OS is Ubuntu 22.04 in Docker.
Any ideas on why the “Internal Server Error” happens with this LiteLLM proxy setup? Thanks!