Die japanischen Songtexte in der unteren rechten Ecke sind nicht meine Anmerkungen und sollten ignoriert werden.
Der Screenshot zeigt, dass mein Modell tatsächlich verfügbar ist, aber nicht als Standardmodell festgelegt werden kann.
Aufgrund von Einschränkungen für neue Benutzer werde ich den Screenshot, der die Verfügbarkeit des Modells zeigt, später senden.
Ich benötige Hilfe, vielen Dank!
Hier sind die relevanten Protokolle, die nützlich sein könnten und die Operationen zum Testen der LLM-Verfügbarkeit und zum Festlegen des Standardmodells umfassen.
Started POST "/admin/plugins/discourse-ai/ai-llms/test.json" for 172.68.225.73 at 2026-03-20 02:11:07 +0000
Processing by DiscourseAi::Admin::AiLlmsController#test as JSON
Parameters: {"ai_llm"=>{"max_prompt_tokens"=>"32000", "max_output_tokens"=>"2000", "ai_secret_id"=>"2", "tokenizer"=>"DiscourseAi::Tokenizer::OpenAiTokenizer", "url"=>"https://api.chatanywhere.org", "display_name"=>"gpt-4.1", "name"=>"gpt-4.1", "provider"=>"open_ai", "vision_enabled"=>"false", "input_cost"=>"0.7", "output_cost"=>"2.8", "cached_input_cost"=>"0.7", "cache_write_cost"=>"0.7", "provider_params"=>{"organization"=>"", "disable_native_tools"=>"", "reasoning_effort"=>"none", "disable_streaming"=>"", "service_tier"=>"default"}, "allowed_attachment_types"=>["doc", "docx", "html", "markdown", "pdf", "rtf", "txt"]}}
Completed 200 OK in 252ms (Views: 0.2ms | ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 0.7ms)
Started PUT "/admin/plugins/discourse-ai/ai-llms/3" for 172.68.225.73 at 2026-03-20 02:11:10 +0000
Processing by DiscourseAi::Admin::AiLlmsController#update as */*
Parameters: {"ai_llm"=>{"max_prompt_tokens"=>32000, "max_output_tokens"=>2000, "ai_secret_id"=>2, "tokenizer"=>"DiscourseAi::Tokenizer::OpenAiTokenizer", "url"=>"https://api.chatanywhere.org", "display_name"=>"gpt-4.1", "name"=>"gpt-4.1", "provider"=>"open_ai", "vision_enabled"=>false, "input_cost"=>0.7, "output_cost"=>2.8, "cached_input_cost"=>0.7, "cache_write_cost"=>0.7, "provider_params"=>{"organization"=>nil, "disable_native_tools"=>nil, "reasoning_effort"=>"none", "disable_streaming"=>nil, "service_tier"=>"default"}, "llm_quotas"=>[], "allowed_attachment_types"=>["doc", "docx", "html", "markdown", "pdf", "rtf", "txt"]}, "id"=>"3"}
Completed 200 OK in 61ms (Views: 14.4ms | ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 0.7ms)
Started GET "/admin/plugins/discourse-ai/ai-llms.json" for 172.68.225.73 at 2026-03-20 02:11:12 +0000
Processing by DiscourseAi::Admin::AiLlmsController#index as JSON
Completed 200 OK in 47ms (Views: 0.8ms | ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 2.2ms)
Started GET "/admin/config/site_settings.json?plugin=discourse-ai&category=discourse_ai" for 172.68.225.73 at 2026-03-20 02:11:12 +0000
Processing by Admin::Config::SiteSettingsController#index as JSON
Parameters: {"plugin"=>"discourse-ai", "category"=>"discourse_ai"}
Completed 200 OK in 103ms (Views: 0.2ms | ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 4.9ms)
Started GET "/admin/config/site_settings.json?plugin=discourse-ai" for 172.68.225.73 at 2026-03-20 02:11:12 +0000
Processing by Admin::Config::SiteSettingsController#index as JSON
Parameters: {"plugin"=>"discourse-ai"}
Completed 200 OK in 98ms (Views: 0.1ms | ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 0.0ms)
Started PUT "/admin/site_settings/ai_default_llm_model" for 172.68.225.73 at 2026-03-20 02:11:17 +0000
Processing by Admin::SiteSettingsController#update as */*
Parameters: {"ai_default_llm_model"=>"3", "id"=>"ai_default_llm_model"}
Completed 422 Unprocessable Entity in 265ms (Views: 0.1ms | ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 0.0ms)
Verwirrenderweise wurde mir kurz darauf nicht mehr der Fehler angezeigt, als ich erneut auf “Einstellungen ändern” geklickt habe. Ich bin mir nicht sicher, ob das bedeutet, dass es erfolgreich war.
Hier ist das Server-Protokoll dieses Vorgangs:
Started PUT "/admin/site_settings/ai_default_llm_model" for 172.68.225.73 at 2026-03-20 02:20:19 +0000
Processing by Admin::SiteSettingsController#update as */*
Parameters: {"ai_default_llm_model" => "3", "id" => "ai_default_llm_model"}
Completed 204 No Content in 2047ms (ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 0.5ms)


