Eccezione job LLM personalizzati [deepseek-coder-v2:latest non supporta gli strumenti", "tipo": "api_error", "param": null, "codice": null]

We have enabled the custom LLM, but we are facing an issue when trying the chatbot.

Log

Message (2 copies reported)

Job exception: {"error":{"message":"registry.ollama.ai/library/deepseek-coder-v2:latest does not support tools","type":"api_error","param":null,"code":null}}

Backtrace

/var/www/discourse/plugins/discourse-ai/lib/completions/endpoints/base.rb:173:in `block (2 levels) in perform_completion!'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/net-http-0.6.0/lib/net/http.rb:2433:in `block in transport_request'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/net-http-0.6.0/lib/net/http/response.rb:320:in `reading_body'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/net-http-0.6.0/lib/net/http.rb:2430:in `transport_request'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/net-http-0.6.0/lib/net/http.rb:2384:in `request'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/rack-mini-profiler-4.0.1/lib/patches/net_patches.rb:19:in `block in request_with_mini_profiler'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/rack-mini-profiler-4.0.1/lib/mini_profiler/profiling_methods.rb:51:in `step'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/rack-mini-profiler-4.0.1/lib/patches/net_patches.rb:18:in `request_with_mini_profiler'
/var/www/discourse/plugins/discourse-ai/lib/completions/endpoints/base.rb:168:in `block in perform_completion!'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/net-http-0.6.0/lib/net/http.rb:1632:in `start'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/net-http-0.6.0/lib/net/http.rb:1070:in `start'
/var/www/discourse/plugins/discourse-ai/lib/completions/endpoints/base.rb:139:in `perform_completion!'
/var/www/discourse/plugins/discourse-ai/lib/completions/endpoints/open_ai.rb:53:in `perform_completion!'
/var/www/discourse/plugins/discourse-ai/lib/completions/llm.rb:415:in `generate'
/var/www/discourse/plugins/discourse-ai/lib/personas/bot.rb:89:in `reply'
/var/www/discourse/plugins/discourse-ai/lib/ai_bot/playground.rb:494:in `reply_to'
/var/www/discourse/plugins/discourse-ai/app/jobs/regular/create_ai_reply.rb:18:in `execute'
/var/www/discourse/app/jobs/base.rb:318:in `block (2 levels) in perform'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/rails_multisite-7.0.0/lib/rails_multisite/connection_management/null_instance.rb:49:in `with_connection'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/rails_multisite-7.0.0/lib/rails_multisite/connection_management.rb:17:in `with_connection'
/var/www/discourse/app/jobs/base.rb:305:in `block in perform'
/var/www/discourse/app/jobs/base.rb:301:in `each'
/var/www/discourse/app/jobs/base.rb:301:in `perform'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:220:in `execute_job'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:185:in `block (4 levels) in process'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:180:in `traverse'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:183:in `block in traverse'
/var/www/discourse/lib/sidekiq/discourse_event.rb:6:in `call'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:182:in `traverse'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:183:in `block in traverse'
/var/www/discourse/lib/sidekiq/pausable.rb:131:in `call'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:182:in `traverse'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:183:in `block in traverse'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/job/interrupt_handler.rb:9:in `call'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:182:in `traverse'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:183:in `block in traverse'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/metrics/tracking.rb:26:in `track'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/metrics/tracking.rb:134:in `call'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:182:in `traverse'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/middleware/chain.rb:173:in `invoke'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:184:in `block (3 levels) in process'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:145:in `block (6 levels) in dispatch'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/job_retry.rb:118:in `local'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:144:in `block (5 levels) in dispatch'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/config.rb:39:in `block in <class:Config>'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:139:in `block (4 levels) in dispatch'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:281:in `stats'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:134:in `block (3 levels) in dispatch'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/job_logger.rb:15:in `call'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:133:in `block (2 levels) in dispatch'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/job_retry.rb:85:in `global'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:132:in `block in dispatch'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/job_logger.rb:40:in `prepare'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:131:in `dispatch'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:183:in `block (2 levels) in process'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:182:in `handle_interrupt'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:182:in `block in process'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:181:in `handle_interrupt'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:181:in `process'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:86:in `process_one'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/processor.rb:76:in `run'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/component.rb:10:in `watchdog'
/var/www/discourse/vendor/bundle/ruby/3.3.0/gems/sidekiq-7.3.9/lib/sidekiq/component.rb:19:in `block in safe_thread'
curl -X POST http://localhost:11434/v1/chat/completions -H "Content-Type: application/json" -d '{
  "model": "deepseek-coder-v2:latest",
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful assistant."
    },
    {
      "role": "user",
      "content": "What is the latest version of Conan as per 2025"
    }
  ],
  "temperature": 0.7,
  "max_tokens": 200
}'

{"id":"chatcmpl-940","object":"chat.completion","created":1757057197,"model":"deepseek-coder-v2:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":" As of my last update in early 2023, there isn't any specific information about \"Conan\" after 2025 since I don't have future prediction capabilities or real-time data. However, I can provide some general advice on how to find the latest version of Conan, which is a package manager for C/C++ dependency management:\n\n1. **Visit the Official Website:** The most reliable way to get information about the latest version of Conan is to visit the official website at https://conan.io/. There you can usually find the latest release notes and documentation related to the latest version.\n\n2. **Check GitHub Repository:** Conan's codebase is hosted on GitHub, so you can also check the repository (https://github.com/conan-io/conan) for releases or tags that correspond to different versions. The repository often includes release notes in the form of commits and pull requests.\n\n3"},"finish_reason":"length"}],"usage":{"prompt_tokens":30,"completion_tokens":200,"total_tokens":230}}

Looks like this is the key error:

deepseek-coder-v2:latest does not support tools

looking at the list of models here: deepseek-coder · Ollama Search

it doesn’t seem like deepseek-coder-v2 supports tools — you’d either need to remove tools from your persona or switch to a model with tool calling like deepseek-coder-v2-tool-calling

When we execute the CURL command with the inputs, it produces the output without any issues. We have not made any customizations in the persona and respective settings.

@awesomerobot

Could you provide more details and some examples? This would help us gain a clearer understanding and foster better discussions.

it doesn’t look like the curl command includes tool calls, so that might be why? in the persona settings, are there any tools listed here?

@awesomerobot

We haven’t included any custom Personas yet, so all the ones you see in the app are just the default options. Also, we’re not using any of them at the moment!

Personas:

Tools

After taking another look at your earlier screenshot, I see that you’re using the forum helper persona in your message… that persona includes some built-in tools that you can’t remove and will not work with the model you’re using.

You should try one of the pre-configured personas that doesn’t include tools (“creative” is one option) or create a new persona that doesn’t include tools.

Could you please clarify if having a persona is essential for the effective functioning of LLMs?

Yes, our AI features use an LLM with a specified persona.

When you have more than one persona enabled and create a new message, you’ll see a dropdown menu that lets you choose the persona:

For testing purposes, if you enable the built-in Creative persona:

and then refresh the page and start a new message with it, I’d expect this to work with your LLM because the Creative persona does not include any tools.

1 Mi Piace

I am seeing only these option, could you please help me with the setting

You need to visit /admin/plugins/discourse-ai/ai-personas and either create a new persona or enable one without tools. If you want to try an existing persona, Creative would be a good one to test as it doesn’t include any tools.

So you’d click edit here:

and at the bottom of the persona settings, you’ll need to enable it, click “create user”, and save your changes

Once that’s done, the Creative persona should appear in the list (you may need to refresh the page first)

Do we also need to select the default modal?

Also, please assist with the following settings:

AI helper proofreader persona 
AI helper title suggestions persona 
AI helper explain persona 
AI helper post illustrator persona 
AI helper smart dates persona 
AI helper translator persona 
AI helper markdown tables persona 
AI helper custom prompt persona 
AI helper image caption persona 
AI embeddings semantic search hyde persona 
AI summarization persona 
AI summary gists persona 
AI bot discover persona 
AI Discord search persona 
AI translation locale detector persona 
AI translation post raw translator persona 
AI translation topic title translator persona 
AI translation short text translator persona 
Inferred concepts generate persona 
Inferred concepts match persona 
Inferred concepts deduplicate persona 
AI embeddings generate for PMs 
AI bot public sharing allowed groups 

How to feed my question to LLMs, and persona to get better output

I want to get the answer based on the data in my Forum and Model if not available

@awesomerobot

Thank you so much for your input! I really appreciate it. I’m getting responses from the bot, but it seems like my forum content isn’t included. Do you have any suggestions for how we could improve that to get with content of the forum? Thanks again!

1 Mi Piace

You’ll need to switch to an LLM that supports tool use, our included “forum helper” persona for example, uses “search” and “read” tools to find relevant content in the forum to incorporate into its replies.

If the model does not have native tool support, you can always use the XML alternative, which includes them in the prompt:

1 Mi Piace