Discourse AI mit Ollama lokal zum Laufen bringen

Interessiert sich jemand, ob das hier mit Ollama lokal zum Laufen gebracht wurde?

1 „Gefällt mir“

Ich versuche auch, dies mit Ollama zum Laufen zu bringen. Es scheint, dass die vom Discourse AI-Plugin verwendete API nicht mit Ollama kompatibel ist und alle Einstellungen, die ich ändere, die API-Anfrage nicht ändern.

Dies ist, was mit Ollama funktioniert:

curl http://192.168.1.2:11434/api/generate -d '{\
                     \"model\": \"llama3.2\",\
                     \"prompt\": \"Why is the sky blue?\"\
                   }'

Und oben scheint die einzige Art von Anfrage zu sein, die das Plugin versucht. z.B.

curl http://192.168.1.2:11434/ \
                           -X POST \
                           -H 'Content-Type: application/json' \
                           -d '{\"inputs\":\"\u003cs\u003e[INST] What is your favourite condiment? [/INST] Well, Im quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever Im cooking up in the kitchen!\u003c/s\u003e [INST] Do you have mayonnaise recipes? [/INST]\",\"parameters\":{\"max_new_tokens\":500, \"temperature\":0.5,\"top_p\": 0.9}}'

Der KI-Chatbot unterstützt angeblich Ollama, aber ich kann auch keine Antwort von ihm bekommen.

Also ja, jeder, der dies mit Ollama zum Laufen gebracht hat, bitte posten Sie, welche Einstellungen Sie verwendet haben!

2 „Gefällt mir“

Anscheinend habe ich mich zu früh gefreut, wenn ich keine Einstellungen im Discourse AI Plugin vorgenommen habe und die RAG-Anweisungen in diesem Post befolge, erhalte ich Antworten im AI-Chatbot von Ollama!

1 „Gefällt mir“

Es funktioniert einwandfrei mit Ollama, mehrere Teammitglieder nutzen es.

1 „Gefällt mir“

Cool. Gibt es irgendwo eine Anleitung, der wir folgen können, wie man sie implementiert?

Ich interessiere mich mehr für die Zusammenfassung und andere Funktionen als für den Chatbot.

ETA: Diese Konfiguration funktioniert nicht:

(Ich habe hier die IP-Adresse meines Hosts geschwärzt).

Was passiert, ist, dass ich einen Internal Server Error erhalte, wenn ich auf „Run Test“ klicke.

Unter /logs sehe ich:

NameError (undefined local variable or method `tokenizer' for an instance of DiscourseAi::Completions::Dialects::ChatGpt)
app/controllers/application_controller.rb:427:in `block in with_resolved_locale'
app/controllers/application_controller.rb:427:in `with_resolved_locale'
lib/middleware/omniauth_bypass_middleware.rb:35:in `call'
lib/middleware/csp_script_nonce_injector.rb:12:in `call'
lib/middleware/anonymous_cache.rb:409:in `call'
lib/middleware/csp_script_nonce_injector.rb:12:in `call'
config/initializers/008-rack-cors.rb:14:in `call'
config/initializers/100-quiet_logger.rb:20:in `call'
config/initializers/100-silence_logger.rb:29:in `call'
lib/middleware/enforce_hostname.rb:24:in `call'
lib/middleware/processing_request.rb:12:in `call'
lib/middleware/request_tracker.rb:385:in `call'

Dies geschieht unabhängig vom ausgewählten Tokenizer. Ich teste dies auf 3.5.0beta1-dev (c1ee4e120e).

1 „Gefällt mir“

Setze die URL auf http://localhost:11434/v1/chat/completions

1 „Gefällt mir“

Der Test schlägt immer noch mit dem internen Serverfehler fehl, die gleiche Protokollnachricht wie zuvor – ein NameError, der besagt, dass ‘tokenizer’ eine undefinierte lokale Variable ist.

1 „Gefällt mir“

Versuchen Sie, das Modell unverändert zu speichern, die Seite zu aktualisieren und dann über die Schaltfläche “LLM bearbeiten” zurückzukehren und einen neuen Test auszuführen.

Gleiches Ergebnis – ich habe das ein paar Mal durchlaufen, aber gerade noch einmal versucht.

Ich habe die Installation auf die übliche Weise durchgeführt – sie zu meiner app.yml-Datei hinzugefügt und ./launcher rebuild app ausgeführt.

Die Zeile, die ich hinzugefügt habe, war:

hooks:
  after_code:
    - exec:
        cd: $home/plugins
        cmd:
          - sudo -E -u discourse git clone https://github.com/discourse/discourse-ai.git

(Ich habe die anderen Plugins aus der Liste entfernt, aber die YML-Komponenten belassen, damit Sie sehen können, wo sich dies in der YML-Datei befindet.)

1 „Gefällt mir“

Ich werde mein lokales Ollama-Setup morgen wiederherstellen und hier berichten.

1 „Gefällt mir“

Ich habe Folgendes getan:

ollama serve
ollama run phi4

# Test

curl http://localhost:11434/v1/chat/completions \
              -H "Content-Type: application/json" \
              -d '{
              "model": "phi4",
              "messages": [
                  {
                      "role": "system",
                      "content": "You are a helpful assistant."
                  },
                  {
                      "role": "user",
                      "content": "How much is 1+1?"
                  }
              ]
          }' -s | jq .choices[].message.content

"Die Summe von 1 + 1 ist 2.

In der grundlegenden Arithmetik, wenn Sie eine Einheit zu einer anderen einzelnen Einheit hinzufügen, wird die Gesamtzahl zwei Einheiten. Diese Operation ist Teil der grundlegenden Additionsprinzipien, die in der Mathematik verwendet werden. Kann ich Ihnen sonst noch bei Mathematik oder einem anderen Thema behilflich sein?"

Auf Helper getestet

2 „Gefällt mir“

Thanks for testing this.

I notice that I’ve got an extra option (Disable streaming completions) compared to your config. Is it possible it’s the model selection that’s causing an issue for me? (I’ve got deepseek-r1:32b installed).

curl http://localhost:11434/v1/chat/completions \
              -H "Content-Type: application/json" \
              -d '{
              "model": "deepseek-r1:32b",
              "messages": [
                  {
                      "role": "system",
                      "content": "You are a helpful assistant."
                  },
                  {
                      "role": "user",
                      "content": "How much is 1+1?"
                  }
              ]
          }' -s | jq .choices[].message.content

"<think>\nAlright, I'm trying to figure out how much 1 plus 1 equals. Okay, so let's break this down step by step even though it seems simple.\n\nFirst, I know that addition is combining two numbers to get a total sum. So, when we say 1 + 1, we're essentially putting together one object and another object to see how many we have in total.\n\nLet me visualize it: imagine I have one apple in my left hand and another apple in my right hand. If I put them together, how many apples do I have? It should be two apples, right?\n\nAnother way to think about it is using a number line. Starting at the number 1, if I move forward by one more step, where do I land? That's right, on the number 2.\n\nI can also use my fingers to count. Hold up one finger on my left hand and another on my right. Now, count them all: one, two. So that gives me a total of two fingers, which means 1 + 1 = 2.\n\nMaybe I should check if there's another perspective or method. For instance, in binary, the number system used by computers, 1 + 1 equals 10 because it carries over after reaching 2 (which is represented as 10). But that's a different context. In standard base-10 arithmetic, which we use daily, 1 + 1 remains 2.\n\nLet me also consider the concept of sets. If I have a set containing one element and another set with one element, combining them would result in a set with two elements. That reinforces that 1 + 1 equals 2.\n\nI might wonder if there's any scenario where 1 + 1 doesn't equal 2. But within the realm of basic arithmetic and mathematics as we learn it, the answer is consistently two.\n\nTo make sure I'm not missing anything, maybe I should look into some mathematical proofs or principles that confirm this. For example, in Peano axioms, which form the foundation of arithmetic, the idea of adding 1 to 1 would follow the axiom that every natural number has a successor, and 2 is defined as the successor of 1.\n\nTherefore, putting all these thoughts together—visualizing objects, using a number line, counting fingers, considering sets, and even touching on mathematical axioms—it all points towards 1 + 1 equaling 2.\n</think>\n\nThe result of adding 1 and 1 together is 2. This can be seen in simple arithmetic and various ways of visualizing the problem.\n\n**Answer:**  \n$1 + 1 = \\boxed{2}$"

Similar result, though I get the <think> tags with this particular model - so maybe the model won’t be suitable for this use.

I’ve just rebuilt (ie, ./launcher rebuild app) my installation and made sure everything is current - I show “all up-to-date” in my Update page.

Selecting either deepseek-r1:32b or phi4 (which I installed and tested as well), I still am getting the internal server error response in the browser.

But in my logs, I’m seeing something new.

FinalDestination::SSRFDetector::DisallowedIpError (FinalDestination: all resolved IPs were disallowed) lib/final_destination/ssrf_detector.rb:105:in 'lookup_and_filter_ips' lib/final_destination/http'

lib/final_destination/ssrf_detector.rb:105:in `lookup_and_filter_ips'
lib/final_destination/http.rb:15:in `connect'
net-http (0.6.0) lib/net/http.rb:1642:in `do_start'
net-http (0.6.0) lib/net/http.rb:1631:in `start'
net-http (0.6.0) lib/net/http.rb:1070:in `start'
plugins/discourse-ai/lib/completions/endpoints/base.rb:105:in `perform_completion!'
plugins/discourse-ai/lib/completions/endpoints/open_ai.rb:44:in `perform_completion!'
plugins/discourse-ai/lib/completions/llm.rb:281:in `generate'
plugins/discourse-ai/lib/configuration/llm_validator.rb:36:in `run_test'
plugins/discourse-ai/app/controllers/discourse_ai/admin/ai_llms_controller.rb:128:in `test'
actionpack (7.2.2.1) lib/action_controller/metal/basic_implicit_render.rb:8:in `send_action'
actionpack (7.2.2.1) lib/abstract_controller/base.rb:226:in `process_action'
actionpack (7.2.2.1) lib/action_controller/metal/rendering.rb:193:in `process_action'
actionpack (7.2.2.1) lib/abstract_controller/callbacks.rb:261:in `block in process_action'
activesupport (7.2.2.1) lib/active_support/callbacks.rb:121:in `block in run_callbacks'
app/controllers/application_controller.rb:427:in `block in with_resolved_locale'
i18n (1.14.7) lib/i18n.rb:353:in `with_locale'
app/controllers/application_controller.rb:427:in `with_resolved_locale'
activesupport (7.2.2.1) lib/active_support/callbacks.rb:130:in `block in run_callbacks'
activesupport (7.2.2.1) lib/active_support/callbacks.rb:141:in `run_callbacks'
actionpack (7.2.2.1) lib/abstract_controller/callbacks.rb:260:in `process_action'
actionpack (7.2.2.1) lib/action_controller/metal/rescue.rb:27:in `process_action'
actionpack (7.2.2.1) lib/action_controller/metal/instrumentation.rb:77:in `block in process_action'
activesupport (7.2.2.1) lib/active_support/notifications.rb:210:in `block in instrument'
activesupport (7.2.2.1) lib/active_support/notifications/instrumenter.rb:58:in `instrument'
activesupport (7.2.2.1) lib/active_support/notifications.rb:210:in `instrument'
actionpack (7.2.2.1) lib/action_controller/metal/instrumentation.rb:76:in `process_action'
actionpack (7.2.2.1) lib/action_controller/metal/params_wrapper.rb:259:in `process_action'
activerecord (7.2.2.1) lib/active_record/railties/controller_runtime.rb:39:in `process_action'
actionpack (7.2.2.1) lib/abstract_controller/base.rb:163:in `process'
actionview (7.2.2.1) lib/action_view/rendering.rb:40:in `process'
rack-mini-profiler (3.3.1) lib/mini_profiler/profiling_methods.rb:115:in `block in profile_method' 
actionpack (7.2.2.1) lib/action_controller/metal.rb:252:in `dispatch'
actionpack (7.2.2.1) lib/action_controller/metal.rb:335:in `dispatch'
actionpack (7.2.2.1) lib/action_dispatch/routing/route_set.rb:67:in `dispatch'
actionpack (7.2.2.1) lib/action_dispatch/routing/route_set.rb:50:in `serve'
actionpack (7.2.2.1) lib/action_dispatch/routing/mapper.rb:32:in `block in <class:Constraints>'
actionpack (7.2.2.1) lib/action_dispatch/routing/mapper.rb:62:in `serve'
actionpack (7.2.2.1) lib/action_dispatch/journey/router.rb:53:in `block in serve'
actionpack (7.2.2.1) lib/action_dispatch/journey/router.rb:133:in `block in find_routes'
actionpack (7.2.2.1) lib/action_dispatch/journey/router.rb:126:in `each'
actionpack (7.2.2.1) lib/action_dispatch/journey/router.rb:126:in `find_routes'
actionpack (7.2.2.1) lib/action_dispatch/journey/router.rb:34:in `serve'
actionpack (7.2.2.1) lib/action_dispatch/routing/route_set.rb:896:in `call'
lib/middleware/omniauth_bypass_middleware.rb:35:in `call'
rack (2.2.11) lib/rack/tempfile_reaper.rb:15:in `call'
rack (2.2.11) lib/rack/conditional_get.rb:27:in `call'
rack (2.2.11) lib/rack/head.rb:12:in `call'
actionpack (7.2.2.1) lib/action_dispatch/http/permissions_policy.rb:38:in `call'
lib/content_security_policy/middleware.rb:12:in `call'
lib/middleware/anonymous_cache.rb:409:in `call'
lib/middleware/csp_script_nonce_injector.rb:12:in `call'
config/initializers/008-rack-cors.rb:14:in `call'
rack (2.2.11) lib/rack/session/abstract/id.rb:266:in `context'
rack (2.2.11) lib/rack/session/abstract/id.rb:260:in `call'
actionpack (7.2.2.1) lib/action_dispatch/middleware/cookies.rb:704:in `call'
actionpack (7.2.2.1) lib/action_dispatch/middleware/callbacks.rb:31:in `block in call'
activesupport (7.2.2.1) lib/active_support/callbacks.rb:101:in `run_callbacks'
actionpack (7.2.2.1) lib/action_dispatch/middleware/callbacks.rb:30:in `call'
actionpack (7.2.2.1) lib/action_dispatch/middleware/debug_exceptions.rb:31:in `call'
actionpack (7.2.2.1) lib/action_dispatch/middleware/show_exceptions.rb:32:in `call'
logster (2.20.1) lib/logster/middleware/reporter.rb:40:in `call'
railties (7.2.2.1) lib/rails/rack/logger.rb:41:in `call_app'
railties (7.2.2.1) lib/rails/rack/logger.rb:29:in `call'
config/initializers/100-quiet_logger.rb:20:in `call'
config/initializers/100-silence_logger.rb:29:in `call'
actionpack (7.2.2.1) lib/action_dispatch/middleware/request_id.rb:33:in `call'
lib/middleware/enforce_hostname.rb:24:in `call'
rack (2.2.11) lib/rack/method_override.rb:24:in `call'
actionpack (7.2.2.1) lib/action_dispatch/middleware/executor.rb:16:in `call'
rack (2.2.11) lib/rack/sendfile.rb:110:in `call'
rack-mini-profiler (3.3.1) lib/mini_profiler.rb:334:in `call'
lib/middleware/processing_request.rb:12:in `call'
message_bus (4.3.9) lib/message_bus/rack/middleware.rb:60:in `call'
lib/middleware/request_tracker.rb:385:in `call'
actionpack (7.2.2.1) lib/action_dispatch/middleware/remote_ip.rb:96:in `call'
railties (7.2.2.1) lib/rails/engine.rb:535:in `call'
railties (7.2.2.1) lib/rails/railtie.rb:226:in `public_send'
railties (7.2.2.1) lib/rails/railtie.rb:226:in `method_missing'
rack (2.2.11) lib/rack/urlmap.rb:74:in `block in call'
rack (2.2.11) lib/rack/urlmap.rb:58:in `each'
rack (2.2.11) lib/rack/urlmap.rb:58:in `call'
unicorn (6.1.0) lib/unicorn/http_server.rb:634:in `process_client'
unicorn (6.1.0) lib/unicorn/http_server.rb:739:in `worker_loop'
unicorn (6.1.0) lib/unicorn/http_server.rb:547:in `spawn_missing_workers'
unicorn (6.1.0) lib/unicorn/http_server.rb:143:in `start'
unicorn (6.1.0) bin/unicorn:128:in `<top (required)>'
vendor/bundle/ruby/3.3.0/bin/unicorn:25:in `load'
vendor/bundle/ruby/3.3.0/bin/unicorn:25:in `<main>'

This Discourse install is realtively fresh, and I’ve not configured any IP address blocking. I’m also the only user on it. I am running ollama in a Docker container, but that shouldn’t be an issue; I use open-webui (also from another docker container) with the instance and that works fine. I am using the local IP address rather than “localhost”, but “localhost” doesn’t behave any differently.

Appreciate the time you’re putting into help figure out what’s going on here.

ETA: I installed jq in the Discourse docker container, and ran the curl command (substituting my local IP address for localhost), and it did run from there - so there’s connectivity between the containers, and the URL is correct).

Das ist unser standardmäßiger SSRF-Schutz, der greift. Sie können Ausnahmen über die Umgebungsvariable DISCOURSE_ALLOWED_INTERNAL_HOSTS hinzufügen.

Er wird ausgelöst, wenn Discourse versucht, eine als intern eingestufte IP zu erreichen.

4 „Gefällt mir“

Super, das hat funktioniert – ich habe es über die Einstellungen und nicht über die Umgebungsvariable gemacht, aber der Test ist jetzt bestanden. Nochmals vielen Dank für all die Hilfe!

5 „Gefällt mir“

3 Beiträge wurden in ein neues Thema aufgeteilt: How to Use Ollama With Embeddings

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.