Getting discourse ai to work with ollama locally

Cool. Is there a guide somewhere that we can follow on how to implement it?

I’m more interested in the summarization and other functions than the chatbot.

ETA: This configuration doesn’t work:

(I redacted the IP address of my host here).

What happens, is I get an Internal Server Error when I click “Run Test”.

At /logs, I see:

NameError (undefined local variable or method `tokenizer' for an instance of DiscourseAi::Completions::Dialects::ChatGpt)
app/controllers/application_controller.rb:427:in `block in with_resolved_locale'
app/controllers/application_controller.rb:427:in `with_resolved_locale'
lib/middleware/omniauth_bypass_middleware.rb:35:in `call'
lib/content_security_policy/middleware.rb:12:in `call'
lib/middleware/anonymous_cache.rb:409:in `call'
lib/middleware/csp_script_nonce_injector.rb:12:in `call'
config/initializers/008-rack-cors.rb:14:in `call'
config/initializers/100-quiet_logger.rb:20:in `call'
config/initializers/100-silence_logger.rb:29:in `call'
lib/middleware/enforce_hostname.rb:24:in `call'
lib/middleware/processing_request.rb:12:in `call'
lib/middleware/request_tracker.rb:385:in `call'

This happens regardless of the tokenizer I select. I’m testing this on 3.5.0beta1-dev (c1ee4e120e).

1 Like