OK here’s a fun goodie:
Ollama support for (Completely AWESOME!) llama3
:
This is for when the bot is run locally in dev or in the cloud with ollama server … and in Basic mode:
- make sure model is
llama3
:
- custom URL needs to be set
http://localhost:11434
:
If you have a big enough server, you could be serving ollama in the cloud there.