Sorry guys I couldn’t figure out from this article if locally installed LLM can be confgirued through the standard settings UI?
אני חושב שכל עוד הוא חושף API נתמך, זה אמור להיות אפשרי.
האם יש מודל שפה גדול (LLM) מסוים שאתה מתכנן להתקין (או שכבר התקנת) באופן מקומי?
this topic may help
That’s another exercise to choose the right one - I wasn’t certain even after reading your AI-related articles here at Meta.
I guess some Open Source LLM Selector tool from the Discourse Team would be very helpful - because you know the internals and what exactly LLM must be capable of doing for it to excel in various types of tasks relevant to Discourse communities. So, the tool/wizard/LLM would ask questions or let me check on/off in a list of 20+ typical tasks I’d like the LLM to do in my community, and then get a recommended Top 3 (uncompromising but heaviest and requires expensive hardware; balanced that requires medium-priced dedicated server; and lightweight for basic tasks in small-to-medium communities that can run on a $20-40 VPS).
I think maintaining the correct answer to that would be a full time job. ![]()
If you are looking for cheap, Gemini Free tier is the way to go:
Gemini flash 2.0 is a very capable model and the free tier provides enough usage to perform quite a few things on your forum.
אני מחפש אירוח מקומי מכיוון שאני לא יכול לשלוח את התוכן שלנו לשום שירות מחוץ לשרתים שלנו.
לגבי אפשרויות זולות, תודה על הרמז!
Uncompromising
- DeepSeek V3 0324
- Qwen 3 235B A22
Balanced
- Qwen 3 32B / 30B A3B
- Llama 3.3 70B
- Qwen 2.5 70B
Lightweight
Maybe unsloth/gemma-3-4b-it-qat-GGUF · Hugging Face ? It’s hard at this level, much more economical to use a hosted API, like OpenRouter.