Default LLM model is required prior to enabling "Chat"?

When trying to set the LLM for the Locale_Detector to none so it will take the default (which has been set).

This works for other personas, just not for this one.

3 Likes

Odd, I can’t repro this.

Are you using this persona for any other feature (check the ai-personas page)?

1 Like

We can probably remove this whole check now that the default LLM global setting exists.

2 Likes