RBoy
(RBoy)
December 14, 2025, 12:36pm
1
A minor bug on the LLM usage page. When you create a new LLM model and run a test for it, it shows up in the usage report under llm_validator, but since there’s no name of the llm as yet the model name shows up in the filter list (no LLM name), in this case moonshotai
However after selecting it, the dropdown filter changes to 0 instead of moonshotai and all the statistics show up as 0
1 Like
Thanks for the report @RBoy , that will be fixed by
main ← ux-preserve-features-models-dropdown-options-when-filtering
merged 08:13PM - 15 Dec 25 UTC
When selecting a model or feature filter on the LLM Usage page, the dropdown wou… ld lose its options and display incorrect values (like "0") because fetchData() was clearing the cached dropdown options on every request.
The filtered API response only contains models/features matching the filter, so rebuilding the dropdown options from filtered data caused other options to disappear.
Added a `clearCache` parameter to fetchData():
- Initial load and date range changes pass `clearCache: true` to refresh options
- Filter changes use the default `clearCache: false` to preserve all options
Ref - https://meta.discourse.org/t/391273
3 Likes