I checked the source code. If I didn’t make mistakes, the tokenizer is used for two things: count tokens for statistics and price estimation, and truncate posts to the set limit. So it shouldn’t affect me a lot if a wrong one is used.
Gerelateerde topics
| Topic | Antwoorden | Weergaven | Activiteit | |
|---|---|---|---|---|
| Discourse AI - Large Language Model (LLM) settings page | 20 | 2592 | 26 november 2025 | |
| AI powered Spam detection | 11 | 920 | 11 januari 2025 | |
| Discourse AI - Spam detection | 22 | 2562 | 25 september 2025 | |
| What's the cheapest/best AI to use for AI Spam? | 6 | 173 | 18 maart 2025 | |
| Discourse AI | 92 | 37971 | 4 december 2025 |