I checked the source code. If I didn’t make mistakes, the tokenizer is used for two things: count tokens for statistics and price estimation, and truncate posts to the set limit. So it shouldn’t affect me a lot if a wrong one is used.
Связанные темы
| Тема | Ответов | Просм. | Активность | |
|---|---|---|---|---|
| Discourse AI - Large Language Model (LLM) settings page | 20 | 3166 | 26.11.2025 | |
| AI powered Spam detection | 11 | 1026 | 11.01.2025 | |
| Discourse AI - Spam detection | 32 | 3673 | 10.03.2026 | |
| Introducing Discourse AI | 26 | 3800 | 04.05.2023 | |
| What's the cheapest/best AI to use for AI Spam? | 6 | 286 | 18.03.2025 |