I checked the source code. If I didn’t make mistakes, the tokenizer is used for two things: count tokens for statistics and price estimation, and truncate posts to the set limit. So it shouldn’t affect me a lot if a wrong one is used.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Discourse AI - Large Language Model (LLM) settings page | 20 | 3076 | November 26, 2025 | |
| AI powered Spam detection | 11 | 1001 | January 11, 2025 | |
| Discourse AI - Spam detection | 32 | 3574 | March 10, 2026 | |
| Introducing Discourse AI | 26 | 3727 | May 4, 2023 | |
| What's the cheapest/best AI to use for AI Spam? | 6 | 264 | March 18, 2025 |