I checked the source code. If I didn’t make mistakes, the tokenizer is used for two things: count tokens for statistics and price estimation, and truncate posts to the set limit. So it shouldn’t affect me a lot if a wrong one is used.
lilydjwg
9
Gerelateerde topics
| Topic | Antwoorden | Weergaven | Activiteit | |
|---|---|---|---|---|
| Discourse AI - Large Language Model (LLM) settings page | 20 | 3297 | 26 november 2025 | |
| AI powered Spam detection | 11 | 1057 | 11 januari 2025 | |
| Discourse AI - Spam detection | 32 | 3834 | 10 maart 2026 | |
| Introducing Discourse AI | 26 | 3895 | 4 mei 2023 | |
| What's the cheapest/best AI to use for AI Spam? | 6 | 307 | 18 maart 2025 |