There’s a growing complaint in my forum that accounts aren’t real and are bots - non-human accounts that can interact with other bot accounts and with actual human members, carry on a conversation in the public domain, respond to PMs, can like posts, and some other things.
Can anyone speak to the validity of any of that?
Can bots complete a multi-step registration process - register by entering an email address and password, select a username, and then verify that email address outside of Discourse?
Is the more reasonable answer a more simple one?
Non-english speakers using translation services that aren’t great, so they sound semi-robotic and unnatural?
Concerted efforts by commercial groups that manage several accounts under one roof, in an effort to publish link spam or indirectly promote (or keep relevant) a keyword in a topic title, first post, or within the replies, by periodically interacting with the topic to keep it active?
The way I see things, it doesn’t much matter where these accounts come from. If they disrupt the community without providing value, they gotta be removed.
Now there is a bit of nuance in that human users can potentially become helpful members of a community whereas automated users probably won’t. So if you see someone flailing around and think they have potential, you might want to hold off on kicking them out. Maybe make allowances for factors such as non-English speaker or immaturity that a little time might straighten out. But if there’s no sign of progress, it doesn’t matter if the account was created by a machine or a human.
I suspect it’s possible to identify automated accounts from looking at account details. You might get some mileage adding domains to blocked email domains under Login settings. (Just, you know, make sure it’s a domain that won’t be used by legitimate users.) Blocking IPs (/admin/logs/screened_ip_addresses) might also help. Still, I find it’s better to let people show me who they are and then take action. Preemptively blocking stuff isn’t as helpful as you might imagine.