Was approached by our Growth team (SEO and GEO (visibility in generative content - ChatGPT, Perplexity etc.)) who see the value, and already see forum content being surfaced in GEO conversations.
They have a wide range of content that they’re focusing on at the moment, and proposed ‘simulating’ topics within our forum, but in a way that externally, there at appears no different to any other conversation.
For example, creating 5/6 accounts, and posting questions from these accounts, where we can then reply/encourage the community to reply from the normal ‘regular faces’ within the community.
To me, at its core, this is no different to organic conversations and answers - except we are planting these conversations for the purpose of better SEO/GEO.
Is this something others are doing? Is there anything I should think about before we go forward with this effort?
I remember a few topics/posts about that, most being pre-AI-era:
I guess the main thing in your suggestion is that no bots will answer; they will just start topics, right?
I don’t know from a business perspective. Maybe it’s beneficial in the long run. Perhaps it can help a community grow.
My guts tell me that I don’t like the idea of answering a bot without knowing it’s a bot, but I don’t have any particular argument to expose (though some are written in my posted links’ answers).
I’m 100% sure I have answered at least one bot at some point on Reddit in the last years or months, and I don’t like this idea. It reminds me of my dog who once tried to interact with a lifeless terracotta pig at a flea market, thinking it was a living creature (that was both sad and funny).
Now, if a topic is created by a bot, and is labelled as such, and a legitimate, constructive discussion starts between participants about the proposed topic, why not?
I hope I understood your post well, and that I’m not off-topic
Communities are built on trust. Breaking that trust risks immediately losing all your high-value contributors.
For me personally: If I found out that I spent hours answering some bot, I would never volunteer my time to a community again. It breaks the social contract under which a community operates.
This is key:
If you’re honest and transparent about it, then it’s probably a perfectly fine experiment to run.
quoted for truth. If you’re pursuing growth at all costs, and you don’t care about your reputation, intend to have no relationship with your users, that’s another case.
This is the opposite of organic conversations and answers.
Doing this without disclosure to the community is grossly disrespectful both to your users and to the spirit of the community you’re trying to create.
Do what your boss says, but frame what you’re doing properly and don’t try to put a fig leaf over it. This is not “organic.” This is faking content as a SEO play, full stop.
Interesting concept, to get the engagement going “artificially”. My view on this is that if it’s solely to boost SEO rankings, then hey, that’s a bit like cheating right? But at the same time it could also spark a discussion (which I believe is also the point).
Like others I think have mentioned, stating that it is a bot is important. Some users may not care much for SEO either.
I would say this - use these conversations also for discussion, not just SEO. In fact, I would say that thoughtful, enriching discussions are more important than SEO - it’s about the community.
If SEO improvement is a side-effect then, well, that’s an added bonus. But my opinion would be not to make SEO the center, instead the community and enriching discussions.
Not sure if this is all completely relevant, please feel free to correct me.
I ran a little experiment with Discourse HelperBot and I think this was pretty cool.
If you use this –properly labeled– in public, that could actually spark some interesting discussion. Last year we went through a similar exercise internally at Discourse. Where we played a game of “trip up the bot”. Any time someone managed to make the bot answer with a lie.. we’d fix the documentation.
And yes, I do think having something like this could help with SEO/GEO. Have the AI generate a ton of content, which you verify to be true. Have it be a part of the Google index, but put it in a default-muted category maybe?
It can certainly be helpful for other users who are using search phrases that don’t appear in the documentation for example.
Exactly, I wouldn’t want to undermine that kind of intention either. I personally wouldn’t mind a forum where an AI is trying to start conversations at all; think “What’s everyone’s opinion on XYZ”; but as others have said, it just has to be clear it’s AI.
I definitely think there can be value in it, especially for communities that are just starting out and having trouble filling up their content, but I’m convinced quality above quantity is the way to go.
Appreciate all the thoughts and perspectives shared here
Just to clarify one detail that might help add context - to make sure I explained it fully:
This wouldn’t involve bots, AI, or company-branded accounts. These would be real people from our team, using their personal accounts, asking real questions (which happens already from time to time) - the same way anyone might if they were facing an issue and turned to the community for help.
The only difference is that the questions would be intentionally seeded to cover topics we know are valuable (from an SEO/GEO perspective) but aren’t yet well represented in the community.
Not sure if this shapes the conversation, but definitely hear all the points about trust and transparency, and I really appreciate the discussion
Many, many, communities have started that way. Usually people stop after the community takes off. But doing it for SEO reasons makes total sense and is wise community management.
Don’t overdo it though. In stage i’d probably limit it to 2-3 topics per week at most, probably less. Still nets you 100 topics in a year, good win for SEO.