We just realised that our Discourse forum is not indexed by Google (we remember that it was indexed about a year ago), and we’re trying to fix it right now. What are the configuration that we need to make sure are set properly?
This is what I’ve done so far:
I’ve made sure that “allow index in robots txt” is ticked
I’ve added the following domains to “exclude rel nofollow domains”:
grakn.ai (our main site domain)
discuss.grakn.ai (our discourse forum domain)
I’ve made sure that “add rel nofollow to user content” is unticked
I’ve added Googlebot to “whitelisted crawler user agents”
Am I missing any other configurations that I need to set?
Our Google Search Console shows that discuss.grakn.ai could still not be crawled because it is blocked by robots.txt - see screenshot below.
I’m also unclear how we ended up in the above state, @codinghorror. I’ve been the admin of the site for the past year and I did not change anything related to stuff above. I do remember not doing an upgrade for very long, and then did one shortly before the above issue started occurring, but I don’t know if that’s related.