My forum is hosted on a subdomain and recently has not been updated on google. I made some changes and do check but its been days and its not getting updated. I changed the description even thats not updated its been 3 days now.
Any idea about this issue?
robots.txt file is the default one.
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
Before posting here, I have tried other ways such as adding my domain to google search console. advertised the domain too. but still the content is like 1 month old.
If you post your domain, people could have a look.
You might try the sitemap plugin. I’ve seen people make very convincing arguments that it’s absolutely necessary and that it’s absolutely unnecessary. One client I have claimed that it made their site (with 230K imported posts) get indexed very quickly after waiting for some weeks.
I tried to fetch the hoe page from google crawl sites. and found out that its unreachable. However, domain is working fine its reachable.
Network unreachable: robots.txt unreachableWe were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.
This is really bad, it should be reachable. Did it give any error?
yes… network unreachable. quite confusing when the domain is working fine.
If you use your browser to go to
do you see it OK?
It was actually a stupid error.
I was redirecting discourse via proxy to https. however, http was also working on the top level domain. hence because of the confusion between https & http, I think google was unable to fetch robots.txt file.
I just forced http to use https completely. and now everything looks fine.
thanks everyone for helping.