Google indexing issue (robots.txt)

Google informed me of this issue with my forum, any input on fixing this?

Also Discourse has no sitemap, I found a sitemap plugin but my question on that is – does the sitemap constantly update or would I need to re-initiate the plugin frequently in order to “update” the sitemap?

referenced forum

How many pages / urls are affected by this issue?

1 Like

7

https://voskcointalk.com/u/ruth

Apr 10, 2020

Apr 10, 2020

Apr 10, 2020

Apr 10, 2020

https://voskcointalk.com/u/Cmiles7888

Apr 10, 2020

https://voskcointalk.com/search?q={search_term_string}

Apr 9, 2020

https://voskcointalk.com/u/VoskCoin

Apr 9, 2020

Yes, /u/ pages are excluded in robots.txt by default. Do you think you are missing something by not indexing user profiles?

If the answer is yes, you can override your robots.txt file in

https://voskcointalk.com/admin/customize/robots

Is there a reason they are excluded by default? what if someone wanted to search for a specific user on a forum easily through google ex.

greer voskcointalk, if his profile isnt being indexed, then that would not be possible other than google just link to a thread of his, then having the user navigate directly to it?

There’s no content on the crawler view for profile pages except the bio, and spammers love putting garbage in their bio. It’s better all around to block crawling to stop the garbage from being associated with the site.

6 Likes

Also Discourse has no sitemap, I found a sitemap plugin but my question on that is – does the sitemap constantly update or would I need to re-initiate the plugin frequently in order to “update” the sitemap?

Are you familiar with this information? thanks for the above reply

It updates automatically.