Gestione di Bingbot

Just putting this out here, while discussing this with Microsoft (the engineers who build Bing and PM for Bing)

  • Microsoft never told me that this blocking will cause irreparable damage to sites ranking

  • Microsoft suggested a site map as a workaround

  • Microsoft did not make an explicit recommendation to use Crawl delay vs blocking

  • Microsoft said they want to fix the underlying issue

We are testing the site map theory.

My gut feel on this is that crawl delay vs straight out telling bot not to crawl will have almost the same effect if this only goes on for a month or so. long term crawl delay is not a proper solution cause the backlog they have is too huge and they can not work through all the urls they want to.

We are only telling Bing not to crawl, we say nothing on the pages asking Bing or any crawler not to index using meta tags

So, basically what we have here is a bunch of people who have a gut feel that due to SEO reasons blocking a crawler for N days via robots.txt while we work this out with Bing will damage sites forever in Bing. My gut feel is that the old content will remain in Bing for at least a few months if not more.

Now, if Microsoft tell me that what we are doing will have irreparable damage going forward, I think it would have more weight.

If anything brand new sites deploying Discourse at the moment have more of a chance finding this out that we messed with Bing and find this topic. Crawl delay would mask that.

18 Mi Piace