Massive traffic drop from Google searches after migrating from myBB

Almost 3 weeks ago we migrated an old forum from myBB to Discourse, bringing all the content and creating proper 301 redirects for all old threads, categories and topics. Since the move, our incoming traffic from search engine has dropped over 20%, so far inexplicably.

Our SEO consultant has highlighted that serving different content to the crawler may be one of the reasons that Google treats the new forums so poorly compared to the past. The version seen by Google of a topic page for example contains less content than what a browser sees (no indication of reply, login, search or suggested topics, among other things.)

Has anybody noticed similar traffic drops after a migration from ‘old-school’ html forums? Any suggestion on how to fix things on our forums?

1 Like

In our experience it takes quite some time for this to stabilize, months. I wouldn’t even begin thinking about this until 3 months in.

Here’s one graph a customer who did a mass migration from Ning shared for example

1 Like

I was hoping you’d have good news :frowning: did that mass migration from Ning include 301 redirects? This shouldn’t happen, it’s not normal that traffic drops as dramatically when the content is migrated following SEO best practice. A drop in pagevies I expected (as there are much less pages in Discourse than mybb) but a drop in sessions and users is not normal. Something’s off. Any other thoughts of what’s wrong?

1 Like

Upon diving into this issue further, I found that what Discourse is doing is called cloaking and it’s considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected. (from Cloaking - Search Console Help)

I’m quite disappointed to find this out. Any way to simply disable this cloak?

Except that, actually, the content is the same. Cloaking would be if Massive traffic drop from Google searches after migrating from myBB showed your response as “Upon diving into this issue” to users and “Upon googling more info on this issue” to Googlebot (the latter, of course, is keyword stuffing).

What discourse does is a case of this paragraph from “Hidden Text and Links”:

However, not all hidden text is considered deceptive. For example, if your site includes technologies that search engines have difficulty accessing, like JavaScript, images, or Flash files, using descriptive text for these items can improve the accessibility of your site. Remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either and will benefit from the descriptive text as well.

https://support.google.com/webmasters/answer/66353

4 Likes

Note, we are not cloaking, if you disable javascript you will see the exact same text as Google sees. (Which is also the same text that is on the page)

What you could experiment with is the sitemap plugin, Discourse Sitemap Plugin, cause now that tons has moved Google may be playing catchup.

3 Likes

almost the same content. Things like the suggested topics and all interactions are not visible. Not sure if that counts for Google but anyway… Why is this behavior even needed since googlebot can perfectly execute javascript since 2015 (mentioned also on https://meta.discourse.org/t/how-googlebot-crawls-javascript-an-interesting-article/28954)?

Catch up with what? The content didn’t move, it’s all there where it was and it’s 301 redirected. I was not expecting traffic to drop. If nothing else, I was expecting it to stay stable until Google started realizing that the pages were responsive, usable on mobile, etc. and therefore see better search results. And the drop is not insignificant, we’re talking 20% less visits from search results week over week in the past 2 weeks.

I showed you the graph above, that indicated one year, if you are unwilling to wait that long perhaps just switch things back to the way they were and keep it that way if you prefer safety.

3 Likes

Discourse has been around way before 2015 afaik.

Plus crawling with JavaScript enabled will be way slower than disabled cause it costs Google resources to generate HTML for the pages.

The content did move, in fact, you just said it is all 301 redirected.


I think it does make sense to make it configurable what we consider a “crawler” (especially if new crawler types pop up). If you want to be a guinea pig and disable crawler special behavior for Googlebot feel free to.

https://github.com/discourse/discourse/commit/f6fdc1ebe81652be07e8c2c12b59812305de1ba5

I agree with @codinghorror here though, if you are not willing to work with us, probably best for you to migrate back to your previous software.

6 Likes

I’m more than willing to work with you folks, don’t get me wrong: I picked Discourse after a long selection process. I simply have a problem and I’m looking for solutions: traffic is dropping at an alarming rate and it’s not clear why. The main suggestions I got from our SEO consultants is that ‘sort-of-cloaking’ (for lack of better term) but I’m hoping that having this conversation with you can lead to solving a problem.

I found that, thank you. I installed it on a clone site, produced the maps, fixed the URL to the correct domain and submitted to Google Webmaster. I’ll let you know when they’re processed.

I’d be interested to run your commit but I’d have to do it on an experimental site, I can’t run that in production… I’ll put together a dev environment tomorrow and run some tests.

That’s a very alarming chart, I’m concerned you take it so lightly. “Wait for a year” while a crucial KPI of my community gets down the drain is a disappointing option.

Going back to finding solutions to the problem, am I understanding it right that you’re saying that the drop in search traffic is simply to be expected? Migrating a community to Discourse will take ~1 year before traffic goes back to pre-migration levels? Even if one migrates keeping the same domain, 301 redirects and no other screw ups?

6 Likes

Not an expert on SEO here or anything, but are you sure that Google is treating your site as it was?

Even when you’re 301-ing everything, the pages come out different. It is obvious to Google that your content is no longer the same (even though the texts are the same). So it is not a simple 301. 301 means “moved”. This is “moved” and “changed”.

If I were Google, I’d be very cautious about a site with high ranking suddenly having all its links point to some other pages which are not at all similar to the old pages. If I were Google, I’d treat these as “new” pages and maybe start counting from zero or something.

9 Likes

I used for several years participated in the development of search engines. And a little bit familiar with search engines, structutral index, rank them. 3 months is really very little time. I want to say that the drop in traffic is a natural process. With the transition to the new script you completely change the site structure. There is a redirect or not, doesn’t matter. Ljuboja search script will react to this. In the Discourse there are different places for improvement from the point of view of compliance with the requirements of search engines. I wrote about this previously, for example, on many pages there is no Title. It is not very good can affect the ranking of these pages. But it’s the little details, and it can’t say so globally (down traffic). You need to wait at least 6 months to draw conclusions. Analyze the indexes of search engines, using the mechanisms provided, for example, in Google. Sorry for my English, but I think my idea is clear.

9 Likes

Both

and

are clear to me: I knew that switching to Discourse would have an impact. But I thought it would be a positive one, or at best neutral.

The old forum were made of pages that looked old, not-mobile friendly, with bad URL. With the switch to Discourse Googlebot now should see beautiful responsive, mobile-first pages… So Google notices a switch to a better URL structure, same content, pretty presentation and penalizes the new pages?

Since Google doesn’t strictly penalize 301 redirects I’m looking for other reasons that may affect SEO negatively.

OLD_URL has some accrued relevance score in Google. It has not changed in a long time, so Google is going to wait quite awhile before it crawls it again. NEW_URL, of course, has identical content to OLD_URL but no relevance score.

We both know that OLD_URL has been changed to a 301 redirect leading to NEW_URL, but Google won’t know that until it re-crawls it, which it’s not going to do very often. So instead of seeing a moved page, Google thinks there’s a copied page. That won’t change until it re-crawls all of the old pages and sees the 301s.

Maybe you could make a sitemap with the old URLs and a recent updated date, so that Googlebot will recrawl all of the old URLs and learn about the 301 redirects?

5 Likes

Go to any other script have the same effect.

Ranking in search engines depends on a number of parameters.

To the old version of the site Google has developed a level of trust. He determined the frequency of scans and calculated the weight of the pages, etc… And everything has changed. How to respond to this search engine? At least, mistrust. This reaction is common.

Redirect only showed that a new page has changed address. But to search engines these pages have changed! They do not become better or worse (depending on many factors), they just became different.

The weight of the pages (PR), everything has changed…

3 Likes

Like we keep saying, you need to wait at least three months. At least. Maybe more. If you are looking for quick fixes, SWITCH BACK TO YOUR OLD SOFTWARE.

1 Like

I like this suggestion, I’ll do it.

Another area I’m investigating is the default nginx rate limits as well, as maybe Googlebot is hitting them? Thanks for the help everybody, got plenty of ideas already.

@sam I have applied your patch but to another forum that I am building out from scratch. I’m not comfortable applying it in production though as at the moment I don’t think that’s the root cause of the dropping traffic.

Google is not penalizing anything. It just needs to adjust. Adjusting takes time. Be patient :smiley:

3 Likes

OK. You go to a grocery store all the time. Ad old, mom-n-pop style, old-60’s store. Very reputable. You trust it. You tell your friends that their goods there have good quality and come from reputable sources.

Then they moved to a new, shiny location. With a brand-new store, and all new faces. On the door of the old store that’s a notice: “301 - We’ve moved!”

Inside the brand-new store, they sell exactly the same stuff. But you don’t recognize the store any more. Sure, it’s modern and great and everything, but do you TRUST it? Will you recommend it to all your friends?

Or will you observe for a while? Do you smell something fishy is going on? Are these the same people? Or have they sold out to a money-grabbing new MNC?

In other words, will the new store’s Page Rank be the same as the old store?

14 Likes

How would you describe when many of the top performing pages have lost their top-10 status in the Google results page and are now on page 2?

Google is indexing the site at the rate of 8-10k pages a day, up from the 2k/day average before the migration to Discourse. Googlebot is getting pages a lot faster than before, from 800ms to 70ms with Discourse. Everything seems to be going fine except the massive traffic drop, which keeps on dropping.

Is it possible that the default rate-limiting protections are slowing down googlebot?