Google May 4th Core Update impact on Discourse forums

Ok. I’m lacking knowledge about how it works exactly (but it seems there is still some loading to do initially, and it could be faster. Hence this whole topic). This discussion already kind of occurred on Oct 14, above. Jeff himself actually raised the idea:

Why not take the content and create a totally pre-generated static site. Something as fast as it can be. And submit this to search engines instead of the actual forum? The idea would be that the content is the most important here, not the experience. And focus on the rapidity of the delivery, as it seems what is important to search engines.

Infinite scrolling hasn’t been blamed lately, but you would create static pre-generated pages without infinite scrolling. And you would design it like a rally car: Every bit of weight which isn’t strictly necessary, you get rid of it. No hamburger menu, no logo, no avatar for posters. You just solely focus on the content and on speed.

It would be like a nice restaurant (=the Discourse forum), where you setup a drive-in with orders to go. Same great food (=the content) but without any of the experience. You order at a speaker coming in (=your search on the search engine) and you get your food prepackaged thrown through your window. The whole idea is that it’s what is demanded, and only the food (=content) and the speed of delivery is important. If people like the food, maybe they’ll come back and enjoy the whole experience inside.

After, it’s up to each owner (=admin) to make a choice: Do you think a drive-in is bad for your brand and you refuse to do it, or do you go that route to attract more people, knowing many may not ever come eat inside (a lot already do not, but it may get even worse. And maybe your restaurant will seem far less nice presented that way). But maybe that famous website recommending restaurants will send you more people (it remains to be seen if it’s effective).

What would be needed is a plugin or module to generate these static pages as content is added to the forum (I guess it shouldn’t be overly complicated). You would just add a link here and there to your actual forum (setup on “do not crawl” for search engines). It would be up to each admin to use this solution or not.

If what has been said above is correct in principle, and this problem may get worse in the future, this seems a solution acceptable to me. Or maybe I didn’t understand well. (Note: all this would be READ ONLY, of course)

1 Like

We already serve plain HTML javascript-less to crawlers :upside_down_face:

As I said above, the thing is that the the new LCP score is captured from users browsers, not crawlers.


Ok, but what I fail to understand then, is that nobody is doing better in this case, right? Why would it impact search results? If Discourse sites do as well as others ("as well’ or “as bad” :wink: ). Other sites are also opened on Android, aren’t they?


Android is slower than average for single core performance, which affects heavy weight single page applications like Discourse. We go about that in depth at The State of JavaScript on Android in 2015 is… poor

The top iPhone is 10x faster than the last Pixel when rendering Discourse. Google doesn’t take into account iPhone renders into LCP, because they just can’t since there is not real Chrome in iOS.


So there may indeed be an advantage on that front to generate a “small pages” site to submit to search engines instead. It wouldn’t be worth it? (maybe not). Or admins have to offer new high end phones to their users? :wink: That’s the purpose of all these ads claiming you won the latest Iphone ? :rofl:

Thanks for the explanations, Falco !

1 Like

From quickly skimming this Chrome User Experience Report  |  Chrome UX Report  |  Google Developers it looks like Google gets the information from spying on users (with permission) so you’d have to persuade a lot of them to use your poor man’s Discourse :slight_smile:


It seems like you are confusing Ember and Ember CLI. Ember is the framework we are already using (and have been for 8+ years). Ember CLI is the command line tools we are migrating to rather than using Rails’ asset pipeline. I mention this because some things you say (pre-3 require re-writes which would not be true of Ember CLI but could be true of Ember.)

Again Ember CLI doesn’t do rendering. Ember does, and at times it has bad performance issues. Note this isn’t anything specific to Ember - all the current frameworks have performance traps you have to be aware of. After years of working with Ember we identified two hot paths (head and topic view) that needed better performance and switch to a virtual dom based approach.

We might not always need to do this, depending on how Glimmer/Ember Octane works out, but the code is quite stable and runs fast even on older mobile devices now.

Ember Octane was introduced in 3.15, and there’s been two LTS releases of it since (3.16 and 3.21). We will be upgrading to it, but in stages. Fortunately the Ember team allows you to opt in (even per file!) of which format you want to use.

Having said all that, there is a fair amount to criticise about Ember. Back when performance was a bigger problem for Discourse there were a couple promised releases that ended up hurting us rather than helping us. That was tough. We had to keep a very close eye on it for a long time to meet our needs.

Today, it is also has a fraction of the popularity of newer frameworks like React. However, React did not exist 8 years ago! Our only choices were Angular, Ember and Knockout. If you think upgrading Ember is tough, you could see what Angular went through from versions 1 to 2, (not to mention their sidequests with Dart!)

Upgrading Ember over the years has been a lot of work, but at least it’s an option! None of the other frameworks offered any kind of upgrade path like this.

As for re-writing in Vue/Next/React, I think people severely underestimate how much code we have that works just fine. It would be an unfathomable amount of work.


Yes, that is correct. When your user population has old devices typically your site will be rated less.


I’m considering it @justin and @awesomerobot but I wanted Robin to weigh in on the specifics of Ember CLI first.

At its core, there is a bit of a “What happens when an unstoppable force meets an immovable object?” paradox here … we are very intentionally a JavaScript app (or SPA), and that involves tradeoffs we decided to make in 2012/2013 based on our best guess of what the future would look like in 2022/2023. Although I am obviously biased, I’d say our prediction that “gee, mobile device browser performance will be indistinguishable from desktop browser peformance” was right on target.

Heck, beyond the target, since most recent Apple phones are faster than laptops and desktops. :astonished: That… I did not see coming!


While we will continue to improve what we can on first load speed – and speed in general – I think our track record here is laudable. For one thing, we got so much publicity in 2015 that Google internally made a bunch of improvements in V8, Chrome, and Android to address weak Qualcomm SoC performance as measured in JavaScript.

Our achilles heel has been … Qualcomm. Sadly, Qualcomm hasn’t done very well on the performance front to date, as the “best” performing Android device is only at roughly iPhone 7 level. It takes a long time for older Android devices to cycle out of the market, but the 855 and 865 were both decent performers at approximately iPhone 7 level:

I have to scroll down more to get to an Apple device that is as slow as the fastest Android device, but if I do, the closest match is the iPhone X / iPhone 8 at ~910 in Geekbench. Unfortunately the 865, for reasons I don’t fully understand, underperforms a bit on the web side, so we’re still at iPhone 7 perf levels in Speedometer:

I do wish we lived in a world where there were Android devices shipping on a variety of SoCs from a variety of companies who all competed to build the fastest and most powerful SoCs… :crying_cat_face: On the upside, iPhone 7 performance is solid for Discourse and I’m happy that eventually all Android devices, even the old ones, will be “at least as fast as an iPhone 7”. I also have my fingers crossed for the upcoming Snapdragon 875, there should be more details on that in the coming months. :crossed_fingers:

According to the Geekbench 5 results, we can see that the Xiaomi Mi 11 is powered by the 5nm Snapdragon 875 (as hinted at by none other than Xiaomi executive Lu Weibing). The upcoming Xiaomi Mi 11 has managed to score 1,102 points in the single-core benchmark and 4,113 points in the multi-core test.

If true, that’d put it at the A12 level, and hopefully that’ll manifest on the web side as well!

Anyway, there’s a core architectural decision here at Discourse to be a JavaScript app … and we are fully committed to that path for forseeable future.



For those keeping an eye on your stats, here’s another date to jot down. It will be interesting to see what happens in the coming weeks in relation to Google’s most recent core update, December 2020.

1 Like

You cannot forget about the recently announced Apple Silicon Macs! :grinning:

Out of curiosity, where did that publicity come from?

The A10 chip is still hanging on by a thread.

Just in case, I’m setting my expectations low. Apple has always been ahead.

Even with that said and done, Android smartphones are still trying to catch up. It is absolutely ridiculous. Apple already has the A14 chip, and they’re probably now working on the A15 chip for next year.


Thanks for sharing that. A few relevant items to this discussion to pull out from the article:

What to do if you are hit. Google has given advice on what to consider if you are negatively impacted by a core update in the past. There aren’t specific actions to take to recover, and in fact, a negative rankings impact may not signal anything is wrong with your pages. However, Google has offered a list of questions to consider if your site is hit by a core update. Google did say you can see a bit of a recovery between core updates but the biggest change you would see would be after another core update.

This is helpful, too.


In my view, when talking about SEO, which is optimizing search engine result when compared against other sites, talking about user-hardware is mostly irrelevant.


It’s actually quite simple.

Take the case of a one person and their search results on their hand held device.

Whatever the speed of their device, or whatever the chipset, the performance will similar across all web sites because the end user experience ( performance ) of a single end user device on the network is going to be the same, for the most part, across all web sites of similar performance. Faster websites will be faster and slower website will be slower, regardless of the chip set, etc. of the end user device. A rising tide raises all boats and a sinking tide lowers all boats. In SEO, the end user devices is “noise” as an SEO signal compared to the serving application, which is what is optimized in “SEO”.

Hence if a mobile phone is the fastest phone in the entire universe, all websites will be fast (or slow) depending on the speed of the network and the design of the web site. The focus of SEO is on optimizing the web application and delivery of that application, not the end user devices. If a web app performs “amazingly great” on one chipset, so do all other websites of similar design. The focus of SEO is not the end user device; it is the optimization of the web application, the content, the loading time based on the server; not the client device. The client devices visit, in theory, all web sites and so all that is “noise” in the signal-to-noise ratio of SEO.

From a web site SEO perspective, your search engine optimization based on the user experience will be the same across the network for all users of the same class (performance characteristics) of all end user mobile devices. The only thing which will give a web site an SEO advantage over another web site is the performance of the web site (and their network), not the end user devices.


Because the end user devices will perform the same across all web sites, generally speaking. If the user’s mobile is slow because of memory or the chipset, it will be slow across the enter cyberverse of websites. In other words, the discussion about how end user devices effect SEO is moot. Search engine optimization is a server side operation, not a client side operation.

What matter is content, presentation and performance; and how Google’s AI scores these factors across the cyberverse. If, for example, everyone in the entire world upgrades to quantum computing mobile phone, the SEO will be the same because all end users will have the same “end user device performance curves”. The optimization occurs at the provider (the web site). Likewise, if the entire cyberverse degrades to slow chipset mobile phones, the search engine rankings are going to be mostly the same; because the optimization which needs to happen, happens by the servers serving the web content.

Of course Discourse as a Javascript-driven SPA will perform better after it loads if mobile are faster. So do every other web site as well! Generally, it’s the network performance which matters as well as the server performance, not the end user device as far as SEO goes. This is not my opinion, it is a scientific, engineering fact. My opinion or emotional connection to Javascript or EmberJS does not change how SEO works. What works for SEO is the content and the performance of the web application.

In closing, Google uses advanced AI, mostly artificial neural networks to determine how Google ranks and indexes web content. Search engine optimization is based on how Google’s AI ranks the site, the site’s performance, and the site’s “appeal to Google’s AI”. How much we love Javascript or Ruby or Python, or how much like or adore the elegance and mechanics of any web application we provide to end users is not relevant; unless our passion for our application appeals to Google’s AI and we are creating unique content which is well presented to Google’s AI and how Google’s AI perceives performance; not how we perceive performance and content.

We do not rank our own web sites. Google’s AI does the ranking.

As Google has stated publicly:

“One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before. The list will change, and films previously higher on the list that move down aren’t bad. There are simply more deserving films that are coming before them,” Google wrote.

The company offered the following list of questions to consider when evaluating your content:

  • Does the content provide original information, reporting, research or analysis?
  • Does the content provide a substantial, complete or comprehensive description of the topic?
  • Does the content provide insightful analysis or interesting information that is beyond obvious?
  • If the content draws on other sources, does it avoid simply copying or rewriting those sources and instead provide substantial additional value and originality?
  • Does the headline and/or page title provide a descriptive, helpful summary of the content?
  • Does the headline and/or page title avoid being exaggerating or shocking in nature?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Would you expect to see this content in or referenced by a printed magazine, encyclopedia or book?

This is SEO and Google’s core business is creating algorithms so machines can score and classify web sites.


This is inconsistent with the record - we know that Google is collecting real-world page load times from Android devices and using that for page rankings.


Yes, but all Android devices (of the same type) perform basically the same for all web sites (of the same type). In other words, if you optimize a Javascript-based web site which uses webpacker and bundler, you are competing, from an SEO perspective, with all other sites that are Javascript SPAs using webpacker and bundler, etc.

I did not say Google is not collecting this information. I’m trying to explain that focusing on the client device is not going to solve the SPA SEO performance issue. A “rising tide raise all boats”, so a faster CPU which processes compressed JS well (fast, optimized, etc) will perform well on all similar web sites.

In other words, the SEO is on the server side (as I typed at great time and length above), not on the client side.

This is well documented, BTW.

Never mind :). I prefer not to debate this here on meta. Thanks.

Google has been quite clear on what they consider important SEO signals, now and into 2021. They will constantly redesign their AI, based on events and situations in cyberspace.


From a SEO perspective you’re indeed competing with sites that are technically similar.

But from a business perspective you’re competing with other sites in your market, regardless of their technology. And that could make people consider switching to a technology that is perceived “better” from a SEO perspective. And that is the boat some people are in.


There’s been long-standing, totally unfounded myth that WordPress powered websites and blogs are somehow better search engine optimized than other websites. Talk to many website owners and some will tell you this. I’m not sure how this myth started but it’s out there.

And it’s totally false. Most WordPress websites are practically invisible to the world. WordPress itself doesn’t ensure a viable marketing presence. Never did. Never will. In any particular subject area plenty of sites that are successful run something other than WordPress.

I am positive that great content, and in particular a high degree of civil discourse, will ensure a high level of of quality traffic. That is, community members will continue to visit your website no matter the google juice afforded. They will also bring new community members in much better than any search optimization will.

I guess the question is of focus: do we point to technology for solving issues of relevance or to the quality of the discussion by the community? I think the latter will always win.


This may seem logical, but according to the above discussion, it may not be the case.
If you use some technologies, features, functions, etc., which are quite slower on some devices than other websites NOT using these features, you may get penalized. There may be an argument it’s not really about some devices being slower, but about devices being slower running certain kind of websites compares to others.

According to what I understood, here, it’s mainly about infinite scrolling, and it seems Android devices are particularly bad at it, compared to running “pages” websites. So, in the end, it does seem to make some sense (I didn’t really get it at first either, but after some explanations, mainly from Falco as I remember, I think it now does make sense considering browsers like Chrome kick back some informations to Google about the browsing experience).

Is that the case? Isn’t the “fight” on the CONTENT level, all technical aspects aside (except speed)? Which indeed results in some more technically advanced sites (infinite scrolling) being at a disadvantage.

If the goal is a better placement in the results, the competition is on the CONTENT, and after this, technically wise, on the SPEED. There aren’t any classification for “technically similar” sites, or is there? (For context, I’m mainly thinking about infinite scrolling, here)

1 Like

This conversation is getting a bit circular. We’ll reopen if we have anything new to add.

See Introducing Discourse Splash - A visual preloader displayed while site assets load