Improve Discourse's pagespeed score?

(Jeff Widman) #1

When I run Google’s pagespeed tool on it gives a number of suggestions:

Discourse is only scoring a 62 on mobile and an 84 on desktop.

Are any of their suggested fixes easy improvements? For example, they suggest better image compression.

I’m a n00b on the tech side, but if @sam or @eviltrout can look at this and suggest a few easy fixes I’m willing to try putting together a pull request to implement them. I’m not sure when I look at the list of issues which ones are easy to improve and which are just a byproduct of Discourse’s rich client-side architecture and therefore pretty much impossible to change.

Normally I wouldn’t pay much attention to a silly score–I know in previous pagespeed threads @sam and @codinghorror said there isn’t much that can be improved here–but an acquaintance who runs a large forum just sent me a Google Analytics screenshot showing a ~10% bump in traffic from Google after he spent some time trying to increase his PageSpeed score.

We’re both very aware that correlation does not equal causation, but he dug through the analytics and the lift seemed to happen across his entire site–it wasn’t caused by a couple of pages driving tons of traffic. The only code modifications he made during this period were related to improving pagespeed–stuff like compressing images, etc. Plus Google explicitly says they use loading speed as a ranking factor.

So any improvements here will not only help the user experience but also drive more traffic…

PageSpeed complaints about Discourse
Google Page Speed Insight
PageSpeed complaints about Discourse
(Kane York) #2

Well, this is the primary violation:

Eliminate render-blocking JavaScript

Except for the part where Discourse uses JS to do all of the rendering.

(Jeff Widman) #3

Are any of the others things that can be changed?

Particularly on mobile there seems to be some other warnings that seem like they aren’t tied to JS, but I’m afraid I don’t fully understand what they’re asking for. As I said, if any seem fairly easy to fix, I’m more than happy to dig into it a bit and see if I can assemble a pull request…

For the desktop, it mentions image compression, but all the images mentioned are user avatars… Should that compression be happening at gravatar, the CDN, or Discourse server?

(Jeff Atwood) #4

Hmm interesting let’s see. I will right click some system generated default avatars and do “save as”.

C:\Users\wumpus-home\Desktop>pngout avatar-large.png
 In:    1984 bytes               avatar-large.png /c3 /f0 /d8
Out:     941 bytes               avatar-large.png /c3 /f0 /d8, 144 colors
Chg:   -1043 bytes ( 47% of original)


C:\Users\wumpus-home\Desktop>pngout avatar-small.png
 In:    1381 bytes               avatar-small.png /c3 /f0 /d8
Out:     722 bytes               avatar-small.png /c3 /f0 /d8, 124 colors
Chg:    -659 bytes ( 52% of original)

So yeah, @sam, we could save nearly 50% of file size on system-generated default avatars if we pngout them.

(Sam Saffron) #5

That is very minor compared to the general complaint Google has here. Agree it should be fixed, @zogstrip can you add to your list?

PageSpeed is basically complaining about JS blocking rendering. Thing is, with Discourse nothing renders until all the JS parses. So there is absolutely nothing we can do until ember.js provides with a clean mechanism of doing a pre-render server side.

(Régis Hanol) #6

That PNGOUT library is really good at optimizing PNGs :astonished:

which needs that PR to be merged to actually optimize the images

(ben_a_adams) #7

Not sure if something has changed with the docker container version, but I’m now receiving compression not enabled on the assets js files, stylesheets cache, locales and categories.json, is this something I should be change on the server or is it something that is in the docker container?

39 / 100 Suggestions Summary:
Enable compression for the following resources to reduce their transfer size by 1.5MiB (76% reduction).

(ben_a_adams) #8

Hmm, might be me proxying through ngnix, will look a bit deeper

Yes is proxy passthough issue - resolution provided in linked post.

Running other websites on the same machine as Discourse
(Juan) #9

Very relevant issue/discussion considering the current internet landscape. Kudos @jeffwidman!

(Sigurður Guðbrandsson) #10

Wow, this is actually my specialty, website speed optimization.

For images - adding a API for image optimization would be a big step.

Render blocking javascript - adding async to the scripts might improve the PS score if you don’t rely on the javascript to run in the imported order.

The CSS - if you can inline it in the header, you’ll increase your PS score.

Other than that - Good job :smile:

(D. Cavenger) #11

You can change a lot of things to improve page speed, but you shouldn’t necessarily look strictly at the Google Page Speed test (although it is great for testing usability). The most important thing is actual site speed (as in seconds). There are better services out there which can help you benchmark this. Here are a few:

These services also give you a lot of information about the speed bottlenecks of your website and how you can fix them. Hope it helps!

(Christoph) #12

I suppose the pagespeed module for Nginx will not help either, right?

(Rafael dos Santos Silva) #13

Pagespeed module is useful when you have a old site that you can’t change source code. You add it on the reverse proxy and it fixes stuff for you while breaking a lot of stuff.

Makes no sense when you can actually fix your software.

(Christoph) #14

Another suggestion from optimization engines like is to combine the many avatars using sprites:

I’m assuming that this has already been considered and dismissed but I’m curious to learn why exactly. Are avatars changing to frequently or they are needed in unpredictable combinations so that chances are you might have to download a whole sprite for every avatar you need? But even then you’d at least have them cached locally saving you dozens if not hundreds http requests…

(Matt Palmer) #15

Sprite-sheeting avatars isn’t practical. Just look at the set of URLs listed there – 9 different letters and 10 different colours for letter avatars, plus three user avatars (which could change at any time). There’s (at least) 37 characters, 216 colours, and the myriad of sizes that Discourse uses. How do you chop up the sprite sheet? If by characters, you’re forcing people to download about 216x the data they would for a single avatar (one cell per colour), with a 1-in-37 chance of getting a “sheet hit”. If segmented by colour, you’re looking at (at least) a 37x increase in data, for a 1-in-216 chance of getting a “sheet hit”.

To “bundle” all the avatars for a given topic into one sheet, that’s significant code complexity and poor cacheability (every time a new person posts, or someone changes their avatar, a new sheet needs to be generated and served to everyone).

And all this for what? A small improvement in the non-blocking render path for people stuck using an outdated version version of HTTP. Can’t see the benefit, myself.

Frankly, any site that claims to provide page speed recommendations that don’t start with “enable HTTP/2”, and then base further recommendations on HTTP/2 being enabled, are living in the stone age, and can be dismissed as such.