Need "window.prerenderReady" implemented on my instance [PAID GIG]

I’m using to serve the “application/JS” version of the site to crawlers ( my instance serves crawlers the JS version via a hidden setting )

It’s working well but seems discourse might fall under this category

but some web pages use some custom loading flows or constant polling that may trick Prerender’s logic; thus, it fails to make a decision on the ready-ness of the page.

The first time prerender accesses any discourse URL, it will timeout ( 20 seconds set by prerender )

The page renders fine it’s just that doesn’t know the page is fully loaded so it “stays” trying to render the page, till the 20 seconds is up, then it serves the HTML version

If the crawler requests the page again, it will serve the page in 1 second (give or take) - as there’s an HTML version of the URL in the cache

…but thats not practical as there are thousands of URLS and 20 seconds for each URL ( for the first time its accessed) will not work

So I will need the below added right after the <head> tag (and when the page is done to set the variable to true)

<script> window.prerenderReady = false; </script>

I’d like this to work site-wide- hopefully, that makes the job easier

Not sure what this entails but if I’m off please let me know - $300? $400?

Does anyone have any feedback on this?

Perhaps there is a core file I can edit in the meantime

Do you have the code you’re using to do that somewhere?

1 Like

The code to serve the JS version?

It was the hidden site setting " crawler_user_agents " that you (@pfaffman ) help me enable/adjust

Edit I removed “bots”, “crawlers” and “spiders” from the above list

How is involved? How would Discourse know when to include the <head> tag?

Oh, I think prerender is referring to the existing <head> tag no?

For us to add the <script> window.prerenderReady = false; </script> right under the existing <head> tag

Edit Im also not sure if they need the code set within the head tags or after the closing head tag

How did you install prerender to make it serve the prerendered pages? Three methods are listed at How to Install Prerender® - Prerender. Did you use one of those?

1 Like

Yes, I used the cloudflare middleware .

So cloudllare takes any requests from bots and sends them to prerender

So can you supply an API call to prerender that will return the true/false value that you want in

 <script> window.prerenderReady = false; </script>
1 Like

Aha I see, - did some reading on the API and this might be a little over my head (but hopefully this makes the task easier)

The budget’s a bit low for this kind of thing basically. Modifying what gets served to crawlers can be tricker than it seems. Various issues can arise.

Personally, I’m a little skeptical as to the wisdom of doing this in the first place, but I’m sure you have your reasons.

I think Jay means the discourse client API. You could use that via a theme component to tell when Discourse is fully rendered.

You seem like you’re somewhat familiar with software development. I made a little introductory course on discourse theme development last year which includes a discussion of how to use the API in a theme. It’s free and open source. You can read through it starting here:

You’ll probably need to use a frontend event that’s triggered when the page is rendered. There are some examples of that once you get to the first javascript unit in the course


Thanks for the reply Angus

I’m not sure that’s what needs to be done. The crawlers are already getting an HTML version of my discourse instance.

Too early to tell but I’m pretty optimistic. It’s just a lot of SEO clean up- google is crawling a new site entirely. I just can’t imagine Google ranking the non-JS crawler version of the site and giving the same rank as the actual user experience

The first part I need done is to just get that code up there in the <head>

Then it’s to implement this part according to

then make sure you only set this variable true when your page is finished rendering, and it’s safe for Prerender to grab the content. This is possible in an async call that runs very late on your page. will then wait a small amount of time to ensure all the calls are finished and save your page.

Will be going through the documentation you provided - thank you for that

1 Like

I’m not sure what you’re seeing, but in our experience, the crawler view has ranked sites quite well. We’ve had customers report that their community outranked their main site.


It could just be the site has very valuable content, and the algorithm is ignoring the “bad parts”. Each case is different.

It just wouldn’t make sense for google to rank the crawler version as if it was the JS version. (in the general sense)

There is no menu , no suggested topics, no gutter links, user profile/badges pages are noindex , and a boatload of other features that are just not available in the crawler version.

I’ll update with a new topic once the results are in. So far the positioning in the SERPS is very erratic

edit user profile/badges pages are noindex via headers