Hi guys
I really love Discourse, I am working on an SPA too, and have been following your development closely. When you make an architectural decision, we tend to roll with that as well.
You use RoR and EmberJS, we use ASP.Net MVC and AngularJS. Different technologies, but similar conceptually on the client and server.
We have noticed that you don’t seem to serve web crawlers (googlebot, bingbot, etc) your Javascript. You never followed Google’s hash fragment recommendations, which is good news because that was recently deprecated.
So we have been doing the same, we opted not to follow the hash fragment recommendations either, and when a web crawler arrives at our website, we give them a server generated version of that page. If a regular human arrives, they get the full SPA experience.
One of the things I have noticed, in particular, is that when I check Google cache from before we stopped giving Google our JS, it’s a complete mess. The Google cache page tries to hit up our api (CORS says No!) and download views (again, CORS).
Since we stopped giving them the JS, and instead opted for the server-side rendered version, things are much nicer when viewing the Google cache version of the page.
However, Google seem pretty, consistently, adamant that they can handle Javascript.I have been talking to people on other forums who suggest that we should just let Google let it rip with our SPA,and that the cache view isn’t important.
Any feedback would be really welcome. Are you guys planning at some point to stop serving up server side rendered versions of any particular page to web crawlers, or are you going to continue to do so?
Do you honestly believe that Google can handle JS as well as they say they can?
Apologies if this is off topic, but hearing your opinions/views on this would be really great.
Thanks!