I really love Discourse, I am working on an SPA too, and have been following your development closely. When you make an architectural decision, we tend to roll with that as well.
You use RoR and EmberJS, we use ASP.Net MVC and AngularJS. Different technologies, but similar conceptually on the client and server.
So we have been doing the same, we opted not to follow the hash fragment recommendations either, and when a web crawler arrives at our website, we give them a server generated version of that page. If a regular human arrives, they get the full SPA experience.
One of the things I have noticed, in particular, is that when I check Google cache from before we stopped giving Google our JS, it’s a complete mess. The Google cache page tries to hit up our api (CORS says No!) and download views (again, CORS).
Since we stopped giving them the JS, and instead opted for the server-side rendered version, things are much nicer when viewing the Google cache version of the page.
Any feedback would be really welcome. Are you guys planning at some point to stop serving up server side rendered versions of any particular page to web crawlers, or are you going to continue to do so?
Do you honestly believe that Google can handle JS as well as they say they can?
Apologies if this is off topic, but hearing your opinions/views on this would be really great.