If I may add…
Before I ever brought sending the JS version to google I was tinkering with it.
I tested sending the JS version to google around the beginning of April or so. I remember it returning a result most of the time ( even if it was broken looking ) . Using the google mobile tool.
I thought it might be this commit - I made the code edits , rebooted and same behavior.
Perhaps someone remembers a PR or commit in the past couple months that may have altered browser and/or crawler detection?
Edit Sorry for all the updates, the more info the better, amirite?
While trying prerender last month , Google ended up adding 2000 urls to the forum coverage. ( mostly these URLS )
They were all served in .005 seconds, prerender had the URLS cached and ready for the googlebot to access. So it took them all quickly
Point is, perhaps the crawler got “very used” to the no JS and commited resources to get those 2k pages.
So now its accessing the site in this manner until it figures things out (and needs to access with JS more) just a theory