Would the batch method be suitable for a large amount of rebakes?
2851000 / 27182220 ( 10.5%)
This our current process after starting it yesterday with the normal rebake command, it seems to tick about 1000 every 3 seconds. We are very close to the end of our import journey and testing, and I just wanted to make sure there was a more proper way to rebake a large site before we settled on this slower method.
Can anyone explain how this in_batches version works. Presumably it does the re-bake in batches, but from the posts above, it is stated that by default it does rebake in batches of 100 every 15 minutes by default.
I have a 2 million re-bake job to do and trying to figure out the best way to do this. The job has no urgency, but I want to make sure that normal operation and administrative operations (such as backup) are not impacted by a long running job.
And now I just read this post: Rebaked all my posts, but what's it doing now? which tells me the re-bake task isn’t even re-baking them but just marking them for re-baking (how is this mark done?). The process is so slow I’m really struggling to believe it takes so long just to mark a post for re-baking.
Be thankful it doesn’t overwhelm your site. The whole point is to prevent this process from taking too many resources, keeping your site responsive during the process.
Indeed, marking should be very quick. And the rebake_post does seem to do the call the cooking. Maybe there are some async tasks that happen as part of this or as a result of this?
I wrote a program to scan all the imported posts to find what mark-ups/smileys they contained. Then wrote another program to bake the raw posts into HTML and update the database directly.