Here below are from Vanilla and imported to Discourse
Did you use the bulk importer? If so you’ll need to run rake import:ensure_consistency
to generate those stats.
Yes Justin. I have used bulk importer (vanilla)
Will try that now. Thanks!
Any idea when using the rake import:ensure_consistency after a few minutes it makes the ssh unresponsive?
I already make the
ServerAliveInterval 3600 just to make sure it wont die.
How long should rake import:ensure_consistency take for 20million datas
A very long time. If it can process 10K per minute, you’re looking at 33 hours. It could easily be 10X that long.
EDIT: Beware: This assumes that I can do arithmetic with a calculator.
Thanks Jay for clarifying!
It’s ok Jay - math is hard.
There’s not a progress bar, no. As Jay said it will likely run for quite a long time. Give 24hrs and check back.
You could in another window do something like PostTiming.all.count
in a rails console. If it’s getting bigger then you’d know that it was getting bigger. Maybe. But I didn’t look at what it’s actually doing.
sir one last question. when i admin/backup and restore it will it be counted to the backup?
It’s on the database, so it will be included.
Also you want to look at /sidekiq and make sure that all of those jobs are done before moving to production
Thanks Jay. So does it mean when I restore the back up into a new server do i need to run the rake import:ensure_consistency? because it includes an action of “updating” after insert
I don’t believe so. Once those tables are updated they’ll be in the database.
Great! thanks Jay and the swift response of the discourse team!
Jay, I have 1 more question.
why is it the PostReply has no data? it even went thru and proceed to the next step
with this I’m already with the update user stats
If you’re importing I’m not 100% sure a PostReply
is generated. I believe that’s for direct replies to specific posts, like mine to yours.