Improving import and export support

Hmm - sorry, I didn’t quite follow the sentence - do you mean some kind of warning sentence on the /admin page, just like we do for ImageMagick, missing Facebook tokens etc.?

Or something else?

No, any box with less than 2GB mem + swap combined.

I’m running one on a $10/mo DigitalOcean cloud server, which is 1GB of ram, but I also set up a 2GB swapfile.

1 Like

Ok, so I have it setup now on a higher-specced box, and I was able to run the backup =).

I downloaded the backup tar.gz file, and extracted it. Inside, I have:

  • dump.sql
  • meta.json

Just to clarify - the meta.json file has a timestamp as the version number - does this mean I need to match the exact same git commit in order to do a successful import/export? Is this going to be the case all the way until 1.0?

And the dump.sql - that’s just a straight PostgreSQL dump, right? So there isn’t really a specific Discourse import/export or backup format per se, it’s just a dump of the database.

Hmm, my original purpose of testing this feature was to see how I might import another forum (www.lefora.com) into Discourse.

If I want to do that - do I need to generate a dump.sql file exactly like this, including all the DDL statements, in order for Discourse to import? There’s no easier way/format?

I set up a Digital Ocean box with low memory and enough swap to mock around and I run into the problem where running backups or restores from the admin panel would fail with ‘Waiting for sidekiq to finish running jobs…’. and multiple lines of ‘Waiting for 1 jobs…’. I don’t think it has anything to do with memory. I tried to resize the DO droplet and increase memory to 2GB with no help.

Turns out even though (default) queue was empty, sidekiq had one email worker from my registration email that I needed to remove for it to run. I didn’t setup email for this install since I just wanted to play around and not use the install for anything else than testing. After clicking ‘Clear workers list’ it runs fine even with 512MB memory (+swap of course).

2 Likes

The meta.json file currently only holds the current database version (which is the timestamp when the last migration was created). This value is used during the restore to make sure you’re not importing a newer version of the database without migrating first.

Not exactly.

We do use the standard pg_dump command to generate the dump but then we add a slight modification.

In order to limit the amount of downtime during the restoration, we make sure to restore the backup in the restore schema instead of the standard public schema.

This allow us to limit the downtime to the amount of time it takes to switch the public schema to the backup schema, and the restore schema to the public schema :wink:

You will need to write custom code for that. You may want to take a look at

for inspiration.

Unfortunately no. Welcome to a world of pain :rage1:

Hi,

Have there been any changes/updates on the Discourse import/migration front?

Or is improving import/export support still on the roadmap somewhere?

There was talk about this Discourse migration service releasing some of their code as open-source, however, that doesn’t appear to have eventuated.

Is anybody aware of any up-to-date migration/import tools for Discourse?

There are some migration tools, @techapj can you provide a howto topic for finding it and getting started? Can be a very basic howto for now, just the location of the code basically. Longer term we will produce a detailed howto for migrating a sample PHPbb forum, but it may take a few months for us to get to that.

But in the meantime, the migration / conversion code is open source, you just have to know where in the tree to find it. Warning, it is evolving rapidly and meant only for developers at this time.

1 Like

Here is the howto for migration:

2 Likes

This is totally implemented and quite rocking.

Backup/Restore on Discourse is quite a lifesaver at times.

3 Likes