Postgres dump and import into Docker

(dhyasama) #1


I did a pg_dump on an older version of Discourse (0.9.6) and need to import it into another instance ( which is running with the docker setup from @sam.

I have the sql export in the docker shared volume but I’m not sure about the procedure for importing it within docker. A few outlined steps would be greatly appreciated.


Migrating old Discourse to the new Dockerball :)
(Régis Hanol) #2

You’ll need to ssh in the container ./launcher ssh app and then your dump will be available in the /shared directory.

(dhyasama) #3

And then follow the move db procedure in the docs?

(Régis Hanol) #4

The second part, yes.

(Alex Hartley) #5


I’m still really struggling with this. I have discourse running on heroku and want to migrate the DB over to the docker set up that @sam has done.

I have a brand new forum now set up on digital ocean with docker, have done a pg_dump of the heroku database and copied it into the shared directory of the container.

Now I’m struggling to import the latest.dump file (I’ve been trying to use the pg_restore command but it won’t connect), so any help would be appreciated. I tried to follow the instructions in this post but have no real frame of reference, so am a bit lost.

Thanks in advance.

(Ben T) #6

Can you update your heroku instance? Then you could use the new backup feature, and restore from that.

(Alex Hartley) #7

What’s the new backup feature @trident? I used this repo to install on heroku and have modified it a bit. But it doesn’t look like it’s been updated in a couple of months.

I think my issue is more the process or restoring a DB when it comes to docker. I thought I’d just be able to pg_restore using something like:

pg_restore --verbose --clean --no-acl --no-owner -h localhost -U myuser -d mydb latest.dump

but am unsure of the DB connection credentials. But I may be going down totally the wrong path.

(Ben T) #8

Maybe try setting up a local install of discourse, pulling the data; then generating the backup file. It may be easier than connecting to the docker instance over ssh, and dumping the database in there.

The other path is to reconcile that repo, and the current latest code. Then, you’d just backup from the admin console.

Migrating old Discourse to the new Dockerball :)
(dhyasama) #9

Here are my notes from the point of getting the sql file into shared. They are a bit rough and there is probably a better way but this worked for me.

sudo -u postgres psql

ALTER USER discourse WITH Superuser;

Drop old database. Probably have to kill stuff first

SELECT pg_terminate_backend(pid) FROM pg_stat_get_activity(NULL::integer) WHERE datid=(SELECT oid from pg_database where datname = 'discourse');

drop database discourse;

ALTER DATABASE discourse OWNER TO discourse;
\c discourse

import data

sudo su discourse 

psql discourse < my-file.sql

cd /var/www/discourse

bundle install --without test --deployment
RUBY_GC_MALLOC_LIMIT=90000000 RAILS_ENV=production bundle exec rake db:migrate
RUBY_GC_MALLOC_LIMIT=90000000 RAILS_ENV=production bundle exec rake assets:precompile
RUBY_GC_MALLOC_LIMIT=90000000 RAILS_ENV=production bundle exec rake posts:rebake

I have always had to run db:migrate twice

(Alex Hartley) #10

Thanks @dhyasama, I may try that if I can’t get the backup function to work.

I’ve managed to get the latest version running on my local machine. I pg_restored the DB and manually ran the migrations that were missing.

Discourse successfully made a backup from that data, but when I came to import the backup into the new digital ocean install, it starts normally then throws the error:

[2014-03-03 12:10:55] psql:/var/www/discourse/tmp/restores/default/2014-03-03-121055/dump.sql:34: ERROR: relation "api_keys" already exists

So made progress, but still not there yet.

(Alex Hartley) #11

Would it matter which version of postgres I was using to do this?

I upgraded my local machine to 9.3.3 but it still gives me the same error. The digital ocean machine is 9.2.4.

A fresh backup restores no problem. It’s only after the pg_restore that it seems to break. The forum on the local machine works fine though with the imported data.

I got to the point where I had copied the latest.dump file into the docker container and could pg_restore from there, but I was unsure of the database credentials when asked for them.

(Alex Hartley) #12

I think I’d been running discourse incorrectly on my local machine. I used a bitnami virtual machine image and it exports and imports fine. The issue I’m having now is that the DB dump from the bitnami machine gives me an:

[2014-03-09 18:37:31] psql:/var/www/discourse/tmp/restores/default/2014-03-09-183730/dump.sql:10: ERROR: unrecognized configuration parameter "lock_timeout"

When trying to use it as a backup for the digital ocean machine.

Which is, I think, because the postgres version on the digital ocean machine is older than the postgres version used on the bitnami virtual machine.

I’ve tried to upgrade postgres on the docker container but haven’t been successful yet.

UPDATE: Just deleting the lock_timeout line from the dump.sql and recompressing seems to work fine. It imported into postges 9.2 with no problems.

Import production data into Vagrant for mail testing
(Leo Bergnéhr) #13

Followed this and got a really old discourse db successfully migrated to the latest bits in minutes. Thanks!

(Passante) #14

@dhyasama following your steps I get asked for passwd:

[sudo] password for discourse:
I need to restore a db into docker but I cannot. copying raw files does not work.