Download a backup with `wget`

For a seasoned sysadmin, the favored way of moving large files is a tool like rsync or scp, but sometimes, you don’t have ssh access to the host where you need to retrieve a large backup file.

Because backup files contain sensitive information, Discourse’s has security features that make it very difficult for an unauthorized person to retrieve a backup. You must get a download link via email and use that link logged in as the user who requested it. If you want to download the data with a web browser, it’s quite painless. If you want to move that backup to another server on the internet, most residential internet service makes that a very painful proposition. On my home internet connection a 2.3 GB file takes over 20 minutes to upload. The 12GB file here would take on the order of two hours.

Here is how to get a link that will allow you to pull a backup from a Discourse site via wget.

First, initiate the download as usual and then open Chrome’s downloads page (chrome://downloads/). Right-click on the URL and copy it.

Then, in a shell on the machine where you want the file, you can then paste that URL into a wget request. Make sure that you put quotes around it as the &s will cause you problems. You also need to make sure to maintain the backup file’s original filename. Your request should look something like this:

wget "" -o discourse-2020-11-19-001538-v20201116132948.tar.gz

You can also omit the -o filename and rename the file (which will have the full URL as its filename) after you’ve downloaded it. The URL is time-limited, so you’ll need to initiate the download when you’re ready to pull it to your server.

You can then move the file to /var/discourse/shared/standalone/backups/default and restore it from the web interface or with

cd /var/discourse
./launcher enter app
discourse enable_restore
discourse restore

The final command above will print a list of the available backups and you can copy-paste the correct one to start the restore.