تحميل نسخة احتياطية من رابط بريد إلكتروني باستخدام `wget`

For a seasoned sysadmin, the favored way of moving large files is a tool like rsync or scp, but sometimes, you don’t have ssh access to the host where you need to retrieve a large backup file.

Because backup files contain sensitive information, Discourse’s has security features that make it very difficult for an unauthorized person to retrieve a backup. You must get a download link via email and use that link logged in as the user who requested it. If you want to download the data with a web browser, it’s quite painless. If you want to move that backup to another server on the internet, most residential internet service makes that a very painful proposition. On my home internet connection a 2.3 GB file takes over 20 minutes to upload. The 12GB file here would take on the order of two hours.

Here is how to get a link that will allow you to pull a backup from a Discourse site via wget.

First, initiate the download as usual and then open Chrome’s downloads page (chrome://downloads/). Right-click on the URL and copy it.

Then, in a shell on the machine where you want the file, you can then paste that URL into a wget request. Make sure that you put quotes around it as the &s will cause you problems. You also need to make sure to maintain the backup file’s original filename. Your request should look something like this:

wget --show-progress "https://bucket-name.s3.us-west-2.amazonaws.com/backups/xyz/multisitename/discourse-2020-11-19-001538-v20201116132948.tar.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAWWK5WHOFJ%2F20201119%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20201119T013442Z&X-Amz-Expires=300&X-Amz-SignedHeaders=host&X-Amz-Signature=1753b97a8aaf6953c89aa28628b8db" -o discourse-2020-11-19-001538-v20201116132948.tar.gz

You can also omit the -o filename and rename the file (which will have the full URL as its filename) after you’ve downloaded it. The URL is time-limited, so you’ll need to initiate the download when you’re ready to pull it to your server.

You can then move the file to /var/discourse/shared/standalone/backups/default and restore it from the web interface or with

cd /var/discourse
./launcher enter app
discourse enable_restore
discourse restore

The final command above will print a list of the available backups and you can copy-paste the correct one to start the restore.

14 إعجابًا

Sadly this does not seem to work outside of sites using S3.

I believe that it will work.

إعجاب واحد (1)

I think this won’t work, as S3 authenticates identity through the X-Amz-Signature parameter in the URL, whereas downloading directly from Discourse requires cookie-based authentication. Merely copying the URL into wget does not suffice for authentication.

إعجاب واحد (1)

Hmm. Maybe that’s right. Maybe I’ve only ever done it via S3 (like probably from CDCK sites where I don’t have ssh access).