In the past when I clicked the Download
button on the backups screen, the browser downloaded the file. Now the tooltip reads “Send email with download link” and it emails me with a link to download it. Is that new behavior that I just missed or is there an issue with my install?
Yes, that is new behavior added to increase the security of the backups. I’m looking for the relevant topic now…
Edit: Here you go:
So. I have four Discourse forums. I like to download my backups daily, for each. In each case, I must navigate to the “backups” page, click on the download link, switch over to my e-mail program, wait for the message to arrive, and click on the link that’s provided? Every day?
I understand the need for security (and that’s one reason I don’t use SSO at all!), but this sucks.
Given this convoluted workflow, is there some way that I could easily bypass all of this and automatically download the most recent backup for each of my sites daily?
Have you considered having your uploads automatically backup to S3? If you want to take backups every day that’s even simpler than logging in to download them.
Thanks, @eviltrout, I may look into that. I’ve usually stayed away from Amazon’s services, and so had forgotten that this was a baked-in option.
(Logging in hasn’t been an issue, as I’m a moderator on each of the forums anyway, and have browser tabs for them pinned open — so downloading the automatic backups been dead easy.)
We are also going to be adding more cloud providers for automatic backups in the future via the Rails Girls Summer of Code project, if Amazon isn’t your thing.
The best kinds of backups are the ones you don’t have to do anything about
Yeah, and Dropbox Backups already works.
Is there a way to disable this feature? I used to download backups automatically to a server in my control by inspecting /admin/backups.json using an admin API token (of an account which doesn’t have an email address in the first place).
I’d be more interested in being able to set an encryption key for the backups. If someone manages to steal the API key for my backup bot then I could still rest relatively easy knowing the private key is only on the backup download server, which doesn’t face the internet apart from downloading the backups.
I understand the security implications, however, I’d consider storing (unencrypted) backups on someone else’s server to be more dangerous than the risk of someone obtaining the API key for my bot (which resides only on a system I have physical control of, and does not act as a server). Making Discourse push backups to the system would be less optimal for me as that would mean I had two internet-facing servers to worry about instead of one.
There’s no way to disable this feature, and because we want to be very secure by default I can’t see us allowing a user to do that any time soon. You could build a plugin that allows you to access the backups directly.
The encryption key is not a bad idea as a separate feature, but it wouldn’t mean we’d pull back this feature.
Sorry for the late reply.
Oh yeah, allowing a user to disable the feature kind of makes the feature meaningless I suppose, hadn’t thought of it that way.
Then I’ll consider making a plugin .Thanks for the quick response and suggestion!
We had an issue here because it made it much harder to download a backup from a (third party) provider where we don’t have shell access, and upload it to one of our own servers. Especially when handling a large backup and both servers are in the USA (we’re in Europe) this process was costing us hours instead of minutes.
Somehow the backup link does not work with an api_key so a simple wget
will not work
We made a small script that can be called from the command line. It needs three parameters:
- username
- password
- download link from the email, including the token
So something like
./download.sh myuser mypass https://forum.mysite.com/admin/backups/mysite-2017-05-31-113000-v20170526125321.tar.gz?token=d31de61bc07f7bc9060717bc57828584
Here is the script:
#!/bin/bash
USERNAME=$1
PASSWORD=$2
URL=$3
HOSTNAME=$(echo $URL | awk -F/ '{print $3}')
TOKEN=$(
curl \
--header "X-Requested-With: XMLHttpRequest" \
-A "MSIE" \
-b cookies.txt \
-c cookies.txt \
-v\
https://$HOSTNAME/session/csrf|awk -F: '{print $2}'|cut -c2-89
)
echo "Using Token $TOKEN Username $USERNAME Password $PASSWORd"
curl \
--header "X-Requested-With: XMLHttpRequest" \
--header "X-CSRF-Token: $TOKEN" \
--header "Origin: https://$HOSTNAME" \
-A "MSIE" \
-b cookies.txt \
-c cookies.txt \
-v\
--data "login=$USERNAME&password=$PASSWORD" \
https://$HOSTNAME/session
curl \
-A "MSIE" \
-b cookies.txt \
-c cookies.txt \
-O \
$URL
One more suggestion: would it be an idea to exclude the email with the link from being affected by the disable_email
setting?
I think it a quite bad idea.
Because the browser downloader usually stops at some time. The backup package is quite big usually, but if we only have one chance to download and then the link will pastdue, usually it will download half and stoped there.
Can you support breakpoint resume download? If not, you’d better think of a better way to help people successfully download the whole package with the browser.
I have the same issue. I have to disable this feature for personal reasons.
I understand that you want to be very secure by default. But can we have non-default options?
Or can we have an option to send the link to a new target email address?
I strongly agree to set it to enable as a default option.
There are many features that can be set as disable or not default. Maybe they are not supposed to be meaningless.
Email link or not isn’t the main issue here. Now we must remember that downloading backup was a solution for real issue and that was, and is, too rarely done backups.
I take backup of database of my webstore every 5 minutes. My CMS takes backup every hour. With Discourse the freshest fish I get is once a day natively. That means because I’m lost with docker and anything else another than MySQL/MariaDB I’m taking everyday a risk to loose some content if something goes bad in this window of 24 hours.
That has happend to me once, actually.
With the current email link method, when I was downloading backups in a region with a restricted network, I always got stuck at 30%, even if I used VPN for some time.
I will have a similar issue in the future if it is still email-link-only at that time.
If you are self-hosted and want to download the backup you can do it via ssh/scp. Another option is to have backups stored on S3. Downloading them via the web browser is not a reliable way to automate the process.
If you want that level of being able to go back in time then you should configure postgres to allow rollbacks and/or have a live backup server.