Download backup - email link only

In the past when I clicked the Download button on the backups screen, the browser downloaded the file. Now the tooltip reads “Send email with download link” and it emails me with a link to download it. Is that new behavior that I just missed or is there an issue with my install?

3 « J'aime »

Yes, that is new behavior added to increase the security of the backups. I’m looking for the relevant topic now…

Edit: Here you go:

5 « J'aime »

So. I have four Discourse forums. I like to download my backups daily, for each. In each case, I must navigate to the “backups” page, click on the download link, switch over to my e-mail program, wait for the message to arrive, and click on the link that’s provided? Every day?

I understand the need for security (and that’s one reason I don’t use SSO at all!), but this sucks.

Given this convoluted workflow, is there some way that I could easily bypass all of this and automatically download the most recent backup for each of my sites daily?

Have you considered having your uploads automatically backup to S3? If you want to take backups every day that’s even simpler than logging in to download them.

1 « J'aime »

Thanks, @eviltrout, I may look into that. I’ve usually stayed away from Amazon’s services, and so had forgotten that this was a baked-in option.

(Logging in hasn’t been an issue, as I’m a moderator on each of the forums anyway, and have browser tabs for them pinned open — so downloading the automatic backups been dead easy.)

We are also going to be adding more cloud providers for automatic backups in the future via the Rails Girls Summer of Code project, if Amazon isn’t your thing.

The best kinds of backups are the ones you don’t have to do anything about :slight_smile:

9 « J'aime »

Yeah, and Dropbox Backups already works.

5 « J'aime »

Is there a way to disable this feature? I used to download backups automatically to a server in my control by inspecting /admin/backups.json using an admin API token (of an account which doesn’t have an email address in the first place).

I’d be more interested in being able to set an encryption key for the backups. If someone manages to steal the API key for my backup bot then I could still rest relatively easy knowing the private key is only on the backup download server, which doesn’t face the internet apart from downloading the backups.

I understand the security implications, however, I’d consider storing (unencrypted) backups on someone else’s server to be more dangerous than the risk of someone obtaining the API key for my bot (which resides only on a system I have physical control of, and does not act as a server). Making Discourse push backups to the system would be less optimal for me as that would mean I had two internet-facing servers to worry about instead of one.

4 « J'aime »

There’s no way to disable this feature, and because we want to be very secure by default I can’t see us allowing a user to do that any time soon. You could build a plugin that allows you to access the backups directly.

The encryption key is not a bad idea as a separate feature, but it wouldn’t mean we’d pull back this feature.

4 « J'aime »

Sorry for the late reply.

Oh yeah, allowing a user to disable the feature kind of makes the feature meaningless I suppose, hadn’t thought of it that way.

Then I’ll consider making a plugin .Thanks for the quick response and suggestion!

1 « J'aime »

We had an issue here because it made it much harder to download a backup from a (third party) provider where we don’t have shell access, and upload it to one of our own servers. Especially when handling a large backup and both servers are in the USA (we’re in Europe) this process was costing us hours instead of minutes.

Somehow the backup link does not work with an api_key so a simple wget will not work :frowning:

We made a small script that can be called from the command line. It needs three parameters:

  • username
  • password
  • download link from the email, including the token

So something like

./download.sh myuser mypass https://forum.mysite.com/admin/backups/mysite-2017-05-31-113000-v20170526125321.tar.gz?token=d31de61bc07f7bc9060717bc57828584

Here is the script:

#!/bin/bash
USERNAME=$1
PASSWORD=$2
URL=$3
HOSTNAME=$(echo $URL | awk -F/ '{print $3}')

TOKEN=$(
curl \
  --header "X-Requested-With: XMLHttpRequest" \
  -A "MSIE" \
  -b cookies.txt \
  -c cookies.txt \
  -v\
  https://$HOSTNAME/session/csrf|awk -F: '{print $2}'|cut -c2-89
)

echo "Using Token $TOKEN Username $USERNAME Password $PASSWORd"
curl \
  --header "X-Requested-With: XMLHttpRequest" \
  --header "X-CSRF-Token: $TOKEN" \
  --header "Origin: https://$HOSTNAME"  \
  -A "MSIE" \
  -b cookies.txt \
  -c cookies.txt \
  -v\
  --data "login=$USERNAME&password=$PASSWORD" \
  https://$HOSTNAME/session

curl \
  -A "MSIE" \
  -b cookies.txt \
  -c cookies.txt \
  -O \
  $URL

One more suggestion: would it be an idea to exclude the email with the link from being affected by the disable_email setting?

7 « J'aime »

I think it a quite bad idea.

Because the browser downloader usually stops at some time. The backup package is quite big usually, but if we only have one chance to download and then the link will pastdue, usually it will download half and stoped there.

Can you support breakpoint resume download? If not, you’d better think of a better way to help people successfully download the whole package with the browser.

J’ai le même problème. Je dois désactiver cette fonctionnalité pour des raisons personnelles.

Je comprends que vous souhaitiez être très sécurisé par défaut. Mais pouvons-nous avoir des options non par défaut ?
Ou pouvons-nous avoir une option pour envoyer le lien vers une nouvelle adresse e-mail cible ?
Je suis tout à fait d’accord pour que ce soit activé par défaut.

Il existe de nombreuses fonctionnalités qui peuvent être définies comme désactivées ou non par défaut. Peut-être qu’elles ne sont pas censées être dénuées de sens.

Le lien par e-mail ou non n’est pas le problème principal ici. Nous devons maintenant nous rappeler que le téléchargement de sauvegarde était une solution à un problème réel et que c’était, et c’est, des sauvegardes trop rarement effectuées.

Je sauvegarde la base de données de ma boutique en ligne toutes les 5 minutes. Mon CMS sauvegarde toutes les heures. Avec Discourse, le poisson le plus frais que j’obtienne est une fois par jour nativement. Cela signifie que parce que je suis perdu avec Docker et tout ce qui n’est pas MySQL/MariaDB, je prends chaque jour un risque de perdre du contenu si quelque chose se passe mal dans cette fenêtre de 24 heures.

Cela m’est arrivé une fois, en fait.

1 « J'aime »

Avec la méthode de lien par e-mail actuelle, lorsque je téléchargeais des sauvegardes dans une région avec un réseau restreint, je restais toujours bloqué à 30 %, même si j’utilisais un VPN pendant un certain temps.

J’aurai un problème similaire à l’avenir s’il s’agit toujours uniquement d’un lien par e-mail à ce moment-là.

Si vous êtes auto-hébergé et que vous souhaitez télécharger la sauvegarde, vous pouvez le faire via ssh/scp. Une autre option est d’avoir des sauvegardes stockées sur S3. Les télécharger via le navigateur Web n’est pas un moyen fiable d’automatiser le processus.

Si vous souhaitez pouvoir revenir dans le temps à ce niveau, vous devriez configurer postgres pour autoriser les rollbacks et/ou avoir un serveur de sauvegarde en direct.

2 « J'aime »