In the past when I clicked the Download button on the backups screen, the browser downloaded the file. Now the tooltip reads “Send email with download link” and it emails me with a link to download it. Is that new behavior that I just missed or is there an issue with my install?
Yes, that is new behavior added to increase the security of the backups. I’m looking for the relevant topic now…
Edit: Here you go:
So. I have four Discourse forums. I like to download my backups daily, for each. In each case, I must navigate to the “backups” page, click on the download link, switch over to my e-mail program, wait for the message to arrive, and click on the link that’s provided? Every day?
I understand the need for security (and that’s one reason I don’t use SSO at all!), but this sucks.
Given this convoluted workflow, is there some way that I could easily bypass all of this and automatically download the most recent backup for each of my sites daily?
Have you considered having your uploads automatically backup to S3? If you want to take backups every day that’s even simpler than logging in to download them.
Thanks, @eviltrout, I may look into that. I’ve usually stayed away from Amazon’s services, and so had forgotten that this was a baked-in option.
(Logging in hasn’t been an issue, as I’m a moderator on each of the forums anyway, and have browser tabs for them pinned open — so downloading the automatic backups been dead easy.)
We are also going to be adding more cloud providers for automatic backups in the future via the Rails Girls Summer of Code project, if Amazon isn’t your thing.
The best kinds of backups are the ones you don’t have to do anything about 
Yeah, and Dropbox Backups already works.
Is there a way to disable this feature? I used to download backups automatically to a server in my control by inspecting /admin/backups.json using an admin API token (of an account which doesn’t have an email address in the first place).
I’d be more interested in being able to set an encryption key for the backups. If someone manages to steal the API key for my backup bot then I could still rest relatively easy knowing the private key is only on the backup download server, which doesn’t face the internet apart from downloading the backups.
I understand the security implications, however, I’d consider storing (unencrypted) backups on someone else’s server to be more dangerous than the risk of someone obtaining the API key for my bot (which resides only on a system I have physical control of, and does not act as a server). Making Discourse push backups to the system would be less optimal for me as that would mean I had two internet-facing servers to worry about instead of one.
There’s no way to disable this feature, and because we want to be very secure by default I can’t see us allowing a user to do that any time soon. You could build a plugin that allows you to access the backups directly.
The encryption key is not a bad idea as a separate feature, but it wouldn’t mean we’d pull back this feature.
Sorry for the late reply.
Oh yeah, allowing a user to disable the feature kind of makes the feature meaningless I suppose, hadn’t thought of it that way.
Then I’ll consider making a plugin .Thanks for the quick response and suggestion!
We had an issue here because it made it much harder to download a backup from a (third party) provider where we don’t have shell access, and upload it to one of our own servers. Especially when handling a large backup and both servers are in the USA (we’re in Europe) this process was costing us hours instead of minutes.
Somehow the backup link does not work with an api_key so a simple wget will not work 
We made a small script that can be called from the command line. It needs three parameters:
- username
- password
- download link from the email, including the token
So something like
./download.sh myuser mypass https://forum.mysite.com/admin/backups/mysite-2017-05-31-113000-v20170526125321.tar.gz?token=d31de61bc07f7bc9060717bc57828584
Here is the script:
#!/bin/bash
USERNAME=$1
PASSWORD=$2
URL=$3
HOSTNAME=$(echo $URL | awk -F/ '{print $3}')
TOKEN=$(
curl \
--header "X-Requested-With: XMLHttpRequest" \
-A "MSIE" \
-b cookies.txt \
-c cookies.txt \
-v\
https://$HOSTNAME/session/csrf|awk -F: '{print $2}'|cut -c2-89
)
echo "Using Token $TOKEN Username $USERNAME Password $PASSWORd"
curl \
--header "X-Requested-With: XMLHttpRequest" \
--header "X-CSRF-Token: $TOKEN" \
--header "Origin: https://$HOSTNAME" \
-A "MSIE" \
-b cookies.txt \
-c cookies.txt \
-v\
--data "login=$USERNAME&password=$PASSWORD" \
https://$HOSTNAME/session
curl \
-A "MSIE" \
-b cookies.txt \
-c cookies.txt \
-O \
$URL
One more suggestion: would it be an idea to exclude the email with the link from being affected by the disable_email setting?
I think it a quite bad idea.
Because the browser downloader usually stops at some time. The backup package is quite big usually, but if we only have one chance to download and then the link will pastdue, usually it will download half and stoped there.
Can you support breakpoint resume download? If not, you’d better think of a better way to help people successfully download the whole package with the browser.
Tengo el mismo problema. Tengo que desactivar esta función por motivos personales.
Entiendo que quieras ser muy seguro por defecto. Pero, ¿podemos tener opciones no predeterminadas?
¿O podemos tener una opción para enviar el enlace a una nueva dirección de correo electrónico de destino?
Estoy totalmente de acuerdo en configurarlo como una opción predeterminada.
Hay muchas características que se pueden configurar como deshabilitadas o no predeterminadas. Quizás no están destinadas a ser inútiles.
El enlace de correo electrónico o no, no es el problema principal aquí. Ahora debemos recordar que descargar una copia de seguridad fue una solución para un problema real y que era, y es, hacer copias de seguridad con muy poca frecuencia.
Hago una copia de seguridad de la base de datos de mi tienda en línea cada 5 minutos. Mi CMS hace una copia de seguridad cada hora. Con Discourse, el contenido más reciente que obtengo es una vez al día de forma nativa. Eso significa que, como estoy perdido con Docker y cualquier otra cosa que no sea MySQL/MariaDB, corro el riesgo todos los días de perder algo de contenido si algo sale mal en esta ventana de 24 horas.
De hecho, eso me ha sucedido una vez.
Con el método actual de enlace de correo electrónico, cuando descargaba copias de seguridad en una región con una red restringida, siempre me quedaba atascado en el 30%, incluso si usaba VPN durante algún tiempo.
Tendré un problema similar en el futuro si todavía es solo por correo electrónico en ese momento.
Si te autoalojas y quieres descargar la copia de seguridad, puedes hacerlo a través de ssh/scp. Otra opción es tener copias de seguridad almacenadas en S3. Descargarlas a través del navegador web no es una forma fiable de automatizar el proceso.
Si quieres ese nivel de poder retroceder en el tiempo, deberías configurar postgres para permitir retrocesos y/o tener un servidor de copias de seguridad en vivo.