I am in a hotel right now and they keep throttling and eventually blocking my download of the backup file of my discourse instance’s data. I could save directly to my google drive folder if the “download backup” button was a link. I need to be able to visit the link via url then I can click the “save to google drive” plugin button to save directly to the drive. As it is now, I have to save locally first but I can’t get the entire file before being disconnected on the network because the hotel network policies don’t allow large files.
Whilst I wouldn’t recommend using your API key in this way (with a 3rd party service).
Create your backup file name from what’s displayed on screen…
discourse-example-com-2016-06-01-033102.tar.gz is the filename displayed in the backup section.
Add to that the API key and an admin user in the following form:
So the full URL looks something like this:
Personally if I had to do this I would after “Regenerate” the API key so it couldn’t be used again.
Should this URL convention still work? I’m getting nothing but a blank screen.
No. As of Discourse 1.8, downloading backups can only be done via an emailed link for extra security.
Problems with downloading backups
I have an issue similar to the OP. I’m off-grid with relatively slow satellite with a relatively low data cap. I was unable to get the download to complete/resume before the token expired. Also, the URL wouldn’t resolve in 3rd party download apps (e.g., LoaderDroid on Android), which was unusual (and may or may not be related to Discourse).
After a couple days of failing, I ended up requesting the help of a developer friend with better internet. But assigning them admin credentials and asking for outside help isn’t exactly a best practice.
My issue was solved via a cumbersome workaround, and migrating servers isn’t a regular task done under these circumstances, and it’s not an issue for me anymore. I’ll leave the question of whether this process warrants tweaks to y’all, but can say that the API key and username trick with the URL probably would have allowed me to get this done more cleanly.
If you have access to the server then backups can be grabbed via scp/ssh, even transferred server-to-server without the need to download in between.
I am unable to download the backup. it gets disrupted and could not resume.
if I click again it is not working. any other way to download the backups ?
You can access through FTP
In FIlezilla create a new site :
host : ip adress of your server
protocole : SFTP
Login : root
Password : root’s password
You’ll find it in :
Downloaded the backup using the
scp command using terminal on linux.
scp SSH_USER@IP.ADD.RE.SS:/var/discourse/shared/standalone/backups/default/FORUM-NAME-2019-01-04-033217-v20180607095414.tar.gz /home/vaishak/forum-backups
I had to go for it as the SMTP configuration wasn’t working.