Discourse Backups to Dropbox (Deprecated)

(Rafael dos Santos Silva) #1

:warning: Deprecation Notice

This plugin is deprecated and should no longer be installed on new sites. Instead, use the plugins discourse-sync-base and discourse-sync-to-dropbox described in the topic Synchronizer-base for any backup provider.
If you’re already using this plugin, please consider migrating to the new plugins.

Discourse Backups to Dropbox

Backups are useless unless they are off-site. This plugin syncs your automatic backups on Dropbox.


Proceed with a normal installation of a plugin.

After Installation

You need to enable the plugin on Settings -> Plugins and fill you Dropbox Generated Access Token that you can get here:


  • Quick API Check on Settings to test Dropbox Token

  • Have the option to remove a backup after sending to Dropbox, and keeping last N backups on Dropbox independently.

Automated Backups to other Destinations Besides Amazon S3
Alternate Backups
Download backup - email link only
Upgrade discourse-backups-to-dropbox
S3 Backups - not even trying?
How-to guide on DigitalOcean Block Storage for backups?
Upgrade error due to deprecated plugin
Creating a Plugin for the "Backups" Page
Google Drive: External backup storage
Microsoft OneDrive: External backup storage
Box: External backup storage
Extend S3 configuration for other s3 API compatible cloud storage solutions
(Joshua Rosenfeld) #2

@Falco, this is great! How hard would it be to expand this to other cloud storage providers? We use Box where I work, so it would be great to get support for that. I know many businesses use Google Apps (or whatever the new name is), so Drive support would be nice too.

(James North) #3

Awesome thanks @Falco!

(Rafael dos Santos Silva) #4

I’m a Google fan, so I tried Google Drive first, but Dropbox API is so nice that I could build it faster. So I went it that.

I think that adding other services could be done, it’s a matter of time only.

(James North) #5

Hey @Falco,

Awesome work!

I’ve installed this on a few Discourse sites now and I’m eagerly waiting to see how the backup goes …

I wonder if in the future there could be a quick API check implemented?

Just a button that outputs ‘yep api is valid’ or ‘nah i can’t communicate with dropbox check it, fool’. Otherwise I guess I’m just waiting around here to see whether it fails or succeeds when the time comes to run a backup.

Rock on!

(Rafael dos Santos Silva) #6

Hey @JamesNorth 24 hours passed, are your backups on Dropbox?


the backups didn’t show up for me. is it the app key or the app secret that goes into the plugin setting? i did the app key.


answering my own question, its neither. Its the access token that goes into the plugin setting, which can be generated when the Dropbox app is created.

thanks for the plugin!

(James North) #9

Thanks for checking in @Falco. It did fail - the info sent to the logs was very clear and indicated the wrong key.

I had the same problem as @ckshen. The language below the key input field says

Your Dropbox API key created on https://www.dropbox.com/developers/apps/create

Of course, useless without the secret, but I followed the language a bit too literally probably.

Perhaps it could say:

Your Dropbox Access Token generated at https://www.dropbox.com/developers/apps/create

Just to prevent fools like us returning :slight_smile:

Now I’ll just wait to see how the next backup goes. :thumbsup:

(James North) #10

Just ran a manual backup job now and it obviously connected to the App (a little notification popped up on my Dropbox notifications with the server’s domain), but it failed.

Maybe just Dropbox’s fault? I know that previously I’ve made little Apps to connect it has had connection problems every now and then.

Info says:

Job exception: Broken pipe

Backtrace says:

/usr/local/lib/ruby/2.3.0/openssl/buffering.rb:322:in `syswrite'
/usr/local/lib/ruby/2.3.0/openssl/buffering.rb:322:in `do_write'
/usr/local/lib/ruby/2.3.0/openssl/buffering.rb:340:in `write'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/timeout/null.rb:51:in `write'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/request/writer.rb:99:in `write'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/request/writer.rb:87:in `block in send_request'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/request/writer.rb:86:in `each'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/request/writer.rb:86:in `send_request'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/request/writer.rb:42:in `stream'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/request.rb:110:in `stream'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/connection.rb:74:in `send_request'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/client.rb:63:in `perform'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/client.rb:41:in `request'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/http-2.0.3/lib/http/chainable.rb:26:in `post'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/dropbox-sdk-v2-0.0.3/lib/dropbox/client.rb:368:in `upload_request'
/var/www/discourse/plugins/discourse-backups-to-dropbox/gems/2.3.1/gems/dropbox-sdk-v2-0.0.3/lib/dropbox/client.rb:240:in `upload'
/var/www/discourse/plugins/discourse-backups-to-dropbox/lib/dropbox_synchronizer.rb:21:in `block in sync'
/var/www/discourse/plugins/discourse-backups-to-dropbox/lib/dropbox_synchronizer.rb:19:in `each'
/var/www/discourse/plugins/discourse-backups-to-dropbox/lib/dropbox_synchronizer.rb:19:in `sync'
/var/www/discourse/plugins/discourse-backups-to-dropbox/app/jobs/regular/sync_backups_to_dropbox.rb:4:in `execute'
/var/www/discourse/app/jobs/base.rb:154:in `block (2 levels) in perform'

(Rafael dos Santos Silva) #11
  1. I changed the wording to Dropbox Generated Access Token

  2. Try another manual backup to see if we get another Broken Pipe. What’s the size of your backup?

(James North) #13

Awesome! Reckon that’ll prevent a few folks coming in here.

Just about to run another one - it’s 2GB as there are a lot of attachments on the board. I thought this could be a problem, as Dropbox doesn’t have a super super great connection to things a lot of the time.

(James North) #14

Same problem.

It seems to happen very early on in the process. I’m not sure that there would even be time to send 20MB of data and it fails.

I run another Discourse forum that is about 100MB all told, so I’ll give that one a shot and see if the same issue occurs.

(James North) #15

This ~150MB backup worked fine @Falco.

I just did the original 2GB forum for a 4th time and the same error occurs. I guess 2GB is fairly large (I have no idea of relativity here) so I can only assume that’s what’s causing it to fail.

(James North) #16

Just confirming that a 2GB backup fails flawlessly every time.

Wonder whether it’s related to RAM/swap or whether it’s a connection issue? I have 2GB ram and 4GB swap and I have two very small/quiet forums on that box as the only things running on it.

(Rafael dos Santos Silva) #17

Just asking the obvious question: do you have enough free space on Dropbox?

(James North) #18

Yeah I have over 1TB free on my plan.

I had a bit of a read and you can upload a single file up to 20GB in size to Dropbox.

The failure happens before the system would have time to send 2GB from my server to Dropbox, I think. So it zips everything up, the backup succeeds locally and then within about 30 seconds the logs report the failures above.

(Erlend Sogge Heggen) #19

I think this is still a great feature request. Just checking for a response would be great, but even better if it was possible to upload a small test file. Let’s you rule out a lot of possible setup mistakes.

(Rafael dos Santos Silva) #20

Just found the problem. I’m using an API point that limits files to 150MB .

(James North) #21

Nice @Falco - thanks for digging deeper!