Configure automatic backups for Discourse


(Jeff Atwood) #1

So you’d like to automatically back up all your Discourse data every day?

Go to the /admin settings, backup, and set backup_frequency to 1.

Now backup will be taken every day.

Store backups on local server or Amazon S3

:warning: Warning

Storing backups and regular uploads in the same bucket is not recommended.

Instructions for Discourse v2.2.0.beta3 +6 and newer

By default backups are stored on the local server disk. In order to store backups on Amazon S3, you’ll need to create a unique, private S3 bucket and put it’s name in the s3_backup_bucket site setting.

:warning: The S3 bucket should only be used for backups. If you need to use a bucket that contains other files please make sure that you provide a prefix when you configure the s3_backup_bucket setting (example: my-awesome-bucket/backups) and make sure that files with that prefix are private.

You can follow most of the steps in Setting up file and image uploads to S3 if you need help in setting up a new bucket.

Next, set your S3 credentials under the Files section: s3_access_key_id, s3_secret_access_key, and s3_region.

Finally, you need to select “Amazon S3” as backup_location.

From now on all backups will be uploaded to S3 and not be stored locally anymore. Local storage will only be used for temporary files during backups and restores.

Go to the Backups tab in the admin dashboard to browse the backups – you can download them any time to do a manual offsite backup.


Instructions for Discourse v2.2.0.beta3 and below

Up until v2.2.0.beta3 backups are always saved on the local server disk by default.

If you want to also automatically upload your backups to Amazon S3, check enable_s3_backups. You’ll need to create a unique private S3 bucket name in s3_backup_bucket to store your backups.

Next, set your S3 credentials under the Files section: s3_access_key_id, s3_secret_access_key, and s3_region.

Backups are always stored on local server disk. Go to the Backups tab in the admin dashboard to browse your local server backups – you can download them any time to do a manual offsite backup.

If you’ve enabled S3 backups, check your S3 bucket to find the uploaded backup files.

Note that you can also enable an automatic move to Glacier bucket lifecycle rule to keep your S3 backup costs low.


How to make a private bucket on Amazon S3 for backup?
Enable a CDN for your Discourse
Backups not uploading to S3
Public data dumps
Backups not uploading to S3
Which one use at DO for discourse backup?
Awareness for path dependencies when setting up a discourse forum
How to make a private bucket on Amazon S3 for backup?
How to take backup of Discourse hourly in a day
Proper Procedure for Upgrading Discourse Docker Container
Install Discourse on Amazon WS with Cloudflare
Setting up SSL with my domain name and Discourse instance
Download backup - email link only
Launcher rebuild fails if s3 settings aren't correct
Backup location
Backups failing, and admin page inaccessible
Backup to S3 configuration problem
(Dave McClure) #10

Does the user need to have the custom policy edited along the same lines as documented here?


(Régis Hanol) #11

Yes s/he does! At the very least s/he needs to be able to add/remove files in the backup bucket.


(Oliver Mark) #23

You are able to Archive Backups from your S3 Bucket to Glacier.
It is cheaper, but an Restore attemps more Time.

This Site will Help you to reduce Backup costs.:


(FOSS dev/hacker) #44

Setting this up can be rather confusing. Here’s a simple guide to help you out.

  • Log into your Discourse admin panel
  • Configure daily backups
  • Set maximum backups to 7
  • Log into your Amazon Web Services account
  • Go in the S3 Dashboard
  • Open the bucket containing the backups
  • Click on the properties tab
  • Activate versioning
  • Open the Lifecycle menu
  • Add a rule for the whole bucket
  • Set current version to expire after 15 days
  • Set previous version to
  • Archive to Glacier after 1 days
  • expire after 91 days
  • Save and logout

How it works

Versioning will keep backups automaticly deleted by Discourse. One day after beeing deleted it will be moved to the Glacier storage. After 91 days it will be delete from the Glacier storage.

Warning

Amazon charge you for item stored in Glacier for 90 days even if you delete them before. Make sure your Glacier Lyfecicle keep your file at least 90 days.


Don’t see uploaded images – 404ing and uploads tar.gz does not show them
Install Discourse on Amazon WS with Cloudflare
(Stephen) #87

Can we throw an error up somewhere in /admin if the same bucket is specified for both backup and uploads?


(Jeff Atwood) #88

Not a bad idea, up to @gerhard


(Jide Ogunsanya) #89

I can now see this is the cause of the 502 bad gateway error in admin dashboard. I think a custom error message will be better…