Hello. I’m looking for advice on how best to setup a reliable backup system for one of Discourse forums I administrate. The backup file that gets generated is around 3GB. (This forum was populated by importing 20 years worth of a mailing list emails with images, thus the size.)
We use DiscourseHosting.com as a host which takes daily backup of the database and files for disaster recovery purposes and stores them off-site. They only guarantee to store one backup at a time and not the archiving of several old backups.
This is great as a first line of defense but we feel it would be wise to have an additional backup system completely under our control which also involves archiving a few backups at a time.
The question is, what’s the best way to do this given the size of the backup in terms of successfully dealing with the transfer of a large backup file and avoiding a long amount of read-only downtime?
I’ve been trying to get the built-in Amazon S3 backup to work with no luck so far. I’ve also read about a discourse dropbox backup plugin but haven’t tried it yet.
Before I dive too deep into Amazon or Dropbox as a solution, I’m wondering what other ideas/experiences people have regarding huge backups like this? Thanks!