That setting seems to be enabled by default on my website. But regardless, are there any recommendations on how to backup s3 upload buckets efficiently?
This guide for saving backups to s3 and also archiving them to glacier makes sense when the backup is a single zip file. But my understanding of glacier costs is that they charge per file, so costs will go up drastically for unzipped bucket backups.
I’m an AWS newbie, so any advice is appreciated. Thanks!
Edit: alternatively if there’s no great simple answer, I could consider not using s3 for uploads.
That depends on so many factors. How much money do you want to throw at it? For which scenarios would you like to have backups? Software bugs, Amazon datacenters being hit by an asteroid, an evil admin deleting files from S3,…
I’m afraid we can’t help you with that. You need to find the solutions fitting your use case somewhere else. The search engine of you choice is a good starting point.
In our case with a non-discourse site we use awscli to sync buckets (aws s3 sync) between different regions in different accounts, so even if an account was compromised and the bucket deleted, or if an asteroid falls and destroy an Amazon datacenter (hopefully not), we could recover from the other bucket. If you do a sync the costs should not be so high because only new/changed files will be synced.
Well, there is still the case of the 2 accounts being compromised in a short period of time and the buckets deleted or Amazon closing AWS, both very very unlikely. But if something like that occurs you can just play in the lottery, choosing the numbers you think are wrong