I’ve got a site that’s pushing a 20GB backup to Wasabi S3. It works. Most of the time.
But sometimes it fails to upload to S3 and keeps the local
.tar.gz. And then eventually the disk fills and I’m left with a full disk, the uncompressed
.tar file (because there wasn’t enough space for the compressed version, and soon, a broken site because the disk is full.
Before I punt on Wasabi, I’d like to try to see if there are any clues.
I’ve looked in
production.errors and the sidekiq and unicorn logs and don’t see “acku” anywhere either on the day that the backup failed or when it worked. Shouldn’t there be a log somewhere?