If a site backup fails due to disk space when it’s gzipping the backup, the .tar file is left. Discourse can’t see it or use the tar file. On one hand, a decent sysadmin would be alarmed enough at a failed backup and immediately go solve the disk space problem and then gzip the backup by hand in a shell. On the other hand, someone who doesn’t like getting their hands dirty in a shell is sort of out of luck.
As an aside, it would seem like 50GB would be a reasonable partition size for a site with a 13GB backup, but since there are two copies of the current backup while it’s gzipping, and
maximum backups doesn’t delete a backup until there are more than maximum backups, 50GB is enough for
maximum backups to hold only one backup. It took me quite a while to understand that math.