添加选项以禁用备份压缩

Note to self:

Recently hit space issues again and backups failing so this post needs revisiting soon.

Note that in order to create a backup resulting in a 12GB gzip file I require ~36GB+ of space dedicated to backups, ~24GB+ of free space:

  • 12GB for the backup from the day before
  • 12GB for the new backup file
  • ~12GB+ tar archive to be compressed into gzip file (original DB file backup + original image files

So as backup sizes increase by 1GB, the backup / space requirements are actually increasing by a ~3GB.

This assumes that you are only keeping a single previous backup - where the Discourse setting maximum backups is set to 1.

The Discourse default is 5, so using defaults I would need ~84GB dedicated to backups to allow them to work.

2 个赞

What is the lion’s share of the backup? I assume uploaded images and so on? Wouldn’t it be easier to specify the backup is database-only, and thus make it a tiny fraction of the overall size?

(Yes, we’d still need some way to back up the images independently, but at least then the urgent need for 100GB+ of space would not be present.)

Yes the lions share of the backup content is “uploads”.

However a backup is not complete without the “uploads”.

My personal target is to move the images / “uploads” to Amazon S3 to avoid this issue for this specific instance, however there is still some testing to be done on a high topic / post count instance before I can trust the migration to S3, some issues already highlighted in that thread (more specifically avoiding a rebake of all posts).

I have other Discourse instances that would benefit in the backups being created in a more streamlined way.

2 个赞

I have the same problem as this thread, I have many GB of images and while I want to migrate them to S3 from what I have read the migration script seems a bit buggy still. So, I still have images locally but am running out of disk space given the high ceiling needed to allow a backup. Even if I could delete the old backup before creating the new one it would be OK for me. In fact I have been doing that manually.

Note that the backup system also seems to be failing me on the free disk space calculation, it will fill up the whole disk before giving up, and not even delete the partial files. Then the whole computer gets unhappy. There should be a calculation to not do a backup if there is no disk space for it, taking into account the space needed for the compression etc.

Edit: I am going to run a cron job which will delete the (sole) local backup every day. That should solve my immediate problem, but I think it would also be nice to have an option to immediately delete any (local) backup that was already successfully copied to S3.

当前用于备份压缩的 gzip 选项有哪些?

与主题中的内容不同,我希望能通过更高效的压缩方法来节省空间。我针对我们的 SQL 转储文件,使用不同的 gzip 压缩级别以及 Brotli 进行了一些快速而粗略的测试。

2630702226 level1.sql.gz
2276305530 level1.sql.br
2216602536 level5.sql.gz
2147212204 level9.sql.gz
2036157791 level2.br
1851831279 level4.br

正如所见,在效率方面,Brotli 4 级完全碾压了 Gzip 5 级,而两者的压缩时间处于同一量级。考虑到 Brotli 1 级工具速度极快,其压缩结果也不差。

无论如何,我认为 10% 或以上的节省空间幅度相当可观。

1 个赞

有意思,实际的压缩时间是多少呢?:thinking:

Zopfli 在这里会更有意思,因为它与标准的 gzip 解压缩工具兼容。Brotli 则需要不同的解压器。

2 个赞

zopfli 的速度非常慢,我怀疑我们不会想用它在像巨型备份这样的场景中使用。至少 brotli 在速度方面做了一些优化。

3 个赞

上传的备份 gzip 压缩级别
是否有办法禁用 gzip 压缩?由于我的上传主要是已压缩的图片,再次压缩它们是在浪费资源和时间。

1 个赞

来自相关主题:

1 个赞