What is the lion’s share of the backup? I assume uploaded images and so on? Wouldn’t it be easier to specify the backup is database-only, and thus make it a tiny fraction of the overall size?
(Yes, we’d still need some way to back up the images independently, but at least then the urgent need for 100GB+ of space would not be present.)
Yes the lions share of the backup content is “uploads”.
However a backup is not complete without the “uploads”.
My personal target is to move the images / “uploads” to Amazon S3 to avoid this issue for this specific instance, however there is still some testing to be done on a high topic / post count instance before I can trust the migration to S3, some issues already highlighted in that thread (more specifically avoiding a rebake of all posts).
I have the same problem as this thread, I have many GB of images and while I want to migrate them to S3 from what I have read the migration script seems a bit buggy still. So, I still have images locally but am running out of disk space given the high ceiling needed to allow a backup. Even if I could delete the old backup before creating the new one it would be OK for me. In fact I have been doing that manually.
Note that the backup system also seems to be failing me on the free disk space calculation, it will fill up the whole disk before giving up, and not even delete the partial files. Then the whole computer gets unhappy. There should be a calculation to not do a backup if there is no disk space for it, taking into account the space needed for the compression etc.
Edit: I am going to run a cron job which will delete the (sole) local backup every day. That should solve my immediate problem, but I think it would also be nice to have an option to immediately delete any (local) backup that was already successfully copied to S3.
Quelles sont les options gzip actuelles utilisées pour la compression des sauvegardes ?
Contrairement à ce qui est mentionné dans le sujet, je cherchais à économiser de l’espace en utilisant une méthode de compression plus efficace. J’ai effectué quelques tests rapides et approximatifs sur notre dump SQL avec différents niveaux de gzip et aussi avec brotli.
Comme on peut le constater, Brotli niveau 4 surpasse largement Gzip niveau 5 en termes d’efficacité, tandis que le temps de compression reste dans la même fourchette. Le résultat de Brotli niveau 1 n’était pas mauvais, compte tenu de la rapidité fulgurante de cet outil.
Quoi qu’il en soit, je trouve des gains de 10 % ou plus tout à fait intéressants.
zopfli est effroyablement lent, je doute que nous souhaitions l’utiliser pour quelque chose comme une sauvegarde massive. En revanche, brotli est au moins partiellement optimisé pour la vitesse.
\u003cniveau de compression gzip de sauvegarde pour les téléchargements
est-il possible de désactiver la compression gzip. puisque mes téléchargements sont principalement des images déjà compressées, c’est une perte de ressources et de temps d’essayer de les compresser à nouveau.