Our disk space disappeared - how to find who/where?


(fearlessfrog) #1

Our free disk space went down by about a gig overnight, or rather the backup image size jumped up.

We’re trying to figure out where it went, as we think a user may be uploading files to themselves via PM or something?

Where are good places to start looking in Discourse for this type of info, i.e. where the upload allocation went per user?


Duplicate Uploaded Files
User Upload Reporting & Quota
(Marcos P) #2

try this commands in your host:

sudo apt-get autoclean

sudo apt-get autoremove

cd /var/discourse

and after

sudo ./launcher cleanup

and press y for cleanup files!


(Alan Tan) #3

Usually what I’ll do is to run

df -h and see which file system is taking up most of the disk space and then try to narrow down the scope with du -h with an optional --max-depth=1 to find the offending file/folder. Another tool that you might want to checkout is ncdu


(fearlessfrog) #4

Thank you - yep, we do that and it certainly does stop the bumps. I’ll double check for sure, but also did want to just see if there was a quota/report per user as well - especially as it was within the Discourse backup.tar.gz.


(fearlessfrog) #5

So @sam just to be super brief :slight_smile: perhaps the simplest thing that would work for us is that I just do a

./launcher enter app
rails c 

…and then just use models for Upload (:filesize) and User and go irb query that way? I think that would get what we want.

EDIT: Yep, I just did it that way, i.e. user.uploads.sum(:filesize) &map. We found someone with 9GB of uploaded content in Discourse.


(Mittineague) #6

9GB Whew!
Are your settings very high?

max image size kb
The maximum image upload size in kB. This must be configured in nginx (client_max_body_size) / apache or proxy as well.
max attachment size kb
The maximum attachment files upload size in kB. This must be configured in nginx (client_max_body_size) / apache or proxy as well.

Or is this a member doing a mess of smaller uploads?


(fearlessfrog) #7

^ That. Our max image size was left at 3072 kb. It looks like just over a year of constant screen-shot uploading, but we are still figuring that out. It doesn’t look malicious, just really keen/dedicated and we are a very screenshot-centric forum. One tutorial topic alone had over 100 images uploaded. An interesting thing is the 2nd place uploader has only 0.6 GB used, so we’ll talk to who is doing it and figure something better out.

I wonder if the trust_levels could have a file/image quota added? I haven’t searched through the feature wishlist to find out as yet. This user is a level_3.

EDIT: I added a feature idea here, if others might have the need too:


User Upload Reporting & Quota
(fearlessfrog) #8

Ok, some new information the more we dig.

This looks like a Discourse bug somehow. The user with the 9 GB of files has the same 13 files that were uploaded as a set repeated 1728 times, i.e. 22,464 duplicate file uploads.

I’ll raise a bug and hopefully we can remove these plus stop it happening again.

Bug is reported here. Will welcome any help/advice :slight_smile:


(Mittineague) #9

Hmmm, AFAIK the only multiple images are optimized_images of uploaded avatars

I think there is something different going on here with your situation. I cant imagine avatars weighing all that much.


(cpradio) #10

Unless he enabled animated avatars.


(fearlessfrog) #11

I recognize the filenames repeated I think, and am pretty sure this was a bulk upload of screenshots either in a topic post or in a personal message. The size of the files (all .png’s) is like ‘595934 bytes’ or ‘322155’ as in, not files that look like they have been processed to thumbs or are avatars.

EDIT: We should move the discussion to the new bug topic I think, as I’m filling out the details as I find them there.


(Tobias Eigen) #12

on my site it turned out to be backups that take up all the space. I’ve now changed it to only keep 3 on server, and the rest on s3. I also turned off “s3 disable cleanup” which means I can decide myself which backups there to keep.