Would it be worth resizing uploaded images (to save space)?

I can see this as very useful feature… Not only preset compression level at ACP but also to limit maximum resolution. I agree with AstonJ that it is not needed to keep maximum resolution in archive… Maybe to recompress files after some time and only files with kilobyte size exceeding some level.

Actually some PNGs make sense - for screengrabbing for example… (it wouldn’t do any good to convert screengrabbed PNGs to JPG).

But I think we should focus on JPEGs where we can be very efficient and save space.

1 Like

Can safety resize all images manually with jpegoptim batch or only optimize? Problems in the future with discourse if i edit the uploads externally?

I dont use s3 uploads, all local.

Just for interest sake, how long has your Discourse site been live and what size is your current backup file. I’ve just started up a new site with Discourse which is going to involve lots of image uploads so just wondering where you stand after X months. @codinghorror solution was indeed what I was thinking to do is every month run a cron and compress all images older than X months using littleutils or any other image compression software. Littleutils handles PNG’s quite well and is used as the backend to a number of Wordpress image compression plugins.

My Discourse instance (a local community forum) has been up nine months and is now 2GB.

There’s some Data Explorer SQL that shows upload data volume by user. In my case, a handful of users have been responsible for the majority of uploads.

I’d really like to see a Discourse feature that recompressed and resized large image uploads of a certain age (say, 1 month)

6 Likes

Thanks for the reply. I guess 2Gb over 9 months is not too bad, I’ve allocated 1Tb to my site so I should be good for some time to come :slight_smile: I’ll do some testing however with some bash using littleutils using opt-jpg opt-png and opt-gif, shouldn’t be too difficult to have a cron do this once a month. Would be better though as a core feature.

This looks like a promising method, going to test with this on a local dev server.

1 Like

Sorry about the delay in getting back to you.

We’ve been online a year now and the backup is 1GB. However we are a programming forum with relatively few uploads. I imagine one of my other forums would have a significantly bigger DB with all the uploads the members do on those (they are not Discourse forums and images are compressed and saved in the file system).

3 Likes

We’ve been on discourse since the end of Nov. 2016. Our uploads grow by about 0.5MB/month. Our old platform had an image upload facility, but it was a manually vetted process, so there was a delay until the photos were available (and the photos weren’t inline in the forum posts, you had to post a link to the photo album). It’s an antique car forum, so including a photo showing some detail is a boon, and the users love it - I would guess that 10-15% of the posts include a photo.

Reducing the max upload size (3072) isn’t an option, as the pics need enough detail that they show… detail.

Limiting the user’s ability to upload (either by group membership, or by MB/month) wouldn’t really work either - if some new user posts a question of the form “Is this right?” the best way to do that is often to include a picture of the carb, or brake pipe, or whatever, and the easiest answer is that someone with the same car answers by uploading a picture in response. Often a post will start with a detail question and quickly garner 5 “here is how mine is” responses with photos

We’re not using S3 for uploads yet, but it’s only a matter of time. As the users master uploading photos, and start really using it our backup window will get longer. At the moment our backup is 5GB: 1.5GB of uploads, and 1.8 Million posts.

1 Like

Just an update I found a nice way of optimizing images older than X days using GruntJS … going to test the code this weekend and post my results here.

5 Likes

What was the outcome of this compression , I’d be interested in knowing ?

Are you highlighting these images in the post listings at all? Or do you have to visit the actual post to see which actually have photos?

I’m not sure I understand the question.

I’m wondering where you display these images.

They’re in the user’s posts.

A typical example: Someone will be asking about a construction or assembly detail which isn’t shown in the shop manuals, see this example (a picture is worth a thousand words - and a great many hours in the shop): New old Door hinge issues - XK - Jag-lovers Forums

I’d love to know if you had any success here?
Our install is nearly 3 years old now and the server is showing 27GB used (after running cleanup). There’s quite a lot of photo uploading, often straight from phones, which means lots of multi-MB image files building up.

I’d love to be able to shrink all >1 year old images to something decent but sane, to claw some precious space back.

5 Likes

I believe @neil has some code we used to do this a few times, but it was focused on older versions of Discourse where inappropriately large .png images would sometimes get saved instead of smaller .jpg images.

3 Likes

The script is downsize_uploads.rb. Run it like so:

cd /var/www/discourse
RAILS_ENV=production bundle exec ruby script/downsize_uploads.rb

It will try to downsize images to be 1 megapixel or less, or you can pass in the max size you want.

The script doesn’t scope it like that, but you can easily edit the script to do so. Add .where('created_at < ?, 1.year.ago) to the query on line 10.

8 Likes

@neil Could this be used for S3 image uploads as well?

1 Like

That script only works if you store the images locally. It would need to be updated to support images stored externally.

4 Likes

I tried this on my discourse, but got the following error

/var/www/discourse/vendor/bundle/ruby/2.6.0/gems/activerecord-6.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:50:in rescue in postgresql_connection’: FATAL: Peer authentication failed for user “discourse” (ActiveRecord::NoDatabaseError)`

FTR, the script has now been updated to also handle uploads stored on S3.

5 Likes