Optimize images before uploading?

Is it possible to pre-resize an image BEFORE uploading to server?

So people can simply drag and drop large photos, no need to resize them manually?

I did some search, seems it is possible using HTML5:


I think the server already takes care of optimizing images.

The problem is the file size.

If the max image size is set to 2M, and user trying to upload a photo (usually >4M), he will get a ‘Sorry, the image you are trying to upload is too big’ message and unable to upload that photo. And, if he is using a cell phone, it could be more hard to manually resize that photo.

Server can only optimize images which are already uploaded. What we need in this situation is optimize that image BEFORE uploading.


This would be a great feature. To get I phone images to work for me I had to crop them significantly

1 Like

Checking my Nexus 5 as a reasonable current / modern data point, I have:

  • 667 jpg photos, 1.21 gb = 1.85 mb each on average
  • Largest 4.5 mb
  • Smallest 1.3 mb

If we bump the default max upload size from 2mb to 3mb that covers 624 (94%) of the photos on my device. That seems fine.

I think that’s a reasonable default for V1 and covers us OK … we’ll need to figure out some fancy resizing thing for post V1, since

  • phones will eventually have 10 zillion megapixel cameras, and every single image they save will be 5mb to 10mb+.
  • the worst-case “noisy image” photos will always be unusually large, e.g. my set of 5 / 667 images that are 4.0mb+ outliers.

On a “typical” Digital Ocean steady state install of Discourse, with the 1GB / 30GB minimum, I am seeing:

30.9% used of 29.40 GB = 9.08 GB used, 20.32 GB free

20.32 GB / 3 MB = 6,773 full size (3mb) images
20.32 GB / 2 MB = 10,160 medium size (2mb) images

So if you have say, 1000 members, and they upload on average 3 images per year per user (3,000 images/year), assuming an “average” image size of 2mb, that means you’d run out of disk space in about 3 years.

Clearly it depends heavily on

  • size of your community
  • what they are uploading
  • how often they are uploading

But there’s some back of the envelope calculations that seem reasonable to me.


Unfortunately, all of our community are moaning about this. I don’t like storing their large images, but I’d definitely allow them to upload them as large as they want, up to 20 MB, and then get them resized by Discourse before to store.

1 Like

The default of 3MB may need to be reconsidered - the decision made some 7 months ago.

@codinghorror’s example phone a Nexus 5 was released November 2013 was perhaps quite out of date at the time of the decision
with an 8 MP, 3264 x 2448 pixels camera.

A contract phone in the UK is a “Samsung Galaxy Alpha” released September 2014
with an 12MP, 4608 x 2592 pixels camera.

I don’t have any average image file sizes to offer but the sample I have been sent is 3.2MB
(the user was unable to upload it).

In addition the 4th most popular camera phone (for uploading photos to flickr) is the 16 MP, 5312 x 2988 pixels Samsung Galaxy S5 released April 2014.

I took a sample of the first 15 images on Flickr in a “Samsung S5” search that the following characteristics:

  • Taken with a Samsung S5 model
  • The “Original” file was downloadable
  • Was not a panorama image
  • Did not appear edited
  • Had original EXIF data
  • Was not a picture of the phone itself

The file sizes:

This was just a quick analysis - perhaps flawed - but a ~7MB average leaves me with a bad taste.

I submit that optimizing images before upload and default upload file size limit might have to be reviewed.

And that an average size might not be right and that we have to remember to aim slightly higher with limits to prevent user issues and support requests.


I agree that on a long enough timescale something has to be done, the key bit is that the mobile device needs to be able to send a smaller image to the browser – ideally without blowing up memory in the process.

You could always brute force it

  • load the full image as a bitmap in JavaScript
  • resize it in real time using JavaScript code (canvas action?)
  • send the resized image from memory

But that would blow up memory and CPU on the mobile device pretty badly, all in the name of getting a smaller image over the wire than the one that is on disk.

Another thing of note is I assume those are raw photos - Google photos which is pushed heavily as the free photo backup typically forces a save of 2048x for unlimited free storage, not sure about iCloud. Is there any data sources with a higher sample size that might take that into account?

Or maybe go with an integrated off site photo embed. So when a user clicks the attach button theyre offered to upload their photo to a profile linked Flickr/G+ or juat imgur.

To be honest the simplest fix is probably to just accept up to 10mb – it’s easy enough to adjust in site settings. We should make sure our nginx template allows this on the back end, and then the client can enforce smaller values if needed.


I see little value in having these huge images what we really need is client side resizing


I agree - but until we have the feature “optimize images before uploading” (this topic) sane defaults must be put in place.
The question of sane defaults was the point of my post, research I did as a result of users complaining about not being able to upload.

1 Like

I don’t see any reasonable client side image resizing options at the moment…

Loading a giant image into memory on a HTML canvas object, on a device with 1GB of RAM that’s already running our large JS payload… maybe we should just wait until Apple gets its crap together and ships all mobile devices with 2GB RAM?

(Not that Android Chrome memory management is really any better, but at least the devices tend to have 2GB+.)

File API and canvas do the trick, we should be able to use it on android/iphone http://stackoverflow.com/questions/10333971/html5-pre-resize-images-before-uploading

1 Like

We are sure this won’t blow out device memory? A ~4-5mb JPEG still expands into an enormous array of 32-bit values for each pixel… iPhone 5 / 6 picture size is

3264 × 2448 × 32-bit = 255,688,704 bits = 32 megabytes of memory

Which isn’t much, but when your whole device has 1,024 megabytes to work with, of which a lot of that is taken up by

  • the OS
  • the browser
  • our JS
  • our HTML / CSS / DOM


Probably (hopefully?) somewhat safe to assume that devices which capture such large images will have the memory necessary to do some processing on them…

Not true though, iPhone 6 and 5 still have 1 GB RAM. iPhone 4s has 512mb RAM, though that device is on its way out.

Yes, this should be fine. We’re not asking the browser to load it all in memory as a bitmap, we’re asking it to do a PNG -> JPEG re-encoding.

Note: The process goes like this

  1. Get the file from the user as a “Blob”
  2. Get the URL of the file with window.URL.createObjectURL(blob)
  3. Stick that url in an Image on a <canvas>
  4. Get the scaled-down data back from canvas via canvas.toDataURL("image/jpeg",0.7)
  5. Upload that to the server
1 Like

I’m pretty sure this step loads the image in memory as a bitmap. Why wouldn’t it?