Setting up file and image uploads to S3

(Jeff Atwood) #67

It is 110% supported and safe.

(Scott Smith) #68

I switched to S3 several months ago and everything is working well. I would not recommend migrating the old images over however, the migration script is buggy - it didn’t move many images and also broke a few. I don’t know which ones got moved so I am keeping all the old images on my server, and so the overall disk usage (or Discourse backup size) has not reduced any. But at least its not growing rapidly like it was before the switch.

(Régis Hanol) #69

Then it should be fixed. :pencil:

(Robert Down) #71

I am getting Job exception: Access Denied in the Discourse logs when attempting to upload a file to S3. Some details:


/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/seahorse/client/plugins/raise_response_errors.rb:15:in `call'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/aws-sdk-core/plugins/s3_sse_cpk.rb:19:in `call'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/aws-sdk-core/plugins/s3_dualstack.rb:23:in `call'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/aws-sdk-core/plugins/s3_accelerate.rb:33:in `call'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/aws-sdk-core/plugins/param_converter.rb:20:in `call'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/seahorse/client/plugins/response_target.rb:21:in `call'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/seahorse/client/request.rb:70:in `send_request'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-core-2.5.3/lib/seahorse/client/base.rb:207:in `block (2 levels) in define_operation_methods'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-resources-2.5.3/lib/aws-sdk-resources/services/s3/file_uploader.rb:42:in `block in put_object'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-resources-2.5.3/lib/aws-sdk-resources/services/s3/file_uploader.rb:52:in `open_file'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-resources-2.5.3/lib/aws-sdk-resources/services/s3/file_uploader.rb:41:in `put_object'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-resources-2.5.3/lib/aws-sdk-resources/services/s3/file_uploader.rb:34:in `upload'
/var/www/discourse/vendor/bundle/ruby/2.3.0/gems/aws-sdk-resources-2.5.3/lib/aws-sdk-resources/services/s3/object.rb:251:in `upload_file'
/var/www/discourse/lib/s3_helper.rb:30:in `upload'
/var/www/discourse/lib/file_store/s3_store.rb:41:in `store_file'
/var/www/discourse/lib/file_store/s3_store.rb:17:in `store_upload'
/var/www/discourse/app/models/upload.rb:205:in `block (2 levels) in create_for'
/var/www/discourse/app/models/upload.rb:204:in `open'
/var/www/discourse/app/models/upload.rb:204:in `block in create_for'
/var/www/discourse/lib/distributed_mutex.rb:21:in `synchronize'
/var/www/discourse/lib/distributed_mutex.rb:5:in `synchronize'
/var/www/discourse/app/models/upload.rb:105:in `create_for'
/var/www/discourse/app/models/user_avatar.rb:23:in `block in update_gravatar!'
/var/www/discourse/lib/distributed_mutex.rb:21:in `synchronize'
/var/www/discourse/lib/distributed_mutex.rb:5:in `synchronize'
/var/www/discourse/app/models/user_avatar.rb:13:in `update_gravatar!'
/var/www/discourse/app/jobs/regular/update_gravatar.rb:12:in `execute'
/var/www/discourse/app/jobs/base.rb:154:in `block (2 levels) in perform'
hostname	ip-172-31-48-109-app
process_id	22178
application_version	65081a91939f9ed85a74a2d0023c864444195854
current_db	default
current_hostname	[REDACTED]
job	Jobs::UpdateGravatar
problem_db	default
user_id	        [12, 19, 10, 43]
avatar_id	[14, 21, 12, 45]
current_site_id	default
[HASH] REDACTED-BUCKET-NAME [23/Mar/2017:21:48:58 +0000] arn:aws:iam::REDACTED:user/discourse-uploads REDACTED REST.HEAD.BUCKET - "HEAD / HTTP/1.1" 200 - - - 26 25 "-" "aws-sdk-ruby2/2.5.3 ruby/2.3.3 x86_64-linux resources" -
[HASH] REDACTED-BUCKET-NAME [23/Mar/2017:21:48:58 +0000] arn:aws:iam::REDACTED:user/discourse-uploads REDACTED REST.PUT.OBJECT original/2X/1/1e03e179f46682a0049e5f80e0ab43cde9f3613e.png "PUT /original/2X/1/1e03e179f46682a0049e5f80e0ab43cde9f3613e.png HTTP/1.1" 403 AccessDenied 243 201977 5 - "-" "aws-sdk-ruby2/2.5.3 ruby/2.3.3 x86_64-linux resources" -

I successfully used AWS CLI to verify the credentials I am using are capable of uploading a file to the bucket.

Perhaps I’ve just been looking at it for too long, can anyone see if I’m overlooking something obvious?

(Lutz Biermann) #72

I’ve set up S3 with this guide and it works fine for new uploads. Thanks for that!

Now the problem:
After successful running Rake uploads: migrate_to_s3
I have a lot (several hundred) images with the following URL:
Some pictures were displayed correctly, but most of them were not.

The original URL was and it was set to

I had to restore the database because many images were broken. Running latest, default-docker setup and nginx reverse-proxy.

Any ideas?

(Scott Smith) #73

You have to rebake all posts to fix this issue - its a bug in the migrate script. See e.g.

for a discussion of the issue.

In my migration I got maybe half the old images to migrate over via uploads:migrate_to_s3 and subsequent rebake, but half didn’t make it. So, I have had to keep all the old images on the server since I don’t know which ones made it over and which ones did not.

(Jide Ogunsanya) #74

Nice share. It works flawlessly. Although, you will only get your keys after you create a user and you must have created your policy before you can complete the creating of a “user”. :wink:

(Leah Kramer) #75

I’m having trouble getting ours to work. If you follow these steps and create an IAM user, are you supposed to check off the box “s3 use iam profile” in Discourse > Admin > Settings?

And if so, checking that box says NOTE: enabling will override “s3 access key id” and “s3 secret access key” settings. so does that mean you don’t enter the 2 keys? And how does it know which IAM user to use?

(Bryan Holst) #76

I was able to get S3 working briefly on my site, but as soon as I did, the media player stopped working correctly. Is this just the way it is when using S3?

(Ben Edwards) #77

And this is still the case in 2018? Sorry, just want to make sure?

(Sam Saffron) #78

Well, s3 support is FAR more polished today and even used on meta. I can highly recommend using it.

I would say, it depends, if you expect lots of uploads going with s3 + an s3 CDN is probably way easier to cope with but costs will be a tiny bit higher.

(Jeremy M) #79

That is my question as S3 doesn’t seem to be an easy setup on Discourse. I am having trouble getting it setup

(Sam Saffron) #80

That is a bit unfair, using IAM is very fiddly, the general hobbyist installers would just go with Digital Ocean meaning that IAM is not even an option for them. I would recommend going without IAM first and only after you get the standard way going, go for ultra ninja mode.

(Jeremy M) #81

My apologies - wasn’t saying the problem was Discourse - it just is a bit confusing of what to put where and what’s needed and such. I don’t think it’s easy to setup, either with IAM or without - but that doesn’t point the blame at Discourse.

(Sam Saffron) #82

Oh it is far more complicated than built-in cause multiple services are involved, Amazon do a spectacular job at making the web ux as complicated as possible for mere humans.

What we need to focus on is improving the OP here with better screenshots and better notes.

(Michael) #83

Hey, I had some trouble with this. TLDR; ACCESS DENIED because of a typo. In my case, the bucket was not created automatically. I just made one with the defaults and it worked.

Also, is there a link to setting up a CDN using cloudfront? Reading the comments, may be more recent info.

(Caswal Parker) #84

Is there a way to manually migrate uploads to s3? the rake uploads:migrate_to_s3 is for some reason incredibly slow. Over 8 hours has moved 2gb of about 36gb.

The discourse instance is in Paris, uploading to an s3 bucket in London.

I am in New Zealand and can download my backup .tar.gz at 10MiB/s, so about a 1hr download.

(Nur) #85

The last time i check I believe Google’s storage are cheaper right?


Could use some help here :slight_smile:

I’m getting error: Missing required field Principal

  "Version": "2012-10-17",
  "Statement": [
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": [


When I try uploading a file a popup says “Access Denied” I dont get it :frowning:

edit: have no idea how its working now… o.o I added a second policy for a site and not discourse while I was putting figuring discourse out on hold, I saw a tutorial mentioned something about if two bucket names are listed a certain way like


something like that (being more locked down they say.So they said it will say Access Denied so i removed one…ya know what it may be because it took awhile for it to kick in when I added the policy to the user?