Increasing max file size

Good morning,

Having a file upload issue that I just can’t seem to figure out.

I am running the discourse instance via docker in a VM on Google Cloud.

I currently have file uploads and discourse backups using S3 via GCS enabled and those functions are working properly after following the instructions on the Configure an S3 compatible object storage provider for uploads thread. I can see the uploads in the bucket and when I look at upload URLs, all uploads are showing the proper URL from the CDN so they seem like they are pulling correctly from the bucket.

I then followed the instructions on the Change the maximum attachment/upload size thread and I have the following in app.yml under params

params:
  db_default_text_search_config: "pg_catalog.english"

  ## Set db_shared_buffers to a max of 25% of the total memory.
  ## will be set automatically by bootstrap based on detected RAM, or you can override
  db_shared_buffers: "1024MB"

  ## can improve sorting performance, but adds memory usage per-connection
  #db_work_mem: "40MB"

  ## Which Git revision should this container use? (default: tests-passed)
  #version: tests-passed

  ## Maximum upload size (default: 10m)
  upload size: 1000m

I then went to Settings>Files and entered the following:

But when I go and upload a 12.5MB PDF to a post, I receive this:

The other two PDFs that are 6-7MB uploaded fine and were uploaded through the S3 bucket as intended and returning with CDN addresses. So I’m pretty stumped and any help would be much appreciated. Thanks in advance.