Discourse & Cloudflare

I don’t know how anyone could think that the discussion is about anything other than the CDN. The OP here says

They didn’t say “The cloudflare CDN service” but I think it’s pretty clear that it’s what they mean. I honestly don’t know how any of the other services have anything to do with whether Discourse works.

Let’s be clear here. Cloudflare isn’t a CDN. They call a feature of their service Cloudflare CDN, but at a fundamental level that’s not what it does.

The two elements which usually intersect with Discourse are their DNS (fine, good even) and their reverse proxy - the orange cloud while using their DNS product.

The reverse proxy can cache uploads (which is ok) but can also interfere with the javascript payload delivered to the browser (typically not ok, and the element we’re discussing here). As a reverse proxy it also increases latency for all communication between client and server, which directly impacts user experience.

Cloudflare tunnel is mentioned elsewhere on meta and is fine for the application where it’s highlighted.


Thank you @Stephen for providing the clarity I was pointing at.

Isn’t that opposed to the benefits that a CDN provides at its core? I mean, making assets load faster to users all around the globe by serving files from close regions?

That’s the conventional interpretation, yes. Cloudflare does have a huge network and broad presence, but it doesn’t stop their proxy from slowing down communication to some degree.

There’s no real issue with turning on the orange cloud once Let’s Encrypt has issued a certificate, providing you disable their performance features and ok with the increased latency. As we said above it’s useful if you need to obscure your server IP or want to cache /uploads. It’s just not the magic bullet some purport it to be.


To my knowledge Let’s Encrypt can always access the webserver for the HTTP-01 challenge through HTTPS, even with the Cloudflare Universal SSL cert in front of it, so one wouldn’t have to wait with turning on Orange Cloud until after LE cert is issued.

I would recommend trying it- it’s a common support topic here.

Let’s Encrypt will fail if Cloudflare is enabled before the intial cert is issued.

Discourse-setup also doesn’t add the Cloudflare template, I typically recommend the two be handled at the same time after the initial build completes.

1 Like

I actually did, all my public web apps are running through Cloudflare Tunnel which has to enable Orange Cloud to function and all were able to receive an initial LE cert without Port 80 and HTTP being available at all.

Cloudflare tunnel is a different product. It’s not going to make things faster. It’s totally different.

Yes. But that wasn’t what I was talking about with Stephen. And I also never said anything about “making things faster”. I added an example in what circumstance LE certs can be issued even with Orange Cloud on, because I experienced it that it works.

It would appear that you’re right that Cloudflare now has enough products with similar but very different names and purposes that it’s going to be very confusing helping anyone using Cloudflare. Discourse on a residential internet with Cloudflare Tunnel is a very specific and documented use, which is very different from what 99% of the topics discussing cloudflare are talking about. It doesn’t really belong in this topic.

Yes, HTTP-01 challenge works in conjunction with Cloudflare in “orange cloud” mode. But it does not work over HTTPS, the HTTP-01 challenge only works over port 80, and:

Many people running Cloudflare set Cloudflare to automatically redirect HTTP to HTTPS, and that makes port 80 on the origin server unavailable, and that prevents HTTP-01 challenges from working.

So if you don’t enable those redirects, then it will work.

So strictly speaking this is untrue.
Let’s Encrypt will fail if Cloudflare is set to redirect traffic on port 80 before it reaches the origin server.

1 Like

I agree, however because IT security is now more present than ever and people begin to work with Zero Trust products more, which CF Tunnel is a part of, we will and should see an increase of utilization of this kind of technology, that’s why I brought it up.

I think you misunderstood how LE’s HTTP-01 challenge works.
It looks for the token certbot or other variant of LE client put, most of the time, into the .well-known subfolder of the webserver.
But it isn’t hardcoded to start the request on port 80, ignore any HTTP code redirects and fail outright if it can’t find the token.
The HTTP-01 challenge is able to follow HTTP redirects (so 301 and 302) and is therefore able to read the .well-known folder through 443 and HTTPS.
And the reason it works for Cloudflare Universal SSL WITH Redirect (and Cloudflare Tunnel) is that Cloudflare answers in place of the webserver on Port 80, redirects the request to 443, where LE can read the token and the CA can issue the cert.

High-Level diagram of the flow:

Certbot starts HTTP-01
→ POSTs cert request to CA and puts token into .well-known
→ CA starts GET for Token on FQDN port 80
→ CF redirects to port 443 and secures the request with its Universal SSL cert
→ Request is forwarded to the webserver itself (through CF Tunnel or direct)
→ CA is able to GET the token in .well-known because port 443 is able to present the token the same way HTTP and port 80 would
→ CA POSTs the RAW cert data and Certbot creates the files

I started a topic back in March looking for more recent details or guides on how to implement Cloudflare.

I’m still looking for one :slight_smile:

I use Cloudfront as a CDN but would like to add the ddos protection that Cloudflare bring to the table. We get hammered a lot :confused:

I think I understand it pretty well. You are right, it can be redirected to HTTPS, but it depends on Cloudflare settings and webserver configuration whether that will work or not, since initially there will be no valid certificate on the origin server.

Yes, they can be redirected to a different port, but HTTP-01 challenges must always start on port 80.

See Challenge Types - Let's Encrypt

The HTTP-01 challenge can only be done on port 80. Allowing clients to specify arbitrary ports would make the challenge less secure, and so it is not allowed by the ACME standard.


I agree, I just pointed out your inaccuracy that it will straight up never work.

The quoting of my sentence you have executed here is quite evil since it suggests the wrong circumstance of discussion and implies another meaning. My full sentence was

and the important part of my sentence was the combination of “hardcoded to start the request on port 80” AND “ignore any HTTP redirect” AND “fail outright”, since you said

and this implies that the reason of the HTTP-01 challenge failing is the redirect alone, which is not true.
Also, strictly speaking, a redirect does not make port 80 “unavailable”.

No evil meant or intended.

It makes port 80 of the origin server unavailable for all traffic that is directed at the hostname.

I don’t like the current tone of the conversation in this topic, so I’m unwatching it.

My opinion on Cloudflare in combination with Discourse can be summarized as “many people are apparently unable to configure it correctly so in general I would recommend against enabling it. If you want to use it for DDoS protection, then I would recommend enabling it with very specific settings only.”


That’s a clear statement, thanks.


This can be another stupid question, but because I’m serving only finnish audience I don’t see any reason to use Cloudflare, so I konw it only by reputation.

But if its only benefit is stopping DDoS, and DDoS mostly means just too many calls made by

  • useless SEO-crawlers
  • another bots made by script kiddies

then why not use Nginx front of Discourse and stop known user agents there? When combined with Fail2ban that would reduce load something like 90 % (sure stetson statistic, but a lot anyway).