Discourse & Cloudflare

Yes, HTTP-01 challenge works in conjunction with Cloudflare in “orange cloud” mode. But it does not work over HTTPS, the HTTP-01 challenge only works over port 80, and:

Many people running Cloudflare set Cloudflare to automatically redirect HTTP to HTTPS, and that makes port 80 on the origin server unavailable, and that prevents HTTP-01 challenges from working.

So if you don’t enable those redirects, then it will work.

So strictly speaking this is untrue.
Let’s Encrypt will fail if Cloudflare is set to redirect traffic on port 80 before it reaches the origin server.

1 Like

I agree, however because IT security is now more present than ever and people begin to work with Zero Trust products more, which CF Tunnel is a part of, we will and should see an increase of utilization of this kind of technology, that’s why I brought it up.

I think you misunderstood how LE’s HTTP-01 challenge works.
It looks for the token certbot or other variant of LE client put, most of the time, into the .well-known subfolder of the webserver.
But it isn’t hardcoded to start the request on port 80, ignore any HTTP code redirects and fail outright if it can’t find the token.
The HTTP-01 challenge is able to follow HTTP redirects (so 301 and 302) and is therefore able to read the .well-known folder through 443 and HTTPS.
And the reason it works for Cloudflare Universal SSL WITH Redirect (and Cloudflare Tunnel) is that Cloudflare answers in place of the webserver on Port 80, redirects the request to 443, where LE can read the token and the CA can issue the cert.

High-Level diagram of the flow:

Certbot starts HTTP-01
→ POSTs cert request to CA and puts token into .well-known
→ CA starts GET for Token on FQDN port 80
→ CF redirects to port 443 and secures the request with its Universal SSL cert
→ Request is forwarded to the webserver itself (through CF Tunnel or direct)
→ CA is able to GET the token in .well-known because port 443 is able to present the token the same way HTTP and port 80 would
→ CA POSTs the RAW cert data and Certbot creates the files

I started a topic back in March looking for more recent details or guides on how to implement Cloudflare.

I’m still looking for one :slight_smile:

I use Cloudfront as a CDN but would like to add the ddos protection that Cloudflare bring to the table. We get hammered a lot :confused:

I think I understand it pretty well. You are right, it can be redirected to HTTPS, but it depends on Cloudflare settings and webserver configuration whether that will work or not, since initially there will be no valid certificate on the origin server.

Yes, they can be redirected to a different port, but HTTP-01 challenges must always start on port 80.

See Challenge Types - Let's Encrypt

The HTTP-01 challenge can only be done on port 80. Allowing clients to specify arbitrary ports would make the challenge less secure, and so it is not allowed by the ACME standard.

2 Likes

I agree, I just pointed out your inaccuracy that it will straight up never work.


The quoting of my sentence you have executed here is quite evil since it suggests the wrong circumstance of discussion and implies another meaning. My full sentence was

and the important part of my sentence was the combination of “hardcoded to start the request on port 80” AND “ignore any HTTP redirect” AND “fail outright”, since you said

and this implies that the reason of the HTTP-01 challenge failing is the redirect alone, which is not true.
Also, strictly speaking, a redirect does not make port 80 “unavailable”.

No evil meant or intended.

It makes port 80 of the origin server unavailable for all traffic that is directed at the hostname.

I don’t like the current tone of the conversation in this topic, so I’m unwatching it.

My opinion on Cloudflare in combination with Discourse can be summarized as “many people are apparently unable to configure it correctly so in general I would recommend against enabling it. If you want to use it for DDoS protection, then I would recommend enabling it with very specific settings only.”

6 Likes

That’s a clear statement, thanks.

3 Likes

This can be another stupid question, but because I’m serving only finnish audience I don’t see any reason to use Cloudflare, so I konw it only by reputation.

But if its only benefit is stopping DDoS, and DDoS mostly means just too many calls made by

  • useless SEO-crawlers
  • another bots made by script kiddies

then why not use Nginx front of Discourse and stop known user agents there? When combined with Fail2ban that would reduce load something like 90 % (sure stetson statistic, but a lot anyway).

This discussion is very valuable. For a Chinese website administrator, Could Flare means whether Chinese users can normally exchange data with the outside world. I did a test some time ago. If you don’t use Orange Cloud and access servers in other countries from China, the network jitter will be very serious. The evil thing is that running a forum in China is subject to strict censorship. Even though I was doing a non-political forum, I still suffered. We must assume that the server of the website is located outside of China. So, if I create a forum using Discourse, I have to consider whether it can use Could Flare.