I have two machines on the same home network (with the same public IP), on which I am running
curl <WEB ADDRESS>
On both, that triggers Cloudflare's DDoS protection, as the DOM I get back contains this snippet
<div class="attribution">
DDoS protection by <a rel="noopener noreferrer" href="https://www.cloudflare.com/5xx-error-landing/" target="_blank">Cloudflare</a>
<br />
<span class="ray_id">Ray ID: <code>66923678a8630cd9</code></span>
</div>
Now, on one of those machines, running Ubuntu 20.04 and curl 7.68.0, I can get around that by simply passing any made-up user-agent as a header:
curl -H 'User-Agent: Thisisfake' <WEB ADDRESS>
works just fine (no DDoS in the DOM I get back).
On the other machine, running Arch Linux and curl 7.77, this does not work. Nor does loading the page in Firefox (which I can do fine), copying the request as a curl command as suggested in this post and then running that command:
Firefox reports that the valid request is
curl <WEB ADDRESS> -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:89.0) Gecko/20100101 Firefox/89.0' -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8' -H 'Accept-Language: en-US,en;q=0.5' --compressed -H 'DNT: 1' -H 'Alt-Used: <WEBSITE> -H 'Connection: keep-alive' -H 'Cookie: __cf_bm=<LONG STRING>' -H 'Upgrade-Insecure-Requests: 1'
but running it directly in a shell still produces the DDoS-protected DOM.
Question(s): Why the difference in behavior, and how to investigate further?
Additional info
On a whim, I tried a GET request to the same address with Perl's Mojolicious. That gets around the protection on all machines, no problem.
It reports that its request headers are
"accept-encoding" => ["gzip"],
"user-agent" => ["Mojolicious (Perl)"]
but setting these headers manually in curl does not fix the issue on the problematic machine.