Cloudflare Argo reduces network latency on average by 35% and connection errors by 27%. Traditional network technologies use static routing information, which can be slower and often use congested paths. Slow loading times and connection timeouts increase the likelihood of poor user experience. The Cloudflare company routes 10% of all HTTP/HTTPS Internet traffic. This provides them real-time intelligence on the true speed of network paths. Cloudflare’s Argo smart routing algorithm uses this information to route traffic across the fastest paths available while maintaining secure connections and eliminating excess latency. Argo propagates content via Cloudflare’s 100+ server locations.
Cloudflare Argo enabled
Above: Histogram of Time To First Byte (TTFB). The blue and orange series represent the before and after.
Although the vast majority of visitors to this blog are from the United States, almost a third of the visitors are from Europe. With the UK, Germany, and France at the top. This was one of the reasons – also plain curiosity – why I decided to give Cloudflare Argo a test run. In short, if you have a significant chunk of international traffic, it’s more than worth a try. If all of your web traffic is US-based, you will still see around 20% response time improvement depending on your existing TTFB. With page loads for this blog hovering around the one-second mark, the reduction of 300 milliseconds of response time in Europe and Asia is welcomed.
Above: Each circle represents a Cloudflare network location. The larger the circle, the more traffic is being served via that location.
TTFB measures the delay between sending a request to your server and receiving the first byte in response. TTFB includes network transit time (which Argo’s Smart Routing optimizes) and processing time on your server (which Argo does not affect. See below graph for Nginx’s response time for this blog).
Cloudflare Argo and my previous setup
Above: Screen crop of Nginx Amplify graphs (this blog). The response time spike was due to my WordPress admin back-end activity.
Without Cloudflare Argo, I used Cloudflare’s FPC (Full Page Caching) via page rules for 300ms+ off cached response times.
With Cloudflare Argo, I’m using Nginx w/ fastcgi_cache for FPC. This means all page requests, even uncached, are fast.
Above: My current Cloudflare page rules. FPC could still be enabled here, but I prefer the flexibility with it off.
This blog aimed to set Cloudflare’s edge servers to the max of one month TTL (time to live/expire) for statics. Since there’s no FPC enabled, there’s no need to clear Cloudflare’s cache. This is more convenient than using Cloudflare FPC with a TTL of 4 hours or whatever the previous setting was. Over the past year, with full-page caching enabled, I found Cloudflare’s cache had to be cleared either in-part (new posts, categories, tags, and the front page) or the entire cache (and/or dev mode) during design changes or other global reasons. Since moving full page caching to Nginx and using Cloudflare Argo, the blog now has better overall response times for BOTH cached and uncached requests.
Above: Shows my monthly Cloudflare bandwidth usage. Argo is priced at $5/domain monthly, plus $0.10 per GB of transfer.
Above: 30 days of Unique visitors.
Cloudflare Argo – Conclusion
UPDATE: I’ve been using BunnyCDN (affiliate credit link) instead for the past 2 years. I’ve setup BunnyCDN to FPC this blog via its global PoP locations. See my list of best CDNs, which still includes Cloudflare.
I also measured improvements using Pingdom and other tools. Of course, when Cloudflare’s EDGE servers are set to cache everything/FPC, it’s a bit faster for cache results. However, I wanted to get away from using Cloudflare’s FPC page rules without the 300+ milliseconds hit to response time for my needs. Argo has allowed this and a longer EDGE cache TTL of 1 month for statics instead of 4 hours for FPC, resulting in a much higher cache hit-ratio. This with just 3-page rules, keeping costs down. Of course, you are free to use Argo + cache everything page rules together if that suits your setup.
Considering that 300ms amounts to as much as 30% of this blog’s total page load time (when including response times/TTFB), Cloudflare Argo makes a noticeable difference! It’s certainly not free; however, you do get the advertised improvements in response times, and, in my case, it was almost twice that. In closing, if you are obsessed with page speed and measurable performance improvements, you should give Cloudflare Argo a spin!
Note: The thoughts expressed are my own opinions after using Cloudflare Argo for a few weeks. Your results may vary.