uses Apache server. It failed, here’s why.

Update:’s headers were updated.

Ok, so I was curious and I looked into the headers of and to my surprise its powered by, err’ Apache. However, as explained below, Apache was configured incorrectly.

Here are’s main headers:

HTTP/1.1 200 OK
Server: Apache
Accept-Ranges: bytes
Content-Type: text/html
Access-Control-Allow-Origin: *
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 12213
Connection: keep-alive

…the only good news here is that gzip is being used.

The next problem, also indicated by the headers is no caching setup for static files. For example, all of these files (and many others) are missing browser caching headers:

…this means each of those files have to be repeatedly served by Apache every time a user visits a page, refreshes, etc. When a website lacks caching this becomes even more of a critical issue when there’s programming issues and site errors. It means that as users try to refresh pages and revisit repeatedly because of website errors, that multiplies the load on the server greatly!  For example, with 1 million visitors retrying a “single” page just 2 to 3 times would have resulted in 2 to 3 million requests for “each” static file! A cache TTL setting of even 1 hour could have lowered loads on Apache significantly. down

They are using and Akamai for CDN. This is fine, but not enough of the assets were being served by these CDNs. Most are not served via CDN and not caching headers.

If the basics (improper demand on Apache, no caching headers for statics, javascript, CSS and images) are not covered, it gives you an idea of the level of expertise that had to be lacking.

Also read: benchmark of Nginx vs Apache.

Tags: ,