B rating for static content caching but no cache control headers sent?

On a Magento site, I get a B rating for caching static content. Mod_expires and Mod_headers are both enabled on the Apache server (2.4.7) running on Ubuntu 14.04. I am only using Magento’s default .htaccess which has expires active on and a few directives for specific mime types.

However, the server doesn’t seem to send any cache control headers. Here are the headers I see:

[quote]root@:~# curl -IL https://www.example.com
HTTP/1.1 200 OK
Date: Mon, 24 Jul 2017 17:03:49 GMT
Server: Apache/2.4.7
Set-Cookie: frontend=n52dab340mkmb3elfkiapqn575; expires=Mon, 24-Jul-2017 18:03:49 GMT; Max-Age=3600; path=/; domain=www.example.com; HttpOnly
Set-Cookie: frontend_cid=ay9aVph9owV2omIm; expires=Mon, 24-Jul-2017 18:03:49 GMT; Max-Age=3600; path=/; domain=www.example.com; secure; httponly
X-Frame-Options: SAMEORIGIN
Fpc-Cache-Id: FPC_REQUEST_4d0f85354d3c27e66509301bd35eb682
Vary: Accept-Encoding
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
Content-Type: text/html; charset=UTF-8

I am puzzled to know how webpagetest grades my site B for static content caching even though the server is not sending cache control headers of any kind.

Ah… Slight gotcha with Ubuntu…

mod_expires.load being setup - first step.

mod_expires.conf - second step.

No default mod_expires.conf exists. You must add one to…


Then do an a2enmod expires again + this will create the correct mods-enabled links.

Then bounce Apache - service apache2 reload - then recheck.

Likely this will fix your setup.

Thank you, that was it! I did notice the expires.conf was missing, but it didn’t occur to me that I could or should add the file and re-enable the mod to get it working.

Thanks a ton, again. I spent literally days figuring out why Sucuri was not caching our site and we finally discontinued the service yesterday. This gives me hope that we can get Sucuri to cache our site, after all, because their support said they couldn’t cache until our server sent caching headers.

One question, though. The default value shows the following:

Cache-Control: max-age=1296000 Expires: Wed, 09 Aug 2017 02:13:12 GMT

This is about 15 days and is contrary to the mime type specific expires directives I put into expires.conf, which are much longer. Any idea where this max-age is coming from?

Just a quick note… you should never see an expiry header on index.php - unless you’re playing around with microcaching.

@GreenGecko, that went completely over my head, I must confess. Could you please explain a little bit more about what you meant?

The cache control header I quoted appears when I do the following:

curl -IL https://www.example.com

The html content of the page is dynamic, so should not be cached ( for example, same page logged in / out, random product selection, timestamp, code update ). It’s the static stuff - css, js, woff, etc that can be cached locally. A full page cacher built into the application will take care of this, and reduce the server-side load, but still won’t deliver the page with an expiry time in the future.

This is the core of why I really don’t like services like CloudFlare - they have to proxy to the original server to get a current copy of the html so will always lengthen the TTFB.

There is one method - microcaching - that attempts to improve performance by holding a copy of the page for a minute or less, in the hope that it isn’t wrong that often.

@GreenGecko, thanks for the explanation.

We are indeed using a full page cacher for our Magento site, so the index.php/home page being served is indeed a html file generated from that extension. And, I have the following directive in my expires.conf that relates to html files:

ExpiresByType text/html "access plus 15 days"

So, is the cache control header alright, in this case?

yes that needs to go. will cause plenty of weirdness.

In real terms, assuming you’re compressing ( gzip - mod_deflate on Apache ) html, then the overhead of transferring it is rather small - you can optimistically assume a 10x compression of something like 100k of content = 10k ( see why these minifiers are pointless?? ), so as long as the latency is low, delivering a fpc - stored page really won’t take a long time at all!

You are spot on. I went ahead and removed the expiresbytype directive for html. It was quite pointless to leave it enabled.

We had a similar problem.

In our case we had to enable php gzip:

zlib.output_compression = On