Despite caching code, Cache Static Content gets F

Hi,

I’m a newbie here with rather limited knowledge of technical matters. I’ve read every post on this forum on Cache Static Content. My URL is: http://www.LifeStrategies.net created with RapidWeaver on a Snow Leopard Mac. My test results are at http://www.webpagetest.org/result/130718_ND_781d1410aa45babf356efbebebd19823/

I’m getting AAAAF now, having compressed my header image, and Cache Static Content still rates F. see http://www.webpagetest.org/result/130718_85_132aa9ae6c04e74d6da1319e238d222c/

Over a year ago, I copied someone’s code and set my htaccess file to cache images etc using:

Set up caching on media files for 6 months. 2012/05/01

<FilesMatch “.(ico|gif|jpg|jpeg|png|flv|pdf|swf|mov|mp3|wmv|ppt)$”>
ExpiresDefault A14515200
Header append Cache-Control “public”

end caching code

So isn’t caching already enabled? Yet Cache Static Content says even my favicon .ico is not cached, which should be cached by this. Is webpagetest.org ignoring the 6 months ExpiresDefault I’ve specified? Or have I made a mistake copying this somehow? Am I getting any caching or not?

It did seem to make a difference when I put this snippet in htaccess, my pages load within a couple of seconds. Should I instead specify each type separately, perhaps go to the one year maximum, by using:

ExpiresByType image/ico A31536000
ExpiresByType image/jpg A31536000
ExpiresByType image/png A31536000

And should I add css and javascript to it, eg:

ExpiresByType text/css A31536000
ExpiresByType text/javascript A31536000

I’m wondering if this could be the cause of the problem? Google’s help page at: Leverage Browser Caching  |  PageSpeed Insights  |  Google Developers specifies:

[i]"Expires and Cache-Control: max-age. These specify the “freshness lifetime” of a resource, that is, the time period during which the browser can use the cached resource without checking to see if a new version is available from the web server. They are “strong caching headers” that apply unconditionally; that is, once they’re set and the resource is downloaded, the browser will not issue any GET requests for the resource until the expiry date or maximum age is reached.

“…Last-Modified is a “weak” caching header … It is important to specify one of Expires or Cache-Control max-age, AND one of Last-Modified or ETag, for all cacheable resources.”[/i]

So Developers.Google says I should BOTH use Expires AND to specify a date as Last Modified. But why need both? Should I specify both?

If so, where and how do I specify Last-Modified or use the “CacheIgnoreNoLastMod On” directive. Can either of these be done in htaccess, if so exactly how and where?

Thanks a million for all and any help…
Cris

It looks like it is mostly the js and css that is not being cached: http://www.webpagetest.org/result/130718_85_132aa9ae6c04e74d6da1319e238d222c/1/performance_optimization/#cache_static_content

So adding js and css would help (but make sure you REALLY want to do that because those files will be unable to be changed after you commit without changing the file name). It’ll only help if caching itself works though.

Do you know if your server has mod_expires installed? If not then the Expires directives won’t be doing anything useful.

Patrick, thanks so much for your reply.

I’ve put in a support request at www.javabeanhosting.com asking if mod_expires is installed. I’ll let you know as soon as I know.

Of the 22 complete failures to “Cache Static Content” listed, there are: css5, js3, jsapi1, png7 , ico1, jpg1, no apparent type*4. So it seems as though none of the images on the page are being cached, this looks to be most all of them.

Cris
[hr]
Yes, says Greg at JavaBeanHosting, “mod_expires is enabled”. He goes on to say, “I think the FAILED message is for a different reason”.

Now 13 of the 22 complete failures listed in the report are on my site LifeStrategies.net, and half of those are images, yet it reports they’re not being cached…

As noted originally, Developers.Google says I should BOTH use Expires AND specify a date as Last Modified. Do I actually need to use both and if so where and how?

Thanks for your continued help.
Cris

Greg at JavaBeanHosting suggested I add “ExpiresActive On” to my htaccess file. I’ve done this and my six LSn images files have disappeared from the failure list. HURRAH! here’s the report: http://www.webpagetest.org/result/130720_X1_78Z/

My score for Leverage browser caching of static assets which was at 33/100 with 22 complete failures is now 50/100 with just 14 complete failures. My Document Complete first view is 3.254 seconds, with a repeat view of 1.578 seconds.

But it’s still rated an F, it would be good to recognize the limited success by reclassifying such results as E …

I’ll now try putting a one day cache on css and js and see how that goes. Is this correct?

ExpiresActive On ExpiresByType text/css A172800 ExpiresByType text/javascript A172800 ExpiresByType text/x-javascript A172800 ExpiresByType application/javascript A172800 ExpiresByType application/x-javascript A172800

This worked! My score for Leverage browser caching of static assets is up to 58/100 with just 8 complete failures and 10/12 warnings now. My first view is now down to 2.354 seconds, the repeat view down to 1.484 seconds. And these improved results are still rated F. see: http://www.webpagetest.org/result/130720_6R_8JP/

Can and how do I cache the three Google pngs, along with the google_custom_search_watermark.gif? These change very rarely if at all…

It also fails to cache my page - http://www.lifestrategies.net/ - which has no type. Can and how do I get this cached?

Looking at my latest PageSpeed Optimization Check at: http://www.webpagetest.org/result/130722_RT_XM6/1/pagespeed/

it tells me that Optimizing images would also make savings:

It seems that these four Google images, one gif and three pngs, also need compressing as well as caching. Is it possible to get this to happen? If so, how?

warm regards,
Cris

You’re not going to be able to do anything about 3rd-party content on your page (well, at least while keeping the content) unless you reach out to them. For most of the Google images I wouldn’t worry about it - those are very small differences and there’s a good chance they are already in user’s browser caches from other sites.

Looking at the waterfall, I have a few suggestions that will probably get you bigger gains:

1 - instead of https://ajax.googleapis.com… you should use //ajax.googleapis.com for including jquery. That will get rid of the ssl negotaition time

2 - You’re loading 2 versions of jquery (1.7.2 and 1.8). You should eliminate one of those.

The javascript in general could use a lot of love. There are a lot of blocking scripts and widgets that should be able to be moved or made async (like the google search box for example). You want the browser to be able to parse and render as much of you html as possible before it has to execute any script. You can leave most of the async widgets in place but it would help your rendering a lot of you moved all of the blocking script work to the end of the page.

If you look at the filmstrip you can see that most of the page doesn’t render until near the end of the loading: http://www.webpagetest.org/video/compare.php?tests=130723_1J_W5T-r:2-c:0 and that’s because of the blocking scripts. Fixing those would get your content rendering closer to 0.4s instead of 2s.

Thanks a million, Patrick. Those look like immensely valuable suggestions, I’ll try to find out why it’s been done like that and how to get them changed…

Yet given the gazillions of Google searches each and every day of the year, a little time (hours rather than days) to compress as well as cache these four Google images would result in a tiny improvement for each user, but a big improvement for the whole world wide web. Definitely very - extremely - worthwhile.