Broken test results

The main page is at: http://www.webpagetest.org/result/110302_0R_31YR/
First off the top two rows are text and are gzipped. I have confirmed this. Second, seven days is plenty enough of a short time to have a cache. Third, there is no reason to use a CDN for so few assets. Fourth, you should only alert someone if there is NOT an etag on cached results. I’ll assume that is just a bug. On the pagespeed tab it suggests again the compression, which is done. It also suggests a Vary: Accept-Encoding, which is sent. All together this means that other than suggesting I use a CDN for no reason, which would actually slow everything down, and suggesting that I don’t use etags, which would remove the ability to make sure that no cache is out of date, you just show broken results.

The score for compression is N/A (Not Applicable) because we don’t have visibility into SSL gzip headers (though Page Speed isn’t aware of that which is why it is recommending it - I’ll see about disabling the pagespeed compression checks for SSL content if possible).

If 7 days is fine for you then just ignore the caching grade (though your ACTUAL caching of the images is broken because of the vary-encoding header on images which prevents IE from caching them - look at the repeat view: http://www.webpagetest.org/result/110302_Z4_086a20beb6ba2ecbe41664ab5aa50aab/1/screen_shot/#aft ). We recommend at least 30 days on the caching but it’s pretty arbitrary. Once you can cache for a long period you’ve already don the hard work and the difference between 7 days and a year is completely artificial.

Depending on where your user base is (in relation to the server), a CDN could still help quite a bit in your particular situation because the round trips for SSL negotiation are driving your load times and the only way to reduce that is with a CDN (or to inline the css and png’s which might actually be a better option for you).

The ETAG warning is because ETags are usually not necessary for a site and are easy to configure incorrectly. Unless you have content that needs something like a hash-based validation you are better off just using max-age, expires and last modified and removing the etag headers. It’s not realy that big of a problem though which is why it’s not exposed on the main results page. Here is more information on the reasoning behind the ETag checks: Best Practices for Speeding Up Your Web Site - Yahoo Developer Network

I have “FileETag MTime Size” set, which gets around the inode issue, should I be using multiple servers, which therefore gives the smaller 304 Not Modified response. I’m actually saving room through doing that. If you’re going to warn about it at all, you should check to see if it’s working, which it is.
Beyond that, here is the initial response for my one image:
Date Wed, 02 Mar 2011 23:42:01 GMT
Server Apache/2.2.16 (Debian)
Last-Modified Wed, 02 Mar 2011 17:07:37 GMT
Etag “663-49d82f31e4840”
Accept-Ranges bytes
Content-Length 1635
Cache-Control max-age=604800
Expires Wed, 09 Mar 2011 23:42:01 GMT
Keep-Alive timeout=15, max=100
Connection Keep-Alive
Content-Type image/png

No Vary: Accept-Encoding header at all. I think your system is rather bust. I was hoping it would give more insight than just pagespeed alone, but it seems not. Can you give any reason for using 30 days instead of 7?

Yes, yours are configured fine and will not cause a problem but are also adding no value since you also have Expires, Last-Modified AND Cache-Control. The goal is to NEVER have 304 responses because those are just about as expensive as a regular 200 (particularly in the https case when you have to pay 4 round trips for a new connection). That’s also why 30 days is better than 7 - someone coming back to your site after weeks of not visiting will still be able to use the cached version of the resource and not have to do a if-modified-since check.

Here are the response headers for the validhtml image (validcss have the same): WebPageTest Test - WebPageTest Details

HTTP/1.1 200 OK
Date: Wed, 02 Mar 2011 18:37:05 GMT
Server: Apache/2.2.16 (Debian)
Last-Modified: Wed, 02 Mar 2011 17:07:37 GMT
ETag: “21f6b-663-49d82f31e4840”
Accept-Ranges: bytes
Content-Length: 1635
Cache-Control: max-age=604800
Expires: Wed, 09 Mar 2011 18:37:05 GMT
Vary: Accept-Encoding
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: image/png

Sorry if you don’t find the tool useful. The real benefit comes not from just the recommendations and grades but from letting you see how your page loads under various different network conditions and by giving you the information you need to further optimize your pages. Automated recommendations can only go so far and are really only useful for the really broken sites (which there are a suprisingly large number of).

In this case, your page barely has anything on it so there’s not really a lot to optimize. That said, for this SPECIFIC page you can figure out how to make it quite a bit faster just by looking at the waterfall:

1 - Does this page really need to be over https? Serving it over http would be at least twice as fast since there are so few requests and each one is essentially on a new connection.

2 - The “valid HTML” and “valid CSS” images aren’t being cached on IE. You can see them being requested in the repeat view test and it’s because of the vary: accept-encoding header being applied to images. Fix that and those images will not be requested in a repeat view.

3 - The css is trivial and tiny. Put it directly in your html and avoid the additional request.

4 - Unless you are getting some value from the valid html and valid css images, just remove them - I seriously doubt your users care. If it’s just a test page and you wanted placeholder images then that’s fine. You can use data URI’s for the more modern browsers and embed the image data directly in the html as well and reduce the whole page to a single request.

Make those changes and the load time would drop from 1.2s to 0.2s

-Pat