How many "rules" are really necessary?

I usually look through the WebPagetest logs daily to get a feel for how the sites people are testing look from an optimization perspective and I’d say well over 90% aren’t even doing the basics (I plan on pulling stats together to have actual figures but it’s a good ballpark). Part of me wonders if they are getting overwhelmed aand if some of the things that are checked even really matter (in the grand scale of things). I think there’s a really big fall-off on impact once you get past the first few rules so I figured I’d spark a little discussion on the topic .

Sort of a mix between YSlow and Pagetest rules, here are the ones that I think have the largest impact and are universal (aka “rules”):

Minimize HTTP requests: Be-it by combining css/js, using image sprites or otherwise, this is absolutley critical to improving performance.

Enable Persistent Connections: Saving a round trip for every request can add up pretty fast and it’s scary how often it isn’t enabled.

Properly Compress your content: Gzip your text and make sure your images are properly compressed (smush.it or otherwise).

Make your static content cacheable: Only helps for repeat visitors but this one is also very scary how often it isn’t done.

I’m kind of on the fence about CDN’s. They can hide a lot of sins (particularly around the number of requests) and only apply to sites that are large enough to justify it but that don’t have their own geo-distributed infrastructure.

Everything else is kind of interesting to look at if you are trying to squeeze every last bit of performance out of your site but the payoff drops significantly and there are usually going to be a lot of caveats that make them not universally applicable. Once you reach that level of optimization I think you’re going to be hand-analyzing the waterfalls and the pages anyway to find your specific bottlenecks (like javascript selectors, etc).

Thoughts?

-Pat

Pat,

your 4 rules are spot on and can be considered the basics. Cover these basics and your site will most likely be in good shape.

I have two suggestions for improving Webpagetest.

  1. Weight
    Currently, when looking at a Webpagetest report (tab: Performance Review), it’s not clear what are the most important problems.
    I think it would add value to indicate a weight to each checkpoint. Cache Static = high, No ETags = low, …

  2. CDN
    For many websites, using a big, global CDN like Akamai does not make sense.
    I’ve done many tests on high-traffic sites that only serve people from The Netherlands and Webpagetest will show a bunch of errors in the report for ‘Use a CDN’. It’s not correct to mark this as a problem.
    My suggestion: following Yslow, make it an option to de-activate the CDN check before running the test.

  3. Javascript
    Nowadays, Javascript contributes largely to the slowness of web pages.
    You can have a web page that follows your 4 above mentioned rules, and still be slow because of the JS in the page.
    Checking for JS in the page is a good start (Yslow) but there is more to it. Perhaps you can add the check for inline scripts, which have a strong blocking effect.

I hope this helps.

  • Aaron

Thanks for the feedback…

They are theoretically sorted from most important to least (left-to right) but I’m not convinced the current sort order is correct either (which puts persistent connections off to the middle of the pack). I think I’m going to re-do the checklist UI to make it a lot clearer, calling out the items above specifically in an “important” section and having the others off to the side in an “other checks” bucket.

Yep. Haven’t decided how best to expose this. I think I still want to run all of the checks at test time but then decide what to show when you go look. The question is where do I store the decision of what to display? Do I tie it to a cookie that is user-based (or login) so all of the tests you look at have the same checks? That works well for a preference but if you want to share a test result with someone they will see different checks. Maybe I can store as defaults for any given test whatever the preferences are for the person who ran the test and override it if someone viewing has different preferences.

Gets to be a bit of a hairball so I need to spend a little bit of time thinking how best to do it.

Yep. We’ve started to play with this lightly but looking at it from the execution path. Right now I’m checking for inefficient JQuery selectors which we have seen seriously bog down our sites. It’s only in the text form of the report currently but once we haave a good set of checks a “javascript best practices” check in the checklist is probably warranted.

A CDN is unnecessary if you only serve users in a well-connected region. For example, we find the we can serve users in Western Europe from Germany without noticeable benefit from Akamai.

However, if you want to serve more than one continent from one location, a CDN can be very helpful (China counts as a separate continent due to its censorship walls). For example, some page loading times in Asia went down by a factor of 3 using Akamai instead of serving from Germany.

Hi Pat,
Great Post, I am kind of a novice in this area, and it sounds like Google is going to put some emphasis on fast loading sites. I didn’t understand how to improve all these paramters on my site. Can you do a small “how to” guide of how to improve each thing.

I’m working on some docs right now. Haven’t gotten to the “how to fix” info yet but it should be ready in the next couple of weeks.

Thanks,

-Pat

Thanks, I am sure this will benefit the whole community.
Please let me know, when you have something.

ביטוח מקיף
ביטוח רכב