Is there ANYTHING alse I can optimize?

I think I’m reaching the end, I cannot think of much else I can do to speed up my site other than going to a dedicated server (currently on a VPS).

The following test was taken a few moments ago:

http://www.webpagetest.org/result/110310_7D_4NP2/

It suggests two more CSS and two more JS files could still be combined, but I’m leaving that as is.

Not sure what I can do with the eTags, or if I should.

Other than those two items, is there anything else I may be overlooking?

Kind regards,

Marvin

I’d say the biggest win you have remaining would be to do something about moving the javascript out of the way but that might be more of a site overhaul than you are looking to do. It would probably be the biggest remaining thing that would impact the user experience.

As a sort of proxy I captured video of the current site and the site with the javascript blocked: http://www.webpagetest.org/video/compare.php?tests=110310_92_4P4D-l:Normal,110310_V9_4P4E-l:No+JS

It looks like there’s around a 300ms benefit to the user experience of seeing the page content by moving the javascript out of the critical path.

Pat, I’d very much like saving the 300ms, but removing the javascripts is beyond my capabilities. :slight_smile:

P.S. Hmm, but it is now something that will be on my mind for a long time…

Pat, several months have past since you posted the above. It has always been on my mind, and finally today I actually started to work on removing .js from my site. I’m removing it for guests only for now. The tests look good. I still have to remove a few .js files from inner pages. It should be completed shortly. And then I will have to test the site well to see if I’m overlooking some broken functionality. :slight_smile:

Start render has improved from 600 to 700 ms down to 400 to 500 ms.

Here is a link to a test result: WebPageTest Test - Running web page performance and optimization tests...

wow, that’s really impressive.

Feeling adventurous?

It’s pretty extreme but one more thing you can do is to inline the css for a first time visitor but still reference the external css for repeat visitors (so it can come from the cache). Looks like that might be able to shave another 100ms off of the render time.

It’s a bit tricky but the logic basically works like this:

  • For page loads where cookie X is not set, inline the css and add a bit of javascript (Y). If cookie X IS set then refer to the external version like normal
  • Javascript Y creates a hidden iFrame a few seconds after onload that loads a special page (let’s call it precache.php)
  • precache.php is an empty html document that loads any external resources you want to load into the browser cache (the css file in this case) and sets cookie X (generally meaning that cokie X will be set when the user browser has cached the external version of the css)

Yahoo used to (may still) do something like this for their home page to get really fast first visits without having to also inline all of the code for repeat visits.

Awesome work regardless. It’s easily one of the fastest pages I have seen come through here.

Pat, thanks for another idea! My only concern is, I’ve heard people say Search Engines, for the purposes of ranking sites, look also at the code-to-text ratio of pages, and if I add my .css to my base html page, I in affect add about 6,000 characters of code there. Do you think the code-to-text ratio may at all matter in regards to SEO (Search Engine Optimization)?

Here’s a slow-motion video I just did. It compares the load time of Google.com vs the load time of my hobby site: Video

Hmm, good question. I’ve heard similar rumors but have never seen anything concrete. It’s probably worth asking on the google webmaster forum to see if you can get some clarity on it.

The video is really impressive.

FWIW, I poked around a bit and it looks like the code to text ratio may have originally come from a limitation where the googlebot would only download 100k for each page. The limit has long since been eliminated/raised and from everything I have read the amount of code doesn’t matter at all, as long as the content can still be extracted and has the same semantic markup (i.e., you’re not injecting the content from javascript).

Thanks, Pat, very much. That’s a very helpful piece of information. I have also done a lot of searching and reading on the topic last night and this morning, and I have decided I’m going to test how the site speed changes when I include the external CSS. It may take me a while to do it, but I will post my results (hopefully within days).

If it helps any, the way I do it when I would inline my css from php is a really simple toggle. I keep the css file external but just use something like:

if ($inlineCSS) {
  echo '<style type="text/css">';
  echo file_get_contents('my.css');
  echo '</style>';
} else {
  echo '<link rel="stylesheet" href="/my.css" type="text/css">';
}

Makes it really easy to turn on and off and gives you a good basis for doing the cookie-based logic.

Pat, your suggestions, ideas, and help, are very much appreciated. I have removed the external css file, and placed it inline. I did a few comparison tests. I have mixed feelings about the results, and I will have to test it more at a later time to decide how to proceed.

I have tested in two of your test centers, the nearby Dulles, VA, and the far away Wellington, New Zealand.

In the nearby Dulles, VA, the Start Render time has clearly improved from about 480 ms down to about 310 ms! The overall load time has not changed much. The following tests show about 50 ms improvement, but the load time does vary, and I would have to perform many more tests to conclude if it’s an improvement or not.

In the far away Wellington, New Zealand, the Start Render Before/After times are about the same (1.1 sec), and Load Time actually seems a bit worse (by about 100 ms, 1.6 sec vs 1.7 sec).

For the time being I’m leaving this feature disabled, but I will play will it still at a later time.

Here are links to test results (10 tests per page):

Dulles, VA

Wellington, New Zealand

Well, you’re certainly well into the range where the improvements could be inside of the measurement variation - good place to be :slight_smile:

I can’t remember - are you on a dedicated server that you control?

If so and you are not running a Linux kernel >= 2.6.39 you can also get a boost by upgrading the kernel. Linux 2.6.39 boosted the initial TCP congestion window from 4KB to 10 packets (amount of data the server will send on the first round trip). For a site as fast as yours it can make a pretty big improvement.

Pat, the site is on a VPS, but I may ask the host about the settings.

I’ve done more tests with the CSS in the base file. I tested it in all of the NA and EU test locations (most of my traffic is from NA and EU.

In NA/EU the start render has improved, and the load times seem to be slightly better, too, so I will leave it like this for now.

The load time in New Zealand has worsened by 100 to 200 ms, but the Australian test center shows a tiny improvement.

I don’t think I will bother with serving external CSS to users with cookies at this time. It seems like a bit too much work to modify vBulletin to achieve that.

Pat, I was looking through my vBulletin’s Admin Control Panel, and I found the following:

I’m guessing that the number 2.6.18 is the version number. Do you think that even though I’m on a VPS, if they move me to a server that runs a Linux version >= 2.6.39, I would still benefit?

Wow, 2.6.18 is from 2006. I hope they are keeping up to date on security patches for your VPS.

It won’t be a HUGE gain in performance but it will usually cut the content download time for the first resource on a new connection in half (if the resources are over 4KB). My guess would be that it’s good for ~100ms towards the load time (and maybe 50ms to render) but since your page is so well optimized it would be a reasonably large win percentage-wise.

I’d love to gain another 50 to 100ms, but it will not be easy, if at all possible, with my current VPS host:

All of their VPS products run 2.6.18 kernel, and the only way to get a different kernel with this host would be to go dedicated, and that’s something I’m not ready to pay for, since this is really just a hobby site.

The host is SERVINT, a reputable host with reliable service. I switched to them a few months ago, and I have not experienced a single minute of down time ever since. I wouldn’t mind switching to another hosting provider, but I suspect it wouldn’t be easy to find the same reliability and performance.

And honestly, it’s really not worth it for the 50-100ms on that one page. I’d recommend investing the effort into the rest of the site to make sure the experience is fast across the board.

If you haven’t already tried it, new relic ( http://newrelic.com/ ) is great for giving you visibility into the end-user and back-end performance of the pages that users are visiting. I haven’t tried it against v-bulletin but the basic php and mysql support should let you drill in pretty quickly. I think they have a 30-day free trial which should give you enough information to work with.

Pat, wouldn’t the newer kernel affect the entire site, not just home page?

Thanks for the link, I’ll look into it.

Servint, WiredTree, Knownhost, Liquidweb, have all told me they run 2.6.18 (or older) kernel. Apparently, Virtuozzo doesn’t support 2.6.39 at this time.

Since its a VPS, there shouldn’t be any reason why you wouldn’t be able to upgrade the kernel version yourself. VPS’s are by their nature, virtual dedicated machines. Assuming you have root access you should be able to do something like:
http://www.cyberciti.biz/faq/linux-kernel-upgrade-howto/

Just keep in mind that your hosting may not be able to offer you support if you run into issues.