Asynchronous Javascript

In order to reduce page load times I want to asynchronously load all the Facebook/Twitter rubbish that now seems obligatory on a webpage. I have tried to do that by putting it all in a php include placed near the bottom of every page. The code looks like this:

Unfortunately it doesn’t work. :@

Without that code, my page loads in about 1.9 seconds. With the extra code it takes 5 seconds, and WebPageTest clearly shows that the various Google/Facebook/Twitter files are loading early in the page load process and consequently slowing the whole thing down.

Am I doing something wrong? Well, obviously I am, but what? Please be gently with me: I’m a javascript virgin. :angel:

Any chance you have a link to a test result you’re willing to share?

For starters, async javascript will block onload for all modern browsers (IE < 9 I believe being the main case where it does not) so loading the scripts async just lets the browser load them in parallel without blocking the DOM parser but depending on when you initiate the async load it could actually make load times longer (by starting to load asynchronously later). With async scripts you’re actually better off kicking them off earlier and just letting them do their thing in parallel with the rest of the page loading activities.

Depending on how complicated the page is, the browser’s parser might actually be getting to the end of the HTML really quickly. Any images it encounters along the way are just scheduled to be loaded and the parser continues on with the DOM. If you use Chrome or Firefox as a test agent you will see a pink vertical bar that is the DOM Content ready event which is basically the point in time where the parser finished building the DOM and made it to the end of the html.

I did a little bit of a shuffle around and seem to have got the page load down to about 4.6 to 4.8 seconds. Here is a sample test result:
[url=http://www.webpagetest.org/result/120817_80_D2P/]http://www.webpagetest.org/result/120817_80_D2P/[/url]

It looks like all the DOM content is complete in 3 to 3.5 seconds, which still seems pretty poor.

On the other hand, I ran my own test by clearing cache, cookies and everything out of IE9 and then loading the page. It’s hard to put a figure on how fast it loaded because it was too quick to time accurately. Certainly it was entirely acceptable. It looked like a second or so. Maybe I’m being hyper-critical and should live with what I’ve got and move on to more important things.

FWIW, IE 9’s dev tools has a network trace view that should give you a waterfall as well an indication of how long the page took to load.

How fast is your connection to the Internet and how far away is the server? A low-latency high-bandwidth connection could pretty easily account for the difference.

It does look like the user-visible time is largely dominated by the back-end (time to first byte) of about a second and then all of the social widgets (which they won’t really care about so it should feel like it is loading quite fast).

If you turn on video capture you can see what the visual experience felt like from the perspective of the test agent.

My connection is only 2.6 Mb (real, from a speedtest) so I think most people will have a connection at least as good.

The network trace in IE9 was quite interesting, though not quite as easy to read as WebPageTest.

For the moment, I think I’ve convinced myself that the page load is good enough to move on to other things, like actually getting the site completed and operational. Once it’s up and running, I’ll have another look to see if I can gee it up any further.

Thanks for your help, Patrick.