anyone give a summary on this and save me a job sniffing browser traffic? (Which I can’t do from my work PC for usual corporate network pain reasons ).
I guess I’m after 3 things:
Does this block download in any subtle way?
Is this FF only?
Does it force the browser to DNS lookup when it encounters the link (or only when it’s DNS cache is empty/stale?)
DNS lookups can be pretty frightful in some parts of the world, so I’m looking to see what we can do to alleviate it.
Nope, you’ve asked a good question, I just don’t know the answer but please do post back what you find. You may also want to post it to the yahoo exceptional performance group which is more of a mailing list and has a bunch of people on it as well.
From everything I have read it should be completely asynchronous and start the lookups as soon as it parses it (so higher up in the head the better). Also sounds like Chrome supports it so it’s not limited to firefox. Unfortunately pagetest only works with IE so it won’t be useful for testing it but it could be a very interesting win for non-IE browsers (particularly for pages that reference a bunch of domains).
Wireshark would be best to be sure but depending on the level of access you have on the box you could use something like http watch (trial available) or if you can’t install that, something like Google’s Page Speed plugin for chrome (will at least tell you how it performs in chrome).
So I had a test page with some 20 discrete prefetches in the head and 1 image in the body.
IE does not prefetch, as Pat indicated
Chrome does prefetch and it does seem to do it asynchronously (as per DNS Prefetching )
Safari does not prefetch
However I just could not seem to get FF3.6 to do it (network.dns.disablePrefetch setting was not set i.e. I had default settings)
I tried disabling FF cache using network.dnsCacheEntries = 0.
I tried restarting FF
I tried clearing the windows DNS cache
…and all the combinations thereof.
DNS pre fetch is something i have been looking into, and so far as in browser dns pre fetch tags go, its still a dream…
Meanwhile Google have re invented the DNS server with one that incorporates DNS pre fetch.
Normaly when the ISPs DNS server goes back to the authoritative host to get the DNS record it just caches it without inspecting the TTL on the record. For our site that is 300 seconds, which is deliberately quite short. The next time a user requests the DNS record from their ISPs DNS server, only then does it inspect the TTL and if its expired then it will go back to the authoritative host.
Google found that their googlebot was spending more time looking up DNS records, than it was spidering pages. So they made their DNS server inspect the TTL of the record and before it expired fetch it again so the cache is kept hot.
If you give yourself a tight target like a “1 second homepage” then you may see 700ms wasted in DNS resolving. If thats the case give the google DNS server a go. its available on 8.8.8.8 and 8.8.4.4 its quite shocking the difference it can make.
I think you may be crossing some wires. The Google DNS service is something your users would use and configure for, not something you can do from the server side to improve your performance (unless you encourage all of your users to switch DNS providers).
I agree that they are doing a lot of really cool things in the space, it’s just not something site owners can directly leverage.
[attachment=182]I thought I’d have another look at this, now that dns-prefetch is official HTML5. I ran the following in FF (because I know how to disable FF’s dns cache.) (And yes, this is truly vile!)
My expectation was that the look ahead preparser would recognise nodejs.org and www.w3.org as being in the html, but would not recognise www.w3schools.com as it is in the JS. My understanding being that look ahead preparsers do not read JS.
To my surprise, the DNS lookup for www.w3.org occurs after the sleep. (VRTA graphs this beautifully - see attached.) Pic1 is with dns-prefetch tags commented out, Pic2 with them in.
The horrid code?. Enjoy:
<img src='http://nodejs.org/images/logo.png'/>
<script type="">
(function() {
var startTime = new Date().getTime();
while (new Date().getTime() < startTime + 1000);
}())
</script>
<img src='http://www.w3.org/2008/site/images/twitter-bird'/>
<script type="text/javascript">
(function() {
var d = function() {
var img = document.createElement("img");
img.src = "http://www.w3schools.com/images/w3schoolslogo.gif";
var s0 = document.getElementsByTagName('script')[0];
s0.parentNode.insertBefore(img, s0);
};
setTimeout(d, 1000);
}())
</script>
[hr]
Daftness. Nothing will happen during the sleep. D'oh! Nice to know that the prefetch directive now though.
How’s about this one? Same behaviour - no auto prefetch for the second image from www.w3.org. Same behaviour in FF and Chrome. So, had naturally assumed that only needed explicit prefetch for my js loaded resources…
I don’t know if this is an old post, but pagetest now works with Chrome and FF as well, right ?
If so, I’d be interested in knowing how this affects performance. Any idea if a site using static0-static3 subdomains could benefit from this ? I think so ?