< 0.5s load time (showing off)



Well… it should be mentioned that :-

  1. i have a very very very simple homepage
  2. Test was done on a FIOS connection
  3. The homepage is NOT an important landing page…

Could possibly get faster if the quant.js file loaded asynchronously or after the onload event.

It would be neat if Patrick had some sort of competition sometime.

Could have the same page given to people that enter the contest. Would have to keep test results private until the end of the competition.

Now that would be fun :slight_smile:

I actually have a framework together for a competition but probably not the kind you’re looking for :slight_smile: We used it for some internal training and I’m trying to get it together for a public version. Essentially the way it works is like this:

  • Everyone gets the same starting page to optimize
  • Each user (or team) has a login to http://compete.webpagetest.org/ (results from the internal competition are there right now)
  • On the competition site you can upload a new version of the page and it will automatically kick off a performance test and update the leaderboard
  • For a given team it will keep archives of each iteration available as well as the performance of each package
  • The test destination has 10 sharded CDN paths available
  • At least as of right now it is set up to only allow static html and files (allowing php execution was too risky but I may see if I can find a good way to sandbox each site)
  • The rules were that the site had to maintain all of the functionality of the existing site (so there was some level of judging in addition to the raw performance numbers)

I thought it would be fun to do but eventually it might be a good way for someone to get their site optimized. They provide the page as well as a prize and teams compete to provide the fastest version of the site.

Too much going on right now though so I haven’t had time to polish it up for a public competition.

thanks for the tip about quant.js … few months ago when i checked i believe they had the nasty document.write (unsure)… but thats not the case anymore… will move it to after onload. i should also differ the http://www.google.com/coop/cse/brand?form=cse-search-box&lang=en cause all it does is change the styling of the google CSE search box.

+1 for competition… sounds fun… but there needs to be some clear guidelines about the page being optimized… and what should be on it… maybe giving a sample page with many things on it… and everyone optimizing that same page… Im sure lots of new ideas can come out of it.

Right, check line #1 in the rules - Everyone gets the EXACT same page to optimize :slight_smile:

A competition sure would drive some traffic to your website if it was properly advertised especially if there was a nice prize.

Prize wise, a lot of coders might participate if it was something as simple as a random banner with their company info on it. Could also be a link to the competition on the banner.

I am not sure if dynamic PHP, CFM, ASP, etc pages would be a fair testing basis though. The results would vary depending on what dynamic content would load for the test.

Would be something neat for the future.

aha… i didnt notice your reply before i replied… you are too fast :slight_smile:

It should be mentioned that:

  1. i have a somehow complex homepage
  2. Test was done on a FIOS connection
  3. The homepage IS an important landing root page…


Hey There,

Could make a sprite image and combine your image resources.

If you had just the body document, create an iframe, and then on the body onload event load all the content into the iframe. The page would look the same but the 0.201 would be closer to 0.120 if that. My first point is you should probably look at the entire document load time. This would avoid some black hat loading attempts :slight_smile:

I would not consider it a complex page only having 14 resources. The page is also static. Any page worthy of a competition page should have at least 100 resources with the ability to do basic techniques such as GZIPPING text resources, to medium techniques such as creating CSS sprites, to more advanced techniques like image preloading.

I think as long as the page speed is between 1 to 2.3 seconds, then the overall concept is pretty good. It gets to a point where the difference is not noticeable with the naked eye.

Travis Walters

Iframes is not good… from usability viewpoint.

Hmmm… after Document Complete event, page loads:

  • opensearch.xml open ref
  • favicon.ico wiki ref
  • one javascript file 201005d.js goog ref
  • and 4 images. the same content as in 4 images already loaded, only far better quality. [color=#DCDCDC]no ref[/color]

So, the question is - where do you see black hat? It is not even gray hat :slight_smile:

But it was. See?
Edit: more bad

Sorry. I strongly disagree.

Hey LaBoot,

I was not saying you are using black hat methods at all. I was saying that there is a way anyone could make the document complete come extremely early by only having 2 resources (1. the main document and 2. an iframe). For a competition bases, it would be better to look at the time it takes for the entire document to load.

It looks like you definitely cut down on the resources a bit. It is a bad comparison because you are comparing FIOS to ADSL. The locations are at least the same for the comparison so thats good.

It takes about 0.4 seconds to know a slight change in load time at least for me unless you are viewing it on slow mode.

I was not discrediting your hard work by any means. I was just trying to give you a few ideas. Also a CDN would make it even faster. :slight_smile:

Travis Walters

For a competition there would be a human element to the judging so the iFrame technique wouldn’t work (other than to freak people out on the leaderboards :wink: )

I tried CDN thing (2 companies), and in my case, it is not faster. Things get worse (slower).[hr]
One more thing about…

Look at goog – .ico, .js, and 2 more files are download after Document Complete event.

Google is either evil :huh: or black hat :cool: or creative (speed wise) :idea:

Then you weren’t using the right cdn. Cdn’s are very basic in their concept, and unless you’re doing testing locally on your computer and the server is very close to you, a cdn should improve performance by reducing network latency as well as a possible side benefit of increased download parallelization. I’d say if the majority of your visitors are local and your server is local then a cdn won’t be of too much use. But if your site is national, or even worse international, then you can really benefit from a cdn.

One thing to be careful with is when testing a CDN make sure to do more than 1 run. The first hit for a given object may be a cache miss on the CDN and they have to go back to the original site to populate the cache. Future requests for the same object should be a lot faster.

How does this site http://www.webpagetest.org/result/100628_b616bbe4b363efb8e030e9d811fd1dc9/ get all A’s here and in Yslow it scores a 63?

i have same problem. Maybe some help?

phentermine without prescription
buy cheap viagra uk

@DJMorrisInc That link does not seem to work.

@Jarichonas Do you have a test URL link?

Oops, sorry…I made the correction up above too, but here is the correct url http://www.webpagetest.org/result/100628_b616bbe4b363efb8e030e9d811fd1dc9/

Looking at the page in question, www.thaindian.com, I can see YSlow is giving you a lower grade for the following reasons:

This page has 8 external Javascript scripts. Try combining them into one.
This page has 3 external stylesheets. Try combining them into one.
Grade F on Use a Content Delivery Network (CDN)
Grade F on Add Expires headers
Grade B on Reduce DNS lookups

I am getting a score of 83 for your website by the way.

The only real discrepancy I see is the CDN score because the CDN detection algorithm is different between webpagetest and yslow.

Discrepancies in scores are because a different set of criteria and a different weight on each criterion.