I actually have a framework together for a competition but probably not the kind you’re looking for We used it for some internal training and I’m trying to get it together for a public version. Essentially the way it works is like this:
Everyone gets the same starting page to optimize
Each user (or team) has a login to http://compete.webpagetest.org/ (results from the internal competition are there right now)
On the competition site you can upload a new version of the page and it will automatically kick off a performance test and update the leaderboard
For a given team it will keep archives of each iteration available as well as the performance of each package
The test destination has 10 sharded CDN paths available
At least as of right now it is set up to only allow static html and files (allowing php execution was too risky but I may see if I can find a good way to sandbox each site)
The rules were that the site had to maintain all of the functionality of the existing site (so there was some level of judging in addition to the raw performance numbers)
I thought it would be fun to do but eventually it might be a good way for someone to get their site optimized. They provide the page as well as a prize and teams compete to provide the fastest version of the site.
Too much going on right now though so I haven’t had time to polish it up for a public competition.
thanks for the tip about quant.js … few months ago when i checked i believe they had the nasty document.write (unsure)… but thats not the case anymore… will move it to after onload. i should also differ the http://www.google.com/coop/cse/brand?form=cse-search-box&lang=en cause all it does is change the styling of the google CSE search box.
+1 for competition… sounds fun… but there needs to be some clear guidelines about the page being optimized… and what should be on it… maybe giving a sample page with many things on it… and everyone optimizing that same page… Im sure lots of new ideas can come out of it.
A competition sure would drive some traffic to your website if it was properly advertised especially if there was a nice prize.
Prize wise, a lot of coders might participate if it was something as simple as a random banner with their company info on it. Could also be a link to the competition on the banner.
I am not sure if dynamic PHP, CFM, ASP, etc pages would be a fair testing basis though. The results would vary depending on what dynamic content would load for the test.
Could make a sprite image and combine your image resources.
If you had just the body document, create an iframe, and then on the body onload event load all the content into the iframe. The page would look the same but the 0.201 would be closer to 0.120 if that. My first point is you should probably look at the entire document load time. This would avoid some black hat loading attempts
I would not consider it a complex page only having 14 resources. The page is also static. Any page worthy of a competition page should have at least 100 resources with the ability to do basic techniques such as GZIPPING text resources, to medium techniques such as creating CSS sprites, to more advanced techniques like image preloading.
I think as long as the page speed is between 1 to 2.3 seconds, then the overall concept is pretty good. It gets to a point where the difference is not noticeable with the naked eye.
I was not saying you are using black hat methods at all. I was saying that there is a way anyone could make the document complete come extremely early by only having 2 resources (1. the main document and 2. an iframe). For a competition bases, it would be better to look at the time it takes for the entire document to load.
It looks like you definitely cut down on the resources a bit. It is a bad comparison because you are comparing FIOS to ADSL. The locations are at least the same for the comparison so thats good.
It takes about 0.4 seconds to know a slight change in load time at least for me unless you are viewing it on slow mode.
I was not discrediting your hard work by any means. I was just trying to give you a few ideas. Also a CDN would make it even faster.
For a competition there would be a human element to the judging so the iFrame technique wouldn’t work (other than to freak people out on the leaderboards )
Then you weren’t using the right cdn. Cdn’s are very basic in their concept, and unless you’re doing testing locally on your computer and the server is very close to you, a cdn should improve performance by reducing network latency as well as a possible side benefit of increased download parallelization. I’d say if the majority of your visitors are local and your server is local then a cdn won’t be of too much use. But if your site is national, or even worse international, then you can really benefit from a cdn.
One thing to be careful with is when testing a CDN make sure to do more than 1 run. The first hit for a given object may be a cache miss on the CDN and they have to go back to the original site to populate the cache. Future requests for the same object should be a lot faster.
Looking at the page in question, www.thaindian.com, I can see YSlow is giving you a lower grade for the following reasons:
This page has 8 external Javascript scripts. Try combining them into one.
This page has 3 external stylesheets. Try combining them into one.
Grade F on Use a Content Delivery Network (CDN)
Grade F on Add Expires headers
Grade B on Reduce DNS lookups
I am getting a score of 83 for your website by the way.
The only real discrepancy I see is the CDN score because the CDN detection algorithm is different between webpagetest and yslow.
Discrepancies in scores are because a different set of criteria and a different weight on each criterion.