I have been using WPT for the past few years to test page performance on different websites. But, it was just the other day I realized this forum existed!
So, after trying to optimize the W3TC of my Wordpress site and slowly figuring out how to setup my MaxCDN account, I am hoping for some guidance from the pros here.
Also, I would love to learn more about technical aspects of web performance. So, if you have any guides/articles/etc that you could recommend I would appreciate it!
My site: http://www.renttoownquest.com/
Test Result: http://www.webpagetest.org/result/131217_02_19JT/1/details/
Thanks for the advice. I will try to get these images “sprited” and also check out those books.
I have a programmer who has setup the code on our site and I’m not familiar with the processes or bottlenecks that are occuring.
Is there some type of program/software that can be used that will provide me with all programs/scripts/code that run when particular pages are requested?
There are profiling tools like Xdebug for PHP (http://xdebug.org/) as well as hosted solutions like New Relic (http://newrelic.com/).
New Relic is a really good choice. They offer a free version as well.
You should be caching your database queries, and using memcache to serve static content.
@Robzilla, I never heard of xdebug, i’m gonna give it a try. Are you using it?
I’ve used it once or twice in the past, combined with Webgrind (front-end), if I recall correctly. Note that debuggers slow down code execution, so it’s best not to use it on a production server; if you must, then make sure you disable it afterwards.
Am I looking at a different waterfall? 0.5s isn’t that bad a TTFB for an untuned WP site at all!
Your main delay seems to be Edgecast, where the delays could well be due to demand load of images to the CDN.
I’d look at installing something like W3 Total Cache in the backend and seeing what happens before loading up New Relic. I know this site is a great supporter of this product, but to me it looks like the PHP plugin is based on xdebug, which can really screw with your performance in it’s own right. So, at best I’d treat it as a means of snapshotting the server, then removing it PDQ.
Usually, as long as the DB has enough cache resources available, installing an opcode cacher ( eAccelerator seems to work better then APC for apache based sites - the converse for nginx ) will probably bring you more performance improvement than any other single thing you can do… after that it’s normally time to throw raw CPU power / fix the code, both of which tend to be more expensive.
+1 to most of what was already discussed. Looks like the images from i.oodleimg.com are really slow to respond. Do you own those or is that a 3rd-party? If it’s a 3rd-party, see if you can lean on them. The listing images are very long-tail and probably won’t benefit from a CDN much and really just need to be able to be served really quickly.
As far as the back-end goes, 500ms is pretty quick but making the code faster will make it scale better as well as be faster for users. I expect that for the listing pages you will have the same issues that the image serving has where it’s a really long tail and most caches will be useless (data caches, not opcode which will be useful). Do you control the database for querying the listings or do you make a back-end call out to another service? My money is on that being the slowest piece.
I’ve used buth xdebug and NewRelic on WebPagetest itself. I’m pretty sure NewRelic isn’t built on top of xdebug but how it interfaces with php is probably pretty similar. It’s meant for running in production and is very lightweight (I’m regularly running 300-500 full php requests per second through the single WPT server without issues).