Time to First Byte Either an A or an F...why?

I can run this test several times in a row and get completely opposite results for my server’s first byte time, for example here it is an F:
http://www.webpagetest.org/result/111222_31_2M414/
and here it is an A:
http://www.webpagetest.org/result/111222_AS_2M3EM/1/details/

Notice that these were run on the same day, and I did nothing to the server between the two tests. You can do this yourself and get back to back A or F for first byte. Is my server really this inconsistent, or could your site’s measurement be off?

Take care,
Scott

Your server really is that inconsistent. The difference is coming from various layers of caching on your server (from the OS to the database to the actual application).

If you visit the site (or a page) for the first time, the server will likely have to go to disk for everything, particularly if it is on a shared server. The next time you visit the page, the database queries may be cached in the database, the files (code and data) may be in a filesystem cache in RAM, etc.

It’s unfortunate that the first byte grade is not deterministic and can vary from run to run but I thought it was important enough to highlight to justify the variability.

I think the solution would be for me to install solid state drives, so there is nothing moving…just solid state stored info…I will look into this.

Thank you!

Oh…just another comment on this topic. It seems rather strange to me that I can’t get a B, C or D…only an A or an F…nothing in between.

The gradations are in 100ms increments so if the swings are more than 400ms either way you’ll basically jump between both ends of the spectrum.