The long times look to me a measurement artifact in Chrome because it spans the 400ms of JS code running on DOM Content Loaded. Looking at the bytes in it doesn’t look like it actually downloaded the files.
Thanks for the prompt response. You might remember me from awhile back; I was creating a website called green-watch.org but the project fell through. So here I am (round 2) trying to create a new business involving freelance programming. Last time I built a website for myself, I did not think twice about page speed until the website was practically developed. I am taking it more serious this time around and ever since then really for client websites.
I have moved most of the JavaScript to a body onload event.
I have already used your tool to decrease my first page load from 1.9s to 0.6s and my second page load from 1.5s to 0.6s. However, I did take out a huge portion of the page, which will be loaded via AJAX here soon. The document complete time should not change, but I expect the full loaded time to increase a bit.
I have three questions below if you do not mind?
Question #1 - Does anybody know if Google looks at the overall load time vs only the document complete load time in determining SERP results? I realize that both are important (document load probably more so than overall load). If overall webpage load speed is taken into consideration, I might split my page up into sub-pages. With the additional content from AJAX it will be quite lengthy so I was thinking about having it load via AJAX gradually as the user scrolls down. I do not know how that would effect the SERPS though meaning the overall load time is based upon the user scrolling down. Anybody have any thoughts on this?
Question #2 - On the first page load (non-cached), I am using some inline css to partially render the most important elements on the webpage. On the body onload event, I am loading the rest of the CSS for the webpage. This CSS is then cached and served to the user in the HTML HEAD section on additional requests. This seemed to have increase my webpage speed a lot. The original CSS file was quite bloated. I used some critical CSS generator (Critical Path CSS Generator - by Jonas Ohlsson) to create the inline css for the initial webpage load. However, the Google Insight tool now warns me to “Prioritize Visible Content” and “only about 57% of the final above-the-fold content could be rendered with the full HTML response”. Is there a better tool out there to extract the critical CSS? If not, want to make one for us hah =)
Question #3 - I am going to be working on reducing some images and making them lossless and then creating some sprites. I have not yet made the website dynamic so I expect the time to first byte to increase up to half a second. Most of the stuff that happens after document complete event will be third party plugins (like sharethis to get social network capabilities). I am not really sure how to go about reducing overall load time because of this. It is harder to control how things load on external servers. I just have the control of when they load really. Any advice on this?
For the cached pages question, if the client is asking and the server is responding with a 304 that means it is still doing a validation check and isn’t truly cached in the browser (which is why the connection needs to be set up). As to why it takes so long, it looks like the CPU is pegged for some reason - I’d be more inclined to blame the EC2 instance though it does seem to happen pretty consistently on that test.
Those really long TTFB’s are certainly server processing and getting visibility into that usually requires installing something like New Relic (or instrumenting the code). Coming in from China will add some network RTT to the TTFB but it’s on the order of 250ms and is dwarfed by the server time.