Test Results not looking good. Need some help please.

Our 12 year old company recently had a new website build and migration. I ran the site through this test and the results look terrible. Could someone look over these results and let me know what you think? I am not knowledgeable enough to interpret the results.

I really need to get with the web company that built our site. Many of our customers talk about how they navigate the site and the site actually hangs up during database search and will not advance further. Can you see what might be the problem?

Thanks to all in advance.

Charlie

Website Test: http://www.webpagetest.org/result/131124_N7_S0F/

Looks like noone spent any time on optimizing the web server, and your developer didn’t think to optimize any of the images used.

They very first thing I would do is enable gzip compression of your HTML, CSS and Javascript files. It looks like you’re on your own server with Apache and cPanel, so you can probably use these instructions: http://docs.cpanel.net/twiki/bin/view/AllDocumentation/CpanelDocs/OptimizeWebsite

Also enable keep-alive from within Apache, so that connections can be reused between HTTP requests, and set expiration headers for static resources (images, stylesheets and such) that don’t change often. Your developer ought to be able to assist you with this. I would also stress to them that your images aren’t optimized; image compression is a simple trick that can significantly reduce bytes transferred, without visible loss of quality. According to your WPT results, you can save up to 1.2 MB, which is quite worth it.

As for the slow database searches and long Time to First Byte (even on the initial 302 redirect to www.), that’s something on the backend, and we can only guess what the cause may be. Usually it’s a poorly set up web server, a poorly programmed website, or (worse) both. In short, I would agree that you need to talk to the company that built your site :slight_smile:

Ironically, your developer’s website has many similar performance issues: http://www.webpagetest.org/result/131125_28_QH1/

Rob - Thanks for the advice and feedback. I’ll pass this to the developers.

Was this site written from scratch? If so, then you’ve got the added complication of no knowing the code quality, and a lack of free bespoke performance enhancements ( eg WordPress W3 Total Cache, Magento lesti::fpc ).

First, it’s taking 2 seconds to redirect from carcoverworld to www.carcoverworld.com. This should be instantanious, so fixing that will be an instant win.

To address the (now) 1 second TTFB…

You do have things that can be tuned: MySQL, Apache and PHP. MySQL needs feeding loads of memory in relevant places: there’s a script called tuning_primer.sh ( many thanks to Matthew Montgomery ) which will provide you with a good starting point for that.

Apache’s been dealt with to some extent ( keepalive, mod-compress, etc ). I must admit to being a bit rusty with it, as I find nginx to be a faster, lighter and just plain simpler to work with.

With PHP there are two ways you can improve performance: use an opcode cacher, and feed it lots of memory, and run in FPM mode. The eAccelerator opcode cacher seems to work best with apache, and APC ( the later version built via PECL ) for PHP-FPM.

You’ll possibly also gain from a varnish server in front of your site, so it can deliver as much content as possible directly from it’s local storage. However, configuring it correctly is not a trivial task.

For the rest of the content, A CDN will perform 2 things: first it’ll deliver faster than your (shared) 100Mbit/s internet connection, and second, it’ll remove the load off the server network interface - 100Mbit = 12.5MBytes/sec, which can get swamped fairly easily with a heavy graphics site. In addition, the use of sprites will reduce the number of files necessary to build a site, and so also aid performance.

As a shortcut it may be worth playing around with Google’s mod_pagespeed plugin, as it has the ability to post-process your site into a more efficient form, including autogenerating sprites, resampling images and so on.

Also, look at image compression, but don’t go wild about it at the moment.

http://www.webpagetest.org/result/131126_BE_5WT/ is probably more relevant to your site - better, but… no cigar (:

If you need a hand let me know… Here’s my site ( totally irrelevant but looks good! ) http://www.webpagetest.org/result/131126_7D_5XS/

Our developers have improved some of the site metrics, but it looks like we still have problems and the site just hangs and won’t advance at times for us and our customers. (Intermittently). This is our biggest concern. Thanks for your feedback.

New Test Results: http://www.webpagetest.org/result/131128_FB_V91/

If you compare your performance to the sites listed in your developer’s portfolio, you’ll find that they’re all about at slow as yours, and are managed using the same proprietary CMS, so that’s probably where the problem lies. And if they take care of your hosting, too, then you’re probably not really in a position to improve performance by yourself.

I browsed around the site (from Europe) and the lags weren’t too bad, except for the search box, which took a good 10 seconds. Particularly wasteful is the shortness of Apache’s keep-alive setting (5s currently). Unless you’re on a busy server, they should be able to crank that up to 60s without a problem, and this could make the browsing experience feel a bit snappier. It’s a simple fix for them.

It’s a little odd that the redirect from carcoverworld.com to www.carcoverworld.com is taking so long, since it’s an action that’s usually handled directly by the web server (Apache in this case, either through httpd.conf or a .htaccess file). That would suggest the web server, or the machine itself, is either under high load or poorly configured.

Either way, if it’s managed hosting, there’s not much you can do other than complain.

Your host, Softlayer in Dallas is fine. There are a lot worse, like GoDaddy.

I have had a few sites hosted on Softlayer a few years ago and they were OK. Better than a VDS on GoDaddy.

The huge issues are the 2 second Redirect and the interleaving of CSS causing a a very long time to Start Render and First Paint. The JS could also be causing a rendering re-start.

All CSS must be at the top of the right after the

BIG ISSUE: Move the CSS out of the and into the !!

This is a likely reason for your slow Start Render Time because the is not scoped. One is inside a

, this is wrong. Just put it in the

No JS should be loaded before all CSS. If a JS script loads CSS then add a CSS link for the same file.

There are various reasons for using redirects, none of them are legitimate.

If you are using open source and or plug-ins they may be hi-jacking your visitors.

What is happening is somewhere in early the PHP code a PHP script is running and when the script is done, 2 seconds later your home page is loaded by using the redirect.

Somewhere in the PHP there will likely be a header statement like:

header('Location: http://www.carcoverworld.com/, true, 301);

or may look like

header('Location: . _SERVER[“REQUEST_URI”], true, 301);

If you want me to take a look and find out what is going on I’ll do that for you out of professional curiosity.

iSpeed - Thanks so much for the look over. Any additional info or review is appreciated and will be forwarded to our web development folks. Thanks.

Charlie

@iSpeed, please keep your contributions constructive without devolving to insults. The slow start render has nothing to do with the CSS location - the main issue is that there are so many separate css and js files that are being loaded individually (though they are all cacheable so it’s mostly a first visit hit).

Charlie, the redirect looks like it is being done by the application rather than the web server. You should be able to create a simple .htaccess rule that will do the redirect directly (bonus points for making it a cacheable 301 redirect) which would improve any cases where the users hit a redirect.

From the looks of things, the web server is doing a reasonably good job serving static content so you may have to look at adding caching to the application layer itself (depending on what platform it is built on there may be out-of-the-box solutions available).

The other thing that can get you a quick win would be to re-compress some of your images. The big banners in particular - http://www.webpagetest.org/result/131128_FB_V91/1/performance_optimization/#compress_images

Just compressing the banner images better will cut the page weight by close to 70% and the site will look exactly the same (likely a problem across the site).

If you do not believe me, maybe you will believe Google:
From Google’s Web Performance Best Practices
Browsers block on external CSS files before painting content to the screen. This incurs additional network latency and increases the time it takes to display content to the screen.

How does the page get rendered without the CSS?

When subsequent CSS links are encounter by the Browser the rendering has to re-start. Look at the waterfalls and the location of the last CSS and the Start Rendering. Locating the CSS at the top of the is a fact. Not my opinion.

Think about it. First the C in CSS, is CASCADING. means child selector attributes will inherit attributes from form parent selectors. When the parent or child is in the subsequent CSS file will that not have an effect on page rendering?

Secondly the order in which the STYLE attributes are defined sets precedence. Subsequent attributes override the prior.

The Browser must complete the DOM database tables to be able to render the page. When parsing the CSS and the Browser encounters new CSS the Browser could instead examine to see if and how this change will affect the rendering already done. This would add some very complex code and slow the process, it is then likely the Browser will determine that the best approach to deal with the new CSS changes is to start over.

Or they could simply without need for additional complex code just start over. They can then, as they have, publicize the rule about putting the CSS first so the Browser does not have to re-start rendering.
[hr]

There will be NO second cached visit when the user clicks the Back button rather than wait for the first page load.

The first and foremost is the fist page view in order to get in to the cache. When the user clicks the Back button rather than wait for the first page load the cache does not get loaded.

Furthermore there will be less first loads because Google is very careful to monitor the time you click on a link and when you return from the site. Every time someone bounces from a Search Link the page is lowered in the ranking for that search term. With lower ranking there will be fewer referrals from Google. With no Google referral there will consequently be no need for a cached second view.

The Google bot is only concerned about first page views. Google’s job is to refer good quality links to their users. It is in Google’s best interest to filter out pages that load slow. Most Google users are visiting site for the first time. Because Google knows (as they have published) how many will abandon a site with slow a page load. Again, with no Google referral there will consequently be no first view or a need for a cached second view.

Google has also published they use page load speed in their ranking metrics. The even are so concerned with this metric they have gotten involved with a project that ranks Web Page Speed. Maybe you have heard of it? PageSpeed Insights. Slow pages are ranked lower and get fewer first views and no cached second views. Again, with no Google referral there will consequently be no first view or a need for a cached second view.
[hr]

Where is he going to redirect to? If there were a better page to load first, wouldn’t it be better to make that page the index page rather than use a redirect?

Would it not be much better to solve the problem rather than use a time consuming, albeit small, redirect?

Finding the PHP redirect (it is a PHP redirect as evidenced by the header) is a piece of cake using a quick and simple text search of the PHP files for the redirect header() command.

There may be a reason for the redirect. It is possible there is a authentication or initialization process that must be run for an online app to run.

If it’s a clandestine hi-jacking it would be wise to find the culprit. It could be a devious Web Developer.
[hr]

Caching of dynamic pages adds another level of complexity and another point of failure. Would it not be better to eliminate the need for the caching?

PHP is very capable of doing a better job of transmitting the HTML than a caching solution.

First of all PHP can specify header parameters more dynamically than Apache. PHP can override the default output buffering and send the before the output buffer has enough bytes to trigger early transmission.

Would it not be better to address the underlying PHP problem? In many cases it is just a simple matter of adding a flush() command in the appropriate places.

For Word Press, especially free WP Themes, it is a simple matter of correcting the PHP code. For example WP Themes typically use context switching between HTML and PHP. Throughout the code WP uses <?php $variable ?> in HTML output mode. Each time this happens it requires a time consuming context switch.

The correct way is to keep the mode in PHP output and use Heredoc string quoting and inserting a flush() whenever appropriate.

Most designers process their SQL Queries for dynamic content before sending any HTML.

I will transmit as much of the page as possible. Usually at least the and page HTML header. I will halt the output, flush the buffer, process the SQL query, format the query results storing the resulting HTML in the next output buffer and flush this buffer before the server has completed transmitting the previous buffer.

Full disclosure - I work for Google on the Chrome Performance team (and spent a good part of the last 3 years on the make the web faster team).

My comment about CSS was specific to the issues with this page - not in general - yes, the browser needs to fetch and apply all of the css declared before an element before it can lay it out - it’s just not the longest pole right now for this page.

The redirect is a simple (and very common) bare domain → www redirect to avoid serving the same content from both domains. You could just test the www. domain in the first place and ignore it but since we can also see that the bare domain redirect is not-cacheable and is painfully slow, it’s a quick fix to do the bare domain redirects as a permanent redirect in the web server config (nothing nefarious with hijacking going on here).

btw, it also looks like keep-alives are disabled - would be a big help to get that fixed. If it’s a VPS or dedicated server then you’ll need to change the server config. If it’s on a shared host then you can ping them but it’s possible you’ll need to change hosting providers.

Pmeenan -

I don’t take offense to iSpeeds postings. For 2 1/2 years our company has struggled with 2 development companies and lot’s of $$$ to build out a reasonable solution. The first company flat out failed, and our present developer seems to be struggling.

I appreciate any and all feedback posted to my situation. No one’s feelings are getting hurt here.

Charlie

Understood - iSpeed has added a lot of technical depth to the discussions and is providing great feedback - I just want to make sure that we keep a generally constructive tone on the boards.

Sorry to hear that you’ve been having so much issue getting a development company that knows what they are doing to do the work. A lot of the issues are pretty basic ones that I would hope professional developers don’t make but clearly that’s not always the case (happens quite a bit when it’s more of a design team that doesn’t actually do development).

Feel free to send the dev team over hear as well - sometimes it helps to be able to communicate directly with the guys writing the code so they understand what’s going on.

Sorry, but that’s grossly inaccurate. By that logic, I could easily ruin the rankings of any competitor. While users bouncing back to the results pages may be one of hundreds of (small) relevance factors, it would be far more complex than you make it out to be.

I have to nip this in the bud because there are just too many false claims going around, especially on the relation between page speed and search rankings. Companies that offer performance-related services have a hand in this, and Google understandably makes no effort to dispel them: unlike the whole PageRank debacle, having webmasters obsess over page speed is actually good for the web.

The influence of page speed on search engine rankings is often overstated. Pages with poor load times should be downgraded, and I’m sure they are, but it’s all relative: just because a page on site A loads in half the time it takes to load a page on site B, by itself does not mean it should rank higher – unless perhaps both sites would otherwise be scored exactly the same for a given query, but that’d classify as an edge case. The point at which you get into trouble is when load times begin to really interfere with the user experience, and I’m not sure the site up for discussion here would qualify for that. Improving page load times may, however, result in more pages viewed, ‘likes’, backlinks, and those may somehow seep back into search rankings, so it’s worth focusing on from all perspectives.
[hr]
Another thing I noticed is that the Droid Sans font is loaded only for the breadcrumbs. That’s a waste.

Has your developer responded to any of this, Charlie? And is this a fully managed type of service? If not, you could consider a post in the Web Site Optimization Help Needed forum to get some professional help.

Yes it is more complex than just a single user bouncing of the page.

My comments is not based on reading about Ranking somewhere.

By following a search term bounce rate and the ranking for that search term in Google Webmaster Tools it is very clear there is a close correlation between the two. Just the fact that Webmaster Tools shows this metric has much relevance.

Webmaster Tools shows dynamically how the ranking of a search term changes from day to day along with the bounce rate. It does not take a rocket scientist to see there is a direct correlation.

I am sure Google is smart enough to detect when you try to ruin the ranking of a competitor.

Just by common sense logic if search users are bouncing from a site on a particular search term one may come to the conclusion there is little relevance that site and the search term.

Yes the bounce rate weight in ranking would be complex. Google has a huge database of metrics stored for every search user and a large staff working every day to create an algorithm that can assess the weight to be given to each bounce.

Bottom line, a slow loading page will beyond any shadow of a doubt contribute to lower Google Rankings, higher page abandonment, lower site visitor satisfaction, and lower conversion rates.

Err, where in WMT do you see bounce rates? Google does not have access to that data; once a visitor’s on my site, I can track him, but Google cannot; they can only see whether that visitor returns to the search results, and that’s data they’re not likely to share.

Perhaps you’re confusing Webmaster Tools with Google Analytics.

I don’t think it’s that clear-cut at all, but that’s a discussion that exceeds the bounds of this thread, and we’re already diverging.

No, it is WMT not Analytics.
You underestimate Google.

Really. Can you provide evidence of that? I know CTR, impressions and avg. position are provided, but no bounce rates.

I have not used WMT or Analytics in over a year. I deleted all my Google accounts due to Google’s Privacy Policies. At that time I did also have Analytics on the sites I was monitoring in WMT. I am getting old and have noticed some decline in my memory in the past few years. I can visualize the WMT window in my mind today with a column of numbers with a percentage. When I saw your message with CTR I began to have doubts on the accuracy of my statement. I just now reviewed the data for 3 sites I was monitoring ranking for 30 search terms. Dates on the data collected ended from 2004 to 10/30/2011. Over two years is a long time for my dilapidated mind. There were many sources for the data, I wrote my own “stat counter”, used AwStats, WMT, Analytics, Web Position Gold, and etc. I NEVER paid any attention to the online SEO Experts. I have had many domains going back to Feb 1996. I was always able to get my target terms to #1 ranking on all major search engines. The term “caller id” I ranked #1 on every search engine since the Alta Vista days whenever that was maybe like 1998?. The reason I remember the bounce and rank correlation was because I wrote two bot apps to test a search term. Both were timed to randomly mimic the click through times of a human. One would bounce the other would click through to multiple pages. I figured I would need about 100 IP addresses to build 100 profiles with a history with Google. The weak link was trying to emulate the Browsers javaScript in PHP. The 100 IP address for the profiles would cost about $500 / month. I’m sure Google knows all the IP addresses of the free IP proxy services which are filtered from their ranking stats. I created an algorithm to detect users with multiple accounts I create a hash for all accounts from the user’s IP, user agent vs. javaScript command profiling, screen size, and OS. I assume Google had to do something similar especially when there are so many users sharing IP addresses. That’s why I said it would not be that easy to ruin a competitors ranking. If I had not seen a clear correlation between bounce and rank I would not have spent the time to create the bots.