Speed Index and Bandwidth

We ran multiple tests on same day - around same time (within 30mins) and I see varying results with wide differences in speed index.
All of them uses DSL connection, and ran from Location Dulles VA

We find that speed index varies a lot even though the contents are same. We wonder if there is a bandwidth factor here causing
difference in speed index. We also checked the bandwidth graph at the end of waterfall and I see bandwidth spiked down at approximately 3.3 secs
on the second test that reports higher speed index. This is giving us the question on bandwidth factor on speed index.

We would be looking for some insights on what causing differences for speed index. Thank a lot.

Do you have links to the test results? The reported bandwidth is the effective bandwidth utilization of loading the page (not the available connection bandwidth) so a drop is usually the result of a different issue (usually a slow request).

Thanks Patrick for quick reply.

Below are 2 tests where we tried to limit the bandwidth in but see different speed index results

Our goal is to get consistent speed index for multiple runs with same content/configuration

Thanks

I don’t think the traffic shaping liked the 0 for upstream bandwidth and it looks like there was no traffic shaping (and there is NO WAY you want to test loading 3MB at 5Kbps).

Looks like you are pushing the CPU really hard and the slower case was 100% pegged (and probably because traffic shaping was disabled so it was loading as fast as it could). Slight variations between ads, etc could easily cause the differences given the faster test was close to 100% as well.

If you want the traffic shaping to work, try with a non-zero upstream (and much faster downstream) profile.

If you want to eliminate the CPU impact, try the Dulles_Thinkpad location which run on physical Core i5 thinkpad devices with GPU’s. They are very high-end for what your users would see but it should eliminate the 100% CPU problem.

Thanks Patrick for quick reply.

We tried the configuration setting suggested, but still see considerable difference in speed index between runs

Below is the difference of tests we measured .

https://www.webpagetest.org/video/compare.php?tests=160831_C3_Y4Q%2C160831_98_Y2Z&thumbSize=200&ival=100&end=visual

Could you please guide on what other factors/configuration to be modified to get consistent speed index

Thanks

Getting deeper into consistency for a dynamic site like that is going to be complicated but a few more things that come to mind:

  • Block 3rd-party domains that may introduce variability (ads domains in particular).
  • Further restrict the testing so it all runs on the same physical decive. Through the API this is the tester= parameter. You can set it as a hidden field going through the UI by first navigating to: http://www.webpagetest.org/?tester=THINKPAD1 (1-9 are the ID’s for the different thinkpad devices). They are all running cloned images and the same hardware but there could still be slight variability

Beyond that you are still going to be subject to races and timing in how and when the browser decides to render and how the resources are being served at any given time.

One technique that I use frequently is to always do 5-9 runs and then use the fastest (successful) run as the representative one.

Thank Patrick for insights on suggestions :slight_smile:

We will try out these options

Hi Patrick, I stumbled upon this thread and got very puzzled by your last suggestion

  • “One technique that I use frequently is to always do 5-9 runs and then use the fastest (successful) run as the representative one.”

Could you elaborate why you’re suggesting as opposed to median or even average
As you yourself pointed out “still going to be subject to races and timing in how and when the browser decides to render” - hence the fastest could be an outlier.

Thanks

When you get an outlier, it is usually from a “bad” situation. Server somwehere was slow to respond, etc. Picking the fastest run helps eliminate a lot of that.

It kind of depends on what you are trying to measure and optimize for though.