Urgent help needed. High times for CDN access..



Check any image access …for example request 25. Why is it taking 12 seconds to grab a 282Kb file from the CDN (Akamai) ?

URL: http://as1.wdpromedia.com/media/wdwhispanic/es-us/SpOf-Summer-Spanish.jpg
Loaded By: http://disneyworld.disney.go.com/es-us/:322
Host: as1.wdpromedia.com
Location: Brooklyn, NY
Error/Status Code: 200
Start Offset: 1.650 s
Time to First Byte: 227 ms
Content Download: 12096 ms
Bytes In (downloaded): 282.0 KB
Bytes Out (uploaded): 0.4 KB

Yes, that one image is 282KB but it looks like you have a whole lot of them downloading at the same time so you really have closer to 1.7MB of images all in contention for the bandwidth: http://www.webpagetest.org/result/120612_8G_XR6/1/breakdown/

It looks like you could save close to half of the image bytes by being less conservative with the compression: http://www.webpagetest.org/result/120612_8G_XR6/1/performance_optimization/#compress_images

Thanks Patrick for the info.

What image compression techniques are you recommending ? Our creative dept says they are already compressed in this current form ?

I’m suggesting that if they use Photoshop to use “save for web” at a quality level closer to 50-60 instead of the 80-90 they must currently be using. Yes, they are technically compressed, but barely. They should see how low they can turn the quality setting and still be happy that there are no visible artifacts.


the other issue I found was the CPU usage is pretty high during the image gets. Since I am assuming the web test page servers are shared by different users, the requests are getting queued or blocked and that may also explain the delay. I will in meantime request the images to be compressed furthur.

The WebPagetest test agents only run one test at a time and are not shared at test time. I can 100% guarantee you that the long times are because you’re trying to send close to 2MB of images to the user and the connection is a 1.5Mbps DSL line.

1.5Mbps is good for ~170KB/s at the theoretical max (which is usually only hit for really long file downloads, hardly ever for web because of things like slow start and the size of HTTP responses).

The theoretical limit on how fast you could deliver the 2.2MB over that connection is ~198 is around 12 seconds so your 14 second load time is surprisingly close (particularly with a 1-second TTFB in which time no data is transferred).

The page is completely bandwidth constrained and there are only 2 ways to make it faster:

1 - Reduce the TTFB. At most you have 1 second of leeway there but that would benefit everybody, regardless of their connection

2 - Reduce the size of the page:

  • You can save 30-40% of the weight of the page just by compressing the photos better and not changing anything else with the site and that would make the times 30-40% better at the speeds tested.

  • You could also keep the photos the way they are and change how the page loads. Instead of loading all of the photos for the carousel you could just load the first one as part of the initial load and then use the javascript carousel code to load the additional images as part of the carousel logic but the logic would have to account for only rotating the carousel if the image finished loading.

Optimally you’d do both.

Thanks for details. Appreciate it. What do you think is the reason for high CPU times?

Hard to say for sure without looking closer but you can try testing with one of the dynatrace browser configurations which will profile the page and point out any javascript hot spots.