average connection speeds and impact on load times

I’m trying to figure out how to explain performance gains in layman’s terms (to clients).

I’m mostly a front-end/SEO practitioner but I have learned how to enable gzip in .htaccess and some other backend techniques (which is to say I’m not very experienced with networking and traffic).

I have been told that connection speeds which are listed as ‘Mbps’ are Mega bits per second, and therefore aren’t a direct translation to file sizes which are in MB (Megabytes).

How do I translate them and what are ‘normal’ connection speeds?

For instance, this site lists “Mobile 3G” as an option with a denoted Dowloadspeed of 1.6Mbps, Upload speed of 768Kbps and a Round Trip Time of 300ms. I understand what those things are but how much actual loading time should that (everything else being equal) save if I shave 200 Kilobytes off of the JS.

Even harder to explain, let’s say the site loads 60 resources (60 HTTP requests) and I reduce that by 10 (to 50–all from the same domain–I understand how and why this is important). How much time would that generally save?

I don’t need anything absolute, just a guideline so when a non-technical person asks (a client or a non-dev co-worker) I can say that ‘the time I spent should, as a rule of thumb, speed up the site for a 3G user by 2.5 seconds and a typical broadband user by 1.3 seconds’ or whatever.

I just need an approximate conversion between some of these factors and the real world (aside from running before and ofter tests with tools like this).


1 bit = 8 bytes so multiply the file size by 8. Though there’s also networking overhead and that assumes perfect utilization. New connections need to warm up first so it’s a best-case.

As for estimating impact of changes, the reason you don’t see tools providing that already is that it’s hard. If the 60 HTTP requests were not on the critical path and were tiny resources then it might not have any impact on the load time at all. If they were all JS files in the head then the impact could be huge. The easiest way to ballpark it would be to use the blocking feature to block those requests and see what the impact is (though depending on the requests you block, it may end up blocking down-stream requests).

Wow, this post is a blast from the past. I was testing a website and saw a forum post that intrigued me an after reading decided to create an account to reply to it only to find out I had created one in 2014 (apparently, before I had started using a password manager so I had no record of it and had forgotten about it).

Anyway, thanks for the reply. Now that I have a better understanding, it makes perfect sense yet it had never occurred to me to use the resource blocking feature that way—it’s one of the features I don’t think I have ever used before, I’ll give it a try.