Inconsistent WPT results

Hi

I recently did some profiling of the front-end delivery one of our pages. It came in at around 1.8s for First Render in Chrome. I found some improvements to make, uploaded and profiled that, big win, down to around 1.1s. Great.

A few days later run WPT against the both pages again and the original and edited version are both rendering at 1.2s. Nothing has changed. I understand it is the internet, that things can vary, and it is a small sample (5 each time). I don’t see any single slow network request in the waterfall that looks responsible for the difference. For rendering results to vary so wildly is quite exasperating as you identify work to be done from test results, get resource from a dev team, then further test results to behave majorly differently.

I can see one makes a few less HTTP requests, down to GTM and other tags, looking at that now.

0 - Before
6 - After

https://www.webpagetest.org/video/compare.php?tests=161109_HJ_1MWG%2C161109_WE_1FQH%2C161109_EX_1MVS%2C161109_K4_1FQD%2C161109_70_1K1A%2C161109_PS_1FQ8&thumbSize=150&ival=100&end=visual#

And now, the same page tested on Chrome a few days apart:

http://www.webpagetest.org/video/compare.php?tests=161123_W2_BW3K%2C161109_PS_1FQ8&thumbSize=200&ival=100&end=visual

Not the first time this has happened. Also not uncommon for things to make quite a difference in WPT, get them into production with high hopes, only to see nothing change significantly in RUM monitoring (mPulse).

How do other people cope with the continual cycle of optimism, disappointment, inconsistency and ultimately dejection, of web performance work?

Cheers (it’s not so bad really)

Ben

PS I put these pages on S3/Cloudfront to take the backend and our deployment process out of the picture.