When I go through each run individually for those two tests, I can see that the LCP element is our cookie-consent modal consistently. However when I go to compare view for these two tests and look at the filmstrip:
The LCP frame for the test titled “after FB SDK removal” is a frame at 2.1s that is not the coookie-consent modal. Is it just a bug in how the time is recorded and the filmstrip display logic? Or is there something weird going on here?
Also, tangential question: does the compare view use the first run from the selected tests when showing the filmstrip comparison and other metrics on the page?
The LCP frame for the test titled “after FB SDK removal” is a frame at 2.1s that is not the coookie-consent modal. Is it just a bug in how the time is recorded and the filmstrip display logic? Or is there something weird going on here?
This is due to the frame rate we record at. We record at 10fps on all desktop and emulated mobile devices, and 60fps for real mobile devices. For real mobile devices, we’re able to use the device’s camera to record everything so there’s very minimal overhead. However, running 60fps on desktop and mobile devices tends to result in a large observer effect—the accuracy of the test results becomes compromised.
Unfortunately this does mean that sometimes the actual LCP event will not quite line up with the screenshot.
If you want, you can go to WebPageTest - Website Performance and Optimization Test (or pass the fps parameter via the API) to force recording at 60fps to get better visual granularity, just expect that the metrics might be a bit slower.
Also, tangential question: does the compare view use the first run from the selected tests when showing the filmstrip comparison and other metrics on the page?
if we use ?fps=60 for WPT tests, is there anything in the WPT test result which can give me an indicator the test was done at a particular fps like 60? i.e. inspecting json result?