Page Speed Insights vs Web Developer Tools

Hi everyone,

Would someone be able to tell me the difference between Page Speed Insights vs Web Developer Tools.

I’ve tested the same site on both multiple times, and the two tools never line up in terms of LCP, TTI and CLS. Web Dev Tools rates the site much better than PSI, despite using Lighthouse to run the test.

Kind of confused what to target if the results are slightly different so I think I need to understand the differences in the reports a bit better.

Thanks for any help!

Hi,
To know the differences between the two, it’s good to understand the difference between Lab data and Field data.

Lab data is performance data collected within a controlled environment with the predefined device and network settings.

Field data is performance data collected from real page loads your users are experiencing in the wild.
For page speed insights, this is based on test settings given by Google. Which has its own server location in the US and sets the speed of 3G fast. The test results provided are a mix of both field data and lab data. PSI is great to understand how users are interacting with your website and how it will affect the core web vitals.

Web developer tools are based on your current environment. The results will be based on how fast your device and internet are. The results will be different as you will be connected to a router, while PSI is using 3G internet. You can alter web developer tools to provide the test settings you wish to make it as accurate as possible to a mobile device.

However, web developer tools are only considered lab data, and therefore are different from what is given in page speed insights.

To understand LCP and CLS, I would use PSI to determine the current page performance and web developer tools to find what is causing the increased time.
I hope this helps you.

Is your Web Dev Tools showing a better score than page speed insights?

Continuing what @henryp25 said above. “Lab tests” is often when you run against a local server or machine and so the latency will drop considerably. Page Speed Insights could be anywhere google has servers and may not be close to your origin, and therefor increased latency will delay the LCP, potentially delay TTI (especially if you have external styles and scripts) and can therefor push out your CLS if slower loading assets causes more layout shifts (again especially with external stylesheets, preventable with critical styles in line on page).

Even if both tests are being run on the same “production” instance, still you might be geographically closer to your origin server than the page speed insights servers.

As an example, I live in the same city as the authors of GTMetrix. My Vancouver GTMetrix scores are better than any other testing tool online - mostly because TTFB on all requests is < 50ms due to geographic proximity.

If I run WebPageTest on my site using a more eastern US location, TTFB from origin can jump up to 500-600ms which is when having a CDN (CloudFlare) and proper caching headers help.

One more tip I heard from John Mueller on SEO Office Hours mention: The chrome dev tools capture the entire user experience including user interactions. So CLS will change if you have things like auto-refreshing adds or adds injected into content as you scroll, etc. Whereas automated testing tools generally have a fixed viewport that doesn’t change and will never interact with the page (such as a scroll event) which may cause some scripts to not fire resulting in slightly better stats in these cases.

I suppose that brings up one more thing too, your local viewport and the testing tool’s viewport may not be the same which could shift what is being measured for LCP or change how many images are loaded via various deferred loading techniques, etc.

I’d be curious to hear a more specific complaint from you like which tool was giving the better scores for you.