I’ve been reading up on the documentation and it says that the speed index is the time (in milliseconds) that ‘most’ of the page becomes visible? Looking at the filmstrip above, you can see that the page remains 72% complete from 1.4s to 2.5s.
I’m having a little difficulty in seeing where the 1758ms value comes from - why is it not ~1400, which is when the majority of the page first appears?
Actually, the documentation shouldn’t say that “the speed index is the time (in milliseconds) that ‘most’ of the page becomes visible” - it should say that the speed index is the average time when pixels on the screen have been painted (or something to that extent).
The speed index isn’t a point-in-time measurement so you can’t go to that time in a filmstrip and expect to see anything. I know it’s hard to grok and that’s one of the unfortunate things about it.
Once you have the rendered %'s (assuming those look reasonable), the speed index is the sum of the unrendered %'s (inverse of what you see in the filmstrip) * the interval. It isn’t 1400 because there is still 28% of the page that takes some amount of time to render which pushes it out further.
Ah, that explains some of the inconsistencies we’re seeing. I think I must’ve originally misunderstood what the speed index is representing - thanks a lot!
Is it possible to get access to the intermediate “unrendered %” values that are recorded at each interval in order to compute the final speed index? What I’d like to do is determine how long is takes for a page to get to 90% visually complete, for example.