I’ve been working on improving the speed of a retail site for a while now, and have made significant, objective gains in a variety of ways. These improvements have been backed up by most automated test results and by real world experience. Even the speeds reported by WPT have improved a little, however, they are much worse than I can recreate or corroborate externally.
Shows an average total load time of 8-9 seconds. The start render time is nearly 3 seconds! That’d be terrible, if it were actually the case. But even on my phone the site loads faster than that. Other testers, such as pingdom, also report much higher speeds.
1.53s? That is actually so fast it is suspect in the opposite direction. Though I’m not sure what their “Load Time” actually is, it may not be fully loaded- perhaps just visually complete+interactable? On the other hand, the waterfall does show all the resources loading within the 1.5s time-frame, so maybe it is complete. Either way, it harshly comflicts with WPT, since that is half even the start-render time reported here. And first-byte time, which I’d think would be pretty consistent, is just 150ms there compared to 450ms here.
I had read in another thread here that I found via google from 2014 that WPT can simulate slower connections for more realistic results (which I already knew). In that thread, the person posing a similar question to this had been on a cable-equivalent line and that explained the difference. The answer was to use “native” as the connection setting on a test. However, my above test is supposedly on 20mbs-down fiber. Furthermore, I tried a few tests on native and the results got worse. Here is one example:
What’s going on here? Almost 4 seconds for start-render, 14 seconds for full load? That’s absurd and contradicted by all other sources.
Is there some bug on our side that confuses WPT? Something with the CDN that gives it low priority? I’m about out of ideas.
And of course, the obligatory, “And what improvements would you recommend?” Though that is less important. Note that the images can’t be compressed more despite what it thinks. We already have slight visible jpeg artifacting as-is, and as a visuals-focused retailer image quality is critical. I’m not sure about that F on caching either; some of it is tracking tags that simply can’t be cached, but it additionally seems to think most page content caching for ~24 hours is an issue. That doesn’t seem right, right?
We’re on the Demandware platfor-er, sorry, ‘Salesforce Commerce Cloud’ platform, if that is of any relevance.
Any and all advice appreciated, thanks.