I’ve recently been experimenting setting up scripts to automate tests in the hosted platform and for a particular test got significantly different results between the automated script and manually going to webpagetest.org and running the test. I don’t think it is an issue with the script as I only found this with one particular test. The tests are for the same URL, same location, only using first view.
Were you running the automated script against WebPagetest or your own instance? Also, do you see it regularly or was it a one-time problem?
The API and the UI both use the exact same code path but it’s possible one of the options was set differently in the UI (assuming both were run against WebPagetest for the same location, browser and connectivity).
Both tests are using the public Dulles_IE7 instance, but I get the difference no matter which location I use.
If you compare the two results below, as far as I can tell, the settings for each are identical, the only difference I can see is that the first was submitted using the API, second using the web interface.
Thanks for following that up, I have replaced those ampersands now, looks to be working well.
I spent some time yesterday modifying the script we use to allow for the connection speed to be passed in as variable, expecting that if I passed in the same profile as Dulles_IE7 standard, the results would be the same.
Can you see any reason why the two tests down the bottom would return such different results? The first returns ~5secs, the second ~12 seconds. My expectation would be that they would be practically identical.
Sorry, mistake in the documentation - it’s bwDown and bwUp (instead of bwIn and bwOut). I just fixed the docs. It should be identical once it’s working because the standard profiles are just pre-configured settings. Everything is effectively a custom profile.