Is there a way to exclude traffic from WebPagetest in my GA reports? Thanks.
The WebPagetest traffic has PTST in the user agent string. You could follow these instructions to filter it out: http://blog.yottaa.com/2011/03/google-analytics-how-to-segment-and-filter-out-robot-traffic/
Thanks for the reply. I’m a little stumped with this one. I saw a huge jump in direct traffic so far today and I assumed that it was from me playing around with several cdn’s on my site and using WebPagetest to compare the load times.
I can’t seem to get the filters to pick up PTST per the tutorial on the page you linked to. I can see the user agent string in my logs so I know it’s there. I also tried filtering out Mozilla and that only flagged about 1% of my traffic today.
This is the user agent that I’m seeing:
“Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; SLCC1; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; PTST 2.264)”
My best guess at this point is that a) google isn’t including it and I just happened to have a big jump in traffic today or b) for some reason GA isn’t seeing this user agent string and consequently can’t filter it out or c) I just had a big jump in traffic for whatever reason.[hr]
Did a little more reading and it turns out that GA doesn’t store the user agent field. It supposedly parses it into
Visitor Browser Program (ie. “Internet Explorer”)
Visitor Browser Version (ie. 6.0)
Visitor Operating System Platform (ie. Windows)
Visitor Operating System Version (ie. XP)
Any other ideas? I supposed I could go in and manually identify the IP addresses and create a custom filter. I used a variety of locations around the world though so I’m guess there are quite a few IP’s that would have to be filtered.
You can get the IP’s from here: http://www.webpagetest.org/getTesters.php
Other than that, you can block ga.js using the webpagetest blocking capability but that will change the performance of the page (kind of defeating the purpose).