Under performance review at number 55. I have a jpg file that is failing gzip. I was not aware gzip was for image files. I thought that was for text but this file always fails gzip.
Also, I am consistently getting good first time byte from California and Dulles but my google webmaster tools is showing quite a slow Time spent downloading a page (in milliseconds) at 588 with Database and Object caching on. I have APC for DB and Memcache for Object. HOWEVER… if I have DB caching and object caching off it goes down to a reasonable 278 which is almost 50% better. I can´t make sense of that.
Is my test result throwing up any obvious problems?
The JPEG failing gzip is because of a bunch of meta-data (exif data probably) in the file. Rather than gzipping it you should strip out everything you can.
As far as the TTFB goes, it looks like your times are highly variable (the repeat view was closer to 600ms). If the first view is consistently fast and repeat view isn’t, maybe you have some back-end code that does session tracking that is slow. It’s also possible that a bunch of the deeper pages don’t get hit enough to stay in cache (Google’s data is going to be from the whole site, not just the front page).
I am confused with the gzip result because there is no metadata including exif data in that image. None at all. It has been stripped bare.
How long should I keep pages in the cache? If I make a change I can always purge that particular page and then revisit it. What is a good amount of time for that?
How can I track down what is taking so long on the second run (session tracking?) because this is consistent and if I can see what is doing that then I can look at getting it fixed. Does the waterfall perhaps show what is going on there?
I deactivated wordfence and the second run first time byte is now consistent so it seems if you use wordfence it is creating a session and slowing down the first time byte.
I´m still no closer to solving the gzip issue so I think that one is for you to look at Patrick as it seems there is a bug in the system for that.
Ignore the gzip warning - it looks like it’s only a few bytes. I’ll see if there’s a fix I can do on my side - it does look like that file can be slightly compressed by gzip but not because of any meta-data.
gzip is just a generic compression utility and doesn’t care what bytes it is operating on. Usually we don’t gzip encode image streams because there aren’t additional savings - most of the image formats already squeeze out as much as possible but in some cases you can actually get a little more savings.
gzip should be used for favicons that are using the actual ico format because that is an uncompressed bitmap format and for some font formats which are also binary so it applies to more than just text. It just happens that your jpeg fell into the small set of images that get some additional savings by gzipping but I really don’t recommend doing it (browsers will handle it fine, there’s just no good reason to do it).
The target is derived from the socket connect time which is used as an estimate for the round trip time to the server so it can vary a bit (or a lot if there are issues).
For static, do you mean “pages” or “content” (js/css/images)? Content we usually recommend keeping for a year (some browsers don’t do well with longer expirations than that) but that assumes that the content really is static and versioned. “Pages” is a lot more complicated and depends on the site. Usually they aren’t cached at all or only for short periods on the server (1 minute for example) to reduce load.