Chunked..

I’ve got it into my head that whenever I have to investigate some CDN issue with particular resources playing us false that chunking is somewhere in the mix. Particularly with dropped connections.

Now I can’t see any reason for a tcp connection serving a chunked response to drop any more often than if it were carrying a response of known length. (Am I missing something?)

Had one today and my colleagues found great amusement in my ranting. Fact is we don’t stream responses so have no reason to chunk (love to be corrected on that.) I’d like to get the thing turned off, but my chums think I’m being perfectly irrational and am chasing geese around a field.

Any of this ringing any bells with any one on this forum?

Thanks

Neil

Anyone?

Looks like CDN sometimes miss the 0 size last chunk advice. Ironically, on every occasion, the data has all been sent.

If the connection is closed then the user-agent believes that the connection was dropped before all the data was received. This is the better scenario as a retry can be initiated.

If the connection is persisted then a timeout (eventually) occurs. A situation where keep-alives is definitely not a good idea.

When I have been doing server side perf testing I have all too often hit some nasty flakiness with chunking. Am I really the only one?

(This is a major CDN provider, by the way.)

Thanks chaps

I haven’t seen anything that has conclusively been tracked back to a CDN having problems with chunked responses but it honestly wouldn’t surprise me since they mostly expect static files (all of which have a known size before sending). (DSA solutions aside).

I haven’t seen any browser problems with it either, my expectation is that any issues would be on the CDN’s side in a pull model.

Spot on Pat: resources pulled gzipped but then decompressed for user-agents wanting vanilla.

Seems (and it’s not just a CDN thing) that gzip and chunked just don’t always play nice. It’s not the chunking, it’s the chunking plus compression.

One of those things that once one knows what to Google, one can find. And it’s intermittent, very intermittent; which is why the whispers on Google are faint and conflicting. Intermittent to the extent that some resources don’t seem affected at all, but others of the same mime type are.

Just one to keep in mind.

You haven’t told us a lot.

If you are service static files like PNG, JPG, CSS or JS you should have no chunking if you have no special rules in your webserver. You might have automatic compression of CSS, JS setup I don’t know.

Are you serving HTML through a CDN ? Or maybe generating CSS or JS on the fly ?

If you are using something like PHP to generate HTML, CSS or JS, you should be looking at flush() and ob_* (ob_flush () and friends) function calls.

If you are doing gzip compression in PHP, maybe something like ob_start (‘ob_gzhandler’) or http://www.jpcache.com/ you should look into that as well.

Maybe you should check if you are using the Vary-header correctly.