gzip dynamic html homepage

Hello ,
I am getting a compression ding for the “Cloud ERP Software for Manufacturers | QAD” html content type home page.

http://www.webpagetest.org/result/131209_6E_WS5/1/performance_optimization/#compress_text

Should this dynamic html content type home page be gzipped?

Thanks.

TRX

Yes :slight_smile: It doesn’t really matter whether the page is static or dynamically generated. The web server compresses the output on the fly, based on the content-type of the requested file, like text/html, text/plain, text/css, etc.

GTMetrix has a short but clear write-up: Enable compression | GTmetrix

For some reason while writing this post I was thinking of caching. :slight_smile:

What you said makes sense. I’ll take a look at the GTMetrix.

Question, so if your gzip component failed for a page, would you say one would
just need to make the page (code) size smaller so the gzip does not fail?

Thanks!

Depending on their content / frequency of update you may be able to cache dynamically generated pages too - plenty of news sites do this

[quote]Question, so if your gzip component failed for a page, would you say one would
just need to make the page (code) size smaller so the gzip does not fail?[/quote]

What makes you think it might fail? On-the-fly gzip compression is basic stuff for most web servers, and it’s certainly not something that’s known to fail. It’ll work regardless of the size of your text file. The bigger the file (your HTML, CSS, XML or whatever), the more processing is required to compress it, but, modern CPUs being what they are, this isn’t really an issue anymore.

Since you’re running on IIS, this might help:

I don’t mean to use the word “fail” literally, but more in the way webpagetest graded the gzipped URL page. Link below:

http://www.webpagetest.org/result/131209_6E_WS5/1/performance_optimization/#compress_text

It is saying that the page could of saved more KB, but is that due to the page size too big or the gzip algorithm needing to be better configured?

[quote=“trx, post:6, topic:8503”]It is saying that the page could of saved more KB, but is that due to the page size too big or the gzip algorithm needing to be better configured?
[/quote]

It’s due to gzip compression not being configured at all for the text/html content-type. It is enabled for your JS and CSS files.

Enabling compression will significantly decrease the size of your HTML (and other text files) in transfer. A page will be compressed on-the-fly on the server, regardless of its size, then sent (in compressed form) to the browser, which finally decompresses it. Nothing about the pages (the code) needs to be changed; all you have to do is enable gzip compression in the web server configuration.

If CSS and JS are gzipped the server is likely configured to gzip text/*

It is a Windows Server and the page is created by ASP.NET.

Even on Apache when a CGI-Script generates the page it is not the same as a static page being served.
The programmer has to gzip it and set the headers.

For example in PHP to gzip, use chunked output buffers and gzip the buffers. Add the header()

I would never use an MS development tool like .NET but I would assume that gzip is an ASP page property. something like this:

HttpContext context = HttpContext.Current;
context.Response.Filter = new GZipStream(context.Response.Filter, CompressionMode.Compress);
HttpContext.Current.Response.AppendHeader(“Content-encoding”, “gzip”);
HttpContext.Current.Response.Cache.VaryByHeaders[“Accept-encoding”] = true;

In PHP you just add one line of code and it takes care of every thing: ob_start(“ob_gzhandler”);

]

That would include text/html, which currently isn’t being compressed.

It’s generally a better idea to have the web server take care of that. Whether your page is dynamically generated or a static .html file doesn’t really matter as long as it’s served as text/html. Your web server, if set up properly, will take care of everything.

Thanks for the feedback. Is it a good practice to gzip pages served from portal systems (i.e. IIS, Apache/Tomcat, Liferay, Vignette)?

I’m not familiar with Liferay and Vignette, but the others are web servers, so: yes.

[quote=“robzilla, post:9, topic:8503”]It’s generally a better idea to have the web server take care of that. Whether your page is dynamically generated or a static .html file doesn’t really matter as long as it’s served as text/html. Your web server, if set up properly, will take care of everything.
[/quote]

[size=large]You are correct. Thank you.[/size]
Easier AND Better.

I was under the impression that the programmer had to do the gzip.
I did not know which was better so I tested.

The test Results with no .htaccess and the following PHP
[php]
ob_start(“ob_gzhandler”);
header(‘Content-Type: text/html; charset=utf-8’);
header(‘Connection: Keep-Alive’);
header(‘Keep-Alive: timeout=5, max=100’);
[/php]

[url=http://www.webpagetest.org/result/131210_61_139W/]http://www.webpagetest.org/result/131210_61_139W/[/url]

Results after adding the following .htaccess and removing the ob_gzhandler:

ob_start();

[url=http://www.webpagetest.org/result/131210_7H_138Y/1/details/]
http://www.webpagetest.org/result/131210_7H_138Y/1/details/[/url]

<ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file .(html?|txt|css|js|php|pl)$ mod_gzip_item_include handler ^cgi-script$ mod_gzip_item_include mime ^text/.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_include mime ^application/javascript.* mod_gzip_item_exclude mime ^image/.* mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* </ifModule>

The WPT results have too many variables to make a fair assessment.

Further testing showed that there is no performance difference between the two methods. The next test method I ran a PHP script that retrieved the test pages alternately 60 times each by IP address (no DNS). Running the test on the same Server as the test pages nearly eliminated the other variables of connect time and transmission. The two test pages are identical other than the gzhandler so the HTML generation will be identical This pretty much leaves just the gzip process as the only test variable with everything else being nearly constant. The test results “Wait” time column represents the time to run the script and gzip the generated HTML.

The results, as shown on the attached PDF, are very conclusive the two gzip methods are identical with negligible difference in the total load and Wait times two test times.

The “better” is not performance related but it’s always better to have options. And better to know what the options are. After giving it consideration I doubt I’m going to use the .htaccess method and will continue to use the PHP gzhandler. I sometimes use PHP to deliver very small (less than 200 bytes) and or binary payloads (e.g. images, PDF) where gzip would be an inappropriate waste of time.

My next test will be to see if the PHP routines responds with a content type of image/.* test whether the image content type in the header will hold precedence over the php extension the gzip routine.

I mostly meant “better” in the sense that the web server is better suited to perform this kind of task because they are usually written in compiled languages (Apache and nginx are written in C), which are faster than interpreted languages like PHP. It’s not a difference you’ll notice on a single request, but it ought to be noticeable when you’re serving hundreds or even thousands of requests per second.

In that sense, it is performance-related, but mostly it’s just easier to keep that logic out of your own code.

(Not unlike configuring Apache via httpd.conf rather than using .htaccess files; more centralized and slightly better, performance-wise.)

Good stuff! Thanks.

I notice that our images are being gzipped by default.

http://www.webpagetest.org/performance_optimization.php?test=150402_AJ_CZ2&run=1&cached=0

Should I attempt to strip that out from the response header?

Thanks.