Leverage browser caching of static assets. 1 error.

Getting this error only on the root part of our domain. (so have a score of 95% for this category)

FAILED - (No max-age or expires) - http://www.mydomain.com/

I am using NGINX. Everything else has a green checkmark. (css file and images) Its just the root domain.

A server admin set it up in the NGINX config file but all i can see related to NGINX is the following:

"proxy_cache_valid 200 2m;

limit_conn_zone $binary_remote_addr zone=lz:10m;
limit_conn lz 60;

limit_req_zone  $binary_remote_addr  zone=eigth:4m   rate=7r/m;
limit_req_zone  $binary_remote_addr  zone=fourth:4m   rate=15r/m;
limit_req_zone  $binary_remote_addr  zone=half:4m   rate=30r/m;
limit_req_zone  $binary_remote_addr  zone=one:4m   rate=1r/s;
limit_req_zone  $binary_remote_addr  zone=two:4m   rate=2r/s;
limit_req_zone  $binary_remote_addr  zone=three:4m   rate=3r/s;
limit_req_zone  $binary_remote_addr  zone=four:4m   rate=4r/s;
limit_req_zone  $binary_remote_addr  zone=five:4m   rate=5r/s;
limit_req_zone  $binary_remote_addr  zone=six:4m   rate=6r/s;
limit_req_zone  $binary_remote_addr  zone=seven:4m   rate=7r/s;
limit_req_zone  $binary_remote_addr  zone=eight:4m   rate=8r/s;
limit_req_zone  $binary_remote_addr  zone=nine:4m   rate=9r/s;
limit_req_zone  $binary_remote_addr  zone=ten:4m   rate=10r/s;
limit_req_zone  $binary_remote_addr  zone=twelve:4m   rate=12r/s;
limit_req_zone  $binary_remote_addr  zone=fifteen:4m   rate=15r/s;"

No idea what that is but its working for everything but the root of our domain.

Anyone can get us up to 100%?

You probably don’t want your root domain cached at all. The only reason it is generating an error is because WebPagetest thinks it is static since there are no explicit nocache headers. Is the site php or some other CGI?

Actually the page IS static.

How do i make explicitly no cache headers in NGINX?

This red X is really bothering me :smiley:

You should be able to just define the regex for the URL structure you want to cache and add the expires directly:

    # Set image format types to expire in a very long time
    location ~* ^.+\.(jpg|jpeg|gif|png|ico)$ {
        expires max;


Not sure what your URL structure looks like for the HTML but that should give you the basics (maybe something more conservative than max for the expires time).

Well like i said the problem i am getting is not for the images…its for the root level domain "/’.

That is why my score is 95/100. The only thing not being cached is the root domain main page itself. (images are being cached fine)

Yes - modify the location regex to include the path to the main page:

location = / {
        expires max;


Hi Again

I just found this in the etc/Nginx/Vhosts config file for our domain name:

location ~* ^.+.(js|css|png|jpg|jpeg|gif|ico|htm|html)$
FileETag off;
expires 30d;
proxy_cache mydomain.com-static;

I was wondering what needs to be changed and removed from above there. I think that |htm|html can be removed as i think our host tried to fix it by doing that. The FileEtag is another problem were having as i don’t think that thing works also…but as for the issue in this ticket…am i missing a “/” somewhere?

Please advise. I really would like a solution to this.

Ok i fixed it…you’re right the

location = / {
expires max;

code worked…i just added it underneath my existing images location block