37

I am trying to enable gzip compression for components of my website. I use Ubuntu 11.04 server and Nginx 1.2.

In my Nginx configuration of the website, I have this:

gzip             on;
#gzip_min_length  1000;
gzip_http_version 1.1;
gzip_vary on;
gzip_comp_level 6;
gzip_proxied any;
gzip_types text/plain text/html text/css application/json application/javascript application/x-javascript text/javascript text/xml application/xml application/rss+xml application/atom+xml application/rdf+xml;
#it was gzip_buffers 16 8k; 
gzip_buffers 128 4k; #my pagesize is 4
gzip_disable "MSIE [1-6]\.(?!.*SV1)";

Yslow and Google speed measures advise me to use gzip to reduce transmission over the network.

Now when I try to curl -I my_js_file, I get:

curl -I http://www.albawaba.com/sites/default/files/js/js_367664096ca6baf65052749f685cac7b.js
HTTP/1.1 200 OK
Server: nginx/1.2.0
Date: Sun, 14 Apr 2013 13:15:43 GMT
Content-Type: application/x-javascript
Content-Length: 208463
Connection: keep-alive
Last-Modified: Sun, 14 Apr 2013 10:58:06 GMT
Vary: Accept-Encoding
Expires: Thu, 31 Dec 2037 23:55:55 GMT
Cache-Control: max-age=315360000
Pragma: public
Cache-Control: public
Accept-Ranges: bytes

Any idea of what I have done wrong or what shall I do to get compressed content?

9
  • 3
    Could this be caused by this line: gzip_http_version 1.1;? What happens if you change it to 1.0?
    – Chuan Ma
    Commented Apr 15, 2013 at 15:41
  • Thanks Chuna for your suggestion (thumb up :) ) now google speed insight doesn't give the warning although curl -I gives the same result
    – Alaa
    Commented Apr 16, 2013 at 7:26
  • and google is still giving albawaba.com/countries_list as uncompressed as well
    – Alaa
    Commented Apr 16, 2013 at 8:09
  • 1
    Did you ever end up figuring this one out? Commented Aug 11, 2013 at 10:20
  • 2
    @Drew, I just changed gzip_http_version 1.1; to be gzip_http_version 1.0; and then it worked
    – Alaa
    Commented Aug 11, 2013 at 15:27

9 Answers 9

35

As others have written, it's not enough to enable gzip compression in your server -- the client also needs to ask for it in its requests via the Accept-Encoding: gzip header (or a superset thereof). Modern browsers include this header automatically, but for curl you'll need to include one of the following in your command:

  • -H "Accept-Encoding: gzip" : You should see the Content-Encoding: gzip header in the response (might need to output headers with curl's -v flag), as well as some seemingly garbled output for the content, the actual gzip stream.
  • --compressed : You should still see Content-Encoding: gzip in the response headers, but curl knows to decompress the content before outputting it.
3
  • 4
    Strange thing. My Nginx on AWS Ubuntu instance doesn't set Content-Encoding: gzip header when I do a request from a browser. But when I do a curl -H "Accept-Encoding: gzip" -I http://example.com, Nginx setsContent-Encoding: gzip header just fine. Why? And content seems to be gziped though the header is not set. I checked with Google's PageSpeed and it doesn't complain about no gzip.
    – Green
    Commented Dec 9, 2015 at 14:04
  • Hard to say. The various servers have different logic around this, e.g. some blacklist certain User-agents or Content-types from gzip encoding. In other cases it might be there but the browser hides it from the user. So it really depends. :)
    – lot
    Commented Dec 10, 2015 at 18:33
  • I'm experiencing the same behavior as @Green, Nginx reports gzip compression in the logs, and for CURL requests I see the Content-Encoding: gzip header in responses, but not when the requests are made from the browser, in the Network tab in Chrome Developer tools.
    – Marc
    Commented Jul 4, 2016 at 9:24
16

I can't find anything obviously wrong with your config, usually gzip on & gzip_types application/x-javascript would be enough to get you going. If everything is working right you'll get a "Content-Encoding:gzip" returned back to you.

PLEASE KEEP IN MIND: I have much more consistency with GOOGLE DEVELOPER TOOLS (curl just doesn't behave the way a browser would).

In Chrome, right click and go to "inspect element" then go to "network" (then reload the page if you have to), then click on a resource and check the header tab, the output should look like this (notice the content-encoding is gzip, yay):

Request URL:https://ssl.gstatic.com/gb/js/sem_a3becc1f55aef317b63a03a400446790.js
Request Method:GET
Status Code:200 OK (from cache)
Response Headersview source
age:199067
cache-control:public, max-age=691200
content-encoding:gzip
content-length:19132
content-type:text/javascript
date:Fri, 12 Apr 2013 06:32:58 GMT
expires:Sat, 20 Apr 2013 06:32:58 GMT
last-modified:Sat, 23 Mar 2013 01:48:21 GMT
server:sffe
status:200 OK
vary:Accept-Encoding
version:HTTP/1.1
x-content-type-options:nosniff
x-xss-protection:1; mode=block

Anyway if you are SURE your content is not getting gzipped, I normally get up and running pretty fast with the following:

## Compression
gzip              on;
gzip_buffers      16 8k;
gzip_comp_level   4;
gzip_http_version 1.0;
gzip_min_length   1280;
gzip_types        text/plain text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript image/x-icon image/bmp;
gzip_vary         on;

You could try this in replacement for your code, and/or tweak your values one at a time to help you localize your issue.

Remember to restart or reload nginx after changing the config.

It may also be useful to check your logs and see if there's anything interesting there should you still be stuck.

5
12

I just changed gzip_http_version 1.1; to be gzip_http_version 1.0; and then it worked

1
  • 4
    It should be mentioned here that this answer isn't proper problem solution, it's more likely to be workaround. Setting gzip_http_version to 1.0 can bring some problems, for more info read i.e. serverfault.com/questions/418693/…
    – MBI
    Commented Jun 28, 2017 at 12:21
9

I had to enable gzip in my /etc/nginx/nginx.conf configuration:

gzip on;
gzip_disable "msie6";

gzip_types text/plain text/css application/json application/javascript application/x-javascript text/xml application/xml application/xml+rss text/javascript;

Please note that I had to add application/javascript to the standard gzip_types configuration.

8

Here is my nginx configuration and it works:

gzip                on;
gzip_min_length     1000;
gzip_buffers        4 8k;
gzip_http_version   1.0;
gzip_disable        "msie6";
gzip_types          text/plain text/css application/json application/javascript application/x-javascript text/xml application/xml application/xml+rss text/javascript;
gzip_vary           on;

I think the keypoints are gzip, gzip_disable and gzip_types.

2
  • "are gzip_disable, gzip_disable "? 2x disable is important? Commented Apr 26, 2018 at 20:40
  • maybe he meant gzip on for one of those? Commented Nov 23, 2018 at 16:41
5

You need to run:

curl -I --compressed my_js_file

to make curl send an Accept-Encoding header for gzip - the server will only compress content if the client sends a header saying it will accept it.

NB you can write:

gzip_disable "msi6"

rather than using a regex to disable in IE 5.5 and 6, and you needn't specify text/html as a type because it is always compressed as long as gzip is activated.

5
  • curl --compressed --head albawaba.com/sites/default/files/js/… Gives: HTTP/1.1 200 OK Server: nginx/1.2.0 Date: Sun, 14 Apr 2013 16:59:34 GMT Content-Type: application/x-javascript
    – Alaa
    Commented Apr 14, 2013 at 17:00
  • I suspect that isn't the full output. It is the Content-Encoding header you need to check for, not Content-Type. If I use curl --compressed on a local nginx server I get "Content-Encoding: gzip", but no Content-Encoding header if I do not include the --compressed option
    – junichiro
    Commented Apr 14, 2013 at 18:05
  • also, maybe too obvious, but make sure you restart the web server after editing the config file
    – junichiro
    Commented Apr 14, 2013 at 19:34
  • sure i restarted it :), but as you can see in developers.google.com/speed/pagespeed/… , google speed analyzer telling me that my components are not gziped as well
    – Alaa
    Commented Apr 14, 2013 at 20:03
  • I think chue x has the answer, though you still need to use --compressed if you want to use curl.
    – junichiro
    Commented Apr 14, 2013 at 22:55
5

I am just taking a guess here, but I think you may have to increase your gzip buffer size.

Here are the files that the browser pulls down from the domain. The number on the right is the file download size.

Files served from domain

You may not be able to tell from the screen shot, but all of the text content files ARE gzipped, except for the js file you mention in your question. In the screenshot the js file is the file in green, with a size of about 200K. This file size is greater than what you have specified for your gzip buffers (128K).

The Gzip module docs do not really give a good indication as to what the gzip buffers are used for (whether the buffers are used for uncompressed or compressed data). However, the following post seems to indicate that the buffer size should be greater than the uncompressed file size: Large files with NGINX, GZip, and SSL

6
  • Thanks Chue, I have raised gizip_buffers (as shown in the question above [edited]), and still the same problem, Please advise if you have any thought
    – Alaa
    Commented Apr 15, 2013 at 7:55
  • by the way, i really appreciate your help, but even coutries_list haven't been gzipped although it is smaller than 128K...
    – Alaa
    Commented Apr 15, 2013 at 8:21
  • @Alaa - You are correct about countries_list not being gzipped. I missed that. I have a test nginx setup where I put both files and they are both gzipped. My gzip config only has two lines: gzip on and gzip_types .... Also, is it possible that the files are cached by a proxy in front of your nginx server? Can you try curl on the nginx machine (curl ... http://127.0.0.1/.../js_36..js)?
    – chue x
    Commented Apr 15, 2013 at 15:25
  • You should check whether the response contains the header Content-Encoding: gzip
    – Flimm
    Commented Aug 1, 2017 at 9:54
  • last link "Large files with NGINX" doesn't seem to work Commented Sep 3, 2018 at 23:44
2

Just like Alaa I had to add gzip_http_version 1.0; (no version was previously specified) for it to work (I tried on Firefox 27.0.0).

1

I've experienced the same problem as Alaa, and the problem is caused by Antivirus software, that is currently installed on my computer.

Proxy servers and anti-virus software can disable compression when files are downloaded to a client machine. So if you are running web site in a browser on a client machine that is using such anti-virus software, or that sits behind an intermediate proxy server (many proxies are transparent, and you may not even be aware of a proxy intervening between your client and web server), they may be the cause of this issue.

Disabling antivirus solved my problem with browsers and you don't even need to set gzip_http_version to 1.0.

Hope that will help you.

Not the answer you're looking for? Browse other questions tagged or ask your own question.