1

In Unix/Linux is there any max files size limit that a compression utility ( gzip/compress) can compress. I remember years ago it was mentioned in the gzip page that it can compress files up to 4 gb. Actually I need to compress fillies of around 512 GB regularly. I tested few files with compress utility and found hash code(MD5) of the DB files before compress and after un-compress are same.

2 Answers 2

2

gzip nowadays can compress files larger than 4 GiB in size, and in fact doesn’t have any limit of its own really (you’ll be limited by the underlying file system). The only limitation with files larger than 4 GiB is that gzip -l, in version 1.11 or older, won’t report their size correctly; see Fastest way of working out uncompressed size of large GZIPPED file for an alternative. This has been fixed in gzip 1.12; gzip -l decompresses the data to determine the real size of the original data, instead of showing the stored size.

There are many other compression tools which provide better compression and/or speed, which you might find more appropriate: XZ, 7-Zip...

1

Gzip is concatenatable stream comptression (see "advanced usage" in the man page) so if the algorithm hits a hard encoding limit (*) it can just end the current stream and start a new stream.

So there is no hard limit on data size in the gzip itself

(* I don't know enough about Xflate to say if there is a limit or not)

0

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .