On Tue, Dec 19, 2000 at 05:38:29PM +0000, John McCormac mentioned:
> Is there some upper limit in Gzip that prevents it from uncompressing
> files bigger than 2G ? I've been trying to uncompress (gzip -d
> filename>outfile) a large file for the last day or two and it seems to
> barf when it gets to the 2G mark. As far as I know the -d switch
> decompresses to stdout and the problem seems to be there.
Nah, that's a Linux problem. On Intel, it doesn't support files greater
There are patches, which do give a big performance hit for other stuff,
which will let you use big files.
Installing the RedHat "Enterprise" kernel will enable this.
When I say 'free', I mean 'free': free from bond, of chain or command:
to go where you will, even to Mordor, Saruman, if you desire. "
-- Gandalf, paraphrasing the choice between Free and Non-free software
Maintained by the ILUG website team. The aim of Linux.ie is to
support and help commercial and private users of Linux in Ireland. You can
display ILUG news in your own webpages, read backend
information to find out how. Networking services kindly provided by HEAnet, server kindly donated by
Dell. Linux is a trademark of Linus Torvalds,
used with permission. No penguins were harmed in the production or maintenance
of this highly praised website. Looking for the
Indian Linux Users' Group? Try here. If you've read all this and aren't a lawyer: you should be!