If this has been discussed recently, could someone point me to a rough date and I can search the archives.
I'm having trouble with large files (ie > 2GB), and am trying to figure the best way to sort it. I'm gonna make a few asumptions so correct me where I'm wrong :-
Things I need in order to support large files
* I'm running kernel 2.4.6, which I believe supports large files
* I assume I need libs that can handle large files, I believe that should be glibc >= 2.2
- How do I check what version of glibc I have?
* And finally, the application I'm trying to manipulate the large file with should be compiled for large files, eg if I want to "gzip" a file, then gzip should be compiled against glibc >= 2.2??
What happens is, I have a piece of 3rd party software that runs a backup every night, it creates a single file and now that file has grown to be larger than 2 GB, so with the result that I get a file of size 2147483648 (the 2GB limit I assume) when I know it should be bigger. This 3rd party software does use it's own precompiled tools (ie tar and gzip) which may not have been compiled for use with large files. I'm just trying to find where the problem occurs, ie is it the kernel? the tool? the library?
Maintained by the ILUG website team. The aim of Linux.ie is to
support and help commercial and private users of Linux in Ireland. You can
display ILUG news in your own webpages, read backend
information to find out how. Networking services kindly provided by HEAnet, server kindly donated by
Dell. Linux is a trademark of Linus Torvalds,
used with permission. No penguins were harmed in the production or maintenance
of this highly praised website. Looking for the
Indian Linux Users' Group? Try here. If you've read all this and aren't a lawyer: you should be!