Re: how do I extract a 2.6 gigabyte .tar.gz file ?
Darxus <darxus@Op.Net> writes:
| On 27 Oct 1998, Gary L. Hennigan wrote:
|
| > | > I felt like checking. Oops. When I reinstalled & tried to
| > | > restore it, I
| > | > found out that gzip can't seek to the end of the file (dies
| > | > around 2gb?).
| > |
| > | You can force gzip to handle it as a stream. Try something like:
| > |
| > | cat tarfile.tgz | gunzip -c | tar xvf -
| > |
| > | The "-c" tells gunzip to pipe it to stdout, and the "xvf -" tells tar
| > | to verbosely extract the file coming into stdin. If I understand your
| > | problem correctly, this should work.
| >
| > It's even simpler:
| >
| > gzip -d -c tarfile.tgz |tar xvf -
| >
| > or, if you're using GNU Tar, and you are under Debian):
| >
| > tar xzvf tarfile.tgz
|
| They are both lovely suggestions, unfortunately the problem is a bit more
| substantial. The 1st thing I tried was "tar -zxvf home.tgz", and a couple
| of the things I tried soon after that were cat and less. Neither of which
| read any of it -- less is the only thing that did anything useful, which
| was saying "Cannot seek to that file position" which made me think "hmm,
| did I hear something about stuff not being able to seek past 2gb??"
Yikes!
I missed your original post. That's what I get for replying to you via
someone else's reply and not reading the subject closely enough. Duh?
Where on earth did you store this file? I could've sworn the ext2fs
had a 2GB/file limit on it? Certainly all the file utilities do. It
has to do with the file pointers being 32-bit signed integers. Nothing
that uses a libc call is going to be able to read beyond 2^31 bits
(which is exactly 2GB) on a 80x86 based system, e.g.
Pentium/PentiumII/386/486 based systems.
| You might think that it would sit there chewing on the file for a bit
| before it got to some point beyond what it could deal with. Nope. Didn't
| even start -- failed to even open the file up.
No, compressed files have to be read entirely before being
uncompressed. Or at least that's a requirement of most compression
schemes. Don't recall the exact reason though. Something about storing
the keys at the end of the file?
| Any more ideas ? :)
Only thing I can think of is getting access to a 64-bit machine,
decompressing the file there, tarring the contents off to tape and
then restoring them on your machine. Or at least putting them into
sub-2GB chunks before taking them back.
You might have some luck under Windows, though I have a deep suspicion
that you'll get the same results?
Sorry!
Gary
Reply to: