[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: how do I extract a 2.6 gigabyte .tar.gz file ?



On 28 Oct 1998, Gary L. Hennigan wrote:

> That was an excellent idea, unfortunately Darxus has already tried
> this and it didn't work for him. Perhaps gzip tries to read the whole
> file and even though, in your case, the file is truncated it'll do
> what it can. In Darxus' case that means it's trying to read past the
> 2GB limit and that's a no-no under 80x86 based Linux systems.
> 
> However, now knowing that gzip will in fact decompress a file that's
> lost it's tail, Darxus could try to write a little C program that calls
> truncate() to truncate his file to around 2GB (a little less might be
> a good idea) and see what he can do with it. 
> 
> Of course I'd treat this idea as a last resort. I have NO idea if
> Darxus can copy that file, for a backup before trying the truncate()
> thing, and it'd be a Bad Thing (TM) if he truncated the existing file
> only to find out it wouldn't work. Plus, I don't know if truncate()
> will work on a file greater than 2GB?

Unfortunately I cannot back it up.  I only have 4.3gb of fat32 space
total.  And it would be my guess that truncate() would need to use the
function that I'm guessing every other program is failing on because it
can only handle 2^31 bytes.  I'd love to just see somebody who knows what
they're doing patch the libs to be able to handle, say, 2^63 byte files...
Is that even doable ?  Or would an unsigned long int work ?  

It's sad, out of this 2.6gb file, I'm only interested in probably less
than a megabyte of it :)  -- dunno if those files are at the beginning or
the end.   
________________________________________________________________________
***PGP fingerprint = D5 EB F8 E7 64 55 CF 91  C2 4F E0 4D 18 B6 7C 27***
               darxus@op.net / http://www.op.net/~darxus 
                              Chaos reigns.



Reply to: