Unidentified subject!
From: Michael Stone <mstone@itri.loyola.edu>
> Quoting Darxus (darxus@Op.Net):
> > Any chance I can split this thing into like, 2 pieces, and be able to
> > access half of it ?
>
> First half--no problem; second half--no go. Did you try
> <filname tar zxf -
Just tried it, and it didn't work.
BTW: I found the command split, and tried breaking a .tar.gz in half to
see if I could still extract some of it, and, as you said, with the test
file, I was able to extract the first half. Unfortunately, split isn't
able to deal with a 2.6gb file either.
--------------------------------------------------
From: Andreas Neukoetter <ti95neuk@de.ibm.com>
> Try to patch your gzip/cat/less (for whatever you have the source).
hey, I'm running Debian, I can get all source :)
But would I need to patch gzip/tar/cat/less, or would it be libc I'd be
patching ? I'm guessing libc... and how much of a bear is that to
recompile ?
> The kernel definitly can handle files up to 4gig.
> i think some "old" tools use "long int" for the filepos ... which gets
> you into trouble
> when trying to use the high/sign-bit (2 gig => 2^31, 4 gig => 2^32).
Is it safe to just use an unsigned long (is the +/- bit being used ?) ?
Wait... is there a signed integer type larger than a signed long ?
--------------------------------------------------
From: Keith Beattie <ksb@icds.org>
> Hang on a second here, if the file exceeds the size limit of the file
> system, how did it get there in the first place? If nothing in libc
> can grok it, what created it?
I created it on a fat32 filesystem with a command similar to:
tar -zcvf /mnt/c/home.tgz /home
with a fat32 filesystem mounted at /mnt/c
> Something is amiss here. What I understand you to be saying is
> something like this: "I built a desk in my garage that I want to give
> it to a friend but I can't get it out of my garage because it is
> bigger than my garage!"
Yeah, something like that. Aparently tar/gz don't have a problem
continuing to append to files as they go over 2gb, as long as they were
less when the file was originally opened (in this case, 0).
> Are you sure the size is truly over 2 Gigs?
ls -l
total 2699568
-rwxrwxr-x 1 root root 735435 Oct 10 19:14 etc.tgz
-rwxrwxr-x 1 root root 2666693212 Oct 10 21:10 home.tgz
-rwxrwxr-x 1 root root 96898501 Oct 10 19:43 old.tgz
-rwxrwxr-x 1 root root 20829 Oct 10 19:14 root.tgz
> How about "od" or writing your own C or perl "cat" program to see
> where they fail?
od home.tgz
0000000
I'm not familiar w/ od, but I'm guessing that output means it wasn't all
that successful...
I'm pretty confident I know where it's failing -- I have recollections of
code segments, it's just been a while since I've seen them. I'm guessing
it's where it originally tries to open the file.
--------------------------------------------------
From: Hamish Moffatt <hamish@debian.org>
> Perhaps it's still ON the fat32 file system.
Yup, I had tried copying it back to my ext2 filesystem, and it ended up
iwth a file size of 0.
________________________________________________________________________
***PGP fingerprint = D5 EB F8 E7 64 55 CF 91 C2 4F E0 4D 18 B6 7C 27***
darxus@op.net / http://www.op.net/~darxus
Chaos reigns.
Reply to: