[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: problems with tar for backup (maximum tar file size?)



On Thu, May 29, 2008 at 10:46:10PM -0400, Jimmy Wu wrote:
> I haven't been backing up any of my stuff, and yesterday I decided to
> start doing that
> I want to use tar with bz2, and I wrote this little script to
> hopefully automate this process (attached)
> The script works, but tar doesn't.  The logs show no errors until
> somewhere near the end, when it says
> tar: Error exit delayed from previous errors
> but no other errors.
> 
> I've been searching online, and the only thing I can think of that's
> wrong is the directory is too big.  From what I read, the way tar
> works, the tar archive can't be bigger than 8GB.  My home directory is
> about that much, maybe a little more.  The largest file I have is a 2+
> GB dvd iso.
> 
> So I was wondering: (1) Is it true that tar files can't be bigger than
> 8GB, and (2) If so, what should I use to backup directories bigger
> than 8GB?  I wanted to stick with tar because I can open those on
> other platforms.  If directory size isn't the problem, then what could
> be going on?

Read the tar info docs (tar-doc is contrib or non-free, I forget which,
and is in info format so you either need info or pinfo to read it).  The
maximum archive size depends on the archive format you are creating,
however neither gnu or posix format has this limitation so this
shouldn't be a problem.

I get that error if I have a hanging symlink (a symlink that points to a
non-existant file) which I am telling tar to backup.  Check in the
directories which you are backing up for such links.  However, even with
the error, I get a functional tarball at the end.

What happens if you add the -v verbose option?  Do you see more detail?

Doug.


Reply to: