Re: Copying and taring large amounts of data
hi ya osamu...
gnu tar ( 1.13.19 ) does not have any problems transfering
4GB sized files... from machine-A (linux w/ ext2) to machine-B ( ext2 )
( tar --version )
i think some older gzip has a problem w/ > 2GB files though
if you're transfering vfat files... thats a different problem...
if coping vfat files... you've got to use find /vfat -print | tar .. -T -
so solve the problems wiht "tom's budget for $$$ next year" type filenames
backing up winbloze to linux is not an issue... but linxu to winbloze
is a bad idea due ot permission problems
- retaining permission/attributes is NTO a problem until it
needs to be restored ... and if it needs to be restored...
soemthing else is wrong ??
have fun linuxing
alvin
http://www.Linux-Backup.net ...
On Mon, 18 Feb 2002, Osamu Aoki wrote:
> On Mon, Feb 18, 2002 at 09:15:33PM -0500, Alan Shutko wrote:
> > Osamu Aoki <debian@aokiconsulting.com> writes:
> >
> > > Many utilities still have 2GB file size limitation hidden somewhere,
> > > even though kernel should be able to handle large files.
> > >
> > > So just do not listen to other posts suggesting "tar ..." or similar.
> >
> > The version of tar in woody doesn't have the 2GB limitation with a 2.4
> > kernel. I don't know about the version in stable.
>
> This is encouraging. But if a person is copying file from vfat to ext2,
> isn't it more reasonable to copy 1-by-1. No permission needs to be
> reserved anyway. Some filename limitations (codepage) may even cause
> problem...
>
Reply to: