[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Backup thousands of files 1 at a time with growisofs?



Volker Kuhlmann wrote:

I would particularly like to have my music uncompressed, untarred so I
can just put the dvd into a machine and play them.

This would require each disk to have an in itself complete iso fs with a
subset of the files. Each disk may be slightly underfull.

http://www.serice.net/shunt/

Brilliant piece of software by the looks of it. Thanks for the URL!
Check whether it creates disks with a complete filesystem, or whether it
creates one huge filesystem with a piece of the filesystem on each disk.
In the latter case individual disks wouldn't be accessible.

Alternatively, you could hack up some script which reads through your
music directory, sizing up as many files as fit on one disk, burn that
disk, and then proceed with the next file in the directory.

I have a generalized perl script which reads a file in size-name format, and generates output file(s) of items which will fit on a single media. The size of a media and the per-item overhead are command line parameters. This isn't limited to files, I use "du -S" to generate backups with all of a single directory on a single media. It also tries to equalize the content on each media, so you don't get a bunch of full media ending with one having virtually nothing. That's just my preference, it could be optional (and may be, I don't have the code in front of me).

I originally wrote it in awk a few decades ago when I run a UNIX BBS and backed up my 20MB hard drive to 400k floppies. World changed, problems unchanged, backup media is still too small.

In either case you have no control over which file goes on which disk,
i.e. it'll be somehwat random.

Yes, that's a problem. And the processing time needed to get a really perfect packing can be great if the data just fit on N media (or just don't quite fit).

--
E. Robert Bogusta
 It seemed like a good idea at the time



Reply to: