[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Issues brining BD disks from the command line - write failures



Volker Kuhlmann wrote:
On Thu 09 May 2013 19:01:12 NZST +1200, Thomas Schmitt wrote:

This happens only with CDs which were written in write type TAO.
Ehh, I'm very sure I've seen it with DVDs too, and the read-ahead size
there was larger.

Nevertheless, that is a _read_ problem. Dale has a problem with
write errors.
Sure, but you asked him to test afterwards by reading back.

The read-ahead bug has never been observed with
DVD or BD, anyway.
I have to disagree for DVD, and can't speak for BD, not having tried it.

To my experience, 128 KB is enough. Tradition is 300 KB, out of
a wrong perception of Linux bug and MMC specs.
Actually it depends on the size of reading ahead. So it might vary.
I got so sick of it, I set the value in my script to 2MB to be done with
it. I know it's too big, but I don't care.

And what are the options for UDF (which is becoming increasingly
necessary)?
mkudffs and cp.

But for what, particularly ?
Random-file-access backups. TBH I stopped burning because 4.2GB isn't of
much use these days, but wouldn't mind burning some larger disks. I used
ext2 in the past, useless for reading from, but good enough for dd'ing
back to disk before reading. With larger sizuseless for reading from
es that becomes a bit
annoying.


Why "useless for reading from?" What problems do you have when mounting read-only for use? Haven't done it in a while, but I don't recall doing anything magic, and I certainly have used such media to preserve odd filesystem things like hard links, ACLs, etc. I create an empty file, make a filesystem on it, mount it, copy what I need, umount it, and burn. I had to do a bunch of those two years ago.

--
E. Robert Bogusta
  It seemed like a good idea at the time


Reply to: