on Sun, Dec 09, 2001 at 06:48:46AM +0800, csj (csj@mindgate.net) wrote:
> On Saturday 08 December 2001 17:28, Karsten M. Self wrote:
> > on Fri, Dec 07, 2001 at 10:27:48PM +0800, csj (csj@mindgate.net)
> wrote:
> > > lav2wav +p /xb/base/input-20011207-1948.avi | mp2enc -o audio.mp2
> > > INFO: Norm set to PAL
> > > **ERROR: Error opening /xb/base/input-20011207-1948.avi: File too
> > > large **ERROR: EOF in WAV header
> > > **ERROR: failure reading WAV file
> > >
> > > The program comes from a package which claims to have large file
> > > support, which I enabled at compilation:
<...>
> > > My question: is it the OS or the program's fault? If it means
> > > anything, other programs (all encoders) return a more generic
> > > "Could not open" error message. What puzzles me is that I was able
> > > to create the file thru another program (xawtv's streamer).
<...>
> > Large file support is a complex situation, not something you can just
> > turn on and off. Essentially, the problem is that all parts of the
> > process chain have to support it.
> >
> > Not familiar with the tools you're using, but you might want to try
> > doing other operations on a large file or files to see what does or
> > doesn't work with it. Is your program invoking another application
> > which may not have LF support?
>
> Let me see. I tried something like (actual bash session is now fuzzy):
>
> mkfifo movie.avi
> dd if=original-4GB-capture.avi of=movie.avi
> ffmpeg [option, options] movie.avi
>
> This lets ffmpeg encode half of the one hour and a half test movie.
> Several retries convince me that it's not a random segfault. I even
> diff'ed the resulting files, same tropical fruit. Since 4GB * half =
> magic 2GB limit, I have the suspicion that it's either an error in the
> video stream (say an EOF inserted in the middle of the file by the
> capture program, streamer) or a problem with the OS.
>
> What may be of further interest: applying ffmpeg on the raw file
> (original-4GB-capture.avi) results in an error message (to the effect)
> that the file can't be found.
>
> Side question: are the GNU utilities afflicted by the 2 GB limit? Can
> dd properly handle 2GB+ files.
Easy to test:
$ dd if=/dev/zero of=test bs=1024 count=4200000
Fails on my sysstem:
[karsten@ego:karsten]$ dd if=/dev/zero of=bigfile bs=1024 count=4200000
dd: writing `bigfile': File too large
2097152+0 records in
2097151+0 records out
Peace.
--
Karsten M. Self <kmself@ix.netcom.com> http://kmself.home.netcom.com/
What part of "Gestalt" don't you understand? Home of the brave
http://gestalt-system.sourceforge.net/ Land of the free
Free Dmitry! Boycott Adobe! Repeal the DMCA! http://www.freesklyarov.org
Geek for Hire http://kmself.home.netcom.com/resume.html
Attachment:
pgpRrK5JhoEyG.pgp
Description: PGP signature