Re: slurping a file
On 06:44 Fri 01 Jun , Hugo Vanwoerkom wrote:
> I want to download an file that consists of many pieces that are
> referred to by the main index.html.
> One way is to do it manually, that takes days and is prone to errors.
> Is there a "slurp" application that walks through the file and downloads
> the pieces?
1. wget can be used to grab a file or bunch of files. See man wget for options
such as to read from a file of urls for instance. You might have to preprocess
the main url and make a list of the urls, or perhaps they are stored in a rational manner on the server
(a list a mp3 in one directory for instance).
2. the perl LWP module and others like it can be programmed nicely, if you know perl, which
can do it very nicely...
> To UNSUBSCRIBE, email to debian-user-REQUEST@lists.debian.org
> with a subject of "unsubscribe". Trouble? Contact