[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: website saver for linux?



On Mon, 01 Sep 2008 14:53:38 +0530
Sabarish Balagopal <sabarish.balagopal@gmail.com> wrote:

> Ron Johnson wrote:
> > On 08/31/08 11:02, Philip wrote:
> >> I'm looking for a tool which spiders a site, and downloads every
> >> page in the domain that it finds linked from a particular url and
> >> linked urls in the domain, creating a local site that can be
> >> manipulated offline as static html.
> >>
> >> Is there such a tool for linux (better still debian)?
> >
> > $ wget -m -k -L -np http://www.example.com
> >
> > I run this every week on a certain site that I want to archive the 
> > contents of.  The first time you run it, the whole site gets 
> > mirrored.  Each subsequent run, only new and modified pages are
> > fetched.
> >
> 
For interests sake you might also want to look at htttrack(sp?)

Regards,

Daniel
-- 
And that's my crabbing done for the day.  Got it out of the way early, 
now I have the rest of the afternoon to sniff fragrant tea-roses or 
strangle cute bunnies or something.   -- Michael Devore
GnuPG Key Fingerprint 86 F5 81 A5 D4 2E 1F 1C      http://gnupg.org
The C Shore: http://www.wightman.ca/~cshore

Attachment: signature.asc
Description: PGP signature


Reply to: