[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: website saver for linux?

On Sun, Aug 31, 2008 at 12:02 PM, Philip <subs@christiantena.net> wrote:
> I'm looking for a tool which spiders a site, and downloads every page in
>  the domain that it finds linked from a particular url and linked urls
> in the domain, creating a local site that can be manipulated offline as
> static html.
> Is there such a tool for linux (better still debian)?

It sounds like you want
$ wget --mirror -k <base URL>

Michael A. Marsh

Reply to: