[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: website saver for linux?



On Sun, Aug 31, 2008 at 05:02:01PM +0100, Philip wrote:
> I'm looking for a tool which spiders a site, and downloads every page in
>  the domain that it finds linked from a particular url and linked urls
> in the domain, creating a local site that can be manipulated offline as
> static html.
> 
> Is there such a tool for linux (better still debian)?
> 

wget should do the trick

Philippe


Reply to: