[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: website saver for linux?



On 08/31/08 11:02, Philip wrote:
I'm looking for a tool which spiders a site, and downloads every page in
 the domain that it finds linked from a particular url and linked urls
in the domain, creating a local site that can be manipulated offline as
static html.

Is there such a tool for linux (better still debian)?

$ wget -m -k -L -np http://www.example.com

I run this every week on a certain site that I want to archive the contents of. The first time you run it, the whole site gets mirrored. Each subsequent run, only new and modified pages are fetched.

--
Ron Johnson, Jr.
Jefferson LA  USA

"Do not bite at the bait of pleasure till you know there is no
hook beneath it."  -- Thomas Jefferson


Reply to: