[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: website saver for linux?



Ron Johnson wrote:
On 08/31/08 11:02, Philip wrote:
I'm looking for a tool which spiders a site, and downloads every page in
 the domain that it finds linked from a particular url and linked urls
in the domain, creating a local site that can be manipulated offline as
static html.

Is there such a tool for linux (better still debian)?

$ wget -m -k -L -np http://www.example.com

I run this every week on a certain site that I want to archive the contents of. The first time you run it, the whole site gets mirrored. Each subsequent run, only new and modified pages are fetched.


wget also has a spider (--spider) option where it just check for updates. wget has the complete solution that you are looking for, I guess!

cheers!
Sabarish


Reply to: