[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Download a whole gopherhole using wget/curl?



Am Thu, 28 Nov 2019 05:31:34 -0500
schrieb Sean Conner <sean@conman.org>:

> It was thus said that the Great kiwidevelops once stated:
> > Hi everyone,
> > 
> > I want to archive as many gopherholes as I can, just in case any of
> > them one day shut down or their server stops running and would like
> > to know how I can download a gopherhole recursively.   
> 
>   And as some others have pointed out, some gopherholes are rather
> large in size.
> 
> > Does anyone know how to properly back up a whole gopherhole? Thank
> > you!  
> 
>   Ask the site owner politely for a copy of the content?
> 
That's the way I personally would pursue first. I am aware that it's
not always easy to get in touch with the site owner nowadays, as quite
often the gopherhole doesn't have indications on how to get in touch,
and even for those gopherholes being run on a unique domain name, whois
data nowadays is often of little help either.

On the other hand, I personally wouldn't mind, given that the load is
kept low. Similar to, say, the way Cameron handles Veronica-2 indexing.
Granted, it might take quite a while to fetch the whole content, and as
has been pointed out already might not even work for dynamically
generated pages. In my case, traffic shaping would keep my internet
uplink useable and should even keep the server accessible to the rest
of the world, but it probably wouldn't stand a real DDoS attack. Never
actually had a situation that would test the implementation though.

But as said, if there is a chance of getting in touch with the site
owner might be the easiest choice. I doubt any of the guys here would
reject to burn a few DVDs with the content if asked nicely.
Keeping it up to date later on is even less of a hurdle and could be
handled using something like rsync if that's what is intended.

If however you have no idea on how to contact the site owner, my best
bet would be to take it slowly. 

Best regards,
Florian


Reply to: