[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Download a whole gopherhole using wget/curl?



The 11/27/2019 23:35, kiwidevelops wrote:
> Hi everyone,
> 
> 
> 
> I want to archive as many gopherholes as I can, just in case any of them one day shut down or their server stops running and would like to know how I can download a gopherhole recursively. I know you can download with curl by using this command:
> 
> 
> 
> curl -O gopher://gopher.hole/1/gophermap
> 
> 
> 
> But this only downloads the gophermap and if you run it without the "1/gophermap" it throws this error:
> 
> 
> 
> curl: Remote file name has no length!
> 
> curl: try 'curl --help' for more information
> 
> curl: (23) Failed writing received data to disk/application
> 
> 
> 
> Does anyone know how to properly back up a whole gopherhole? Thank you!
> 
> 
> 
> -Kiwi

Doing this may come off as rude to multiple site owners.
For example RPoD (my gopherhole) is many gigabytes in size
and is run on a residential internet connection. Slamming
people's servers, even with the best of intentions may
not be the best use of your time.

-- 
Nathaniel Leveck
gopher://1436.ninja
http://leveck.us


Reply to: