[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Download a whole gopherhole using wget/curl?



Hi everyone,

I want to archive as many gopherholes as I can, just in case any of them one day shut down or their server stops running and would like to know how I can download a gopherhole recursively. I know you can download with curl by using this command:

curl -O gopher://gopher.hole/1/gophermap

But this only downloads the gophermap and if you run it without the "1/gophermap" it throws this error:

curl: Remote file name has no length!
curl: try 'curl --help' for more information
curl: (23) Failed writing received data to disk/application

Does anyone know how to properly back up a whole gopherhole? Thank you!

-Kiwi


Reply to: