[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Download a whole gopherhole using wget/curl?



I’m in the same boat as well and also run several grandpa holes over a car in residential Internet connection

I would also find it a bit rude

Kind Regards

James

On Thu, 28 Nov 2019 at 15:49, Mr. Leveck <leveck@leveck.us> wrote:
The 11/27/2019 23:35, kiwidevelops wrote:
> Hi everyone,
>
>
>
> I want to archive as many gopherholes as I can, just in case any of them one day shut down or their server stops running and would like to know how I can download a gopherhole recursively. I know you can download with curl by using this command:
>
>
>
> curl -O gopher://gopher.hole/1/gophermap
>
>
>
> But this only downloads the gophermap and if you run it without the "1/gophermap" it throws this error:
>
>
>
> curl: Remote file name has no length!
>
> curl: try 'curl --help' for more information
>
> curl: (23) Failed writing received data to disk/application
>
>
>
> Does anyone know how to properly back up a whole gopherhole? Thank you!
>
>
>
> -Kiwi

Doing this may come off as rude to multiple site owners.
For example RPoD (my gopherhole) is many gigabytes in size
and is run on a residential internet connection. Slamming
people's servers, even with the best of intentions may
not be the best use of your time.

--
Nathaniel Leveck
gopher://1436.ninja
http://leveck.us

--

James Mills / prologic

E: prologic@shortcircuit.net.au
W: prologic.shortcircuit.net.au

Reply to: