[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: [gopher] gopher sessions for CGI's



We can manage this by using "robots.txt" file to disallow "session" URLs.

Best solution is to provide internal index that skip these URLs.

2012/5/30 Kim Holviala <kim@holviala.com>
On May 25, 2012, at 16:52 , Chris Yealy wrote:

> After a little thought revisions of the gopher protocol, I thought of a
> method which would allowing gopher CGI's (or servers) to keep sessions for
> each client which connects, so I wrote a phlog post about it. You can find
> it at:
>
> gopher://sdf.org/1/users/octotep/phlog/05-24-12

How do you prevent search engines from "stealing" a session? What I mean is; when a search engine enters your site, it becomes for example session number "123". Now every person who comes to your site through the search engines also has the same session ID pretty much breaking the whole thing....

I've thought about using the URI for parameters, but it just doesn't work, and it looks ugly...

Gophernicus tracks users by IP address - given how little gopher users there are left it actually works almost perfectly.


- Kim




_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project



--
Damien CAROL
gopher://dams.zapto.org/1/
_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project

Reply to: