[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: packaging question: what to do about this file...



Hello

I have a simple solution (or two actually).

1) Document that robots.txt should be copied to the proper place
   in the README.Debian file.
2) Tell the user to do that in a debconf box, or even to ask for
   where to install it.

I think it is a really bad idea to install it in /var/www. First
of all this default place is probably not used, more than for very very
basic installations and also because you can overwrite other things.

That is my opinion anyway. I vote for telling the user and not let the
package do anything (unless you parse apacheconfig and figure out
where all virtual hosts is placed. :) )

Regards,

// Ola

On Fri, Feb 07, 2003 at 01:45:32AM -0500, sean finney wrote:
> heya mentors,
> 
> i'm packaging sugarplum, an email harvester honeypot basically.  in
> order to not trap legitimate web-spiders, i thought it'd be good to
> make the install of a robots.txt[1] in /var/www happen by default if
> possible, only i'm not sure i can/ought to really do that.
> 
> if i made it a conffile, it would be ok if there were already a file
> there (it would go through the whole diff/yes/no conffile handler),
> but if the package were purged the file would be removed regardless
> of whether or not the file was there originally and was used for other
> stuff (which it might be).  i'd feel bad about just deleting it like
> that.
> 
> i could just do nothing, but then i fear not enough people read
> /usr/share/doc and we'd have a bunch of debian users making life
> hell for the legitimate web crawlers out there.
> 
> the only other thing i could think is that if there already is one,
> don't touch it, and otherwise, cat a copy that exists in /usr/share
> onto it, and when the package is purged, check if the file is
> the same as what is in the package, and if so, delete it, but otherwise
> leave it.  this seems like a hack destined to leave cruft behind though.
> 
> any suggestions or thoughts?  they'd be much appreciated.
> 
> 
> 	sean
> 
> [1] the robots.txt, for those unfamiliar with the
>     never-quite-an-rfc-standard, basically can say things like "spiders,
>     don't go here", which spammers frequently ignore but google won't.



-- 
 --------------------- Ola Lundqvist ---------------------------
/  opal@debian.org                     Annebergsslingan 37      \
|  opal@lysator.liu.se                 654 65 KARLSTAD          |
|  +46 (0)54-10 14 30                  +46 (0)70-332 1551       |
|  http://www.opal.dhs.org             UIN/icq: 4912500         |
\  gpg/f.p.: 7090 A92B 18FE 7994 0C36  4FE4 18A1 B1CF 0FE5 3DD9 /
 ---------------------------------------------------------------



Reply to: