[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

packaging question: what to do about this file...



heya mentors,

i'm packaging sugarplum, an email harvester honeypot basically.  in
order to not trap legitimate web-spiders, i thought it'd be good to
make the install of a robots.txt[1] in /var/www happen by default if
possible, only i'm not sure i can/ought to really do that.

if i made it a conffile, it would be ok if there were already a file
there (it would go through the whole diff/yes/no conffile handler),
but if the package were purged the file would be removed regardless
of whether or not the file was there originally and was used for other
stuff (which it might be).  i'd feel bad about just deleting it like
that.

i could just do nothing, but then i fear not enough people read
/usr/share/doc and we'd have a bunch of debian users making life
hell for the legitimate web crawlers out there.

the only other thing i could think is that if there already is one,
don't touch it, and otherwise, cat a copy that exists in /usr/share
onto it, and when the package is purged, check if the file is
the same as what is in the package, and if so, delete it, but otherwise
leave it.  this seems like a hack destined to leave cruft behind though.

any suggestions or thoughts?  they'd be much appreciated.


	sean

[1] the robots.txt, for those unfamiliar with the
    never-quite-an-rfc-standard, basically can say things like "spiders,
    don't go here", which spammers frequently ignore but google won't.

Attachment: pgpl3l4cPDlEU.pgp
Description: PGP signature


Reply to: