[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Packaging dependencies for mailman3-hyperkitty



Le vendredi 25 mars 2016 à 13:02:55+0800, Paul Wise a écrit :
> On Thu, Mar 24, 2016 at 11:43 PM, Pierre-Elliott Bécue wrote:
> 
> > Packaging dependencies for mailman3-hyperkitty
> 
> Does HyperKitty depend on mailman3 or just enhance it by providing an
> archive web interface? If the latter, I would suggest calling it
> hyperkitty instead of mailman3-hyperkitty.
> 
> > robot-detection suffers the same illness, but it's tiny, it's possible to
> > integrate it in hyperkitty, or make it optionnal.
> 
> Embedded code copies are against Debian policy, please package it
> separately or get upstream to switch to something else.
> 
> https://wiki.debian.org/EmbeddedCodeCopies
> 
> Something like that sounds like it isn't possible to keep usefully
> up-to-date in Debian stable though, since the landscape of robots on
> the web will be changing continually and many will be aiming to
> emulate browsers.
> 
> https://pypi.python.org/pypi/robot-detection
> 
> In addition, it seems to be woefully inadequate for that since the API
> doesn't appear to take into account IP address ranges.
> 
> It also depends on the robotstxt.org database, which would need to be
> packaged separately and is also no longer kept up to date at this
> time:
> 
> http://www.robotstxt.org/db.html
> 
> "This robots database is currently undergoing re-engineering. Due to
> popular demand we have restored the existing data, but
> addition/modification are disabled."
> 
> As the page says, there is a better database of user-agents available
> 
> http://www.botsvsbrowsers.com/
> http://www.botsvsbrowsers.com/category/1/index.html
> 
> Unfortunately this is incompatible with the data format used by
> robotstxt.org/robot-detection:
> 
> http://www.robotstxt.org/db/all.txt
> 
> So you can see from the botsvsbrowsers.com data, the User-Agent field
> is often bogus or contains vulnerability attack patterns and is thus
> mostly not useful at all and should probably just be ignored by all
> web apps at this point.
> 
> So I would suggest convincing upstream to remove whatever use of
> robot-detection is present in mailman3 or hyperkitty.

That's in progress, the only goal of this detection is to deactivate
javascript dynamic load of threads. We're thinking about alternative
solutions.

> > That leaves me with django-gravatar2, that seems useful, and is still
> > developed. I heard there is some kind of "canonical" way of packaging django
> > apps. As I'm not used to that, I'm here to ask advice.
> 
> I would suggest upstream switch from Gravatar (a centralised
> proprietary service) to Libravatar (a federated Free Software service
> that falls back on Gravatar):
> 
> https://www.libravatar.org/

I understand your point, and I'll think about it, but my goal is to make
upstream remove obsolete dependencies. django-libravatar seems to be the
only project that bundles support for that, and it's not maintained, whereas
django-gravatar2 is still maintained.

So, for now, I think that I'd rather have a first mailman3 suite in debian,
and then, think about how to make things better. :)

> Re canonical django packaging, you may be talking about this:
> 
> https://wiki.debian.org/DjangoPackagingDraft
> 
> There are also lots of python-django-* packages in Debian that you
> could look at.

Thanks!

-- 
PEB


Reply to: