Re: "Avoiding the vendor perl fad diet"
On Tue, 31 Jan 2012 11:39:28 +0100, Damyan Ivanov <email@example.com> wrote:
-=| Gabor Szabo, 31.01.2012 08:59:04 +0200 |=-
So I recently thought how to solve this.
One solution could be to use separate local::lib directories for each
application. That would include some duplicate module installations
but that would reduce the risk.
You seem to have re-invented the Ruby gems hell :) [...]
Here's my recipe for deploying Perl applications:
* ensure every dependency is packaged for Debian. If not, package it
(this was why I joined the Debian Perl group :)
* develop with current versions from Debian/unstable
* deploy application as a Debian package, with proper dependencies
Works very nice even for applications that have only one instance in
production and is a killer for multi-instance deployments.
Ehm... not so fast :)
At Opera we have a lot of Perl "mission-critical" stuff, as they say :)
We use Debian stable, so the majority of our servers now run squeeze.
The problem we have is that some squeeze packages are old,
where by old I mean 2-3 years old.
So what we often need to do, for example for Mojolicious, is to
package our own deb, using dh-make-perl usually.
Now, that would be great, except CPAN modules are not like
Mojolicious on average. If you happen to need a Catalyst plugin,
or anything that pulls Moose, or MooseX::WhatEver::Insanity,
then you're kind of screwed.
If you enter this downward spiral, it's really easy to
stumble on a sub-dependency that will need a newer Module::Build,
or a newer Class::MOP something, that in turn needs a newer
Test::More, and you see where this is going...
You will rapidly need to package a whole lot of stuff.
In the end, the process became so painful (for us, I'm not saying
in general), that we rolled our own full solution, that consist of:
1) DebZilla, a simple html form where you can input either
a module name or a tar.gz file. Debzilla will apply some
heuristics and ultimately use dh-make-perl to produce a deb package
and upload it to an (internal) repository.
I think dh-make-perl actually covers a fair amount of this,
but there is some added magic to avoid failures here and there.
More details available on request :)
2) Arepa, available on CPAN (https://metacpan.org/module/Arepa)
Web app to cover the rest of the deb package workflow, with
approval, build queue workers to build for several distros and
platforms, and ultimately syncing to a production repository.
End result is that we have our own internal debian repository
with hundreds of packages, perl and non-perl, that we use from
And we use puppet to automatically add this internal repository
Now, some considerations of my own:
- This is *a lot* of work for a company, and it took us
a long time to reach this almost-everything-is-automated point.
This is somewhat compensated by the many different projects we run,
- Wouldn't it be better if many different (large?) companies shared
this amount of pain, creating something together?
- Aren't we substantially altering Debian's stability with all
this repackaging and upgrading of deps and sub-deps and sub-sub-deps?
Anyone has some better idea?
Switching to unstable maybe?