[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Dependence on specific versions



On Feb 10, 2011, at 10:40 PM, Jesús M. Navarro wrote:

> Hi, Stefane:
> 
> On Thursday 10 February 2011 21:29:34 Stefane Fermigier wrote:
>> On Feb 10, 2011, at 7:50 PM, Russ Allbery wrote:
>>> Stefane Fermigier <sf@nuxeo.com> writes:
>>>> Only by fixing version numbers of third-party libraries can you be sure
>>>> that the same build that works today will still work next week, if you
>>>> redo the build on the exact same version of the sources (and Maven, and
>>>> Java, of course), any operating system.
>>>> 
>>>> Yes, we do upgrade third-party lib versions from time to time, but only
>>>> when there is a good reason to ("if it ain't broke, don't fix it").
>>>> 
>>>> BTW: I used to think like you 3-4 years ago when I discovered Maven, but
>>>> had to change my mind due to the reality.
>>> 
>>> For those of us who have been doing this sort of thing for a while, this
>>> argument sounds very familiar.  I've heard this argument applied to C
>>> libraries, Perl modules, Python modules, and most recently Ruby modules.
>>> It always sounds persuasive when presented from a stability perspective.
>>> It has always proven to be completely wrong in the long run.
>> 
>> Please develop, unless you want me to believe you only based on your
>> reputation, which I won't since I don't know you.
> 
> I'll give my opinion here: both Russ and you are right.
> 
> Yes, you are right: in order to distribute a product you must have to be able 
> to reproduce it unambiguosly from your sources.  This implies all your 
> dependencies must be hardcoded.
> 
> But in the case of Java (and Ruby, and Python), making this on development is 
> nothing but a dirty hack to cover the malpractices of the developers.

Which developers, upstream or downstream ?

If I have to work with upstream libraries that ignore, or break, the rules you state below, and are, indeed, good software engineering rules, in theory, what am I to do ?

My (and my team's) experience has taught us that it's better to assumer the worse.

One of my friend, who's the maintainer of the distribute2 Python project, and knows a thing or two about dependency management, send me recently an article with the example of a Python library whose versioning scheme is that each version of his software is named after one of his cats.

(This is, BTW, not much more stupid that the versioning scheme chosen by Debian, where you can't figure out if a "squeeze" is a better or smaller version number than a "sid" or whatever).

Even with more classical versioning, you still have to work around date-based releases, alpha-beta-gamma-rc releases, pre-releases, etc.

All of this can confuse Maven's version comparison function, and give you a very wrong version if no one really cared.

> Your development environment should rely at most on minor versions of its 
> dependencies and let the extraversion float as the upstream developer see fit 
> because said upstream developer will have the acumen not to break backwards 
> compatibility between extraversions and, in fact, will use extraversions only 
> for bug fixing (and those you definetly want as soon as they are published).  
> Your developers, on the other hand, will need to defend their position if 
> your app is happening to be using two different versions of the same 
> component and, no, telling "that's what X-tool pushed into my environment", 
> or "that's the uberbleeding edge version from the upstream developer out of 
> his nightly builds" is not a proper answer.
> 
> So, if your app depends on, say, foo_1.2.3, your dependency checking will look 
> for foo_1.2 or even "just" for foo_1.  This will allow your continous 
> building environment to fail early if it happens that foo_1.2.(x+1) or foo_1.
> (x+1) are not backwards compatible, analyze why and act accordingly.  Of 
> course, once the shinny new version of your app is ready, you will freeze 
> your building environment for that version to whatever happens to be the 
> exact versions of dependant libs at the moment.

Actually at least one thing is wrong in this model (which I used to think the right one): my continuous integration system (Jenkins, BTW), is not alerted when a new shiny (but buggy) new version of library so and so appears on the central maven repository. So no build will be triggered, and the breakage won't be noticed.

But next time I commit a change to my code base, the build will break, and I will do the logical thing, which is to assume that *I* did break the build, and spend an inordinate amount of time until I realize that the error doesn't come form my commit, but from an event that doesn't appear at all in the commit timeline.

> Oh! and another hint: "if ain't broken don't fix it" is not such a valuable 
> knowledge with regards to open source if only because of the outer testing 
> base and the work it takes to jump over more than a few releases 
> (exponential, not linear, to the distance).

I didn't mean that you don't have from time to time, and in a very controlled manner, to upgrade your dependencies to keep up with upstream developments. But that you shouldn't leave it to chance.

  S.

-- 
Stefane Fermigier, Founder and Chairman, Nuxeo
Open Source, Java EE based, Enterprise Content Management (ECM)
http://www.nuxeo.com/ - +33 1 40 33 79 87 - http://twitter.com/sfermigier
Join the Nuxeo Group on LinkedIn: http://linkedin.com/groups?gid=43314
New Nuxeo release: http://nuxeo.com/dm54
"There's no such thing as can't. You always have a choice."


Reply to: