[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

graph theory oportunitties in Debian release process



   Hallo,

   I use Debian and I wondered, why is everything here so obsolete.
Then I wondered, why it takes so long for packages to pass, even to the testing release.

So I read about process that passes packages from unstable to testing and then to stable release. I see major slowdown reason in unsatisfied cross- and core packages dependencies. Often there are few core packages (glibc, libc6 etc) that block many others. And often some unbreakable dependency problems.

I feel that today's script based mechanism is too simple in roots and cannot handle complicated dependency problems. I feel, that some problems could be unsolvable too. The problems persist and major releases are too rare and they contain very old software even in time they appear, not to mention after year, when still no next release is to see, and everything in system is soooo old. And only the stable release is the one I can install without basic functionality fear.

I'm sorry I'm not math genius nor programmer to help, but I can suggest something:

some engine based on graph-theory should be build that will search for best way that packages should follow to decrease the locks caused by unbreakable cross-dependencies. It should find the core problem packages that should be forced forward to allow many others to pass, and find the shortest path to accomplish this. The engine should take some "value of importance of the package", the obsolecy (how old is the package in last stable distribution), take in care more versions of packages (with their different dependencies), even future versions, and mix it all down. Output will be the mix of packages that can go thru right now and packages that should be forced (namely libraries) to allow break the dependency problems. The goal is to minimalise number of dependency problems.

         What do You think?

                  Sincerely Peter



Reply to: