[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

FTPMaster meeting minutes



Hello world,

as you probably read on debian-project[1] there was a meeting of the
FTPTeam in Fulda last weekend. Mark, Alexander and myself met from
Friday til Sunday to discuss various topics we had on agenda - and to
discover multiple new restaurants all around my place. :)
And while I still miss out on Baklava (how can a turkish restaurant
seriously run out of that?) I thought we shouldn't have you miss
something that smells, tastes and looks like minutes, so here they are.
I'm sorry for all the length of it, but flipping between a simple "We
met, we did something, sod off" and this one, we somehow voted for a
slightly longer edition. Have fun. :)

[1] http://lists.debian.org/debian-project/2010/08/msg00314.html


 1. New FTPMaster
    As you could already have read on d-d-a[2], there is a good reason to
    send condolences over to Torsten, as we took his absence as the best
    opportunity to promote him from FTP Assistant to FTPMaster. After all
    he couldn't run away screaming as he wasn't attending.
  
    [2] http://lists.debian.org/debian-devel-announce/2010/09/msg00006.html
  
  
 2. Call for volunteers
    As usual when posting longish mails to d-d-a (yes, I know we are on
    -project) about our nice and beloved FTP Team, a call for volunteers
    is on order. And here it is! Ever felt compelled to do the hard
    groundwork? Ever wanted to help at a nicely central place inside
    Debian? Or just want to write some python code and still look for a
    good place to stick it in?
  
    Here we are. Come join the Nav^Wteam. Just sign over there to the
    right, or even easier, mail us. We won't bite you, thats for
    sure. At least not right away. :)
  
    The criteria are the same as always: You need to be a DD (except for
    coding only, though it helps to know the usual flow of a package) and
    you need to be able to deal with the existing team members.
  
    An occasional flame should also not disturb you, if you are working
    in the NEW queue you will stand between the people uploading and the
    packages entering the archive, rejecting something is not always
    liked much. (But you also get positive replies and thanks, to keep your
    spirits up :) ).
  
    And - if you get headaches when reading legal texts - we all do. But
    it is needed and things like NEW are mainly about that, the ftpteam
    is *the* one place to decide if something is ok for Debian to
    distribute or not, and you will have to take this decision. (Yes,
    there is more, but this is the master of the points you check).
  
    Obviously the other points I made in earlier mails, like [3], still
    apply too.
  
    [3] http://lists.debian.org/debian-devel-announce/2010/03/msg00003.html
  
   
 3. BYHAND
    Turns out our BYHAND handling was broken since a few days. Seems like
    we never got that back alive after Essen.
    Shows how much BYHAND is used today, but with a perfect timing we had
    someone place a BYHAND upload in the queue, so Mark took the
    opportunity to fix it up.
  
    We also decided we do no longer want to have BYHAND. We want those
    things to be as automated as possible, and as such we invite the
    remaining few users of BYHAND (about 2 packages) to talk to us, so we
    can switch them over to the automated way other packages are using
    (for example d-i).  This will help them as well as us as it'll mean
    their packages go through more quickly and don't have to wait for us
    to process them.
  
  
 4. Volatile archive
    A while ago the volatile team approached us and asked if we can take
    over their archive. We did not exactly take it over, but starting
    with squeeze, the volatile suites will be integrated into the normal
    ftp.debian.org mirrortree. This weekend we enabled squeeze-volatile
    on ftp-master and setup the needed scripts so that the volatile team
    can fill it with packages whenever needed.
    Please note that the general handling of volatile starting with
    squeeze is now different to the way volatile worked in the past. All
    packages now have to pass stables proposed-updates queue before going
    into volatile. Stay tuned, the volatile team will send out more
    information about its handling later on, the exact policy how the
    suite is run is with them, not us.
  
  
 5. Security archive
    This is one blocker in the way to a stable squeeze release as this
    archive is yet unable to process the dpkg v3 formats. Having recently
    upgraded backports to the current dak codebase I now know what I have
    to do with the security archive (same old code there), and my current
    schedule means it should be done real soon now. There is one anomaly
    in the security archive, namely the script used to actually release
    DSAs, which needs lots of work (it's BAD, mmmmmmmmmmmmkay?!) to
    continue working, but otherwise it should be the same amount of work
    as getting backports working.
  
    Additionally we discussed ways we can merge security into the normal
    FTPMaster archive. Obviously this is not a simple step to do because
    we must ensure that embargoed security issues stay embargoed until
    their normal release and such requires a good change in our database
    layout to deal with it. But we already have a good picture in which
    way to move and definitely follow this as a long term goal.  This will
    have the advantage that things like out of sync orig.tar files won't
    be able to occur and migration of security releases to stable should
    be much smoother (will turn out to be a simple suite copy, instead
    of a move between two independent archives).
  
  
 6. data.debian.org
    Right now this is slightly complicated to implement in a halfway sane
    way. The changes we need involve ensuring the location code is fully
    working, probably some database adjustments, and getting rid of some
    assumptions our code (may) make.
  
    This point and the merge idea from point 5 will probably best get
    addressed in a week long coding sprint of our team similar to the
    one we did last year in Essen.  A very rough aim for that is
    currently "somewhere springtime 2011", but of course this should be
    after squeeze released.

  
 7. changelog/metadata  export
    We finished a script to export changelogs, README/NEWS.Debian and
    copyright files which are used by packages.debian.org code.
    Up to now pdo was extracting those itself, taking up lots of
    resources doing this, while occasionally going wrong.
    This is now offered for all FTPMaster run archives (security follows
    as soon as it is upgraded).
  
    We will make the export tree available publically soon, so whoever
    want to use the data can freely do so. DDs who have a service that
    want to use them should contact either me or (better)
    mirrors@debian.org to arrange a mirror of the exported data onto the
    Debian machine their service runs on.
  
    Also, always keep in mind that we are happy to export more
    metadata. When you have a service that regularly needs to generate
    some data based on our archive - talk to us. Describe what you do,
    what for, what kind of data you use and how we can help you (what we
    should export). If it is possible for us to do, we will help you
    out. We don't bite, don't waste precious CPU cycles all over the
    places. :) (All the better if you volunteer to write the neccessary
    code too, that almost guarantees that we don't bite. Python and dak,
    but talk to us, we can guide you to the right places in the code)
  
  
 8. We changed our sending mail addresses. In the past we had at least
   
       "Debian Archive Maintenance <ftpmaster@ftp-master.debian.org>"
       "Debian FTP Masters <ftpmaster@ftp-master.debian.org>"
       "Archive Administrator <installer@ftp-master.debian.org>"
   
     as possible From headers from mails generated by dak. From now on
     this should be limited to
   
       "Debian FTP Masters <ftpmaster@ftp-master.debian.org>"
   
     so you may want to adjust your mailfilters accordingly.
     We will keep the installer@ address working for a week or two more,
     before we disallow any more mail send to it.

     This is also true for the backports archive (just replace domain).
     Additionally backports mail handling is split in two areas. For the
     technic you can reach the FTP Masters behind
     ftpmaster@backports.debian.org, while policy and all other
     questions, unless they better fit on one of the mailinglists,
     should be directed to team@backports.debian.org.
  

 9. dak rm
    Alexander helped our removal tool to gain a new option. From now on
    we can close bugs associated to a package when doing a sourceful
    removal. Obviously this is not enabled by default, but an option we
    have to select whenever appropriate (not all removals mean all bugs
    can be closed, like when its "just" a source rename), but this can
    greatly help the QA team.


10. bts categorize
    And while he was already at the code, Alex also fixed our bts
    categorize script and ported it to a different bts soap whatever
    python library. So the usertags on ftp.debian.org should look right
    again, finally.


11. debian-ports
    Following a short chat I had with Aurelien Jarno about
    debian-ports.org and its archive, we discussed if FTPMaster could
    offer help with running such an archive.
    In general we aren't unhappy with it but we also do not want to end
    up the main force behind it. Together with the nature of the archive
    and the attached constraints (different architectures need to have
    the same source version - with different code/patches applied for
    example), the handling in dak is not straightforward nor entirely
    there yet. Most probably it will mean a set of mini archives, one
    per d-p architecture, and is as such attached directly to point 4
    and 5. When the work for those is done, it will be relatively easy
    to provide the support d-p needs. Exact conditions for such work
    still need to be worked out, but basically something along the line
    that FTPMaster does the technic while someone else is actually
    responsible for it. So 2 or more DDs need to sign up for the work
    per arch, if they drop out and noone replaces, it gets removed, etc.

    Realistically we will not be able to offer this before late 2011.


12. rmadison / dak ls
    While this initially was (and still is) a tool provided by the QA
    people, we really dislike the current way it is running. Having to
    shell out to run a dak command, parse and send out its output is
    really not a way that is good (even though it works), and we want to
    provide a saner interface.

    As other services already provide SOAP interfaces to talk to (think
    of the BTS), this will be the way we will follow to. We intend to
    publish a clear API documentation and think of not only providing
    the equivalent of "dak ls", but also other querying tools like
    control-suite and override should be accessible.


13. dinstall replacement
    You might know that dinstall is "the thing ftpmaster let run 4 times
    a day getting us new things to upgrade to".
    What you might miss is that this is a nice 1280 lines shell
    scriptset, which even includes a pretty complete state machine - all
    in simple bash.
    Unfortunately it misses one feature, which was a nice way of
    synchronizing some steps we do in a nice way. While I actually know
    how to extend the scripts to do this, my fellow team members nearly
    threw me out of the window when I told them I want to extend the
    state machine and instead asked me to rewrite it all boringly in
    python muttering something about readability and sanity.

    While I do think my shell scripting pretty sane and readable, for
    the sake of going to the next restaurant instead of arguing I gave
    in and the rewrite is now on the todo list for "as soon as remotely
    possible, best two days ago", so I expect it to be done before end
    of October.


14. 3.0 git/bzr/$VCS
    First: For the following writeup we do not care if the VCS in
    question is git, bzr, hg or whatever. Current preference seems to be
    git, for its speed and flexibility, but the general arguments apply
    to any distributed VCS that could be used here. So interchange git
    with whatever you would like to see.

    We realize that this is a hot topic and understand that many people
    would like if they can *entirely* manage their packages using a
    distributed version control system, even as far as having a "git
    push" being the actual upload.

    We do understand the desire and reasons for wanting something like
    this but do not think the real usage of it - pushing entire
    repositories to ftp-master - is a goal we can currently support.

    The recent discussion, started a little before the DebConf Source
    Format BoF and then continued on debian-devel[4] pretty much listed
    the points the FTP Team has with this approach as Russ had discussed
    with us before. That is also the reason we did not participate any
    further - if there is nothing new to say there is no need to mail. :)

    So, to keep this already pretty long mail from exploding, our main
    trouble with such a package format is still the already mentioned
    work we have to do in the NEW queue. There needs to be a reasonable
    way to ensure Debian is actually allowed to distribute what we put
    on our mirror and having huge git repositories with lots of commits and
    branches will make NEW reviews near impossible with the current set
    of tools. How are we supposed to ensure that there wasn't a number of
    files with headers like "THESE FILES ARE FOR INTERNAL USE OF
    $COMPANY ONLY, IF YOU DISTRIBUTE THEM WE WILL SUE YOU"?[5] When the
    file was removed using git rm a hundred revisions ago, with a commit
    message not even clearly describing it? We would still distribute
    it, people could just checkout an old version and have it. -> Bad.

    Yes, shallow clones and allowing only something like one or two
    revisions in an upload seem to help, but then we don't really see
    the point in a new format. It would lose so much a full repository
    can give you, that one can directly use a 1.0 with patches or
    another of the existing 3.0 formats.

    Besides this we do invite people interested in a 3.0 $VCS format to
    work on the problems and are happy to participate, so we get to
    something we all can live with it - or all see it won't work
    out. But we will not be the drivers behind it, our plate is too full
    already.

[4] http://lists.debian.org/debian-devel/2010/08/msg00244.html
[5] Yes, we had such stuff in NEW, not only once, not just twice.


Below here follow a few more points which might get hard to understand
unless you are in dak / team internals anyways, but we list them
nonetheless, as we had them over here and/or put them on todo for the
future. There is nothing user visibly following, so you might skip if
you want to.


15. control-suite sanity
    Right now there is no sane version checking done when we import new
    data into a suite using c-s. This means that in theory the release
    managers could put packages/versions from any suite into testing
    (say, an oldstable version into testing, an experimental version
    into testing), completly violating any version constraint the suites
    have when processing uploads.
    This is currently solved/worked around by having a very big
    (virtual) hammer flying above the release/volatile peoples heads -
    should they ever make use of this capability and take version from
    elsewhere than the allowed source suite the import process would be
    stopped by us until we have all code changed to fully check it.

    Obviously this was never needed, (britney being in the release teams
    hand since April 2008!), which is also the reason why it is still an
    open issue.

    But the acceptance of more archives and suites with varying teams
    responsible for increases the priority of this task, so we had a
    discussion how to properly design and code this. It seems having it
    implemented in the database is the way to go, but the exact details
    aren't written in stone yet. Mark agreed to take a hammer and chisel
    to do so.


16. .changes parsing
    We currently parse .changes files at multiple places.
    This is insane and actually also stupid as changes files are only
    trusted at the initial parsing done by process-upload (to be called
    with --initial then). Everything later on, even repeated p-u calls,
    must only use the database to gather the needed information, we
    can't trust the .changes anymore to contain correct data (just think
    about removed byhand parts as an example). This needs changes in
    code, but should be a good task for a new volunteer as it doesn't
    seem to be hard to do, just needs time. :)
    

17. dak config
    Our text config file for dak is a nuisance that actually complicates
    code in unneeded ways, duplicates information and is basically
    hated. We already moved many parts of it into the database (for
    example about all attributes of a suite), but should move nearly
    anything else over. Our goal is to stop with a config that only
    tells us where we find the database, rest is taken from there.
    Compared to our Essen meeting last year we are much nearer to that
    goal nowadays.


18. Contents
    We are still working on this, mainly due to time constraints on the
    side of the main driver behind this. Should be ready soon.


19. g-p-s (aptconfig class)
    When we moved to the new ftp-master host franck.debian.org we had
    been astonished by its speed - and immediately set out on the task
    of making it go slow. We did not yet succeed with it, but we managed
    to have many things run in parallel that were very slow before.
    One of those things are the slow apt-ftparchive runs. While this ran
    "serial" over all suites and architectures in the past, easily
    taking up 95% of the time a "dinstall" run takes, we wrote a script
    to run in parallel. Thus we now regenerate the Packages/Sources and
    Contents files 4 times an hour (yes, its that fast that we don't
    notice these often runs, and using the caching feature there enables
    even faster runs during dinstall).
    We just do not like the script that allows us to do this much, we
    need a real class to build up the apt config file that every
    apt-ftparchive needs to be fed, it is too much of a hack currently.
    This is also work in progress, currently with Mark.


20. highlight packages in NEW fixing rc bugs
    As it always happens that packages hitting release critical bugs
    need NEW processing, we plan to hilight these packages in the
    various output formats. This should make it easier for us to fastrack
    these packages and should also make it easier for the release team
    to review (and even veto) these packages during a freeze.


21. untag pending bugs if a package is rejected
    As the bugs fixed in packages in the NEW queue are currently
    (semi-) automatically taged "pending", Jan Hauke Rahm suggested to
    untag them, if a package get's rejected from NEW.  There's already
    a patch floating arround, which needs to be reviewed and merged.


-- 
bye, Joerg
<rra> I don't know that "useful" is the best overall descriptor of the
      things Ganneff picks to put in his signature.  :)
<liw> obviously there is too little useful things said so he has to
      quote silly things instead
 * Myon sees Ganneff adding yet another liw quote

Attachment: pgpgeY00kUEvH.pgp
Description: PGP signature


Reply to: