[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

An improved AM report method



(A kind-of followup to my earlier discussion question on activity-based
assessment of applicants)

Currently, the AM report consists of two parts: a public report basically
saying "P&P: yes", "T&S: yes", and then a private report to the FD/DAM with
complete mail logs covering the entire essay question session.  I can't
imagine that is particularly fun to deal with.

Instead, I propose that the AM report be extended and expanded to include
more detail about what was actually covered, and the manner in which the
applicant was assessed.  Obviously, this needs general approval both from
AMs (who will be writing the reports) and FD/DAM (who need to use the report
to make their approval decision).  The question of how much of this
information gets made public is also at issue here.

My idea is to break the main sections of the report (P&P, T&S) down further
into "core competencies".  For the procedures section, for instance, I've
gone through my template questions and roughly classified them by what I
think they're trying to test the applicant on.  I've come up with:

* Care and Feeding of the BTS

* Maintainer uploads

* NMUs

* General Packaging practices (config files; custom permissions)

* QA

* Security of the project (GPG, security fixes)

* A misc section (translations, mailing lists, developers' phone numbers)

If we can define each of these sections reasonably, then we effectively have
a curriculum which AMs can use to assess (and mentor) their applicants. 
Obviously, an AM can continue to use the existing questions, or make up
their own, but they may decide to do things differently (as I want to try
and do, with making applicants *do*, rather than answer questions).  AMs
could also mix it up, so assess some sections by essay questions, others by
practical tasks, and perhaps even chats on IRC...

An example report could be:

Procedures
------------

* BTS

I asked Fred Q. Hacker some questions on the use of the BTS, and he answered
all of my questions satisfactorily.  A complete transcript of this section
is attached.

* Maintainer uploads

I asked Fred to repackage Xorg without the use of any build system helpers. 
He performed this task satisfactorily, and I've attached his packages.

* NMUs

I asked Fred to prepare an NMU of dpkg and send it to me.  It checked out
properly.  Package attached.

* General packaging

I sent Fred my standard testing package (containing problems with proper
generated config file handling, and some permissions problems) and asked him
to fix it up.  Despite some initial problems with the permissions overrides,
he quickly discovered his mistakes and solved them.  My original and his
fixed packages are attached.

* QA

I asked Fred to do a QA overview of 10 packages currently maintained by the
QA group, which I chose at random.  Those packages were X, Y, ... Z, Q.  As
a result of his analysis, he made comments on bugs #nnnn, #nnnn, #nnnn, and
#nnnn.  Package Y has been removed at his suggestion to the debian-qa list,
at http://lists.debian.org/debian-qa/2005/07/xxxxxxx.html.

* Security

I was lucky enough to find an unfixed security problem in one of Fred's
packages, and reported it as bug #nnnnn.  Fred responded quickly to the
problem, and carried out the actions associated with a security bug well.

I contacted several of the people whose key Fred has signed, and asked them
of their opinion of Fred's key security checking.  Their reports were
positive.  I've attached their comments.

------------------------------------------------------------

(That report, BTW, took my about 20 minutes to type up.  I did have to make
up the information as I went, though, which might make a difference as to
the time required).

One thing that a more formalised curriculum (why do I keep using that word?)
might allow is to expand the applicant management pages on nm.debian.org to
make those pages effectively the one-stop shop for all of your report
construction needs -- boxes in which you can describe what steps you took to
assess the applicant in each sub-area, and spaces to attach supporting
documentation / files.

You don't need to warn me about the lack of offline-ability this produces. I
spend 5 hours a day on a train with no 'net access -- offline operation is
my modus operandi these days.  I think the benefits outweigh the
disadvantages -- for instance, if an AM drops off an applicant, FD has a
much idea of where they were up to.

It makes the final AM report pretty much write itself, too -- the production
of the final report would be nothing more than "click the 'send report'
button", and a short public report goes to d-newmaint, the full report goes
to nm-ctte, FD, DAM, etc, and the AM is done.  The report can be structured
exactly how FD/DAM needs it, and everyone is happy.

Several questions naturally arise out of this sort of reporting, though:

* Should all of the information I wrote in the sample report above be in the public
	report?  I wouldn't have had a problem with my AM putting that info
	in a public list archive, myself, but I understand that other people
	may have legitimate concerns.  Please voice them.  Do we even
	actually need a public report?

* Would a report of the nature above give the FD/DAM enough info to be able
	to make a reasonable assessment of candidates in most cases?  

* Are AMs willing to take the extra time to write out a more detailed report
	of this type?

I'm willing to devote several chunks of train-time to classifying the
questions in the other areas, and maybe hacking on the nm.debian.org code to
support my ideas about report generation.

- Matt

Attachment: signature.asc
Description: Digital signature


Reply to: