On Fri, Jan 25, 2008 at 08:05:01PM +0100, Stefano Zacchiroli wrote: > If so we can just run a script to check whether the affected files are > duplicates or not, get rid of all the duplicates ... and hope nothing > else remains in the bucket :-) Just for fun I've actually searched for duplicated. Attached there's a list, 106 out of 386 "weird sorted" news dirs are duplicated, the remainder are not. Here is how I've created the attached file: zack@master:/srv/packages.qa.debian.org/www$ cut -f 1 weird_sorted_news.txt \ | bin/find_dup_sorted_news.sh \ > dup_sorted_news.txt So, we can't just remove all the "weird" files. Still, the non duplicates can be renamed so that they are properly sorted (for example renaming them from, say, "25.txt" to "TIMESTAMP.25.txt" where TIMESTAMP.txt would have led to a clash). What do you think of this solution? (I volunteer to write the batch script which does this fix.) It remains still open the point of whether clashes can happen again or not ... (see my previous post). Cheers. -- Stefano Zacchiroli -*- PhD in Computer Science ............... now what? zack@{upsilon.cc,cs.unibo.it,debian.org} -<%>- http://upsilon.cc/zack/ (15:56:48) Zack: e la demo dema ? /\ All one has to do is hit the (15:57:15) Bac: no, la demo scema \/ right keys at the right time
Attachment:
signature.asc
Description: Digital signature