On Thu, Apr 05, 2018 at 10:43:04AM +0200, Philipp Kern wrote: > So what would be needed to make at least a simple export of the data > happen? I think the requirements I'd have are these: that's a good question! :) maybe we can sit together with some ftp-team and reproducible builds folks in Hamburg and finalize the design and implement it? > * Data is sufficiently fresh and optimally accessible before the mirror > pulse happens so that you can always fetch the corresponding buildinfo > for a newly pushed package. > * Some way of actually deducing the path to the buildinfo file, either > through some sort of redirector or by naming the files in a consistent > fashion. > > Right now the second point does not work with the date-based farm that > is used to archive the buildinfo files. It would work if we were to just > apply the same splitting as in the regular pool. For the former just > pushing the content through static.d.o should work and dak could push > the content before pushing the mirrors? > > Intuitively I would not care about cryptographic authentication of the > data. After all it can be verified by rebuilding if the package is > reproducible. agreed with all of these points, thanks! -- cheers, Holger
Attachment:
signature.asc
Description: PGP signature