In this thread I learned something obvious to most people but something that I had not fully internalized. So just to be explicit, here is what I learned. * The 'testing' distribution tries to maintain a consistent contour across all architectures by keeping the same version of each package across all of them. There will be specific exceptions such as for the kernel but mostly this is true. * You must upload a binary package along with a source package. Source only packages are not allowed to enforce the rule that packages must build successfully. The feeling is that people abuse source only uploads by uploading broken and unbuildable packages. Therefore forcing a binary upload proves the package is buildable at least somewhere. * The build daemons for other architectures will automatically build the new source package on their architecture. Having been build on that architecture that binary package will have different dependencies and will depend upon the current unstable shared libraries installed on that build system. Therefore package dependencies may not be uniform across all architectures. * The uploaded binary package is all well and good for the specific architecture it was built for. But because of the need to keep an even contour across all architectures it won't move to testing on any particular architecture until it can move to testing on all architectures. At which time the testing version across all architectures will move together. It might be possible to upload binary packages for all architectures individually. But this sounds like a bad idea in general. That could lead to packages which are unbuildable anywhere except the uploading maintainer's systems which build them. Bob
Attachment:
pgp2TcuMLrq8F.pgp
Description: PGP signature