[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: How much data load is acceptable in debian/ dir and upstream (Was: edtsurf_0.2009-7_amd64.changes REJECTED)



Hi Paul,

On Tue, Sep 15, 2020 at 10:00:45PM +0200, Paul Gevers wrote:
> On 14-09-2020 21:04, Andreas Tille wrote:
> > In the case of larger data sets it seems to be natural to provide the
> > data in a separate binary architecture all package to not bloat the
> > machines of users who do not want this and also save bandwidt of our
> > mirroring network.  New binary packages require new processing and my
> > question is here about a set of rejection mails we received ( .
> 
> I assume you realized, but just in case you didn't: the data doesn't
> need to go into any binary package for autopkgtests to find it. While
> running autopkgtests, the SOURCE is unpackaged and available. (You
> mentioned other reasons why you want it, though.)

Yes, that fact is perfectly known.  However, in the current discussion
this would only "help" us since without an extra binary package we would
"avoid" the ftpmaster review of the source package.  My intention is
not to avoid the review but to clarify the situation.

If I understood ftpmaster correctly the amount of data in the source
package is the problem.  It would be great to hear other developers
opinion about the size of data needed for proper testing and where to
put these.  To the best of my knowledge this is not specified in our
documentation.

Kind regards

      Andreas.

-- 
http://fam-tille.de


Reply to: