[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: How much data load is acceptable in debian/ dir and upstream (Was: edtsurf_0.2009-7_amd64.changes REJECTED)



On 9/16/20 2:55 PM, Steven Robbins wrote:
> Since you're soliciting opinions, here's mine.  In the absence of a documented 
> consensus, ftpmaster should respect the packager's judgement rather than 
> reject on their own personal opinion.

Reviewing the packaging is also part of the FTP master job.

On 9/16/20 2:55 PM, Steven Robbins wrote:
> Thorsten's observation ("... is much too large") is completely
> arbitrary. Also, why does size matter?  If the files are necessary,
> they will show up  somewhere. Why do we care which tarball they are
> part of?

The above shows you haven't understood what the problem is.

I replied already to Andreas, though here's my thinking, hopefully that
was the one of TA as well.

With a separate source package holding the data, the data set,
typically, will be uploaded *once*, then we may see new revision of a
tiny debian tarball. I also don't think such a package will need so many
revisions anyways.

On the other hand, the package which needs to be tested with this
dataset may need to be often upgraded to the latest upstream revision.
Uploading a huge debian tarball each time isn't optimized.

Cheers,

Thomas Goirand (zigo)


Reply to: