[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: How should we handle greenbone-security-assistant?





On Fri, Dec 18, 2020 at 8:59 am, Raphael Hertzog <hertzog@debian.org> wrote:
On Thu, 17 Dec 2020, Pirate Praveen wrote:
 > - ensurance that we use DFSG free code only
 >   => we can have tool to review licenses of what has been
 >   downloaded during build and embedded in the binary packages

Then there would not be any value for Debian with such a scenario as people
 can do such analysis on any distro/container.

 It would make debian irrelevant.

I don't think so. First the tool is here to help the maintainer do the
assertion, it's unlikely to be 100% automated, it will likely point
out files to inspect manually and so on.

And, as a user, even if the tool exists, I wouldn't want to run it manually, I would continue to rely on Debian for the vetting process. I don't want
to have to do this on my own.


Even with that tool, if we have to change any modules, we will have to create forks of these modules and publish them and also modify package.json to use these forks.

gitlab has to do that many times with many rubygems when upstream is not responsive to pull requests.

> => we are doing bad now because many useful things are not packaged > (due to the mismatch between our rules and those not-longer-so-new > ecosystems) and when users have to manually install, the reliability
 >   goes down...

This I agree, but this could be achived by a mix of vendoring and individual packages. We can vendor modules that are specific to a single app and
 package more useful libraries as individual packages.

For this to work at scale, you need to work with the upstream ecosystem so that this works out of the box... AFAIK right now adding the required node modules in build-depends will not avoid those modules to be downloaded by the upstream build system and there's no simple flag that you can just add
to enable that behaviour. Is that correct ?

You will have to create a patch that removes the packaged node modules from package.json

See this patch in gitlab,
https://salsa.debian.org/ruby-team/gitlab/-/blob/master/debian/patches/0740-use-packaged-modules.patch#L94

Additionally, you will have to tell webpack/rollup to take modules installed by apt

https://salsa.debian.org/ruby-team/gitlab/-/blob/master/debian/patches/0740-use-packaged-modules.patch#L35

You may able to also add a plugin to yarn to automate this. If there is a yarn plugin that checks and prefers the apt installed modules, then that would work without having to patch the upstream build files. Ruby does that already with rubygems-integrations package. bundle install --local will take the apt installed modules without any change in upstream build tools.


 > - possibility to rebuild from source
 >   => we could have some sort of proxy that would store everything
> downloaded and let us rebuild an identical package without net access
 >   even if the remote resources disappear

 Why would anyone need to use debian in such a scenario?

I don't know for you but the reasons to use Debian would not be changed by the addition of this mechanism. I know that I use only free software,
that all the tools are easy to install, that some sane default
configuration has been provided by the maintainer, that further
instructions are in README.Debian, etc.

All the current trends are making it easy for developers to ship code directly to users. Which encourages more isolation instead of collaboration between projects. It also makes it easy for shipping more proprietary code,
 duplication of security tracking or lack of it. Debian and other
distributions have provided an important buffer between developers and users as we did not necessarily follow the priorities or choices of upstream
 developers exactly always.

This I agree with. And I believe it still stays true even if we accept to
vendor large amount of stuff.

We need to be doing what is the buzz of the time. Free Software was not a
 mainstream idea when we started.

I don't understand what you are trying to say here.

The mainstream idea seems to be isolating every project without any coordination with any other projects, including downstream distributions. The trend is to ship only one configuration (typically as a docker container - some projects don't even support building from source).

With distributions like debian, we care for the whole Free Software ecosystem. When we do transitions, we make sure every free software using a library or tool is updated. None of those are mainstream views.

Mainstream view is continue using a dependency version till eternity and nothing really breaks once it works on the developers machine. There is even a joke about docker being a way to make sure if it works on the developers system, then lets ship the developers system to everyone.

Yes, it is a lot work, but it makes the whole Free Software better by having updated dependencies for all applications.

Distributions are also way for users to effectively use the Freedom to modify the software. I don't think having to depend entirely on developers is a win for users in the long term. Their priorities are not necessarily best for the users.



Reply to: