[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Aborting all plan about deep learning frameworks.



It's true that this field is rapidly moving these days and it's hard
to keep up with upstream releases.  My interest in taking the Keras
and Lasagne packages was mainly to help provide stable and well-tested
targets for these APIs, as well as to learn about packaging.  Two
things that surprised me shortly after I adopted where Theano going
EOL, and Keras getting packed into TensorFlow's upstream.  Meanwhile,
you are right that the vast majority of users will always install via
pip, in order to have the latest version.  However, it also means that
Theano will no longer be *changing*, and as long as Keras still passes
tests using it as a backend I see no reason to remove it.  In fact the
Keras Debian package can provide a nice way to install and use Keras
without a TensorFlow dependency which can be a great thing for
developers who want to integrate neural networks into their software
in a stable manner.

In that sense I personally think that the goal of packaging DL
software in Debian should focus around stability.  While researchers
will most often use the latest and greatest releases and features, the
speed with which releases come also prohibits building software on top
of these engines because when an API is a moving target it's hard to
build on top of.  So I think there is in fact some benefit that can be
derived from providing packages of LTS versions, and eventually to
package software which uses them and can be relied on as "available"
via apt-get and not represent a moving target.  When engines like
TensorFlow get updates, the Debian ecosystem can provide a great way
to automatically check that dependent software is not broken, *before*
updates get deployed to end users, and I think this can be a huge
benefit provided by this community.

We will undoubtedly start to see more software coming with built-in
ML-powered parts in the next years and it would be great to be able to
provide a stable and slower-moving platform in which these can be
developed and maintained.


Steve


On Sun, Nov 4, 2018 at 5:00 PM Mo Zhou <lumin@debian.org> wrote:
>
> Hi Science Team,
>
> Sorry for the depressing mail subject. Some of you may have an
> impression on my past endeavor to introduce some deep learning
> software to Debian, including TensorFlow etc. But now I cannot
> bear them anymore.
>
> Indeed my research field is computer vision and deep learning,
> and indeed I'm still running PyTorch for experiments when I'm
> writing this email. But all these experiments, are powered by tons
> of non-free software and data, including pre-trained neural networks
> and performance library blobs. In fact, if someone asks me, how
> to install TensorFlow/PyTorch on Debian, my answer is definitely
> either "use Anaconda" or "use upstream wheel", because they don't
> lack in performance and commercial support. In contrast, my
> motivation to package them is in fact to "have fun" and
> "learn something". Now I'm tired with packaging those rapidly
> developing stuff, and want to offload burden from myself.
> I don't regret aborting all the plan.
>
> Anyway, my interest in debian-related work is continuously changing.
> This email doesn't mean that I'm fading out from this team, but
> mean I'm moving to work on some more useful and valuable packages.
> This mail is a little bit sad. But I fell in relief by throwing
> them away.
>
> P.S. The deeper I dig into Debian's bugs, the more high-popcon
> buggy packages would emerge...... Obviously fixing important
> stuff is more valuable than introducing new toys.
>
> Happy hacking!
>


Reply to: