[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Status GCE images and trademark (was Please let's not talk about "clouds")



That said, the bulk of Google Cloud just wants to ship completely 100%
opensource code here.  It shouldn't be terrible to get it right, and
as a posh + proper citizen of jessie.  But I'm afraid I'm not sure I
see the path completely yet.  Especially on the "how to build the
packages for these things, when versionitis starts setting in" front.

-david

On Thu, May 16, 2013 at 12:42 PM, David McWherter <cache@google.com> wrote:
> I think keeping us in unstable and backports is totally reasonable for
> the time being.  Especially if that helps us with some of the
> binary-only packaging we may need to deal with.  Jimmy indicated there
> may be quality-control issues with .debs produced by existing Google
> tools, and by FPM, that might still make that hard to deal with, but
> that could be fixed in time too.
>
> I realize that from your user point of view, non-deb is
> non-accessible.  And I want to solve that problem too, but the simple
> fact is that 90% of Google Compute customer base doesn't have that
> problem.  The biggest issue for me, and for them, I think, is that
> 'apt-get upgrade gcutil' doesn't work for them.  Which would also
> solve your problem, I think :)
>
> I think packaging 'gcutil' as a debian that installs cleanly would be
> relatively easy, and it's on our plate (of course we've all been
> overworked with other things getting ready for Google IO).  I think a
> "binary" deb is most useful for users.  I'm actually not sure what
> differentiates a "binary" deb from a "source" deb if the tool is 100%
> python.  Our gcutil would statically link in a bunch of other packages
> it needs.  'gsutil' i think is still a harder story.  We're not 100%
> confident that the .deb that we create works correctly on both debian6
> and debian7 because of the crazy version mismatchy problems.
>
> -david
>
> On Thu, May 16, 2013 at 12:08 PM, Yaroslav Halchenko
> <debian@onerussian.com> wrote:
>> Hi David and Jimmy,
>>
>> On Thu, 16 May 2013, Jimmy Kaplowitz wrote:
>>
>>> All of these things will need to be a multidirectional discussion with
>>> various participants having their areas of familiarity and expertise,
>>> and other areas where listening to other perspectives and needs will
>>> be most productive. My individual role is somewhat in the middle here,
>>> trying to help bridge gaps as well as doing some of the technical
>>> work. I'm very happy that my colleagues want to do the right thing for
>>> both Google's customers and Debian.
>>
>> "do the right thing" is indeed a worthwhile motivator ;)  I am all for
>> it too, so please take my comments/clarifications bellow in that perspective:
>>
>>> > From Google's perspective, there are a few issues we don't yet know
>>> > how to solve:
>>
>>> >   1) As our online services change (rapidly), it's important to get
>>> > our customers new versions of our tools quickly and easily.  It's
>>> > terrible for customers to get an N-month-old-version of our tools in a
>>> > Debian-6 or Debian-7 repo, and try to use that.  Our current solution
>>> > to that problem is to build and push images every month or so.  We'd
>>> > *love* a mechanism wherein we can push new versions of our tools to
>>> > official debian archives every month or two...
>>
>> none of the tools in question is in any official Debian stable release
>> (e.g. nor in 6 AKA oldstable/squeeze, neither in 7 AKA
>> stable/wheezy), and cannot be ever uploaded there -- that train is gone.
>>
>> So for the next 1-3 years, until next stable Debian release comes out
>> you should not worry that  users of official Debian repositories would
>> get some stale version.  Packages could be uploaded to Debian unstable
>> (or experimental) only and nearly as often as you like (and your Debian
>> "sponsor", e.g. Jimmy would be capable to allocate time for) -- e.g.
>> could be many times a day and the official archive is updated twice a
>> day I believe.    Such packages could even be forbidden from
>> entering testing, to which packages migrate usually after 10 days (or
>> could even be shorter) if no grave unfixed bugs present, in case you
>> really want to have a short leash there (but I do not think it is
>> necessary).
>>
>> So altogether I really do not see ANY problem with rapid development of
>> your tools and their availability in Debian archives.  Moreover presence
>> in the archives with automatic ways to update (instead of "go to website
>> and download a new version") sounds like the only sensible
>> approach here.
>>
>> The only way I see for older versions being dumped upon users within
>> upcoming 1-2 years would be when Ubuntu (or some other smaller
>> derivative) picks up such a package from Debian and includes in
>> their release... but I guess we could deal with that one way (keep them
>> in experimental)  or another (just talk and express the reason not to
>> include into non-rolling releases).
>>
>>
>>> >   2) gsutil has an array of c++ and python dependencies that are
>>> > available in Debian-6, but not with high-enough versions of those
>>> > dependencies.
>>
>> once again:  we would not even be able to upload to the official
>> Debian 6, nor Debian 7 -- those are released already, nothing could be
>> added (besides to backports. repository).   And if versioned dependencies
>> are  "good enough" in Debian sid ATM -- there is no problem.  If they
>> are outdated -- please say so, we will work to fix that ;)
>>
>>> > We have a binary-only debian package that installs all
>>> > the stuff necessary here, but it's unclear to us whether this is "good
>>> > enough" for inclusion in official Debian repos.  Our .tar.gz version
>>> > takes these dependencies and includes them essentially "statically
>>> > linked" in the gsutil installation directory, which is a little
>>> > unconventional as well.
>>
>> yes -- it would not work for 'official' Debian binary packages --
>> but that is not necessary (as per above).
>>
>> As for backport builds for older releases there could be multiple ways
>> to resolve the conundrum, and in any case you might end up creating your
>> own APT repository to provide those binary packages built for different
>> releases.  That is similar to what we are doing in NeuroDebian
>> (http://neuro.debian.net) where we do provide backport builds of all
>> packages we also upload to Debian proper.   And we strive to not waste
>> effort in maintaining two copies of the same package.  Quite often,
>> indeed system-provided libraries are outdated and it would have been
>> detrimental to the integrity of the underlying distribution users use if
>> we simply provided "fresh" builds of a new library thus replacing some
>> older but stable version.  So far we adhered to two approaches:
>>
>> 1. E.g. cmtk package & mxml-- as long licenses permit I do not strip
>> needed 3rd party libraries in source distribution --
>> http://anonscm.debian.org/gitweb/?p=pkg-exppsy/cmtk.git;a=tree;hb=HEAD
>> Utilities/mxml ships sources (not binaries) which are built/linked
>> against ONLY on those systems lacking up-to-date version.  But within
>> Debian (i.e. while uploading to sid) I assure that it builds/uses
>> system-wide installed/maintained versions, so those sources are not
>> used.  This allows us to build cmtk across wide range of Debian and
>> Ubuntu releases built from a single source package while still
>> fulfilling Debian policy:
>> http://neuro.debian.net/pkgs/cmtk.html .  So, pretty much, you could
>> stay with what you are doing and probably shipping those 3rd party
>> sources inside (or only for backported packages), or #2:
>>
>> 2. E.g. psychtoolbox-3 package & glew 1.9 -- we have glew 1.9 only in
>> Debian experimental...  So I have backported glew as of 1.9 and provided
>> it from neurodebian as versioned binary packages (i.e. libglew1.9 and
>> libglew1.9-dev).  libglew1.9-dev conflicts with original libglew-dev but
>> we do not need to have them both present on the same system while
>> building psychtoolbox-3 and libglew1.9 natively coexists with
>> libglew1.7.  When 1.9 comes to proper Debian/Ubuntu releases those
>> versions would superseed my backported ones and everyone should stay
>> happy.  So -- now I still build psychtoolbox-3 from the same sources for
>> Debian proper and NeuroDebian backports:
>> http://neuro.debian.net/pkgs/octave-psychtoolbox-3.html
>>
>> ... so the point is -- if you would be needing backport builds -- it
>> should be possible while keeping packaging still acceptable for Debian.
>>
>>> >   3) Really, focusing on gsutil and gcutil are a little narrow-minded.
>>> >  What we at Google *really* want, is a good understanding of the
>>> > process and ideal mechanism to cook those artifacts that integrates
>>> > with Google build tools and with Debian's release tools.  Then we can
>>> > standardize all our Google properties and use this mechanism, and make
>>> > Debian and Google "tier one" partners for releasing software.
>>
>> Sounds like a sound goal, but from my "user" point of view -- "I do not
>> care" ;)  I am just told by you that I need to use gcutil to use GCE --
>> but it is not accessible for me on Debian (click/download/adjust PATH is
>> not "accessible" in my terms).  And that is what I am trying to
>> address here.  And hopefully by the time of next stable Debian things
>> become more clear to aim for the holy mighty artifacts ;)
>>
>>> >   4) More deeply, beyond just those tools, there are several bugs
>>> > we've found in Debian OS, which lead to highly degraded network
>>> > performance, and what looks to be massive CPU starvation in cloud
>>> > environments.  We want to fix this problem in the Debian-6 and
>>> > Debian-7 images we provide but the process to do that is still unclear
>>> > to us.
>>
>> Let us know if you need any help/guidance -- I would first talk to
>> corresponding Debian folks (e.g. kernel team).  But this is a side topic
>> from the 1-2 (shipping tools you currently ask users to use).
>>
>>> > Really, our goal here is to do The Right Thing.  The problem is that
>>> > from our perspective, it's a discussion that is going to take some
>>> > time. But we're eager to have that discussion.
>>
>> Here you go -- we could discuss... or may be we could simply start
>> cooking ;)  Do you see any particular (technical) problem with
>> gcutil source distribution and its goodness of fit for distribution in
>> Debian sid (i.e. always rolling, uploads could be multiple times a day
>> etc)?
>>
>> --
>> Yaroslav O. Halchenko, Ph.D.
>> http://neuro.debian.net http://www.pymvpa.org http://www.fail2ban.org
>> Senior Research Associate,     Psychological and Brain Sciences Dept.
>> Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755
>> Phone: +1 (603) 646-9834                       Fax: +1 (603) 646-1419
>> WWW:   http://www.linkedin.com/in/yarik


Reply to: