Re: plan of deep learning team for next stable release?
On Thu, Nov 26, 2020 at 06:20:45PM +0000, Leonard Lausen wrote:
> Resending as the initial mail didn't pass list moderation (?).
> Mo Zhou <email@example.com> writes:
> > The embedded openblas is also something remains to remove. (I still
> > have no idea why libjulia-openblas64 does not work)
> Would the reason for bundling Julia be as simple as "avoid symbol name
> clashes"? According to , Julia bundles OpenBLAS with ILP64 interface
> symbols suffixed by "64_" as of 1.2.0+dfsg-1. The Debian openblas
> packages in contrast come in both the ILP64 (libopenblas64-0) and LP64
> (libopenblas-0) version, but both use the same symbol names.
libopenblas64-* are supposed to fulfill the virtual package <libblas64.so.3>,
without any symbol mangling due to the existence of our update-alternatives
mechanism. Notably, src:openblas, src:blis, src:lapack and src:intel-mkl
are registered as the BLAS64 alternatives.
The BLAS64 with symbols mangled is exactly provided by src:openblas
What we have done for Debian is similar to what Fedora does in terms of
compiling openblas, but we have an additional update-alternatives mechanism.
> Thus, if Julia would link libopenblas64-0 instead of bundling a
> symbol-suffixed openblas and if a user would dlopen (import) a library
> depending on libopenblas-0 from the Julia process, the ILP64 symbols
> would incorrectly be used for the dlopened library instead of the LP64.
That's exactly why bin:libjulia-openblas64 is built. Plus, linking julia
against libopenblas64-0 is pointless as it is not able to provide
symbols for julia packages like Arpack.jl.
> If that indeed is the issue, providing a libopenblas64_-0 packages with
> symbol name suffixes should solve the problem. This is the approach
> taken by Fedora .
This looks ugly and ambiguous, and may potentially encourage people to
mess up things:
> Also note that this problem not only affects Julia, but also deep
> learning frameworks with large tensor support (ILP64) for Blas
To deal with large tensors on CPU, intel-mkl is recommended in most cases.
Besides, intel-mkl does not mangle its ILP64 ABI like julia upstream
does for openblas. We need compatibility for the old fortran programs.
In this case, the <libblas64.so.3> virtual package makes more sense.
> I'd be happy to learn if there are other reasons for bundling openblas
> in Julia as well as your thoughts about providing a libopenblas64_-0
please apt show libjulia-openblas64
> Best regards
> : https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=905826
> : https://src.fedoraproject.org/rpms/openblas/blob/HEAD/f/openblas.spec