[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bypassing the 2/3/4GB virtual memory space on 32-bit ports

[ debian-arm is Cced: as armel and armhf might be impacted in the
[ debian-devel is Cced: as i386 might be impacted in the future]
[ debian-release is Cced: as the release has to agree with the 

Hi all,

32-bit processes are able to address at maximum 4GB of memory (2^32),
and often less (2 or 3GB) due to architectural or kernel limitations.
There are many ways to support more than 4GB of memory on a system
(64-bit CPU and kernel, PAE, etc.), but in the end the limit per process
is unchanged.

As Debian builds packages natively, this 4GB limit also applies to
the toolchain, and it's not uncommon anymore to get a "virtual memory
exhausted" error when building big packages. Tricks have been used
to workaround that, like disabling debugging symbols or tweaking the
GCC garbage collector (ggc-min-expand option). This is the case for
example of firefox or many scientific packages. For leaf packages they
are usually left uncompiled on the corresponding architectures.

mips and mipsel are more affected by the issue as the virtual address
space is limited to 2GB. Therefore on those architectures, this issue
recently started to also affect core packages like ghc and rustc, and
the usual tricks are not working anymore. The case of ghc is interesting,
as the problem also now happens on non-official architectures like hppa
and x32. The *i386 architectures are not affected as they use the native
code generator. The armel and armhf architectures are not affected as
they use the LLVM code generator.

We are at a point were we should probably look for a real solution
instead of relying on tricks. Usually upstreams are not really
interested in fixing that issue [1]. The release team has made clear
that packages have to be built natively (NOT cross-built) [2]. Therefore
I currently see only two options:

1) Build a 64-bit compiler targeting the 32-bit corresponding
   architecture and install it in the 32-bit chroot with the other
   64-bit dependencies. This is still a kind of cross-compiler, but the
   rest of the build is unchanged and the testsuite can be run. I guess
   it *might* be something acceptable. release-team, could you please
   In the past it would have been enough to "just" do that for GCC, but
   nowadays, it will also be needed for rustc, clang and many more. The
   clang case is interesting as it is already a cross-compiler
   supporting all the architectures, but it default to the native
   target. I wonder if we should make mandatory the "-target" option,
   just like we do not call "gcc" anymore but instead "$(triplet)-gcc".
   Alternatively instead of creating new packages, we might just want
   to use the corresponding multiarch 64-bit package and use a wrapper
   to change the native target, ie passing -m32 to gcc or -target to

2) Progressively drop 32-bit architectures when they are not able to
   build core packages natively anymore.

Any comments, ideas, or help here?


[1] https://github.com/rust-lang/rust/issues/56888
[2] https://lists.debian.org/debian-release/2019/08/msg00215.html 

Aurelien Jarno                          GPG: 4096R/1DDD8C9B
aurelien@aurel32.net                 http://www.aurel32.net

Attachment: signature.asc
Description: PGP signature

Reply to: