Bug#165412: libc6: SIGRTMIN defined as -1 with libc 2.3.1, as 32 with 2.2.5
Package: libc6
Version: 2.3.1-1
Severity: important
(I consider this important as it breaks all programs using this
signal entirely. I wouldn't blame you for lowing the severity
though...)
Ok, SIGRTMIN evaluates to -1 with libc6 2.3.1. With libc6 2.2.5
it would evaluate to 32.
SIGRTMIN is defined glibc-2.3.1/sysdeps/unix/sysv/{arch}/bits/signum.h
as:
#define SIGRTMIN (__libc_current_sigrtmin ())
The function __libc_current_sigrtmin is defined in
glibc-2.3.1/sysdeps/generic/allocrtsig.c like this:
int
__libc_current_sigrtmin (void)
{
#ifdef __SIGRTMIN
if (!initialized)
init ();
#endif
return current_rtmin;
}
__SIGRTMIN is also defined in the signum.h file specified above,
like this:
#define __SIGRTMIN 32
The init function is defined like this:
static void
init (void)
{
if (!kernel_has_rtsig ())
{
current_rtmin = -1;
current_rtmax = -1;
}
else
{
current_rtmin = __SIGRTMIN;
current_rtmax = __SIGRTMAX;
}
initialized = 1;
}
And kernel_has_rtsig is defined in two places, sysdeps/generic/testrtsig.h
and sysdeps/unix/sysv/linux/testrtsig.h. In the former, it is defined
like this:
static int
kernel_has_rtsig (void)
{
return 0;
}
And in the latter:
static int
kernel_has_rtsig (void)
{
#if __ASSUME_REALTIME_SIGNALS
return 1;
#else
struct utsname name;
return uname (&name) == 0 && __strverscmp (name.release, "2.1.70") >= 0;
#endif
}
Maybe the problem is that when glibc-2.3.1/sysdeps/generic/allocrtsig.c
includes testrtsig.h, it does it like this:
#include "testrtsig.h"
and the wrong file is used. I don't know...
Oskar Liljeblad (oskar@osk.mine.nu)
-- System Information
Debian Release: testing/unstable
Kernel Version: Linux oskar 2.4.19 #3 Fri Oct 4 17:30:42 CEST 2002 i686 unknown unknown GNU/Linux
Versions of the packages libc6 depends on:
ii libdb1-compat 2.1.3-6 The Berkeley database routines [glibc 2.0/2.
Reply to: