Bug#414782: Unable to load "xf86ExecX86int10"
On Tue, Mar 13, 2007 at 21:59:12 +0100, Julien Cristau wrote:
> On Tue, Mar 13, 2007 at 13:50:32 -0700, C.Y.M wrote:
> > Package: xserver-xorg-core
> > Version: 1.1.1-20
> > I am having the following problem loading the int10 submodule with the all my
> > different nvidia drivers. The nvidia developers tell me this is an issue with
> > xserver-xorg-core.
> > This is what Xorg.0.log has in it. I do not see any errors about missing
> > symbols, but there must be something missing.
> Yes, this has already been reported. See bug#410879.
> Can you send your full log and config files to this bug?
> > (II) Loading sub module "int10"
> > (II) LoadModule: "int10"
> > (II) Reloading /usr/lib/xorg/modules/libint10.so
> > (WW) NVIDIA(1): Unable to load "xf86ExecX86int10".
> > (EE) NVIDIA(1): Unable to initialize the X Int10 module; the console may not
> > (EE) NVIDIA(1): be restored correctly on your TV.
So, I talked about this a bit with ajax today, and he told me they use
x86emu on all architectures in FC7, instead of using vm86 on i386. This
is probably not an option for etch at this point, but maybe we can
consider doing the same thing for lenny?
In the meantime, should I just revert the int10 submodule patch, which
would break X on i386 with 64-bit kernels, or leave this as is, which
seems to cause problems at least on some dual head configs with nvidia
(or actually make the change to x86emu on i386 right now)?