Re: X dislikes int10 on amd64?
Helge Hafting wrote:
I have tried to run X on a secondary graphichs card. It needs
int10 initialization, which works fine when using ia32. But
the same XF86Config-4 fails with pure64:
Here is the end of the log:
(II) RADEON(0): Using 8 bits per RGB (8 bit DAC)
(II) Loading sub module "int10"
(II) LoadModule: "int10"
(II) Reloading /usr/X11R6/lib/modules/linux/libint10.a
(II) RADEON(0): initializing int10
(**) RADEON(0): Option "InitPrimary" "on"
(II) Truncating PCI BIOS Length to 53248
*** If unresolved symbols were reported above, they might not
*** be the reason for the server aborting.
Fatal server error:
Caught signal 11. Server aborting
Is there any special tricks to get int10 working on amd64?
I have installed the ia32-libs package, so that 32-bit software works.
The xserver is the 64bit one though.
The problem turned out to be that X ships with two libint10.a files,
one in /usr/X11R6/lib/modules and one in /usr/X11R6/lib/modules/linux.
The latter is preferred as it offer better performance on x86, but
it is only the former that actually work on amd64. So the solution is
to delete (or rename) the file /usr/X11R6/lib/modules/linux/libint10.a
so it won't get used.
int10 initialization works after that.