[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#803885: xserver-xorg-video-intel: glxgears segfaults for g33 chipset



On Mon, Nov  2, 2015 at 13:54:02 -0800, Alan W. Irwin wrote:

> Package: xserver-xorg-video-intel
> Version: 2:2.21.15-2+b2
> Severity: normal
> 
> Dear Maintainer,
> 
> For my Intel g33 video chipset glxgears ran without issues on Debian
> Wheezy, but for Debian Jessie it segfaults on startup.  I also find
> that glxgears works fine for that identical Debian Jessie box if the
> results are displayed on a remote X server (an X-terminal also running
> Debian Jessie, but that box has an nvidia chipset so I am using the
> xserver-xorg-video-nouveau package in that case to display the results
> of glxgears).
> 
> I also find similar results with the foobillardplus 3D game; segfault on
> startup with direct display using g33 chipset, but no issues if the game
> is displayed remotely using an X-terminal with nvidia chipset.
> 
> My working hypothesis at this point to explain these startup segfaults
> with both glxgears and foobillardplus is that some regression (likely
> upstream) in 3D support for the Intel g33 chipset has been introduced
> between Debian Wheezy and Jessie.  If you don't have access to g33
> hardware to test this hypothesis yourself, please let me know if you
> would like me to run any further tests of the hypothesis.
> 
Please provide the full dmesg, X log and gdb backtrace.

Cheers,
Julien

Attachment: signature.asc
Description: PGP signature


Reply to: