[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#226973: [Linux-fbdev-devel] Re: Bug#226973: xserver-xfree86: [glint] second card at wrong resolution



On Wed, 31 Mar 2004, Sven Luther wrote:
> On Tue, Mar 30, 2004 at 06:02:03PM -0500, Clint Adams wrote:
> > > I believe that the cloack is different for 8bpp and 16 or 24 bpp, don't
> > > remember exactly. At 8bpp, the max clock range is 230Mhz in the driver i
> > > think. You looked at it, you have the hardware, you just have become the
> > > resident expert on this issue :).
> >
> > Ahh.  I tried increasing the 32bpp number to 150MHz, and was successfully
> > able to use a mode at 135MHz.  I am confused as to why the Pixmap setting
> > affects the ramdac frequency.  Is this a mistake or am I misunderstanding
> > something?  Would it be okay to set the maximum based on depth instead
> > of bitsPerPixel, or is bitsPerPixel wrong if it's set to 32 at a plain
> > 24-bit depth?
>
> I would have to look in the specs, but i believe the ramdac is only able
> to push less pixels if those are of 16 or 32 bpp. This is the case for
> all 3Dlabs chips prior to the permedia3. Look at the part in glint_driver.c
> where these values are set. I have no explanation also, since my
> involvement was with the permedia3, so maybe ask Alan Hourihane about
> it.

If your graphics controller is limited by memory bandwidth, the maximum pixel
clock depends on the number of bits per pixels, since larger pixels mean more
memory bandwidth.

Gr{oetje,eeting}s,

						Geert

--
Geert Uytterhoeven -- There's lots of Linux beyond ia32 -- geert@linux-m68k.org

In personal conversations with technical people, I call myself a hacker. But
when I'm talking to journalists I just say "programmer" or something like that.
							    -- Linus Torvalds



Reply to: