Bug#226973: [Linux-fbdev-devel] Re: Bug#226973: xserver-xfree86: [glint] second card at wrong resolution
On Thu, Apr 01, 2004 at 01:41:22AM -0500, Branden Robinson wrote:
> On Wed, Mar 31, 2004 at 11:10:02AM -0500, Clint Adams wrote:
> > > If your graphics controller is limited by memory bandwidth, the maximum pixel
> > > clock depends on the number of bits per pixels, since larger pixels mean more
> > > memory bandwidth.
> >
> > I'm unclear on the differences between 24 bpp and 32 bpp and how this
> > relates to internal pixmap format.
>
> Some hardware uses a "packed-pixel format"; that is, 4 24-bit pixels are
> encoded in 3 32-bit words (what Intel calls "DWORDS", I think).
Yeah, DWORDS, because they are stil living in the era of 16bit hardware :)
> Other hardware tosses in a zero byte with every 32-bit "DWORD" transfer.
Well, the permedia2 should support both formats, depending on chosen
mode.
Friendly,
Sven Luther
Reply to: