[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#226973: [Linux-fbdev-devel] Re: Bug#226973: xserver-xfree86: [glint] second card at wrong resolution



On Wed, Mar 31, 2004 at 11:10:02AM -0500, Clint Adams wrote:
> > If your graphics controller is limited by memory bandwidth, the maximum pixel
> > clock depends on the number of bits per pixels, since larger pixels mean more
> > memory bandwidth.
> 
> I'm unclear on the differences between 24 bpp and 32 bpp and how this
> relates to internal pixmap format.

Some hardware uses a "packed-pixel format"; that is, 4 24-bit pixels are
encoded in 3 32-bit words (what Intel calls "DWORDS", I think).

Other hardware tosses in a zero byte with every 32-bit "DWORD" transfer.

-- 
G. Branden Robinson                |     The last time the Republican Party
Debian GNU/Linux                   |     was on the right side of a social
branden@debian.org                 |     issue, Abe Lincoln was president.
http://people.debian.org/~branden/ |     -- Kirk Tofte

Attachment: signature.asc
Description: Digital signature


Reply to: