[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#226973: [Linux-fbdev-devel] Re: Bug#226973: xserver-xfree86: [glint] second card at wrong resolution



On Thu, Apr 01, 2004 at 08:51:02AM +0200, Sven Luther wrote:
> On Thu, Apr 01, 2004 at 01:41:22AM -0500, Branden Robinson wrote:
> > On Wed, Mar 31, 2004 at 11:10:02AM -0500, Clint Adams wrote:
> > > > If your graphics controller is limited by memory bandwidth, the maximum pixel
> > > > clock depends on the number of bits per pixels, since larger pixels mean more
> > > > memory bandwidth.
> > > 
> > > I'm unclear on the differences between 24 bpp and 32 bpp and how this
> > > relates to internal pixmap format.
> > 
> > Some hardware uses a "packed-pixel format"; that is, 4 24-bit pixels are
> > encoded in 3 32-bit words (what Intel calls "DWORDS", I think).
> 
> Yeah, DWORDS, because they are stil living in the era of 16bit hardware :)
> 
> > Other hardware tosses in a zero byte with every 32-bit "DWORD" transfer.
> 
> Well, the permedia2 should support both formats, depending on chosen
> mode.

Well, maybe this is what is going on.  Maybe Solaris is using a
packed-pixel format, and XFree86 is using zero-padded format.

That is, assuming it's the RAMDAC's job to understand the pixel format,
and that some part isn't stuck in front of it to give it a standardized
format.  I wouldn't know.

-- 
G. Branden Robinson                |     That's the saving grace of humor:
Debian GNU/Linux                   |     if you fail, no one is laughing at
branden@debian.org                 |     you.
http://people.debian.org/~branden/ |     -- A. Whitney Brown

Attachment: signature.asc
Description: Digital signature


Reply to: