[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: X - fading white...



> sorry for the late reply, my uni. course is slowly getting the better of me
> :)
> 
> Tobias Ulbricht [up5a@stud.uni-karlsruhe.de] wrote:
> > 
> > well, to my best knowledge I did this, even copied literally the modelines
> > from the SuSE-X-config file to my debian. On Suse with X 4.0.3, as I said,
> > it returns properly, not on debian 4.1.0.
> > 
> curious......I still think its a modeline issue though.  Laptop screens are
> very fussy about what they get.  Really they only have one refresh rate
> (60hz) however I have heard you are ment to make the rate at low as possible.
> I'm unsure how low they mean though :)
 
I get "fading white" effects when I use SVGAlib apps (e.g. zgv) if I do not
have it set to the correct card type.  (In my case, VESA.  And I get away
with gifs - tho they look crappy - JPGs always shoot it tho)

So that would imply that the combination you need is the right card setting...
which under X is *considerably* more complicated than in SVGAlib.  For one
thing not all ATI cards act even close enough to alike.  So the first thing
I'd ask is if you had to take an entry that you think was probably close 
enough... because, maybe it's not, and you should try another in the family,
or see if your card is discussed as being improved in the CVS tree over at
XFree86.org.

The next thing is, if you just wildly guessed for HorizSync and VertRefresh
you may be pushing your card out-of-bounds.  For instance my laptop Hsync
goes up to about 38.5 - kinda normal, high quality 800x600.  (yes, it's an
older system.)  But it goes -down- a lot further than 30, 31.  In fact by
its specs it actually Hsync's all the way down to 15.  wow.  As one side 
effect it's very compatible with projectors;  as another, I had to cook up
some interesting modelines to get best effects from it.  So, see if you can
find out your -real- Hsync and Vrefresh...

The dotclock is generated by your video card, but the tiny-brain inside your
video monitor has to be able to handle it.  Some LCDs want a very specific
dotclock (e.g. 64.15 and it will let you get away with 64 or 65).  Some LCDs
are quite blaze' about it and only really care that the dotclock is in its
accepted range - for those... 
	WARNING WARNING if it doesn't work your monitor will squeal and
			hate you and I recommend killing the system and taking
			your fsck bravely.
	DANGER, WILL ROBINSON.
...you can quite frankly goof with the hysnc and vrefresh insanely as long
as the modeline you're forcing it to accept has a happy clock value.

You CANNOT do that with CRTs.  They will do very, very bad things.  

> > I don't know what you mean, there are no stretching features on my laptop,
> > at least not "hardwired" ones. All I can do is switch to external monitor
> > and back.
> > 
> when you are in a text console the screen resolution is "640x480" and so the
> text console should only occupy a small section of the screen and not all of
> it.  If you use screen stretching it can be used to take up the whole screen.
> However some console fonts don't look too good at a result of this.  Again
> trial and error.  However if you say this key doesn't exist (which it should
> really, what is the computer again?  If its a toshiba then its likely to be
> a BIOS configuration option) this will make you getting everything to work
> more interesting :)  The reason is sometimes X gets upset about the screen
> stretching and does show white fading stuff.

Actually unless you use console framebuffer, the resolution will be phrased 
more as "80x25" ... if you set LILO to "ask" you can generally see other modes
offered, for instance I'm fond of 80x28 or 80x30 because a little more text
fits on screen, but not so tight as to make the text tiny for me.

Stretching is a BIOS feature, offered to help with having a projector as
external - some projectors chop the edges off the signal, and being able to
offer a black-bordered image is a win.   Newer projectors are better.  No 
big whoop if you don't have it.  X doesn't know or care about it.

I believe there is a tool which will check what your framebuffer settings
are;  and another which helps translate between framebuffer-style modelines
and X-style modelines.  So, if you use a framebuffer-console kernel, you might 
be able to use those two to get you some friendly modelines.

> > Still hope for help. Where can I search for it?
> > 
> what for, help?  If you want do mail me for 'advice' however it is really a
> case of trial and error with laptops.  I will also go through with you on
> accelerating X to be as fast as possible without it having stability issues.

Sometimes turning off accelration entirely makes it more stable.

Sometimes using the SVGA server instead of the "card specific" one is much
more stable -- but it's worth warning you, it will want different modelines.

X's new ability to guess modelines is okay and you might want to comment out
all your modelines and see if it can guess something usable.   It usually
gravitates to VESA compliant modes.  Even if they're not perfect, it's much 
easier to use xvidtune (or its cousins) to tweak a working but out-of-aspect
modeline than it is to work one up from scratch.

X is terrible at guessing your *real* monitor specs though, so if you can 
find those you may improve things greatly.  It might take multiple searches,
since you know your model, maybe you can find who makes the monitor - and a
"technical specs" or "repair guide" for it, that will give you the real
sync ranges.  If it gives them in "back porch" "video bandwidth" and such
terms, that's okay too, if somewhat harder to read (it might have values in
microseconds, as a symbol that looks like a u with a tail).  But you can use 
those with the Video Timings HOWTO to get a more seriously accurate modeline.  
Finally, there are scripts (which are debian packages even) to calculate
modelines given some of these parameters.

Good luck :D

* Heather Stern * star@ many places...



Reply to: