[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Fun with OpenGL (Was: LSB bastards)



On Sun, Jul 01, 2001 at 05:05:54PM +0200, Marcelo E. Magallon wrote:
>  No, it's your own experience Joseph.  In your case it's a fact.
>  Radeons are more problematic than other cards.

I have half a dozen various cards and three different machines on which to
test them.  I can attest to that it is most certainly not idiotproof for
any of them..  As I noted, mixed success with DRI.  The Radeon flies like
a lead weight and takes the machine down with it.  Other cards work much
better with varying degrees of success.

NVidia works for lots of people, but if you have one of the cases where it
doesn't, you're screwed.

3dfx ... is 3dfx.

I don't have anything that uses Utah-GLX, so cannot guess at how well that
works or doesn't work.


>  With really recent versions of everything, dificulty is increasing,
>  yes, because everything seems to be desynced and all sides seem to be
>  saying "it's someone else's fault". 

Yeah, well, that kinda goes with the territory.  I don't generally count
that against them except in releases where it matters.


>  > If you have a DRI-supported card, you must compile not one, but two
>  > custom kernel modules for that card on your machine.
> 
>  I'm curious about which is the second one... agpgart?  If you've got a
>  recent enough kernel it just works.  You can just forget about 2.2
>  kernels, the DRM modules don't work there (and won't until someone is
>  interested enough to port them back).

Yes, and both agpgart and the drm module need to work.  If one has a
problem (and in too many cases still, either or both will have problems
due to unsupported (or buggily supported) hardware, with non-trivial
fiddling needed to get it working if it can be gotten working..)


>  Besides, what's on the XFRee86
>  tree is not what's on Linus' tree (and neither is always in agreement
>  with what's on the DRI tree).  Someone should take a hint and finally
>  pack programs/Xserver/hw/xfree86/os-support/linux/drm/kernel and ship
>  that alongside whatever package provides the DRI drivers.  Pretending
>  that the version in the kernel will work with whatever the packaged
>  driver happens to be is just delusional.  It's much more stable now,
>  but the next release will probably change the requirement again.

Zeph provides kernel stuff with his CVS packages for this reason.  I just
shrug and compile it right out of CVS, no biggie there.


>  > (This if you can get the damned thing to work, which I have had very
>  > mixed success with..)
>  
>  like I said.  Before starting to fiddle with my own binaries, The only
>  thing I had to do to get the DRI working was XFree86 -configure and
>  compile the kernel module (and at that time the kernel drm module and
>  the xfree86 drm module where in agreement so I didn't even need to fish
>  the drm out of the xfree86 sources).

Yes, but some cards Just Work and others don't.  Which ones Just Work
seems to vary by the phase of the moon or something as is typical of
ongoing development.  And if you're running a shiny new system with a
recent chipset (particularly if it's not an Intel chipset) it's touch and
go all the way.


>  > To add insult to injury, while one set of code works fine for GLX in
>  > general, Mesa with a Voodoo2 requires slightly different code.
> 
>  if this is the sort of voodoo2 supported by the mesa glide packages, I
>  wouldn't midn a patch (TESTED, PLEASE! I won't ever add a patch to that
>  package unless the submitter gives some proof that the patch is
>  *actually* tested), and I'll forward it.  If it's the other kind of
>  voodoo2, sorry, I can't help there.  (And I wouldn't mind something
>  more concrete than all this hand waving argumentation -- I stopped
>  beleiving in these "trust me, it's way complicated" fairy tales a long
>  time ago)

For Voodoo2, do not let go of window focus, if you write your code using
the vidmode extension (as a number of people seem to at first), check for
Voodoo2 and hardcode fullscreen.  If you don't, it gets frustrating fast.

Another issue for GLUT apps is that the window that the window which
becomes active under GLUT is not the one that accepts input.  Not the
foggiest idea why.

For Voodoo3, the above problems exist if you do not have DRI (ie, you're
still running around with XFree3 and glide2 for some reason, as a number
of people are...)  You have the added benefit that if you try to set up
for DGA input and do it wrong, glide will crash the machine for you.


I don't know that any of these are Mesa issues, except perhaps the GLUT
issue may be marginally connected.  As for the crash with DGA on the
Voodoo3, Zeph, Joseph Kain, and I all sat around trying to wrap our brains
around that one.  We made it happen in fewer cases, but it can happen
still and we've declared it a known issue.  With the death of 3dfx, I
expect it will be properly resolved when Satan takes up snowbording.  It
shouldn't actually crash the box anymore if your DGA-using code does
everything the right way.


>  > Most recently written apps are, fortunately, as long as they were not
>  > written for GLUT.
> 
>  patches (or technical descriptions of the problem) are welcome.

They're not mesa problems else I'd have made some noise about them by now.


>  > I won't even go into NVidia, whose standard installation is likely to
>  > hose something if you're not damned careful.
> 
>  funny, I thought there were non-packages of this thing to avoid this
>  kind of problem.  I'm no NVIDIA fan, but I have to admit their drivers
>  tend to work flawlessly if you know what you are doing (and this
>  know-how happens to be expressable via dependencies, conflicts and one
>  script).

Of course, the non-package doesn't seem to get updated as often as it
probably ought to be.  It was several weeks from the time I got my GF2 to
the time a new package for 1.0-1251 was uploaded.  The package was not a
trivial rebuild, but I managed to build a local copy and help a couple
other people who relied on NVidia's flawless installer clean up the
aftermath on their systems.


>  > we're all through here...)  Reason why they don't is probably because
>  > GLX 1.3 is not universally adopted by all parties yet, and is likely
> 
>  glXGetProcAddressARB is not GLX-version specific.  Any version of the
>  GLX extension can implement it.  Mesa does.  The OpenGL SI's GLX does,
>  too.  The NVIDIA drivers, too.  Any half decent implementation should.
>  It's the only portable way of using extensions.

It's only guaranteed to be there for GLX 1.3...  Meaning that if you have
less, you'll be dlsym'ing it.  If you have it and it's there, great.  It
massively simplifies code.


>  > to not be deemed so for at least a few months.  NVidia claims to have
>  > everything for 1.3 except that function in their headers, which is
>  > amusing since they do have the function.
>  
>  NVIDIA GL headers are a pathetic joke.

There is a reason I don't use them, you know..  ;)


>  > The Efnet #OpenGL FAQ explains how to properly test for the function
>  > at run-time.  You still have to play with dlopen, but the function is
>  > very clean and may only need to do so once.
> 
>  that's very very non portable.  Try it on an IRIX implementation.  It
>  blows up.  You have to dlopen libgl there (pay attention to case).

Actually, I was wrong, it's not there.  Thought it was.  What is there is
this:

	void *libhandle = dlopen(NULL, RTLD_LAZY);

If you link the binary against the lib, it doesn't matter what it's
called.  If you're dlopen'ing the lib on disk, you need to let the user
decide what to open.  Not doing so defeats the purpose of figuring it all
out at runtime.

Here's a snippet probably written by Jeff from QuakeForge:

        if (glProcAddress_present) {
                return qfglXGetProcAddress ((const GLubyte *) name);
        } else {
                if ((dlhand = dlopen (NULL, RTLD_LAZY))) {
                        void       *handle;

                        handle = dlsym (dlhand, name);
                        dlclose (dlhand);
                        return handle;
                }
                return NULL;
        }

Ugh, assignments in ifs..  Anyway, first run through you try to get
glXGetProcAddress if you can the hard way (after checking for the
extension naturally..)  Afterward, you just use it instead of the dl mess,
which almost looks simple and efficient based on the above snippet.  You


>  > I don't recall reading they call for GLX.  Have I missed something?
> 
>  read the specification.  They probably don't know what GLX is :-) but
>  they call for it (implicitly).  At any rate they refer to the Linux
>  OpenGL ABI.

*sigh*  Now that sounds vague enough to put them on my list of people who
need a good thwacking.  I'll examine the full horror of it later and
coordinate with Zeph on the larts.  =p  I would wager they mean GL 1.2,
GLU, GLX 1.2, and GLUT 3.whatever.  I'd also gamble they don't say so.

-- 
Joseph Carter <knghtbrd@d2dc.net>                   Free software developer

<xinkeT> "Lord grant me the serenity to accept the things I cannot
         change, the courage to change the things I can, and the wisdom
         to hide the bodies of the people I had to kill because they
         pissed me off."

Attachment: pgpAZdyenEpGL.pgp
Description: PGP signature


Reply to: