[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Faking quad buffers using VirtualGL for use with a 3D TV



Hi there.

If you or someone you know has a 3D TV I bet you've wondered what it is about an
OpenGL graphics card that's different from a "normal" or 2D graphics card.

The short answer is - there is no significant difference, and some 3D laptops
use the same graphics chip that 2D ones do - see

   http://www.zdnet.com/blog/computers/ces-2011-sony-debuts-vaio-f-series-3d-laptop-with-new-nvidia-geforce-gt-540m-graphics/4690

Some higher-end machines may use 2 2D graphics chips/cards, one for each eye.

When you look at the actual 3D hdmi output format, it turns out it just outputs
the two images one above the other - see

   http://hdguru.com/3d-hdtv-and-hdmi-explained/1336/

YouTube allows users to post 3D videos which you can search for with
"yt3d:enable=true".

When YouTube detects this tag it adds a "3D" button that allows you to choose
several ways to view the video.

Here are some examples:

YouTube in 3D: http://www.youtube.com/watch?v=5ANcspdYh_U&feature=plcp
StereoQuake: http://www.youtube.com/watch?v=tXvirxRK-Ww

I thought this would be something that's possible - even straightforward - to do
with VirtualGL, but as you can see from this discussion

   Faking quad buffers for 3D TV
   http://sourceforge.net/projects/virtualgl/forums/forum/401860/topic/5335139

the author of VirtualGL seems somewhat inflexible in their approach, either that
or they're missing the point I repeatedly tried to make.

Maybe if Debian users show that they understand the concept, Debian can make a
patch that someday might be accepted upstream.

I just want to give Quake3 a go with a 3D TV - where's the harm?


Regards,
Philip Ashmore


Reply to: