[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Two computers in one: two users each with their own accounts, monitor, and keyboard?



On Wed, 6 Jan 2010 19:15:28 -0600, "Boyd Stephen Smith Jr."
<bss@iguanasuicide.net> wrote:
> In <[🔎] 4B44B28B.10409@hardwarefreak.com>, Stan Hoeppner wrote:
>> To make OpenGL
>> really scream on single user 3D chips, they had to eliminate over the
>> network OpenGL completely, as keeping that capability would have totally
>> hosed the rendering pipeline performance for 3D chips.
> 
> That makes no sense.  OpenGL is an abstration, like the X protocol
itself. 

It makes sense when one takes into account that over 99% of the GL
extensions developed over the past decade target a local GPU optimized
OpenGL server.  Each OpenGL server implementation must target a rendering
device, and the Linux OpenGL server has been written with a local GPU
device driver as the target.  This change in architecture occurred in the
early 2000's IIRC.  I recall seeing something on LKML or a similar list
about the final abandonment of remote rendering capability for OpenGL in
Linux because the two requirements were inversely compatible:  local GPU vs
remote network rendering.  Due to the latencies, if you optimize for one
the other sucks, and if you optimize for the middle, both end up unusable. 
Thus, at that point, the OpenGL server in Linux became GPU centric and no
network rendering was possible.  I don't have a link or hard copy.  Like I
said I read this 8-10 years ago.  IIRC, this change took place back in the
2.0 kernel era.

For one to develop a network centric OpenGL application today, you would
require a hardware (GPU) agnostic OpenGL server component.  Good luck
finding one.  You'll likely need a time machine to take you back prior to
the late 90's.  One would also require an OpenGL client design optimized
for use with a hardware agnostic OpenGL server on a remote IP address. 
Again, good luck finding such a beast.  Oh, I guess you could write one of
each, and then write your network OpenGL application to take advantage of
them, regardless of their limited rendering capability.  Good luck
troubleshooting any capability and/or performance issues.

Due to the client/server model of OpenGL, what you say is possible, and in
the 80s and 90s was the norm.  But it hasn't been for over a decade due to
ultra cheap 3D rendering chips.  The need for the network model evaporated,
and all the operating systems adapted to this paradigm shift.

--
Stan


Reply to: