[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Xorg perfomance and virtual desktops

Tshepang Lekhonkhobe wrote:
I just found some ugly behaviour of GNOME's virtual desktop (VD) and
Xorg: I left gnome-terminal running a rapidly-changing, verbose
listing of some process on VD1 and then moved on to VD2 to run top.
top tells me that Xorg is guzzling some serious CPU resources which
reminds me that the VD's are actually one wide display pretending to
be four (in my case). When I visit VD1 and hide that listing, say go
to another terminal tab, Xorg behaves again. Where should I file this

Hi there,

I don't think that the problem you are describing would be considered a bug in either XOrg or Gnome.

IIRC, X11 provides a standard framework for applications to draw graphical interfaces on a video interface. Since X11 allows the viewable area to exceed the resolution of the monitor the desktop can appear larger than the screen itself. In the past this was handled by letting the mouse 'scroll' the view up, down, left, or right as needed until the edge of the resolution was reached. That isn't an ideal way to navigate the usable resolution so a pager application was developed to designate what region of the resolution was visable. Oddly this was often split into four parts. X11 Clients place and draw their graphical data to the display area as needed without knowing if the application is in the viewable area. So that applications which are hidden (minimized) stop drawing to the display and application that are not hidden continue to do so. As a result open windows on a section of the display that is not currently selected will continue to send X11 messages to the server and the server will process them accordingly.

Hmmm... I think one way to solve this would be to extend the display management code in X11 to identify which applications are in the visable display area. Then the pager applications (including GNOME's) would need to tell the X11 server which applications were visable. The X11 server could then ignore the applications not on the current display area, although this would still wasste resources for each application sending data to the X11 server.

Alternatively the pager application could signal the client applications and request that they hide (minimize) as needed. Then when a given display area is selected the pager application would hide the currently visable client applications and then unhide the appropriate set. Unfortunately that can be problematic too, especially if there are hide/unhide animations and whatnot.

So in short this is a coordination issue between the X11 server and the X11 client applications and therefore not specifically a problem in either application. That probably explains why it hasn't been fixed yet.

Hope that helps,

Matthew McGuire

Reply to: