[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: gs shouldn't depend on svgalib



I've cc'd this to debian-devel more for the final comment about
security, but suggestions about the other issues are of course welcome
l-)

joost witteveen writes:

>Recently, in debian-user, you wrote about gs:
>
>> [......]
>> I agree with this.
>> 
>> -- 
>> Richard Kettlewell (Debian svgalib maintainer)
>
>Well, OK, I deleted a lot of stuff, but the main thing still is there:
>You're the debian svgalib maintainer.
>
>Now, in ghostscript, the svga code does something like this:
>
>gdevl256.c> /* Open the LINUX driver for graphics mode */
>gdevl256.c> int
>gdevl256.c> lvga256_open (gx_device * dev)
>gdevl256.c> {
>gdevl256.c>    int vgamode;
>gdevl256.c>    int width, height;
>gdevl256.c> 
>gdevl256.c>    vga_init();
>gdevl256.c>    vgamode = vga_getdefaultmode ();
>gdevl256.c>    if (vgamode == -1)
>gdevl256.c> 	  vgamode = G320x200x256;
>gdevl256.c>    vga_setmode (vgamode);
>gdevl256.c> 
>
>On my Cirrus Logic GD5430 video card, this results in a resolution
>that my monitor claims is 640x480, but my eys assure me that it looks
>like 320x200 or some such. Anyway, it doens't look good.
>
>Could you tell me if I'm the only person who has this problem (then
>I will not bother you with it), or if more video cards may have this.
>And if so, what I may need to change to increase the resolution if
>available. Can I just change G320x200x256 to G800x600x256, or
>are there better ways to test for the highest video mode available
>(I'm asking instead of trying as trying takes about 30 min compiling
>on my pentium 75).

Try using the program below, which should compile slightly quicker l-)
On my 5430, this switches into 320x200 mode and fills the top half of
the screen with colour - which is exactly what I would expect from the
code; when you say that your monitor claims it is 640x480, what do you
mean - does it display it in some strange distorted way?

Finding the highest available video mode is hard, as the graphics card
may have a different limit from the monitor.  I'd recommend defaulting
to a mode which works everywhere (640x480 is probably good).

----------------------------------------
#include <vga.h>
#include <stdio.h>

/* Open the LINUX driver for graphics mode */
void
lvga256_open (void)
{
    int vgamode;

    vga_init();
    vgamode = vga_getdefaultmode ();
    if (vgamode == -1)
	vgamode = G320x200x256;
    vga_setmode (vgamode);
}

int
main(void)
{
    unsigned char *p;
    int n;
    lvga256_open();
    n = 0;
    for(p = graph_mem + 320*100 - 1; p >= graph_mem; p--)
	*p = n++;
    getchar();
    vga_setmode(TEXT);
    return 0;
}
----------------------------------------

(Link this with -lvga and run it as root or setuid to root.)

>I'm asking you as I'm about to rewrite my ghostscript wrapper
>programme so that it will automatically call gs with
>"-sDEVICE=lvga256" if TERM=linux and no other -sDEVICE is set on the
>command line.

That's a good idea...

>I'm sure I'm gonna get a LOT of bug reports if lot's of people see
>this "320x200" look-alike video mode.

l-)

>(BTW, I'm alson adding "-dSAFER" if I detect gs being setuid,
>for security resons).

My initial reaction to that is `please don't' as svgalib-compatible gs
has to be setuid to work at all.  If gs is setuid then it should call
surrender the root permissions after doing what is necessary (vga_init
does this; if you implement code that doesn't call it manually you
must do it yourself fairly early on.)

What we really need is some kernel support for graphics cards...

-- 
                                      Richard Kettlewell <richard@elmail.co.uk>
`It's all just a bad dream.'             http://www.elmail.co.uk/staff/richard/
`Really?  Is it?'
`Hell no, it's real; I was just telling you what you wanted to hear.'


Reply to: