On Tue, Mar 09, 2010 at 09:22:24AM +0200, Dotan Cohen wrote: > > As you can see from the pictures, the pinout variations¹ allow different > > subsets of the pins to be used. Typically the female end on your > > graphics card will be DVI-I and support all options, while the cable from > > the display will have just the subset of the pins it needs. > > > > > > For example, if you connect a digital monitor to a DVI-I port it *can't* > > Here I assume you meant DVI-D, not DVI-I? I mean DVI-D cable to DVI-I socket (this will ensure the signal is digital). On my monitors and computers, all sockets are always DVI-I and all cables are DVI-D. You can verify this on your monitor checking if it has the four analogue connectors and the same on the cable. If it doesn't, then it can only physically recieve digital input; problem solved! The only other choices cable-wise are DVI-I which can carry both signals, and this should result in autoselection of the correct output (which should be digital) and DVI-A (which you don't need to care about). If you have a DVI-I cable, then it might select analogue for some bizarre reason. Swapping for a DVI-D cable would resolve that if you can't work out how to do it in software. Regards, Roger -- .''`. Roger Leigh : :' : Debian GNU/Linux http://people.debian.org/~rleigh/ `. `' Printing on GNU/Linux? http://gutenprint.sourceforge.net/ `- GPG Public Key: 0x25BFB848 Please GPG sign your mail.
Attachment:
signature.asc
Description: Digital signature