What is the difference betwenn DVI-I and DVI-D ?
DVI-I and DVI-D are variants of the "Digital Video Interface (DVI)" connector, a common video connector for monitors that supercedes the older VGA (15-pin) analog connectors for analog CRT monitors. Both variants of DVI provide better quality digital signal with higher bandwidth, providing crisper images than VGA
A DVI-D connector on a graphics card sends out a digital signal only, while a DVI-I connector, which carries both an analog and digital signal, can send out a digital signal (for digital displays such as flat panel LCD monitors) as well as analog signal (for older displays such as a CRT monitor) using a DVI to VGA adapter.
The most notable difference between the two is that a DVI-D connector does not provide analog signal, and cannot be used with a DVI-to-VGA adapter (it does not have the extra pins of the DVI-I connector that carry the analog signal). Even if you connect a DVI-to-VGA adapter to a DVI-D connector, you will only get digital signal from it.
DVI-I connectors are filly compatible with DVI-D cables and provide the same level of quality for digital signals. They can also be used with simple inexpensive DVI-I to VGA converters.
DVI-D connectors are digital-only, and getting an analog signal from them requires an active (digital to analog) converter that converts the digital signal to analog VGA. Such active converters often cost more than a video card with VGA analog output.
Note: Some older TVs may accept digital signal from a VGA connector, and ONLY in such cases using a DVI-D-to-VGA connector can work, however, only do this at your own risk and keep in mind using such non-standard connector can damage a monitor that does not support it.