The difference of the image when connected via VGA, DVI and HDMI

Comments: 0

VGA, DVI and HDMI arevideo interfaces fortransmitting video signal from a source to an image output device.They differ in the method of transmission and signal processing, as well as the port.

VGA was developed in 1987 and was designed to transmit an analog signal to cathode-ray tube monitors.Ten years later, LCD monitors seized the market.Through VGA, the video signal transmission process was carried out by converting a digital signal into an analog signal, which was then transmitted and output to a CRT monitor.With the advent of LCD monitors, the scheme became more complicated.Now the signal had to be convertedfrom digital to analog, transferred to an LCD monitor and converted back to digital.It became obvious that the analog signal can be excluded from the chain, and in 1999 the DVI video interface appeared.

HDMIwas designedat the beginning of the two thousandth.It differed from DVI in a more compact port and the ability to transmit digital audio signals (since 2008 DVI also learned how to transmit sound).The advantages of the new interface have taken their toll and,at the moment,it is the most advanced.Its popularity led to the emergence ofminiHDMIandmicroHDMI.Their differences are only in the size of the ports.

How much DVI and HDMI image is better than VGA

The main argument in favor of digital interfaces is that the analog signal during transmission is exposed to external electromagnetic fields, and this leads to its distortion. There is some truth in this, but at home there is no serious interference that could lead to a noticeable distortion even when transmitted over a long distance. It is also believed that the DVI and HDMI transmit a signal as accurately as possible at the expense of errors post correction, which VGAdoes not have. This is true, but the advantage is only with a quality cable of small length (up to 5 meters).

Another argument in favor of digital video interfaces is the absence of unnecessary signal conversion – from digital to analog and vice versa. It would seem that HDMI and DVI should winVGA clean in this aspect. In practice, sometimes it turns out the other way, since it does not go without transformations anyway. Digital signals are encoded, and must be decoded and processed before being displayed on the screen. Individual modules of image output devices are responsiblefor this process, and their transcoding algorithms are not always perfect. However, over time they are being improved and are now at a good level even in cheap monitors and TVs.

Cable quality is another stumbling block. An analog signal is less demanding of it, while a digital signal needs a good conductor. This is especially true with a cable length of more than five meters. In this case, with the loss of bits, error correction does not always work and we can get an image several times worse at the output than of a VGA connection.

Summing up

Despite the fact that I belittled the dignity of DVI/HDMI, in some cases,the image transmitted through themwill be better. But this can only be noticed if there is a high-quality cable, a reliable connection between the ports and a good output device – a high-definition monitor or TV.

If the monitor via the VGA gives a good image, do not expect that when you connect via the digital video interface the image will sparkle with new colors. In my experience, I have seen a significant improvement only once with the connection of «AOC» monitors. They worked disgustinglythrough VGA – the image was fuzzy and blurry. In this case, it is only the manufacturer’s fault.

Did you notice the difference in the display image when switching from one video interface to the other one? Leave your comments.

Tags:
PCHardware
Comments —
    © EN.REALADMIN.RU   2022
    Total time generation: 0,1066 s | 4 mb.