Video Output Interfaces By Joshua Blagden | JB - Mac Help | JBlagden

Video Output Interfaces By Joshua Blagden

The purpose of this article is to familiarize readers with the three main video output interfaces and explain a bit about each one. My personal favorite is DVI because it’s commonly found on non-CRT monitors, is clearer than VGA and supports HDCP, which is necessary for watching HD movies.

There are three main video output interfaces: 

1. VGA

2. DVI



VGA (Video Graphics Array) is pretty much the oldest video interface for computers, aside from coaxial. It’s an analog interface, which means it doesn’t have HDCP, which is a necessary technology for watching High-Definition movies from iTunes and Netflix. It is however, the one which if not for HDTVs, is the one with which the most people are familiar with. Even people whose computers and monitors have DVI will still use VGA, largely because they’re either unaware of the difference or have no clue what DVI is. While it is true that VGA essentially supports the same resolutions as DVI, it doesn’t do that in a single generation. Specifically, there have been many versions of VGA over the past 20 years or so, for example: SVGA, WXGA and QXGA, which makes it a bit of an irritation.



Unlike VGA, DVI(Digital Visual Interface) is a digital video interface, which allows it to support HDCP, which as I mentioned in the previous section, is necessary for watching High-Definition movies purchased from the iTunes store. Also, there is a noticeable difference in picture quality compared to VGA; it’s considerably clearer than VGA. Compared to DVI, VGA is a bit blurry. For my external monitor, I originally used VGA, but then I tried DVI and noticed a considerable difference. Well, that and because with VGA I wouldn’t have been able to watch High-Definition content from iTunes, which would be a big irritation. Of course, something to be wary of is that as much as I like Apple, their Mini-Diplayport to DVI adapter doesn’t always have a long lifespan; I had to replace mine after a year or two. 



HDMI(High Definition Media Interface) is the video interface which most people are familiar with at this point in time, largely due to HDTVs. It carries video and audio, which is often useful. HDMI 2.0 can support 4K at 60 Hertz and HDMI 1.4 can support 4K at 30 Hertz. HDMI is often found on IBM-Clone laptops as well as the Retina MacBook Pros and 2013 Mac Pro. And of course, HDMI supports HDCP. Things like Blu-Ray players and set-top boxes (i.e. Apple T.V.) would not function without HDMI, which makes it an integral part of any home-theater.


Mini Displayport

I should also mention mini Displayport. It’s an interface you’ll frequently run into with Macs. Every Mac made in the last several years is equipped with it. It’s nice to have, but it can also be a nuisance. It’s useful because, with the use of adapters, it can serve as an HDMI, VGA, DVI or Displayport port(sorry for the redundancy). The trouble with Mini Displayport is that it’s rarely used on monitors. Most monitors use either VGA, DVI or HDMI, but not Mini Displayport. If you want a monitor which support Mini Displayport, you’ll end up spending upwards of $500. As a result, you’ll need an adapter to use a monitor. Despite its versatility  Mini Displayport is a bit of a frustration since it requires the use of an adapter to be able to connect a display. It makes you ask yourself “Why couldn’t Apple just use a regular display connector”. Fortunately, the Mac Mini, Mac Pro and 15 inch Retina MacBook Pro all come with HDMI. However, for most us, an adapter will have to suffice. Otherwise, you’ll just have to buy a 15 inch MacBook Pro . By the way, the upside of the regular displayport standard is that it supports very high resolutions, including 4K and 7680 x 4320. 



Displayport is the full-sized cousin of Mini-Displayport. It’s rarely used, but it’s useful for 4K displays because it can display 4K at 60 hertz. Displayport has also been used for ~4K resolution monitors before HDMI 1.4 existed. That’s not to say that HDMI can’t handle 4K at 60 hertz. As I said towards the top of the page, HDMI 1.4 can handle 4K at 30 hertz and HDMI 2.0 can handle 4K at 60 hertz. However, as far as I know, no graphics card has HDMI 2.0 yet. So, the best option for using a 4K display at 60 hertz. This is a particularly big issue for gamers who want to be able to play games at 60 frames per second. Personally, I wouldn’t get into 4K right now, and I don’t mind playing games at 30 frames per second. But there are people who do want to be able to play games in 4K at 60 hertz. 

© Joshua Blagden & Justin Barczak 2013-2015