Video Card Inputs and Outputs
One simple but important feature to be considered in selecting a video card is the type of inputs and outputs it has. The types of inputs and outputs will determine what type of monitor and other video peripherals (video cameras, editing consoles, etc.) you can attach to your homebuilt computer.
When I first wrote this site, SVGA was the standard for computer video. SVGA was an improvement on the VGA standard that IBM invented in 1987 and used the same DE-15 connector. All monitors had SVGA inputs, and all computer video cards had SVGA outputs. DVI was still new and only a few computers and monitors supported it. Even those that did still had VGA connectors in addition to DVI.
Nowadays, the VGA-style monitor connector is becoming just a memory. Newer, faster interfaces that are capable of higher resolutions and quality (and some of which can also carry audio and other data) have taken its place. Many video cards and monitors don't have VGA connectors at all anymore. That makes the kind of connectors a video card has an important factor in choosing a video card. Let's take a quick look at some of them.
Before I proceed, let me remind the reader that the complexity of a video signal is a function of the screen resolution, the frame rate or refresh rate (measured in Hertz, or Hz, which basically means cycles per second), the color depth, the complexity of the image, and the rate at which the image is changing. The more complex the signal, the more throughput it needs. That's why some of the descriptions include examples of multiple monitor configurations that can be supported. An interface may be able, for example, to drive a smaller-resolution at a refresh rate of 60 Hz, but a larger-resolution monitor at only 30 Hz.
Standard Video Card and Monitor Interfaces
The interfaces I'm going to talk about in this section apply to the most commonly-used output connections on video cards, and the input connections on monitors and other displays or devices that accept video inputs (such as projectors and digital video recorders). A bit further down I'll mention a few additional interfaces that are only of interest to a few people who use certain specialized kinds of equipment.
Although no longer the standard, video cards with SVGA outputs and monitors with SVGA inputs are still available at the time of this revision. SVGA is an analog standard that uses a DE-15 connector. Because it's analog, the length and quality of the cable and the frequency of the video card's digital-to-analog converter affect the signal quality at the monitor end.
The SVGA interface was designed to reliably drive a single monitor with a maximum resolution of 1024 x 768. That sounds like nothing until you realize that the previous VGA standard was designed around 640 x 480.
The (highly) theoretical maximum of SVGA, if you want to do all the math, is 2048 x 1536 at 85 Hz refresh; but that's pretty much a pipe dream in the real world. With a short, average-quality cable, VGA can easily support 1280 x 1024 with little or no noticeable quality loss. Anything above that starts getting iffy. Realistically speaking, with a quality video card and monitor, a short, high-quality cable, and some luck, you might be able to drive a 1920 x 1080 monitor at 30 Hz over VGA. But you're probably going to get at least occasional ghosting and quality loss.
VGA carries only video, not audio.
DVI stands for Digital Visual Interface. It is the most confusing of the video interfaces. There are multiple types of connectors, and they are not interchangeable.
DVI connectors are divided into two main groups: single-link and dual link. Dual-link can carry twice the data of single-link.
There also are three standards for DVI interfaces: DVI-A (carries analog data only), DVI-D (carries only digital data) and DVI-I (carries both digital and analog data).
There's also another standard known as M1-DA, which is also known as P&D, or simply M1, just to add more confusion to the mix. M1-DA is a single-link interface that can carry digital or analog data, as well as USB or Firewire data.
Some DVI interfaces are theoretically capable of carrying audio data, but I don't recall ever coming across a device that utilized that capability. They may be out there, though.
Single-link DVI is capable of supporting resolutions up to 1920 x 1200 at 60 Hz. Dual link DVI can support resolutions up to 2560 x 1600 at 60 Hz. The digital signals of digital DVI interfaces are basically identical to the video portion of HDMI and can be converted using a simple adaptor. Likewise, the analog signals of DVI interfaces that support analog can be converted to VGA using an adapter. There are also adapters to convert DVI connections to different kinds of DVI connections.
I did mention that DVI is confusing, didn't I?
From a practical, nuts-and-bolts perspective, what all this means to computer builders is that it's not enough to know that a video card has a DVI output. Unless you want to go messing with adapters and dongles, it's important to know which kind of DVI output the video card has.
HDMI stands for High-Definition Multimedia Interface. It carries both audio and video data, including surround sound.
There are five different types of HDMI connectors of different sizes, but chances are you won't have to worry about four of them. Type is the most common and is used on most desktop computers, laptop computers, monitors, and home video devices. Type B is larger than Type A and has never been used in a single consumer product. Type C is smaller than Type A but has the same pin assignments. It's used on some laptops and other portable devices where space is at a premium. Type D is sometimes called "Micro HDMI" and is used on some phones and tablets. It has different pin assignments than Type A, but adapters are available if you need them. Type E is used mainly in automotive applications.
In addition to the connector types, there are also multiple standards sets. The parts of the specs that matter most to PC builders are:
- The original standard supported resolutions of up to 1920 x 1200 at 60 Hz.
- The 1.3 standard bumped the maximum resolutions up to 1920 x 1080 at 120 Hz or 2560 x 1440 at 60 Hz and increased the color depth.
- The 1.4 standard increased the maximum resolutions to 4096 x 2160 at 24 Hz or 3840 x 2160 at 24, 25, or 30 Hz. It also added additional color profiles.
- The standard as of this revision of the site is 2.1, and enables resolutions as high as 8K at 120 Hz.
Because of its huge throughput capabilities and near-universality in modern consumer audiovisual equipment, HDMI is an enormously popular interface for computer video cards. Nearly all current VCR's, DVR's, cable television boxes, monitors, televisions, and streaming media players support HDMI. A high-quality video card with HDMI is a good choice for pretty much any computer, but especially for one that will be doing duty as a media hub or part of a home theater system.
HDMI Cables. HDMI cables are classified by both connector type and speed. Some people say that there's no difference between HDMI cables and that you can use an HDMI cable designed for slower speeds with components built for newer, faster speeds. In my experience, I've found that that's not always the case. I think it's more accurate to say that some older HDMI cables will work just fine with newer HDMI standards, and others won't. Newer cables are tested and certified to work at the higher speeds. Older cables aren't, but may still work.
Probably the best way to look at it is that cables that have been certified by their manufacturers for the newer standards and higher speeds should always work at those standards, while cables made for lower speeds might work at the newer standards and speeds. If they don't, the devices will either negotiate a lower-quality connection (for example, lowering the resolution or refresh rate), they may periodically blank out or display blocks or stripes, they may render audio but not video, or they may not work at all.
In a nutshell, if you have an older HDMI cable and you want to try it, go ahead. It may just work, and you can always replace it if it doesn't. But if you have to buy a new cable anyway, buy an HDMI cable rated for 4k UHD (or whatever the newer standard is if HDMI is upgraded again by the time you read this page). The price difference is trivial.
DisplayPort is a video and audio interface designed specifically for computers, but which is compatible with HDMI and can provide signal to monitors and other devices with HDMI inputs through the use of simple adapters. It can carry audio, video, or both, as well as USB.
The output of a video card with DisplayPort can also drive HDMI monitors and devices using a simple DisplaPort to HDMI adapter or a DisplayPort to HDMI cable with the adapter built-in. The computer I'm using right now uses an Amazon Basics DisplayPort to HDMI cable. It works just fine.
As of this revision of this site, DisplayPort is in version 1.4, which is capable of throughput up to 32.4 Gbit/s when using HBR3 (DisplayPort High Bit Rate Version 3) and DSC 1.2 (Digital Stream Compression Version 1.2). What this means in practical terms is that DisplayPort 1.4 is capable of driving monitors and monitor combinations including (but not limited) to the following:
- One 7680 x 4320 (8K) monitor at 60 Hz
- One 4096 x 2160 (4K) monitor at 120 Hz
- Two 3840 x 2160 (4K) monitors at 120 Hz
- Three 2560 x 1600 monitors at 60 Hz, or one 2560 x 1600 monitor at 120 Hz
- Four 1920 x 1080 or 1920 x 1200 monitors at 60 Hz, or one 1920 x 1080 or 1920 x 1200 monitor at 240 Hz
DisplayPort outputs can also be "split" to run multiple monitors from the same output using a DisplayPort hub, and DisplayPort monitors can be daisy-changed if they have DisplayPort outputs and support MST (multi-stream technology). Overall, it's an excellent interface and is especially popular with gamers, especially if you don't need direct connectivity with HDMI consumer video devices (which would only require a simple adapter even if you did need that connectivity).
Specialized Video Card Inputs
As mentioned above, some high-end video cards also are designed to allow input from video sources. These cards are used for video production, editing, capture, and many other purposes that involve transferring images from external devices onto a computer.
Most of these card can take video input through the same interfaces as those mentioned above. But some also accept inputs that are not commonly used in computing and are more in the realm of commercial video or television production, either past or present. Some of these include:
Composite NTSC, PAL, and SECAM. These are "old" television video standards used in various parts of the world (the United States uses NTSC). These connections combine the red, green, and blue video channels, sync pulses, and so forth into a "composite" video signal that is usually color-coded yellow and usually uses an RCA cable. All three of these composite video standards are fading into history now that HDTV (High-Definition Television) has become the norm.
RGB handles the video signal as separate red, green, and blue components. RGB is used primary for video processing equipment, television projectors, and professional-quality analog video monitors and recorders. It typically uses RCA connectors
S-Video offers higher definition than the NTSC, PAL, or SECAM composite standards, but less definition than HDTV. Many high-end video cards offer S-Video inputs and/or outputs.
YPrPb is the HDTV equivalent of an S-Video connector. It allows direct connection of a video card to High-Definition televisions and other HDTV devices.
RF (Radio Frequency) inputs are used on cards that accept input from standard broadcast or cable television signals. These cards have built-in TV tuners that allow the computer to be used as a television or to be connected to VCR's, certain security cameras, and other devices that use a modulated RF output.