Video Display Standards

Enhanced Graphics Adapter (EGA)

IBM's next standard after CGA was the Enhanced Graphics Adapter or EGA. This standard offered improved resolutions and more colors than CGA, although the capabilities of EGA are still quite poor compared to modern devices. EGA allowed graphical output up to 16 colors (chosen from a palette of 64) at screen resolutions of 640x350, or 80x25 text with 16 colors, all at a refresh rate of 60 Hz.

You will occasionally run into older systems that still use EGA; EGA-level graphics are the minimum requirement for Windows 3.x and so some very old systems still using Windows 3.0 may be EGA. There is of course no reason to stick with EGA when it is obsolete and VGA cards are so cheap and provide much more performance and software compatibility.

Video Graphics Adapter (VGA)

The replacement for EGA was IBM's last widely-accepted standard: the Video Graphics Array or VGA. VGA, supersets of VGA, and extensions of VGA form today the basis of virtually every video card used in PCs. Introduced in the IBM PS/2 model line, VGA was eventually cloned and copied by many other manufacturers. When IBM fell from dominance in the market, VGA continued on and was eventually extended and adapted in many different ways.

Most video cards today support resolutions and color modes far beyond what VGA really is, but they also support the original VGA modes, for compatibility. Most call themselves "VGA compatible" for this reason. Many people don't realize just how limited true VGA really is; VGA is actually pretty much obsolete itself by today's standards, and 99% of people using any variant of Windows are using resolution that exceeds the VGA standards. True VGA supports 16 colors at 640x480 resolution, or 256 colors at 320x200 resolution (and not 256 colors at 640x480, even though many people think it does). VGA colors are chosen from a palette of 262,144 colors (not 16.7 million) because VGA uses 6 bits to specify each color, instead of the 8 that is the standard today.

VGA (and VGA compatibility) is significant in one other way as well: they use output signals that are totally different than those used by older standards. Older displays sent digital signals to the monitor, while VGA (and later) send analog signals. This change was necessary to allow for more color precision. Older monitors that work with EGA and earlier cards use so-called "TTL" (transistor-transistor logic) signaling and will not work with VGA. Some monitors that were produced in the late 80s actually have a toggle switch to allow the selection of either digital or analog inputs. See here for more on the issue of analog and digital signalling.

Note that standard VGA does not include any hardware acceleration features: all the work of creating the displayed image is done by the system processor. All acceleration features are extensions beyond standard VGA.

Super VGA (SVGA) and Other Standards Beyond VGA

VGA was the last well-defined and universally accepted standard for video. After IBM faded from leading the PC world many companies came into the market and created new cards with more resolution and color depths than standard VGA (but almost always, backwards compatible with VGA).

Most video cards (and monitors for that matter) today advertise themselves as being Super VGA (SVGA). What does a card saying it is SVGA really mean? Unfortunately, it doesn't mean much of anything. SVGA refers collectively to any and all of a host of resolutions, color modes and poorly-accepted pseudo-standards that have been created to expand on the capabilities of VGA. Therefore, knowing that a card that supports "Super VGA" really tells you nothing at all. In the current world of multiple video standards you have to find out specifically what resolutions, color depths and refresh rates each card supports. You must also make sure that the monitor you are using supports the modes your video card produces; here too "Super VGA compatible" on the monitor doesn't help you.

To make matters more confusing, another term is sometimes used: Ultra VGA or UVGA. Like SVGA, this term really means nothing also. :^) Some people like to refer to VGA as 640x480 resolution, SVGA as 800x600, and UVGA as 1024x768. This is overly simplistic however, and really is not something that you can rely upon. The proliferation of video chipsets and standards has created the reliance on software drivers that PC users have come to know so well. While Microsoft Windows, for example, has a generic VGA driver that will work with almost every video card out there, using the higher resolution capabilities of your video card requires a specific driver written to work with your card. (The VESA standards have changed this somewhat, but not entirely).

BM did create several new video standards after VGA that expanded on its capabilities. Compared to VGA, these have received very limited acceptance in the market, mainly because they were implemented on cards that used IBM's proprietary Micro Channel Architecture (which received no acceptance in the market). You may hear these acronyms bandied about from time to time:
8514/A:
This standard was actually introduced at the same time as standard VGA, and provides both higher resolution/color modes and limited hardware acceleration capabilities as well. By modern standards 8514/A is still rather primitive: it supports 1024x768 graphics in 256 colors but only at 43.5 Hz (interlaced), or 640x480 at 60 Hz (non-interlaced).
XGA:
This acronym stands for Extended Graphics Array. XGA cards were used in later PS/2 models; they can do bus mastering on the MCA bus and use either 512 KB or 1 MB of VRAM. In the 1 MB configuration XGA supports 1,024x768 graphics in 256 colors, or 640x480 at high color (16 bits per pixel). XGA-2: This graphics mode improves on XGA by extending 1,024x768 support to high color, and also supporting higher refresh rates than XGA or 8514/A. The closest thing to a true SVGA standard is the set of standards created by VESA.

Top of page

PC TECHNICAL HOME PAGE