Upgrading and Repairing PCs (17th Edition)
A video adapter provides the interface between your computer and your monitor and transmits the signals that appear as images on the display. Throughout the history of the PC, there have been a succession of standards for video display characteristics that represent a steady increase in screen resolution and color depth. The following list of standards can serve as an abbreviated history of PC video-display technology:
IBM pioneered most of these standards, but other manufacturers of compatible PCs adopted them as well. Today, IBM no longer sets standards for the PC business (it even sold its PC business to China's Lenovo in 2005), and many of these standards are obsolete. Those that aren't obsolete seldom are referred to by these names anymore. The sole exception to this is VGA, which is a term that is still used to refer to a baseline graphics display capability supported by virtually every video adapter on the market today. When you shop for a video adapter today, you are more likely to see specific references to the screen resolutions and color depths that the device supports than a list of standards such as VGA, SVGA, XGA, and UVGA. However, reading about these standards gives you a good idea of how video-display technology developed over the years and prepares you for any close encounters you might have with legacy equipment from the dark ages. Today's VGA and later video adapters can also display most older color graphics software written for CGA, EGA, and most other obsolete graphics standards. This enables you to use older graphics software (such as games and educational programs) on your current system. Although not a concern for most users, some older programs wrote directly to hardware registers that are no longer found on current video cards. Obsolete Display Adapters
Although many types of display systems were at one time considered to be industry standards, few of these are viable standards for today's hardware and software. Note If you are interested in reading more about MDA, HGC, CGA, EGA, or MCGA display adapters, see Chapter 8 of Upgrading and Repairing PCs, 10th Anniversary Edition, included on the disc with this book. The Video Graphics Array Standard
When IBM introduced the PS/2 systems on April 2, 1987, it also introduced the VGA display. On that day, in fact, IBM also introduced the lower-resolution MCGA and higher-resolution 8514 adapters. The MCGA and 8514 adapters did not become popular standards like the VGA did, and both were discontinued. All current display adapters that connect to the 15-pin VGA analog connector or the DVI analog/digital connector are based on the VGA standard. The VGA connector is shown in Figure 13.13; the pinouts are shown in Table 13.8. Figure 13.13. VGA connector used for VGA, SVGA, and other VGA-based standards.
On the VGA cable connector that plugs in to your video adapter, pin 9 is often pinless. Pin 5 is used only for testing purposes, and pin 15 is rarely usedthese are often pinless as well. To identify the type of monitor connected to the system, some manufacturers use the presence or absence of the monitor ID pins in various combinations. Digital Versus Analog Signals
Unlike earlier video standards, which are digital, VGA is an analog system. Why have displays gone from digital to analog when most other electronic systems have gone digital? Compact disc players (digital) have replaced most turntables (analog), mini DV camcorders are replacing 8MM and VHS-based analog camcorders, and TiVo and UltimateTV digital video recorders are performing time-shifting in place of analog VCRs for many users. With a digital television set, you can watch several channels on a single screen by splitting the screen or placing a picture within another picture. Most personal computer displays introduced before the PS/2 are digital. This type of display generates different colors by firing the RGB electron beams in on-or-off mode, which allows for the display of up to eight colors (23). In the IBM displays and adapters, another signal doubles the number of color combinations from 8 to 16 by displaying each color at one of two intensity levels. This digital display is easy to manufacture and offers simplicity with consistent color combinations from system to system. The real drawback of the older digital displays such as CGA and EGA is the limited number of possible colors. In the PS/2 systems, IBM went to an analog display circuit. Analog displays work like the digital displays that use RGB electron beams to construct various colors, but each color in the analog display system can be displayed at varying levels of intensity64 levels, in the case of the VGA. This versatility provides 262,144 possible colors (643), of which 256 could be simultaneously displayed. For realistic computer graphics, color depth is often more important than high resolution because the human eye perceives a picture that has more colors as being more realistic. IBM moved to analog graphics to enhance the color capabilities of its systems. VGA Features
PS/2 systems incorporated the primary display adapter circuitry onto the motherboard, and both IBM and third-party companies introduced separate VGA cards to enable other types of systems to enjoy the advantages of VGA. Although the IBM MicroChannel (MCA) computers, such as the PS/2 Model 50 and above, introduced VGA, it's impossible today to find a brand-new replacement for VGA that fits into the obsolete MCA-bus systems. However, a few surplus and used third-party cards might be available if you look hard enough. The VGA BIOS is the control software residing in the system ROM for controlling VGA circuits. With the BIOS, software can initiate commands and functions without having to manipulate the VGA directly. Programs become somewhat hardware independent and can call a consistent set of commands and functions built into the system's ROM-control software.
Other implementations of the VGA differ in their hardware but respond to the same BIOS calls and functions. New features are added as a superset of the existing functions, and VGA remains compatible with the graphics and text BIOS functions built into the PC systems from the beginning. The VGA can run almost any software that originally was written for the CGA or EGA, unless it was written to directly access the hardware registers of these cards. A standard VGA card displays up to 256 colors onscreen, from a palette of 262,144 (256KB) colors; when used in the 640x480 graphics or 720x400 text mode, 16 colors at a time can be displayed. Because the VGA outputs an analog signal, you must have a monitor that accepts an analog input. VGA displays originally came not only in color, but also in monochrome VGA models, which use color summing. With color summing, 64 gray shades are displayed instead of colors. The summing routine is initiated if the BIOS detects a monochrome display when the system boots. This routine uses an algorithm that takes the desired color and rewrites the formula to involve all three color guns, producing varying intensities of gray. Users who preferred a monochrome display, therefore, could execute color-based applications. Note For a listing of the VGA display modes supported by the original IBM VGA card (and thus all subsequent VGA-type cards), see "VGA Display Modes" in Chapter 15 of Upgrading and Repairing PCs, 12th Edition, available in electronic form on the disc supplied with this book.
Even the least-expensive video adapters on the market today can work with modes well beyond the VGA standard. VGA, at its 16-color, 640x480 graphics resolution, has come to be the baseline for PC graphical display configurations. VGA is accepted as the least common denominator for all Windows systems and must be supported by the video adapters in all systems running Windows. The installation programs of all Windows versions use these VGA settings as their default video configuration. In addition to VGA, virtually all adapters support a range of higher screen resolutions and color depths, depending on the capabilities of the hardware. If a Windows 9x/Me or Windows XP/2000 system must be started in Safe Mode because of a startup problem, the system defaults to VGA in the 640x480, 16-color mode. Windows 2000 and Windows XP also offer a VGA Mode startup that also uses this mode (Windows XP uses 800x600 resolution) but doesn't slow down the rest of the computer the way Safe Mode (which replaces 32-bit drivers with BIOS services) does. IBM introduced higher-resolution versions of VGA called XGA and XGA-2 in the early 1990s, but most of the development of VGA standards has come from the third-party video card industry and its trade group, the Video Electronic Standards Association (VESA). Note If you are interested in reading more about the XGA and XGA-2 display adapters, see "XGA and XGA-2" in Chapter 8 of Upgrading and Repairing PCs, 10th Anniversary Edition, included on the disc with this book.
Super VGA
When IBM's XGA and 8514/A video cards were introduced, competing manufacturers chose not to attempt to clone these incremental improvements on their VGA products. Instead, they began producing lower-cost adapters that offered even higher resolutions. These video cards fall into a category loosely known as Super VGA (SVGA). SVGA provides capabilities that surpass those offered by the VGA adapter. Unlike the display adapters discussed so far, SVGA refers not to an adapter that meets a particular specification, but to a group of adapters that have different capabilities. For example, one card might offer several resolutions (such as 800x600 and 1024x768) that are greater than those achieved with a regular VGA, whereas another card might offer the same or even greater resolutions but also provide more color choices at each resolution. These cards have different capabilities; nonetheless, both are classified as SVGA. The SVGA cards look much like their VGA counterparts. They have the same connectors, but because the technical specifications from different SVGA vendors vary tremendously, it is impossible to provide a definitive technical overview in this book. VESA SVGA Standards
The Video Electronics Standards Association includes members from various companies associated with PC and computer video products. In October 1989, VESA recognized that programming applications to support the many SVGA cards on the market was virtually impossible and proposed a standard for a uniform programmer's interface for SVGA cards; it is known as the VESA BIOS extension (VBE). VBE support might be provided through a memory-resident driver (used by older cards) or through additional code added to the VGA BIOS chip itself (the more common solution). The benefit of the VESA BIOS extension is that a programmer needs to worry about only one routine or driver to support SVGA. Various cards from various manufacturers are accessible through the common VESA interface. Today, VBE support is a concern primarily for real-mode DOS applications, usually older games, and for non-Microsoft operating systems that need to access higher resolutions and color depths. VBE supports resolutions up to 1280x1024 and color depths up to 24-bit (16.8 million colors), depending on the mode selected and the memory on the video card. VESA compliance is of virtually no consequence to Windows versions 95 and up. These operating systems use custom video drivers for their graphics cards. Note For a listing of VESA BIOS modes by resolution, color depth, and scan frequency, see "VESA SVGA Standards" in the Technical Reference portion of the disc accompanying this book.
Video Adapter Types
A monitor requires a source of input. The signals that run to your monitor come from a video adapter inside or plugged in to your computer. The three ways computer systems connect to either CRT or LCD panels are as follows:
Typically, desktop computers that use microATX, FlexATX, microBTX, PicoBTX, or Mini-ITX motherboards feature integrated video using chipsets made by Intel, VIA Technology, SiS, or other vendors. Some microATX or microBTX motherboards might also have a provision for a PCI-Express x16 or AGP video card. For more information, see the next section, "Integrated Video/Motherboard Chipsets."
The term video adapter applies to either integrated or separate video circuitry. The term graphics adapter is essentially interchangeable with video adapter because all video options developed since the original IBM monochrome display adapter (MDA) can display graphics as well as text. Integrated Video/Motherboard Chipsets
Although built-in video has been a staple of low-cost computing for a number of years, until the late 1990s most motherboard-based video simply moved the standard video components discussed earlier in this chapter to the motherboard. Many low-cost systemsespecially those using the semiproprietary LPX motherboard form factorhave incorporated standard VGA-type video circuits on the motherboard. The performance and features of the built-in video differed only slightly from add-on cards using the same or similar chipsets, and in most cases the built-in video could be replaced by adding a video card. Some motherboard-based video also had provisions for memory upgrades.
However, in recent years the move toward increasing integration on the motherboard has led to the development of chipsets that include 3D accelerated video and audio support as part of the chipset design. In effect, the motherboard chipset takes the place of most of the video card components listed earlier and uses a portion of main system memory as video memory. The use of main system memory for video memory is often referred to as unified memory architecture (UMA), and although this memory-sharing method was also used by some built-in video that used its own chipset, it has become much more common with the rise of integrated motherboard chipsets. The pioneer of integrated chipsets containing video (and audio) features was Cyrix (now owned by VIA Technologies). While Cyrix was owned by National Semiconductor, it developed a two-chip product called MediaGX. MediaGX incorporated the functions of a CPU, memory controller, sound, and video and made very low-cost computers possible (although with performance significantly slower than that of Pentium-class systems with similar clock speeds). National Semiconductor retained the MediaGX after it sold Cyrix to VIA Technologies. National Semiconductor went on to develop improved versions of the MediaGX, called the Geode GX1 and Geode GX2, for use in thin clients (a terminal that runs Windows and has a high-res display), interactive set-top boxes, and other embedded devices. National Semiconductor sold its information appliance business, including the Geode family, to Advanced Micro Devices (AMD) in August 2003. AMD currently offers a variety of Geodebased solutions. Intel became the next major player to develop integrated chipsets, with its 810 chipset for the Pentium III and Celeron processors. The 810 (codenamed Whitney before its official release) heralded the beginning of widespread industry support for this design. Intel later followed the release of the 810 series (810 and 810E) with the 815 series for the Pentium III and Celeron, most of which also feature integrated video. Currently, Intel offers integrated video for the Pentium 4 and Celeron 4 processors in the 845, 865, and 91x chipset families; the 9xx chipsets also support the dual-core Pentium D and Pentium Extreme Edition processors. Table 13.9 compares the major video features of Intel's 8xx and 9xx series chipsets, which include integrated video. Note that chipsets listed together share the same video features but differ in other ways such as memory support, I/O features, and so forth.
[1] Systems with 32MB of RAM, PV 5.x, or greater graphics drivers. [2] Systems with 64MB of RAM, PV 5.x, or greater graphics drivers. [3] Systems with 128MB or more of RAM, PV 5.x, or greater graphics drivers. [4] Systems with up to 128MB of RAM. [5] Systems with more than 128MB of RAM. [6] Systems with up to 255MB of RAM. [7] Systems with 256MB or more of RAM. [8] This family of chipsets was code-named Grantsdale before its release. [9] Amount of memory actually used varies by task and onboard system memory. See Intel's website for details.
Besides Intel, other major vendors of integrated chipsets include NVIDIA, VIA, ATI, and Acer Labs. Table 13.10 compares the video features of major integrated chipsets from these vendors that support current AMD and Intel processors.
Although a serious 3D gamer will not be satisfied with the performance of most integrated chipsets (ATI's Radeon Xpress 200 IGPs being notable exceptions), business, home/office, and casual gamers will find that integrated chipset-based video on Pentium 4, Athlon XP, or Athlon 64 platforms are satisfactory in performance and provide cost savings compared with separate video cards. If you decide to buy a motherboard with an integrated chipset, I recommend that you select one that also supports an AGP 8x or PCI Express x16 video expansion slot. This enables you to add a faster video card in the future if you decide you need it. Video Adapter Components
All video display adapter cards contain certain basic components, such as the following:
Figure 13.14 indicates the locations of many of these components on a typical PCI Express x16 video card. Note that the acronym GPU refers to the graphics processing unit. Figure 13.14. A typical example of a high-end video card based on the NVIDIA GeForce 7800 GTX, a GPU optimized for gaming and dual-display support.
Virtually all video adapters on the market today use chipsets that include 3D acceleration features. The following sections examine these components and features in greater detail. The Video BIOS
Video adapters include a BIOS that is similar in construction but completely separate from the main system BIOS. (Other devices in your system, such as SCSI adapters, might also include their own BIOS.) If you turn on your monitor first and look quickly, you might see an identification banner for your adapter's video BIOS at the very beginning of the system startup process. Similar to the system BIOS, the video adapter's BIOS takes the form of a ROM (read-only memory) chip containing basic instructions that provide an interface between the video adapter hardware and the software running on your system. The software that makes calls to the video BIOS can be a stand-alone application, an operating system, or the main system BIOS. The programming in the BIOS chip enables your system to display information on the monitor during the system POST and boot sequences, before any other software drivers have been loaded from disk.
The video BIOS also can be upgraded, just like a system BIOS, in one of two ways. The BIOS uses a rewritable chip called an EEPROM (electrically erasable programmable read-only memory) that you can upgrade with a utility the adapter manufacturer provides. On very old cards, you might be able to completely replace the chip with a new oneagain, if supplied by the manufacturer and if the manufacturer did not hard solder the BIOS to the printed circuit board. Most recent video cards use a surface-mounted BIOS chip rather than a socketed chip. A BIOS you can upgrade using software is referred to as a flash BIOS, and most current-model video cards that offer BIOS upgrades use this method. However, most vendors now prefer to update drivers rather than perform BIOS updates to fix card problems. Video BIOS upgrades (sometimes referred to as firmware upgrades) are sometimes necessary in order to use an existing adapter with a new operating system, or when the manufacturer encounters a significant bug in the original programming. Occasionally, a BIOS upgrade is necessary because of a major revision to the video card chipset's video drivers. As a general rule, the video BIOS is a component that falls into the "if it ain't broke, don't fix it" category. Try not to let yourself be tempted to upgrade just because you've discovered that a new BIOS revision is available. Check the documentation for the upgrade, and unless you are experiencing a problem with the upgrade addresses, leave it alone.
The Video Processor
The video processor (also known as the video chipset, video graphics processor, or GPU) is the heart of any video adapter and essentially defines the card's functions and performance levels. Two video adapters built using the same chipset often have many of the same capabilities and deliver comparable performance. Also, the software drivers that operating systems and applications use to address the video adapter hardware are written primarily with the chipset in mind. You often can use a driver intended for an adapter with a particular chipset on any other adapter using the same chipset. Of course, cards built using the same chipset can differ in the amount and type of memory installed, so performance can vary. Since the first VGA cards were developed, several main types of processors have been used in video adapters; these technologies are compared in Table 13.11.
Identifying the Video and System Chipsets
Before you purchase a system or a video card, you should find out which chipset the video card or video circuit uses or, for systems with integrated chipset video, which integrated chipset the system uses. This allows you to have the following:
Because video card performance and features are critical to enjoyment and productivity, find out as much as you can before you buy the system or video card by using the chipset or video card manufacturer's website and third-party reviews. Poorly written or buggy drivers can cause several types of problems, so be sure to check periodically for video driver updates and install any that become available. With video cards, support after the sale can be important. So, you should check the manufacturer's website to see whether it offers updated drivers and whether the product seems to be well supported. The Vendor List on the disc has information on most of the popular video chipset manufacturers, including how to contact them. You should note that NVIDIA (the leading video chipset vendor) makes only chipsets, whereas ATI (the #2 video vendor) makes branded video cards and supplies chipsets to vendors. This means a wide variety of video cards use the same chipset; it also means that you are likely to find variations in card performance, software bundles, warranties, and other features between cards using the same chipset. Video RAM
Most video adapters rely on their own onboard memory that they use to store video images while processing them; although the AGP specification supports the use of system memory for 3D textures, this feature is seldom supported now that video cards routinely ship with as much as 256MB or more onboard memory and with the move away from AGP to PCI Express x16 video cards. Many low-cost systems with onboard video use the universal memory architecture (UMA) feature to share the main system memory. In any case, the memory on the video card or borrowed from the system performs the same tasks. The amount of memory on the adapter or used by integrated video determines the maximum screen resolution and color depth the device can support. You often can select how much memory you want on a particular video adapter; for example, 128MB, 256MB, and 512MB are common choices today. Although adding more memory is not guaranteed to speed up your video adapter, it can increase the speed if it enables a wider bus (for example, from 128 bits wide to 256 bits wide) or provides nondisplay memory as a cache for commonly displayed objects. It also enables the card to generate more colors and higher resolutions and, for AGP cards, allows 3D textures to be stored and processed on the card, rather than in slower main memory. Many types of memory have been used with video adapters. These memory types are summarized in Table 13.12.
[1] VRAM and WRAM are dual-ported memory types that can read from one port and write data through the other port. This improves performance by reducing wait times for accessing the video RAM compared to FPM DRAM and EDO DRAM. Note To learn more about memory types used on older video cards (FPD DRAM, VRAM, WRAM, EDO DRAM, and MDRAM), see Chapter 15 of Upgrading and Repairing PCs, 12th Edition, available in electronic form on the disc packaged with this book.
SGRAM, SDRAM, DDR, DDR2 SDRAM, and GDDR3 SDRAMwhich are derived from popular motherboard memory technologieshave replaced VRAM, WRAM, and MDRAM as high-speed video RAM solutions. Their high speeds and low production costs have enabled even inexpensive video cards to have 64MB or more of high-speed RAM onboard. SDRAM
Synchronous DRAM (SDRAM) is the same type of RAM used on many current systems based on processors such as the Pentium III, Pentium 4, Athlon, and Duron. The SDRAMs found on video cards are usually surface-mounted individual chips; on a few early models, a small module containing SDRAMs might be plugged in to a proprietary connector. This memory is designed to work with bus speeds up to 200MHz and provides performance just slightly slower than SGRAM. SDRAM is used primarily in older AGP 2x/4x low-end video cards and chipsets such as NVIDIA's GeForce2 MX and ATI's Radeon VE. SGRAM
Synchronous Graphics RAM (SGRAM) was designed to be a high-end solution for very fast video adapter designs. SGRAM is similar to SDRAM in its capability to be synchronized to high-speed buses up to 200MHz, but it differs from SDRAM by including circuitry to perform block writes to increase the speed of graphics fill or 3D Z-buffer operations. Although SGRAM is faster than SDRAM, most video card makers have dropped SGRAM in favor of even faster DDR SDRAM in their newest products. DDR SDRAM
Double Data Rate SDRAM (also called DDR SDRAM) is the most common video RAM technology on recent video cards. It is designed to transfer data at speeds twice that of conventional SDRAM by transferring data on both the rising and falling parts of the processing clock cycle. Today's mid-range and low-end video cards based on chipsets such as NVIDIA's GeForce FX and ATI's Radeon 9xxx and X300600 series use DDR SDRAM for video memory. DDR2 SDRAM
The second generation of DDR SDRAM fetches 4 bits of data per cycle, instead of 2 as with DDR SDRAM. This doubles performance at the same clock speed. The first video chipset to support DDR2 was NVIDIA's GeForce FX, which became the top of NVIDIA's line of GPUs in late 2002. GDDR3 SDRAM
GDDR3 SDRAM, which began appearing on NVIDIA's high-end graphics cards in early 2004 and is used by ATI's X800 and higher series of AGP and PCI-Express cards, is based on DDR2 memory, but with two major differences:
Video RAM Speed
Video RAM speed is typically measured in MHz, and video card makers often match different memory speeds with different versions of the same basic GPU, as with NVIDIA's GeForce 6800 (300MHz DDR) and GeForce 6800 Ultra (550MHz DDR) memory. Faster memory and faster GPUs produce better gaming performance, but at a higher cost. However, if you are primarily concerned about business or productivity application performance, you can save money by using a video card with a slower GPU and slower memory. Unless you dig deeply into the technical details of a particular 3D graphics card, determining whether a particular card uses SDRAM, DDR SDRAM, DDR2, SGRAM, or GDDR3 can be difficult. Because none of today's 3D accelerators feature upgradeable memory, I recommend that you look at the performance of a given card and choose the card with the performance, features, and price that's right for you. RAM Calculations
The amount of memory a video adapter needs to display a particular resolution and color depth is based on a mathematical equation. A location must be present in the adapter's memory array to display every pixel on the screen, and the resolution determines the number of total pixels. For example, a screen resolution of 1024x768 requires a total of 786,432 pixels. If you were to display that resolution with only two colors, you would need only 1 bit of memory space to represent each pixel. If the bit has a value of 0, the dot is black, and if its value is 1, the dot is white. If you use 24 bits of memory space to control each pixel, you can display more than 16.7 million colors because 16,777,216 combinations are possible with a 4-digit binary number (224=16,777,216). If you multiply the number of pixels necessary for the screen resolution by the number of bits required to represent each pixel, you have the amount of memory the adapter needs to display that resolution. Here is how the calculation works: 1024x768 = 786432 pixelsx24 bits per pixel = 18,874,368 bits = 2,359,296 bytes = 2.25MB
As you can see, displaying 24-bit color (16,777,216 colors) at 1024x768 resolution requires exactly 2.25MB of RAM on the video adapter. Because most adapters support memory amounts of only 256KB, 512KB, 1MB, 2MB, or 4MB, you would need to use a video adapter with at least 4MB of RAM onboard to run your system using that resolution and color depth. To use the higher-resolution modes and greater numbers of colors common today, you would need much more memory on your video adapter than the 256KB found on the original IBM VGA. Table 13.13 shows the memory requirements for some of the most common screen resolutions and color depths used for 2D graphics operations, such as photo editing, presentation graphics, desktop publishing, and web page design.
As you can see from Table 13.13, any current graphics card or integrated graphics solution has more than enough memory to provide true color (24-bit or 32-bit) rendition at any popular resolution. Current video cards and integrated graphics solutions provide much more memory than the amounts listed in Table 13.13 because of the additional memory needs of 3D operation. 3D video cards require more memory for a given resolution and color depth because the video memory must be used for three buffers: the front buffer, back buffer, and Z-buffer. The amount of video memory required for a particular operation varies according to the settings used for the color depth and Z-buffer. Triple buffering allocates more memory for 3D textures than double buffering but can slow down performance of some games. The buffering mode used by a given 3D video card usually can be adjusted through its properties sheet. Table 13.14 lists the memory requirements for 3D cards in selected modes.
Note Although 3D adapters typically operate in a 32-bit mode (refer to Table 13.13), this does not necessarily mean they can produce more than the 16,277,216 colors of a 24-bit true-color display. Many video processors and video memory buses are optimized to move data in 32-bit words, and they actually display 24-bit color while operating in a 32-bit mode, instead of the 4,294,967,296 colors you would expect from a true 32-bit color depth. Although current integrated graphics solutions feature 3D support, the performance they offer is limited by several factors, including
For these reasons, you are likely to be disappointed (and lose a lot of games!) if you play 3D games using integrated graphics. To enjoy 3D games and give yourself a fighting chance of winning, opt for a mid-range to high-end 3D video card based on a current ATI or NVIDIA chipset with 256MB of RAM or more. If your budget permits, you might also consider using a dual-card solution from ATI or NVIDIA that allows you to use two high-end video cards to double your graphics processing performance.
Note If your system uses integrated graphics and you have less than 256MB of RAM, you might be able to increase your available graphics memory by upgrading system memory (system memory is used by the integrated chipset). Most recent Intel chipsets with integrated graphics automatically detect additional system memory and adjust the size of graphics memory automatically. Refer to Table 13.9 for details.
Video Bus Width
Another issue with respect to the memory on the video adapter is the width of the bus connecting the graphics chipset and memory on the adapter. The chipset is usually a single large chip on the card that contains virtually all the adapter's functions. It is wired directly to the memory on the adapter through a local bus on the card. Most of the high-end adapters use an internal memory bus that is 256 or 512 bits wide. This jargon can be confusing because video adapters that take the form of separate expansion cards also plug in to the main system bus, which has its own speed rating. When you read about a 128-bit or 256-bit video adapter, you must understand that this refers to the local video bus and that the bus connecting the adapter to the system is actually the PCI, AGP, or PCI-Express bus on the system's motherboard.
The Digital-to-Analog Converter
The digital-to-analog converter on a video adapter (commonly called a RAMDAC) does exactly what its name describes. The RAMDAC is responsible for converting the digital images your computer generates into analog signals the monitor can display. The speed of the RAMDAC is measured in MHz; the faster the conversion process, the higher the adapter's vertical refresh rate. The speeds of the RAMDACs used in today's high-performance video adapters range from 300MHz to 500MHz. Most of today's video card chipsets include the RAMDAC function inside the 3D accelerator chip, but some dual-displaycapable video cards use a separate RAMDAC chip to allow the second display to work at different refresh rates than the primary display. Systems that use integrated graphics include the RAMDAC function in the North Bridge or GMCH chip portion of the motherboard chipset. The benefits of increasing the RAMDAC speed include higher vertical refresh rates, which allows higher resolutions with flicker-free refresh rates (72Hz85Hz or above). Typically, cards with RAMDAC speeds of 300MHz or above display flicker-free (75Hz or above) at all resolutions up to 1920x1200. Of course, as discussed earlier in this chapter, you must ensure that any resolution you want to use is supported by both your monitor and video card. The Bus
You've learned in this chapter that certain video adapters were designed for use with certain system buses. Earlier bus standards, such as the IBM MCA, ISA, EISA, and VL-Bus, have all been used for VGA and other video standards. Because of their slow performances, all are now obsolete; current video cards use the PCI, AGP, or PCI-Express bus standard. In current and forthcoming systems, PCI Express x16 is the standard video card slot design, replacing the long-time standard AGP 8x. Some systems can support both types of video cards, enabling you to move to PCI Express x16 at your own pace. PCI video cards are limited in quantity and performance and are sold primarily as upgrades for systems with integrated video that lack AGP or PCI Express slots.
AGP Video Cards
The Accelerated Graphics Port (AGP), an Intel-designed dedicated video bus introduced in 1997, delivers a maximum bandwidth up to 16 times larger than that of a comparable PCI bus. AGP has been the mainstream high-speed video standard for several years but is now being replaced by the more versatile and faster PCI Express x16 standard. The AGP slot is essentially an enhancement to the existing PCI bus; however, it's intended for use only with video adapters and provides them with high-speed access to the main system memory array. This enables the adapter to process certain 3D video elements, such as texture maps, directly from system memory rather than having to copy the data to the adapter memory before the processing can begin. This saves time and eliminates the need to upgrade the video adapter memory to better support 3D functions. Although AGP version 3.0 provides for two AGP slots, this feature has never been implemented in practice. Systems with AGP have only one AGP slot. Note Although the earliest AGP cards had relatively small amounts of onboard RAM, most recent and all current implementations of card-based AGP use large amounts of on-card memory and use a memory aperture (a dedicated memory address space above the area used for physical memory) to transfer data more quickly to and from the video card's own memory. Integrated chipsets featuring built-in AGP do use system memory for all operations, including texture maps. Ironically, the memory aperture used by AGP cards can actually cause out-of-memory errors with Windows 9x and Windows Me on systems with more than 512MB of RAM. See Microsoft Knowledge Base document #253912 for details at http://support.microsoft.com.
Although AGP was introduced during the same time period in which Windows NT 4.0 and Windows 95 were the current versions of Windows, those versions of Windows do not support AGP's Direct Memory Execute (DIME) feature. DIME uses main memory instead of the video adapter's memory for certain tasks to lessen the traffic to and from the adapter. Windows 98/Me and Windows 2000/XP support this feature. However, with the large amounts of memory found on current AGP video cards, this feature is seldom implemented. Four speeds of AGP are available: 1X, 2X, 4X, and 8X (see Table 13.15 for details). Most current AGP video cards support AGP 8x and can fall back to AGP 4x or 2x on systems that don't support AGP 8x.
[1] Varies with card implementation. [2] Uses 0.8V internal signaling. Because of the bandwidth AGP 3.0 requires, systems featuring this version of AGP also support DDR333 or faster memory, which is significantly faster than DDR266 (also known as PC2100 memory). AGP 3.0 was announced in 2000, but support for the standard required the development of motherboard chipsets that were not introduced until mid-2002. Almost all current motherboard chipsets with AGP support feature AGP 8x support; however, differences in GPU design, memory bus design, and core and memory clock speed mean (as always) that AGP 8x cards with faster and wider memory and faster GPU speeds provide faster performance than AGP 8x cards with slower and narrower components. Although some systems with AGP 4x or 8x slots use a universal slot design that can handle 3.3V or 1.5V AGP cards, others do not. If a card designed for 3.3V (2x mode) is plugged in to a motherboard that supports only 1.5V (4x mode) signaling, the motherboard will be damaged.
Caution Be sure to check AGP compatibility before you insert an older (AGP 1x/2x) card into a recent or current system. Even if you can physically insert the card, a mismatch between the card's required voltage and the AGP slot's voltage output can damage the motherboard. Check the motherboard manual for the card types and voltage levels supported by the AGP slot. Some AGP cards can use either 3.3V or 1.5V voltage levels by adjusting an onboard jumper. These cards typically use an AGP connector that is notched for use with either AGP 2x or AGP 4x slots, as pictured in Chapter 4, "Motherboards and Buses." Be sure to set these cards to use 1.5V before using them in motherboards that support only 1.5V signaling, such as motherboards based on the Intel 845 or 850 chipsets. PCI Express Video Cards
PCI Express, which has largely replaced AGP in new systems, began to show up in high-performance systems in mid-2004 and has filtered down to almost all systems that use discrete video cards or have integrated video that can be upgraded. Despite the name, PCI Express uses a high-speed bidirectional serial data transfer method, and PCI Express channels (also known as lanes) can be combined to create wider and faster expansion slots (each lane provides 250MBps data rate in each direction). Unlike PCI, PCI Express slots do not compete with each other for bandwidth. PCI Express graphics cards use 16 lanes (x16) to enable speeds of 4GBps in each direction; when PCI Express is used for other types of cards, fewer lanes are used. PCI, AGP, and x16 PCI Express have some important differences, as Table 13.16 shows.
[1] At 33MHz bus speed and 32 bits. [2] Most current systems support AGP 4X/8X only. [3] In each direction; multiply by 2 for bidirectional throughput. [4] Typical PCI Express implementations include one x16 slot for video and two or more x1 slots for other add-on cards, as well as legacy PCI slots. Systems that support NVIDIA's SLI or ATI's CrossFire dual PCI Express video card technologies have two PCI Express video slots running at x8 or x16 speed. The Video Driver
The software driver is an essential, and often problematic, element of a video display subsystem. The driver enables your software to communicate with the video adapter. You can have a video adapter with the fastest processor and the most efficient memory on the market but still have poor video performance because of a badly written driver. Video drivers generally are designed to support the processor on the video adapter. All video adapters come equipped with drivers the card manufacturer supplies, but often you can use a driver the chipset maker created as well. Sometimes you might find that one of the two provides better performance than the other or resolves a particular problem you are experiencing. Most manufacturers of video adapters and chipsets maintain websites from which you can obtain the latest drivers; drivers for chipset-integrated video are supplied by the system board or system vendor. A driver from the chipset manufacturer can be a useful alternative, but you should always try the adapter manufacturer's driver first. Before purchasing a video adapter, you should check out the manufacturer's site and see whether you can determine how up to date the available drivers are. At one time, frequent driver revisions were thought to indicate problems with the hardware, but the greater complexity of today's systems means that driver revisions are a necessity. Even if you are installing a brand-new model of a video adapter, be sure to check for updated drivers on the manufacturer's website for best results. Note Although most devices work best with the newest drivers, video cards can be a notable exception. Both NVIDIA and ATI now use unified driver designs, creating a single driver installation that can be used across a wide range of graphics chips. However, in some cases, older versions of drivers sometimes work better with older chipsets than the newest drivers do. If you find that system performance or stability, especially in 3D gaming, drops when you upgrade to the newest driver for your 3D graphics card, revert to the older driver.
The video driver also provides the interface you can use to configure the display your adapter produces. On a Windows 9x/Me/2000/XP system, the Display Control Panel identifies the monitor and video adapter installed on your system and enables you to select the color depth and screen resolution you prefer. The driver controls the options that are available for these settings, so you can't choose parameters the hardware doesn't support. For example, the controls would not allow you to select a 1024x768 resolution with 24-bit color if the adapter had only 1MB of memory. When you click the Advanced button on the Settings page, you see the Properties dialog box for your particular video display adapter. The contents of this dialog box can vary, depending on the driver and the capabilities of the hardware. Typically, on the General page of this dialog box, you can select the size of the fonts (large or small) to use with the resolution you've chosen. Windows 98/Me/2000 (but not Windows XP) also add a control to activate a convenient feature. The Show Settings Icon on Task Bar check box activates a tray icon that enables you to quickly and easily change resolutions and color depths without having to open the Control Panel. This feature is often called QuickRes. The Adapter page displays detailed information about your adapter and the drivers installed on the system, and it enables you to set the refresh rate for your display; with Windows XP, you can use the List All Modes button to view and choose the resolution, color depth, and refresh rate with a single click. The Monitor page lets you display and change the monitor's properties and switch monitor drivers if necessary. In Windows XP, you can also select the refresh rate on this screen. If your adapter includes a graphics accelerator, the Performance page contains a Hardware Acceleration slider you can use to control the degree of graphic display assistance provided by your adapter hardware. In Windows XP, the Performance page is referred to as the Troubleshoot page. Setting the Hardware Acceleration slider to the Full position activates all the adapter's hardware acceleration features. The necessary adjustments for various problems can be seen in Table 13.17 for Windows XP and in Table 13.18 for other versions of Windows.
[*] Disable write combining, which is a method for speeding up screen display, whenever you select any setting other than full acceleration to improve stability. Reenable write combining after you install updated drivers and retry.
If you're not certain of which setting is the best for your situation, use this procedure: Move the slider one notch to the left to address mouse display problems by disabling the hardware's cursor support in the display driver. This is the equivalent of adding the SWCursor=1 directive to the [Display] section of the System.ini file in Windows 9x/Me. If you are having problems with 2D graphics in Windows XP only, but 3D applications work correctly, move the slider to the second notch from the right to disable cursor drawing and acceleration. Moving the slider another notch (to the third notch from the right in Windows XP or the second notch from the right in earlier versions) prevents the adapter from performing certain bit-block transfers; it disables 3D functions of DirectX in Windows XP. With some drivers, this setting also disables memory-mapped I/O. This is the equivalent of adding the Mmio=0 directive to the [Display] section of System.ini and the SafeMode=1 directive to the [Windows] section of Win.ini (and the SWCursor directive mentioned previously) in Windows 9x/Me. Moving the slider to the None setting (the far left) adds the SafeMode=2 directive to the [Windows] section of the Win.ini file in Windows 9x/Me. This disables all hardware acceleration support on all versions of Windows and forces the operating system to use only the device-independent bitmap (DIB) engine to display images, rather than bit-block transfers. Use this setting when you experience frequent screen lockups or receive invalid page fault error messages. Note If you need to disable any of the video hardware features listed earlier, this often indicates a buggy video or mouse driver. If you download and install updated video and mouse drivers, you should be able to revert to full acceleration. You should also download an updated version of DirectX for your version of Windows.
In most cases, another tab called Color Management is also available. You can select a color profile for your monitor to enable more accurate color matching for use with graphics programs and printers. Video cards with advanced 3D acceleration features often have additional properties; these are discussed later in this chapter. Multiple Monitors
Macintosh systems pioneered multiple-monitor support long before Windows, but starting with Windows 98, all current Windows versions also offer the ability to use multiple monitors on a single system. Windows 98/Me support up to 9 monitors (and video adapters), each of which can provide a different view of the desktop. Windows 2000 and Windows XP support up to 10 monitors and video adapters. When you configure a Windows 98/Me or Windows 2000/XP system to use multiple monitors, the operating system creates a virtual desktopthat is, a display that exists in video memory that can be larger than the image actually displayed on a single monitor. You use the multiple monitors to display various portions of the virtual desktop, enabling you to place the windows for different applications on separate monitors and move them around at will. Unless you use multiple-head video cards, each monitor you connect to the system requires its own video adapter. So, unless you have nine bus slots free, the prospects of seeing a nine-screen Windows display are slim, for now. However, even two monitors can be a boon to computing productivity. For example, you can leave an email client or web browser maximized on one monitor and use the other monitor for additional programs. On a multimonitor Windows system, one display is always considered to be the primary display. The primary display can use any PCI or AGP VGA video adapter that uses a Windows minidriver with a linear frame buffer and a packed (nonplanar) format, meaning that most of the brand-name adapters sold today are eligible. Additional monitors are called secondaries and are much more limited in their hardware support. To install support for multiple monitors, be sure you have only one adapter installed first; then reboot the system, and install each additional adapter one at a time. Table 13.19 lists the multiple-monitor support articles available for Windows 98 and newer versions on the Microsoft support website (http://support.microsoft.com).
It's important that the computer correctly identifies which of the video adapters is the primary one. This is a function of the system BIOS, and if the BIOS on your computer does not let you select which device should be the primary VGA display, it decides based on the order of the PCI slots in the machine. You should, therefore, install the primary adapter in the highest-priority PCI slot. In some cases, an AGP adapter might be considered secondary to a PCI adapter. Depending on the BIOS used by your system, you might need to check in various places for the option to select the primary VGA display. For example, the AMI BIOS used by the MSI KT4 Ultra motherboard for Socket A processors lists this option, which it calls Primary Graphics Adapter, in the PCI/PnP menu. In contrast, the Intel/AMI BIOS used by the Intel D865PERL motherboard lists this option, which it calls Primary Video Adapter, in the Video Configuration menu.
After the hardware is in place, you can configure the display for each monitor from the Display control panel's Settings page. The primary display is always fixed in the upper-left corner of the virtual desktop, but you can move the secondary displays to view any area of the desktop you like. You can also set the screen resolution and color depth for each display individually. Windows XP also supports DualView, an enhancement to Windows 2000's multiple-monitor support. DualView supports the increasing number of dual-head video cards as well as notebook computers connected to external displays. With systems supporting DualView, the first video port is automatically assigned to the primary monitor. On a notebook computer, the primary display is the built-in LCD panel. Note Many recent notebook computers that have integrated graphics do not support DualView; however, most recent notebook computers that use a discrete graphics chip do support DualView. To determine whether your notebook computer supports DualView, open its Display properties sheet and click the Settings tab. If two monitor icons are visible, your computer supports DualView. You can activate secondary monitor support after you attach a monitor to the external VGA port. Even if your BIOS enables you to specify the primary video card and you use video cards that are listed as compatible, determining exactly which display cards will work successfully in a multimonitor configuration can be difficult. Consequently, you should check with your video card or chipset maker for the latest information on Windows 2000 or Windows XP and multiple-monitor support issues. Because new chipsets, updated drivers, and combinations of display adapters are a continuous issue for multiple-monitor support when separate video cards are used, I recommend the following online resources:
Multiple-monitor support can be enabled through either of the following:
A card that supports multiple monitors (also called a multiple-head or dual-head card) saves slots inside your system and eliminates the headaches of driver and BIOS updates or system reconfiguration sometimes necessary when using two or more video cards for multiple-monitor capability. Most recent video cards with multiple-monitor support feature a 15-pin analog VGA connector for CRTs, a DVI-I digital/analog connector for digital LCD panels, and a TV-out connector for S-video or composite output to TVs and VCRs. Thus, you can connect any of the following to these cards:
If the card has two DVI-I ports, in addition to using any of the preceding options, you can use two digital LCD panels. The major video chipsets that support multiple CRT and LCD displays are listed in Table 13.20.
[1] ATI sells video cards using these chipsets under the ATI brand and also supplies chipsets to third-party vendors [7] Some video cards based on this chipset might not support multiple monitors AIWAll-in-Wonder; includes TV tuner and other AV features [2] Matrox is the only vendor using its chipsets; this table lists Matrox card models [3] Features a separate accelerator chip for each display, enabling the independent selection of the refresh rate and the resolution under Windows 2000 [4] Features a separate RAMDAC chip for each display, enabling the independent selection of the refresh rate and the resolution under Windows 2000 [6] Upgradeable to three-monitor support [5] NVIDIA does not manufacture video cards; it sells chipsets only As Table 13.20 notes, some video cards that use a chipset capable of multiple-monitor support might not provide the additional DVI or VGA connector necessary to enable that support. Be sure to check the specifications for a particular graphics card to ensure that it supports the display combinations you need. Table 13.20 does not include video chipsets that support TV-out or video in-video out (VIVO) but do not support a second CRT or LCD display. Caution Some vendors whose cards provide a single VGA or DVI-I port (DVI-I ports can be converted to VGA with an adapter) and a TV-out port refer to such cards as "supporting multiple monitors." Table 13.20 lists only chipsets or cards that support two or more CRT or LCD displays.
Adding Multiple-Monitor Support to Laptops and Desktops with Integrated Graphics
Matrox's DualHead2Go external multiple-display device enables you to split the existing single VGA signal coming from compatible laptops or desktops with integrated graphics into two separate displays that can host their own applications, just as if your system had a dual-headcompatible graphics card installed. The DualHead2Go device and driver software transforms the video signal coming from your computer's integrated video into an ultra wide-screen signal by using Extended Display Identification Data (EDID) and splits it into two parts that can be transmitted to different screens. Each display connected to the DualHead2Go device can display a separate program, or you can stretch a single program across both displays. See Figure 13.15 for examples. Figure 13.15. Using the Matrox DualHead2Go device to host separate applications (top) or to stretch a single application across two windows (bottom). Photos courtesy Matrox Graphics.
To determine whether your system is compatible with DualHead2Go, download and run the Matrox DualHead2Go System Compatibility Tool from http://www.matrox.com/graphics/offhome/dh2go/ try.cfm. See the compatibility list at http://www.matrox.com/graphics/offhome/support/dh2go/compatibility.cfm. |
Категории