Geek 101: A Graphics Card Primer

We all know that the GPU in modern computers is essentially responsible for everything you see on your screen. Both Windows Vista and 7 can use the GPU to render the desktop, taking advantage of 3D acceleration features to provide smooth window movement, transparency, and other visual effects. Naturally, the 3D graphics in games, educational titles, and such are all rendered by the GPU.

Modern GPUs even have special video processing units that decode, scale, and de-interlace the most popular video formats, improving quality and reducing CPU load and power consumption. GPUs are even starting to be used as highly parallel processors to do certain very math-heavy tasks much faster than the CPU, though this technology is in its infancy.

But beyond that, it gets a little confusing. There are dozens of brand names and model numbers out there, and a whole alphabet soup of buzzwords and acronyms that seem specifically designed to confuse the average customer. Let's discuss some common graphics-related terms and what you should look for when shopping for a graphics card (or choosing which graphics card should be in your next computer or notebook).

Nvidia, ATI, and Intel

Today, there are three major players in the graphics market. There's Nvidia, a company that focuses almost entirely on graphics products. A few years ago the CPU and chipset maker AMD bought Canadian graphics developer and Nvidia competitor ATI. You'll still see the ATI brand quite often; AMD kept it around for their graphics division. Finally there's Intel, which currently only makes integrated graphics products built into the motherboard chipsets for their processors.

Soon, Intel will start shipping processors with graphics integrated right into the CPU. There are other graphics companies out there, but they either focus on devices like cell phones or have such a tiny piece of the market that they're not worth bringing up.

Which one should you use? This is a point of much contention among graphics fans and gamers. To be honest, Nvidia and ATI/AMD both make excellent products and have drivers that are, on the whole and over time, roughly comparable in terms of stability. If you want a discrete graphics card, you should pick whichever one is best at the price you want.

Intel's integrated graphics is what you get when you don't make a choice, basically. Though it has improved greatly over the years, it is still slower than the integrated graphics options from Nvidia and ATI, and far slower than discrete graphics solutions.

DirectX

DirectX is an API (Application Programming Interface--a set of conventions and abstractions that let programmers control a piece of hardware like a GPU). DirectX actually contains lots of pieces to deal with things like audio and such, but the part that deals with 3D graphics is called Direct3D.

On Windows, DirectX is by far the most common way that games make use of the GPU, but because it comes from Microsoft and makes use of the Windows driver stack, it's only on Windows. Windows Vista and 7 support DirectX 10.1 as the latest version, and DirectX 11 is coming to both Windows 7 and Vista very soon. With it comes a few exciting new features. We'll get to that in a minute.

OpenGL

If you're not on Windows, odds are that programmers are accessing 3D hardware through an API called OpenGL.

This standard graphics API is controlled by a collaborative entity called the Khronos Group, which has members from lots of big software and hardware makers. OpenGL is available and used on Windows (in fact, the newest version of Photoshop uses it for GPU acceleration), but it isn't as common as Direct3D. These days, all modern GPUs (discrete and integrated) provide both OpenGL and DirectX drivers.

OpenCL

Remember when I mentioned that GPUs can be used for general computing (like video format conversions, heavy scientific calculations, and such)? Well, OpenCL is a standardized way of doing this. An OpenCL program can run on and be accelerated by the GPU, regardless of who the GPU manufacturer is. It's a brand new standard, appearing in both Apple's new Snow Leopard OS and Windows (XP, Vista, and 7).

Neither Nvidia nor ATI have real, final, public OpenCL drivers yet. This is a technology that is in its infancy, but should grow rapidly. Robust OpenCL support and good performance will probably be a real selling point in the next year or two.

Next: Drivers, SLI and Crossfire, and what's to come...

Drivers, Drivers, Drivers

No matter what graphics processor you have, you need the latest drivers. For Nvidia cards, go here. For ATI cards, go here. For Intel integrated graphics, go here. If you have a notebook, you may need to go to your notebook manufacturer's Web site to get the latest drivers.

DirectX 11

Microsoft's marketing department is doing its best to brand DirectX 11 as a Windows 7 thing, but the truth is that it's coming to Vista as well. This new version of the API brings with it several new features. It's too much to go into here, but the short list is:

  • Better use of multi-core CPUs
  • Tessellation - This is the fancy word for breaking up an object made of a small number of triangles (and thus blocky-looking) into a very large number of triangles, which can then be manipulated to make the object look smoother or more detailed.
  • DirectCompute - (aka "Compute Shaders") Like OpenCL, this is a standardized way to make and GPU with DirectX 11 drivers do general computational stuff.

CUDA and ATI Stream

For the past several years, both Nvidia and ATI have been working on using the GPU for general computing tasks. It's hard to launch a new software industry. Each company has its own proprietary means of programming its graphics products. Nvidia's is called CUDA, ATI's is called ATI Stream. CUDA is more popular, but it's still mostly stuck in the "big iron" high performance computing and academic fields, with only a handful of real consumer apps.

New programming models, such as using the GPU for general computing tasks, tend to take off when standards emerge, so the real action will probably be in OpenCL and DirectX 11 Compute Shaders. Don't let CUDA or ATI Stream influence your buying decisions too much.

Future Hardware: Nvidia, ATI, and Intel's Larrabee

Both ATI and Nvidia are getting their new DirectX 11 class graphics products ready to roll. ATI appears to be a few months ahead of Nvidia on their rollout. If the rumors are to be believed, the company should have a top-to-bottom lineup in the next month or two. Nvidia may only have high-end chips at first, at then only at the end of the year or possibly early next year.

Unfortunately, we can't tell you which one is the better buy because we don't really know about their price, performance, power utilization, or any of that other stuff. But if you don't desperately need a new graphics card right now, you might want to wait a few months and see how this new generation of products looks.

Meanwhile, Intel is preparing a novel new product with the code-name Larrabee. This will be a GPU first appearing in a high-end discrete graphics card rather than the typical integrated graphics stuff we see from Intel. It doesn't follow the traditional graphics chip architecture, but is rather a chip full of lots of very compact x86 CPUs (like the Atom chip for netbooks) that have very wide vector processing units and a specialized set of programming instructions.

This makes the chip very flexible, and it should be great for GPU compute type applications, but will it be a fast graphics chip? Nobody knows. What we do know is that Intel is a year ahead of everyone else on chip manufacturing technology and should never be underestimated.

SLI and Crossfire

These are terms for Nvidia (SLI) and ATI (Crossfire) technologies to use more than one GPU at a time for higher performance. Should you get it? Generally speaking, this is one of those "if you have to ask, the answer is no" sort of technologies. You can expect a second GPU to add maybe 50-80% performance over the first, and from there the performance gains are minimal. The third GPU only gets you maybe 30% more, and the fourth (yes, you can do a four-GPU system!) barely improves things over the third at all.

Enthusiast gamers with very big, high-resolution monitors are the target market for multi-GPU solutions. If this is you, you might want to consider SLI or Crossfire. You'll need a motherboard with two graphics slots that supports SLI/Crossfire, but these are not uncommon. Odds are, most of you reading a "Geek 101" article probably aren't the target market for this.

Next: GPU shopping tips

Discrete or Integrated?

Okay, it's time to make a buying decision. Do you go with a discrete graphics card (in either a desktop or notebook) or integrated graphics? If you want to play games, even just a bit, you'll have a far better experience with discrete graphics.

If all you want to do is browse the web and do some light word processing or email, integrated is probably enough. Intel's integrated graphics isn't as good as Nvidia's or ATI's, and if you care about the quality of the video (watching DVDs or downloaded video on your PC), you want an Nvidia or ATI graphics chip. If battery life is your top concern, avoid discrete graphics and go with integrated.

How much should I spend?

As a rule of thumb, you should probably not spend less than $100 or so on a graphics card. Cards in the $99-149 range offer a lot of bang for the buck and can run almost all modern games very well. Once you start spending less than that, the performance drops rapidly and you'll just need to upgrade sooner.

If you or someone who uses the computer is a more serious gamer, look for cards in the $179-229 price range. These offer great performance without breaking the bank. You really don't need to spend more than that if you're reading this article. Those high-end graphics cards are for the sort of graphics and game fans that don't need a "Geek 101" type article.

How much memory do I need?

You'll see a lot of cheap graphics cards with 1GB of memory on them. This is mostly a waste of money. In the $100 range, there isn't much benefit to having more than 512MB of memory. A faster GPU chip on the card is worth more than a bigger amount of memory. Once you get to the $149-and-up range, you want a card with 1GB of RAM. If it's integrated graphics, it'll use your main system memory and you don't need to worry about it (this memory sharing is one of the reasons integrated graphics are so slow).

My Recommended Picks

Low-cost option: Radeon HD 4850 or GeForce GTS 250
Enthusiasts: Radeon HD 4890 or GeForce GTX 275
Expensive: GeForce GTX 285 or Radeon HD 4870 X2

Follow Jason Cross on Twitter or visit his blog.

Subscribe to the Best of TechHive Newsletter

Comments