Geek 101: Making Sense of Anti-Aliasing
If you’ve played a PC game in the past five years, you’ve probably stumbled across an anti-aliasing toggle while mucking about with your graphics settings. Switching it on can make everything on your screen look smoother, but why does it also make your games run slower? And what's the difference between "2x Multisampling" or "4x Supersampling," and which is the best choice for your machine?
We did a bit of lab testing to work our way through the jargon, so you can turn these settings to your advantage.
What is Anti-Aliasing, and Why Should You Care?
Have you ever been confused by edges of supposedly smooth objects in your favorite game looking jagged or blurry? This issue is generally identifiable by a "stairstep" pattern on objects in a digital scene, and it happens because the contrast between dark and light pixels can often make the edges of an object onscreen appear jagged (thus the dreaded term "jaggies.") Anti-aliasing is simply a term for complex algorithms your graphics card employs to make the pixels along the edges of an object appear smoother by blending their colors together.
Of course, from a distance of 12-18 inches the human eye typically can't pick out individual pixels on an image with pixel density greater than 300 pixels-per-inch (PPI). For reference, a 24-inch LCD monitor running at a 1920-by-1200 resolution displays roughly 94 pixels-per-inch. Until you own a monitor that can tackle 300 pixels-per-inch, you'll have to rely on your graphics card to tastefully blur the image.
In order for a modern graphics card to eliminate jagged edges, it first has to know where the edges are in any given image. To do that, your graphics card does something called full screen sampling: The action on screen is calculated at a much higher resolution before actually being displayed--essentially "faking" a higher-resolution. Color data from every pixel is then gathered and averaged before being condensed into the smoother, prettier final image that is actually displayed.
For the purposes of this article we'll tackle the two most common modes of anti-aliasing available in modern games: supersampling and multisampling. For our tests, we used Crytek's Crysis 2 , a graphical tour de force. Our testbed consisted of an Intel Core i7-990X processor, 6GB of RAM, and the Nvidia GeForce GTX 580 graphics card.
With supersampling, your graphics card blends the colors of adjacent pixels by pre-rendering the entire image at a much larger size before each frame (as described above). As an example, let’s say you were playing a game at a 1280-by-800 resolution, and checked off "4x supersampling" in your graphics options. Click on the images below to see a full-size version of them. Pay special attention to the beams--see those stairstep patterns? Those are the dreaded "jaggies."
To smooth out the sharp edges, your graphics card is performing display calculations for an image at 5120-by-3200 pixel resolution (four times the size of 1280-by-800), sampling (collecting) the colors and blending them to create a smoother image at your 1280-by-800 resolution. Put simply, your graphics card is collecting the color data of four pixels, and combining them into one, thousands of times a second, reducing the "jaggy" effect that appears when you have sharp color contrast between pixels.
If that sounds like a lot of work for every frame of animation, it is: Supersampling is a resource-intensive, brute-force approach to anti-aliasing, and if your hardware isn't up to snuff it can seriously bog down your PC's performance.
Multisampling doesn't look quite as impressive as supersampling, but it's faster and far more efficient. It's a newer, more computationally-efficient form of supersampling anti-aliasing that removes gratuitous data like lighting, shading and textures from the sampling process so that only the color data of an image is sampled and blended. That means less work for your graphics card, and more frames per second for you.
Even better, these new multisampling algorithms account for the depth of 3D objects in virtual space and exclusively target areas of high contrast to ensure that only the edges of 3D objects are being sampled, meaning objects still appear smoother and sharper with much less effort.
So which anti-aliasing method is right for you? Everyone has a different setup, but if your games are running well and you want to kick the graphical output up a notch, it's a good idea to first ensure your in-game resolution matches the native resolution for your monitor. You can consult your monitor's packaging or manufacturer for the native resolution, but Windows will also recommend the native resolution, if you check your PC's display properties.
Modern LCD displays can't adapt to match the resolution of your GPU like the old CRT monitors could, and you'll get the smoothest picture while playing games at native resolution.
If you still have GPU cycles to spare on a midrange card like the Nvidia GeForce GT 560Ti or the AMD Radeon HD 6850, start with multisampling anti-aliasing and play with the sampling size until you find a nice balance between picture and performance. If you've shelled out for a high-performance GPU like the AMD Radeon HD 6990, go ahead and crank supersampling to the max. You've earned it.