|One of the fun things about being a hardware analyst is that every now and then, research and comparison between products turns up something both interesting and unexpected. In this case, we were testing and comparing image quality between ATI's 5970 and the GeForce GTX 480 as part of a system review when we stumbled across an officially confirmed antialiasing bug in NVIDIA's 197.41 drivers. Unlike some "bugs" that artificially inflate benchmark performance or lower detail levels, this one improves image quality by performing supersampling AA (SSAA) across the entire image.
Antialiasing is a process that removes jagged edges (jaggies) from displayed content. Supersampling, the type of AA we're primarily discussing here, functions by rendering an entire image at a very high resolution internally, then sampling an area around each pixel. The advantage of supersampling is that the entire image is antialiased, rather than just the edges or intersection points. As a result, it's arguably the best of the antialiasing methods, provided you don't mind the blurring it tends to create. The disadvantage of standard supersampling (other than the blur) is that it scales with the grace of Rosie O'Donnell running 30 flights of stairs. 4xSSAA when gaming at 1900x1200 instructs the video card to internally render four pixels for every single pixel of displayed data. Texture data and lighting calculations are performed on each of the pixels at this high resolution, and the output is then downsampled to 1900x1200. On an NVIDIA card, using nHancer to render in 4x4 mode requires 16x the computational power required to render the original image.
Multisampling streamlines this process by only processing texture data and lighting effects once per group of pixels. Both ATI and NVIDIA have proprietary algorithms that further evaluate what data needs to be processed. Both companies have introduced their own techniques for improving visual quality or lowering the performance hit when enabling AA including ATI's Edge Detect filters and NVIDIA's Coverage Sampling (CSAA).
The Bug: Transparency AA = FSAA
The bug we're going to be discussing affects NVIDIA's Transparency AA (TSAA) settings. Normally, changing this option under the NVIDIA Control Panel adjusts the amount of antialiasing performed on objects that are transparent/translucent, including chain link fences or telephone wires. The images below are from our original discussion of transparency AA in Half Life 2 and demonstrate how Transparency AA functions under normal circumstances.
|BattleForge Image Quality Comparison|
BattleForge is a free-to-play EA online card game that's vaguely reminiscent of Magic: The Gathering but with better graphics and no need to carry around a shoebox of cards. The game supports DirectX 9, 10.1, and 11; our screenshots were taken with the game in DX11 mode and with the optional high quality texture pack installed. BattleForge has a built-in benchmark accessible from the "Graphics" subsection which makes it handy for performance comparisons as well.
Battleforge isn't the only game we tested. We also observed fullscreen supersampling in Bad Company 2, DiRT 2, and Aliens vs Predator. These are all DirectX 11 titles—when we forced BattleForge into DX9 mode to measure the supersampling performance impact there, we discovered supersampling was no longer being misapplied by the Transparency AA settings in the NV Control Panel.
|Image Quality (Cont): Enter ATI, Performance Impact|
|We're not talking much about ATI's comparative image quality, but we did grab screenshots of the Red Team's multisampling (DX11) and supersampling (DX9) in the same area. The top two images are ATI's multisampling and supersampling; NVIDIA's 4xSSAA is below.
Comparing the ATI and NVIDIA supersampling methods we see that while the NVIDIA method is blurrier, it's also more effective at eliminating jagged edges. Based on what we know about NVIDIA's supersampling method as forced by nHancer and ATI's historical approach, the different image results from the use of two different pixel sampling methods.
NVIDIA uses an ordered grid approach, which samples every pixel in an image the same way from the same points. This is less effective on 45' angles or on lines that are nearly (but not quite) perfectly horizontal or vertical. If you want to see an example of the effect, click the left thumbnail above. At certain angles, the human perceives the lines as more or less aliased. In years past, ATI relied on what it calls sparse grid super sampling, claiming that this method was better at handling angled lines. In theory, the ATI images are applying 8x sparse grid compared to NVIDIA's 4x4 ordered grid, but since we're comparing a buggy driver against a game that's been forced to run in DX9 mode for comparison shots, we can't say with 100 percent certainty that the proper modes have been applied.
We've focused on 4xSSAA because it's the sweet spot between visual quality and performance; 2x supersampling is available but doesn't improve image quality much. 8xSSAA is also technically available, but produces a very blurred image. In addition, the performance penalty from 8x drives even a pair of GTX 480's in SLI mode to their knees in DX11.
Switching from 8x multisampling to 4x supersampling cuts performance by more than half, 8x supersampling cuts the GTX 480's framerate in SLI mode by almost 75 percent. This isn't a veiled shot at the GTX 480's performance; remember that for every pixel displayed, up to eight more are being rendered, textured, and lit. In this case, lowering the detail level or the baseline resolution is the most effective way to solve the problem. Back when display resolutions were much smaller, there were certain enthusiasts who eschewed the use of supersampling-based FSAA in favor of playing at higher resolutions—1600x1200, for example, rather than 800x600 with 4xAA. Nowadays, that's not possible--any baseline resolution above 1280x1024 results in the card internally sampling at resolutions well above 2560x1600.
Testbed: Intel Core i7 980X @ 4.2GHz, 6GB DDR3, 2x GTX480's in SLI
The nicest thing about this bug is that it makes supersampling available in DX11 (and most likely DX10). It's easy to see why ATI's supersampling only kicks in when playing DirectX 9 games—the performance hit is immense—but we still think end-users should have final say on whether or not supersampling is enabled on DX10/11 titles. True, there's no guarantee it would be universally effective—certain game engines are limited to certain types of AA—but that's a limitation gaming enthusiasts are already used to dealing with.