We've been hearing details about NVIDIA's on-chip programmable video processor ever since the NV40 originally launched about eight months ago. Back then NVIDIA had stated that the NV40's on-chip video processor incorporated a high-quality adaptive de-interlacing engine, along with better filtering and scaling techniques, video de-blocking logic and an integrated TV-encoder. The NV40's on-chip video processor was also to be a complete HDTV solution as well, with the ability to output HDTV streams in all of the popular formats (720P, 1080i, 480P, CGMS). It also had integrated PVR (Personal Video Recorder) functionality, with hardware audio / video synchronization, hardware MPEG 1 / 2 / 4 encoding and decoding, and WMV9 decode acceleration.
It has since become known that the original NV40 chip used on AGP GeForce 6800 class cards didn't accelerate WMV9 content at quite the level NVIDIA would have liked, but newer derivatives of the chip with an updated video engine are much better performers in this respect. The GeForce 6600 GT, for example, is equipped with the updated on-chip programmable video processor, which gets a new name today. At 9:00AM EST NVIDIA's PureVideo technology, not to be confused with Matrox's brand of the same name, was officially unveiled. The combination of a GeForce 6 Series graphics card, with new decoder software that should be available on NVIDIA's site shortly, finally make PureVideo a reality. And video on the PC now comes a step closer, and perhaps even surpasses, the quality available from most Consumer Electronics devices. The key features of NVIDIA's PureVideo technology include:
- High-definition MPEG-2 Hardware Acceleration. A dedicated 16-way vector processor enables smooth playback of HD video with minimal CPU usage.
- WMV High-definition Hardware Acceleration. Programmable support of the newest format enabled by Microsoft Windows Media Player and Windows XP Media Center Edition 2005 for fluid playback of WMV and WMV HD content.
- High-quality Real-time Video Recording. An advanced motion-estimation engine makes it possible to record in real-time without a loss in quality.
- Spatial / Temporal Adaptive De-Interlacing. Experience interlaced content from satellite, cable, and DVD feeds in full detail and without jagged edges or artifacts.
- 3:2 Pull-down Correction and "Bad Edit" Correction. Restore video to its original 24fps film format to experience a crystal clear picture, frame after frame.
- Flicker-free Multi-Stream Scaling. High-quality 4-tap by 5-tap scaling maintains image detail, even when scaling a small video to a large area of the screen.
- Display Gamma Correction. Automatic format detection adjusts the color quality of video playback so that it is not too dark, overly bright or washed out, regardless of the display.
To give you some examples of how PureVideo affects video playback on the PC, we installed a GeForce 6600 GT into one of our test systems and compared its video playback quality to an ATI Radeon X700. ATI has long been regarded as having the best video engine available on the PC. Their popular All-In-Wonder line of cards is a testament to this fact. But NVIDIA isn't the type of company to just sit back and let a competitor dominate a particular market segment unchallenged. Which is where PureVideo comes in...
Video Cards -
Hard Drive -
Operating System -
Chipset Drivers -
Video Drivers -
Other Software -
DFI LANPARTY 925X-T2 Motherboard
ATI Radeon X700
GeForce 6600 GT
1024MB Kingston HyperX PC5400
Integrated Intel Azalia Hi-Def Audio
Western Digital "Raptor"
36GB - 10,000RPM - SATA
Windows XP Professional SP2 (Fully Patched)
Intel INF v188.8.131.528
ATI Catalyst v4.12
NVIDIA Forceware v67.01
ATI MMC v9.03
HOW WE CONFIGURED THE TEST SYSTEM: We tested the ATi Radeon X800 XL on a DFI LANPARTY 925X-T2 Intel i925X chipset-based motherboard, powered by an Intel
The see the benefits of NVIDIA's PureVideo technology, we occasionally had to alter one of the settings available in their new Decoder Properties menu. In the "Inverse Telecine" tests you'll see on the next page, we had to change the Deinterlace Mode to "Smart" from "Automatic". We had to do this because it's possible to "fool" the algorithm with content that has bad edit detection that's too good. Most high end projectors and plasma televisions from companies like Samsung, Marantz, among others, all have the ability to turn off the inverse telecine algorithms to ensure accurate results. For the ATI powered card, we left all drivers and software at their default settings except for one. We had to manually edit a registry key to enable WMV9 acceleration with the Radeon X700. This is due to the fact that Microsoft has yet to release an update to Media Player 10 that'll enable this functionality by default. We should also note that we used an updated set of DLL files for a portion of the tests presented here to demonstrate the benefits of WMV9 acceleration. When using the updated DLL files, CPU utilization dropped dramatically while playing back some WMV HD content.