NVIDIA GeForce 8800 GT - G92 Takes Flight - HotHardware

NVIDIA GeForce 8800 GT - G92 Takes Flight

3 thumbs up

In the GPU business, like any other semiconductor technology, die size, cost and performance-per-watt are the name of the game.  Performance-per-watt from a design and manufacturing perspective is important on multiple levels.  Certainly, power efficient products are important to the consumer but also, generally speaking, power-efficient designs usually mean higher clock speed head-room and higher yields which, at the end of the day, drop right to bottom line profits.  In terms of very complex, highly scalable processor architectures like CPUs and GPUs, bleeding-edge manufacturing processes are critical to delivering a competitive product with reasonable cost targets, power consumption and profit margins.

Today's NVIDIA GeForce 8800 GT launch is an example of what it takes to bring a highly complex processor architecture to fruition in today's competitive arena.  The new GeForce 8800 GT is an entirely new GPU core; well sort of.  The NVIDIA G92 GPU core that is under the hood of the new GeForce 8800 GT is essentially a die-shrink and cut back of the NVIDIA's G8 architecture, with a few enhancements and optimizations.  With this migration to TSMC's 65nm process technology, the new GeForce 8800 GT is targeted at offering solid mid-range performance, lower power consumption and heat, along with a competitive price.  As we said, complex GPUs need top notch manufacturing processes to be competitive in this game.  Let's have a look at the way NVIDIA thinks it was meant to be played.

 

NVIDIA GeForce 8800 GT
Features & Specifications

Fabrication:  65nm

Number of Transistors: 
754 Million

Core Clock (Includes dispatch, texture units and ROPs)
:  600MHz

Shader Clock (Stream Processors): 
1.5GHz

Stream Processors: 
12

Memory Clock: 
900MHz (1800 DDR)

Memory Interface: 
256-bit

Memory Bandwidth: 
57.6 GBps

Frame Buffer Size: 
512 MB

ROPs: 
16

Texture Fill Rate (Billions of bilinear filtered texels/s):  33.6 GT/s

HDCP Support: 
Yes

HDMI Support: 
Yes

Connectors:
2xDual-Link DVI-I
7-Pin TV Out

RAMDACS: 
400MHz

Bus Technology: 
PCI Express 2.0

Max Board Power: 
110 Watts

NVIDIA unified architecture:

Fully unified shader core dynamically allocates processing power to geometry, vertex, physics, or pixel shading operations, delivering up to 2x the gaming performance of prior generation GPUs.

Full Microsoft DirectX 10 Support:
World's first DirectX 10 GPU with full Shader Model 4.0 support delivers unparalleled levels of graphics realism and film-quality effects.

NVIDIA SLI Technology:
Delivers up to 2x the performance of a single graphics card configuration for unequaled gaming experiences by allowing two cards to run in parallel. The must-have feature for performance PCI Express graphics, SLI dramatically scales performance on today's hottest games.

NVIDIA Lumenex Engine:
Delivers stunning image quality and floating point accuracy at ultra-fast frame rates.
16x Anti-aliasing: Lightning fast, high-quality anti-aliasing at up to 16x sample rates obliterates jagged edges.

128-bit floating point High Dynamic-Range (HDR):
Twice the precision of prior generations for incredibly realistic lighting effects - now with support for anti-aliasing.

NVIDIA Quantum Effects Technology:
Advanced shader processors architected for physics computation enable a new level of physics effects to be simulated and rendered on the GPU - all while freeing the CPU to run the game engine and AI.
NVIDIA nView Multi-Display Technology:
Advanced technology provides the ultimate in viewing flexibility and control for multiple monitors.

Dual 400MHz RAMDACs:
Blazing-fast RAMDACs support dual QXGA displays with ultra-high, ergonomic refresh rates - up to 2048x1536@85Hz.

Dual Dual-link DVI Support:
Able to drive the industry's largest and highest resolution flat-panel displays up to 2560x1600.


NVIDIA PureVideo HD Technology:
The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity, smooth video, accurate color, and precise image scaling for movies and video.

Discrete, Programmable Video Processor:
NVIDIA PureVideo HD is a discrete programmable processing core in NVIDIA GPUs that provides superb picture quality and ultra-smooth movies with low CPU utilization and power.

Hardware Decode Acceleration:
Provides ultra-smooth playback of H.264, VC-1, WMV and MPEG-2 HD and SD movies.

HDCP Capable:
Designed to meet the output protection management (HDCP) and security specifications of the Blu-ray Disc and HD DVD formats, allowing the playback of encrypted movie content on PCs when connected to HDCP-compliant displays.

Spatial-Temporal De-Interlacing:
Sharpens HD and standard definition interlaced content on progressive displays, delivering a crisp, clear picture that rivals high-end home-theater systems.

High-Quality Scaling:
Enlarges lower resolution movies and videos to HDTV resolutions, up to 1080i, while maintaining a clear, clean image. Also provides downscaling of videos, including high-definition, while preserving image detail.

Inverse Telecine (3:2 & 2:2 Pulldown Correction):
Recovers original film images from films-converted-to-video (DVDs, 1080i HD content), providing more accurate movie playback and superior picture quality.

Bad Edit Correction:
When videos are edited after they have been converted from 24 to 25 or 30 frames, the edits can disrupt the normal 3:2 or 2:2 pulldown cadences. PureVideo HD uses advanced processing techniques to detect poor edits, recover the original content, and display perfect picture detail frame after frame for smooth, natural looking video.

Video Color Correction:
NVIDIA's Color Correction Controls, such as Brightness, Contrast and Gamma Correction let you compensate for the different color characteristics of various RGB monitors and TVs ensuring movies are not too dark, overly bright, or washed out regardless of the video format or display type.

Integrated SD and HD TV Output:
Provides world-class TV-out functionality via Composite, S-Video, Component, or DVI connections. Supports resolutions up to 1080p depending on connection type and TV capability.

Noise Reduction:
Improves movie image quality by removing unwanted artifacts.

Edge Enhancement:
Sharpens movie images by providing higher contrast around lines and objects.

A quick take of the above specifications show this GPU has a few more transistors going for it at 754 million, versus other members of the G8 architecture weighing in at 681 million.  This is because NVDIA has enhanced a few key functional blocks within the GPU engine, which we will get into detail on shortly.  In addition, obviously a 65nm fab process has been kind to the GPU, affording the ability to scale to a robust 600MHz core clock and a 1500MHz shader clock.  Comparatively, a GeForce 8800 GTX GPU runs at 575MHz/1200MHz core and shader clocks.  Let's dig a bit deeper into the 8800 GT architecture, next. 

Article Index:

0
+ -

 Great right up. Shame about the temp and noise however Zalman will hopefully come to the rescue on that one. Going to have to wait for AMDs reply until I decide whether to invest though.

0
+ -

I haven't run 3Dmark06 lately so maybe the new drivers are a factor, but the overall score that the GT puts up is almost exactly the same as my ultra... am I missing something???

I couldn't help but say WTF to myself when I saw the score!!!

0
+ -

Remember that the CPU is factored into the 3DMark06 score, so if your CPU is slower that will affect it. As will drivers of course. If you haven't run 3DMark06 in a while, there has been a lot of improvement in NVIDIA's drivers that you're probably not accounting for either.

0
+ -

It must be the drivers because everything else is =.

I'll try later and see/post what I get.

 

0
+ -

I was sort of surprised the card sounded a bit whiny to me and some of the other sites didn't notice. I, as did other sites, ran my tests on an open air bench but it was pretty warm in the lab one day during test and I heard the fan spin up a bit. Now, imagine putting the card in a hot case (don't imagine, I'll do it soon and let you know). I think things could get a little louder still but not to the point where its obtrusive.

0
+ -

I overlooked the fact that a quad core CPU was used in testing. My score with GFX @ stock is 12,465. While I find the 8800's to be very good cards, I have every intention of upgrading to Nvidias next flagship so I can run Crysis faster. Quite frankly I am disappointed with the framerate my card puts out in Crysis at high settings, but thankfully anticipated such circumstances and put $$$ aside for the next card.

0
+ -

 hows it run crysis?

 

jump ship on my gtx and sli two of them bad boys... Big Smile 

0
+ -

 Meister, the Nvidia 9800 series are supposed to be announced around the middle of November so it might be worth bit of a wait on that one.

0
+ -

 I have read the review that Hot Hardware did on the 8800GT and was wondering how you benchmarked Crysis?  Did you use the Crysis Benchmark Tool?

Login or Register to Comment
Post a Comment
Username:   Password: