Logo   Banner   TopRight
Date: Apr 27, 2006
Author: Matt Beauvais

With most of the fuss revolving around video cards such as Nvidia's 7900 series and ATI's X1900 series these days, it's easy to overlook entry-level offerings like the GeForce 7300. That is unless you don't have a ton of coin to throw down for a high end card. As great as it is using high resolutions with anti-aliasing and anisotropic filtering on a 20" LCD is, it's simply not an affordable option for most of us. Of course, you could always settle for on-board video and save some money, right?  Well, if you plan on playing any games or watching HD videos without dropping frames, probably not. So what do you do?  What can you do if you don't have enough cash for a graphics powerhouse, but can't stand to deal with the limitations of most integrated video solutions?

Nvidia aims to help out the multitude of people stuck in that exact situation. With their previous GPU architecture, Nvidia introduced TurboCache technology, at first with their GeForce 6200 card. TurboCache allocates system memory to the video card by way of the PCI Express interface, and while it's certainly not as desirable as having copious amounts of high-speed on-board RAM, it is a useful way of adding more effective total frame buffer memory to an entry-level card, while also keeping prices down.

Today we're going to look at MSI's GeForce NX7300GS TD256E which features TurboCache technology. While the NX7300GS TD256E may not sport the same number of pixel shader processors or vertex shaders as a 7900 series product, it does have essentially the same feature set, including support for Shader Model 3.0 and PureVideo. Take a look...

MSI GeForce NX7300GS
Exploring GeForce 7 Series Technology

NVIDIA CineFX 4.0 Shading Architecture

  • Vertex Shaders
  • Support for Microsoft DirectX 9.0 Vertex Shader 3.0
  • Displacement mapping
  • Geometry instancing
  • Infinite length vertex programs

  • Pixel Shaders
  • Support for DirectX 9.0 Pixel Shader 3.0
  • Full pixel branching support
  • Support for Multiple Render Targets (MRTs)
  • Infinite length pixel programs

  • Next-Generation Texture Engine
  • Accelerated texture access
  • Up to 16 textures per rendering pass
  • Support for 16-bit floating point format and 32-bit floating point format
  • Support for non-power of two textures
  • Support for sRGB texture format for gamma textures
  • DirectX and S3TC texture compression

  • Full 128-bit studio-quality floating point precision through the entire rendering pipeline with native hardware support for 32bpp, 64bpp, and 128bpp rendering modes

API Support

  • Complete DirectX support, including the latest version of Microsoft DirectX 9.0 Shader Model 3.0
  • Full OpenGL support, including OpenGL 2.0

64-Bit Texture Filtering and Blending

  • Full floating point support throughout entire pipeline
  • Floating point filtering improves the quality of images in motion
  • Floating point texturing drives new levels of clarity and image detail
  • Floating point frame buffer blending gives detail to special effects like motion blur and explosions

NVIDIA Intellisample 4.0 Technology

  • Advanced 16x anisotropic filtering (with up to 128 Taps)
  • Blistering- fast antialiasing and compression performance
  • Gamma-adjusted rotated-grid antialiasing removes jagged edges for incredible image quality
  • Transparent multisampling and transparent supersampling modes boost antialiasing quality to new levels
  • Support for normal map compression
  • Support for advanced lossless compression algorithms for color, texture, and z-data at even higher resolutions and frame rates
  • Fast z-clear

NVIDIA UltraShadow II Technology

  • Designed to enhance the performance of shadow-intensive games

NVIDIA Digital Vibrance Control (DVC) 3.0 Technology

  • DVC color controls
  • DVC image sharpening controls

NVIDIA PureVideo Technology

  • Adaptable programmable video processor
  • High-definition MPEG-2 and WMV9 hardware acceleration
  • Spatial-temporal de-interlacing
  • Inverse 2:2 and 3:2 pull-down (Inverse Telecine)
  • 4-tap horizontal, 5-tap vertical scaling
  • Overlay color temperature correction
  • Microsoft Video Mixing Renderer (VMR) supports multiple video windows with full video quality and features in each window
  • Integrated HDTV output

Composited Desktop Hardware Engine

  • Video post-processing
  • Real-time desktop compositing
  • Accelerated antialiased text rendering
  • Pixel shader-driven special effects and animation

Advanced Display Functionality

  • Dual integrated 400MHz RAMDACs for display resolutions up to and including 2048x1536 at 85Hz
  • Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
  • Full NVIDIA nView multi-display technology capability

Operating Systems

  • Windows XP/XP 64/ME/2000
  • Linux
  • Macintosh OS X

Taking a Closer Look


Taking a Closer Look
Itty bitty graphics card 

The NX7300GS TD256E is small video card by today's standards. It's about 8 inches long and only half height, so there shouldn't be any problems fitting it into any system, which is great if you're planning on throwing the NX7300GS TD256E into an HTPC setup. The aluminum cooler looks to be a bit much for this card, however MSI officially supports a limited level of overclocking, so it's just extra security against any kind of heat related damage.

The card sports 256MB of on-board Hynix memory, with two of the chips visible on the back, and the other two hidden by the cooler on the front. With a single VGA port and a single DVI port, the MSI NX7300GS has dual-monitor support and also features an S-Video port which can be hooked up to a TV or similar display.



The NX7300GS TD256E's stock clock speeds are 550MHz for the Core and 810MHz (DDR) for the memory. The clock speeds may seem high considering this is an entry-level card, but you have to keep in mind that it features only a 64-bit memory interface, 4 pixel pipelines, and 3 vertex shaders. With these specs, the NX7300GS TD256E is far more powerful than any integrated video solution, but more expensive mid-range and high-end cards will of course offer better performance.

Software and Accessories
How much for $80?

The NX7300GS TD256E's bundle was decent, but not on the level of some of MSI's other video cards. In the box we found a quick start guide and a small user's manual to help users get things going. For accessories, MSI has provided a common but useful DVI to DB15 adaptor, and an S-Video cable. The packaged software is your basic drivers & utilities CD, and a racing game called "Juiced".  Juiced is no Need For Speed, but it will give you a decent idea of what the NX7300GS TD256E is capable of. If we were reviewing a high-end graphics card, we'd expect a little more, but for around $80, you can't expect a plethora of up-to-date software to be throw in.


Special Features & Test System


A couple of extra features available on the MSI NX7300GS are Nvidia's TurboCache Technology and MSI's Dynamic Overclocking Technology, or D.O.T.

Special Features
TurboCache & D.O.T.(Dynamic Overclocking Technology)

TurboCache: TurboCache basically uses a portion of your computer's system memory as dedicated video memory, which has it's pros and cons. The good news is that you'll have more frame memory available to the GPU. The bad news is that it takes up a portion of your system memory and bandwidth.  With TurboCache you should have at least 512MB or RAM installed in your system to fully benefit from this technology. Since the MSI NX7300GS TD256E already has 256MB of on-board memory, having 256MB or 512MB of system memory won't increase your effective video memory. However, if you're sporting 1GB or more, the card will allocate 256MB of that memory to give you a total of 512MB effective video memory. TurboCache dynamically uses system memory depending on the amount of RAM installed.

D.O.T.: MSI's Dynamic Overclocking Technology is basically a warranty covered overclock. MSI states that it gives "2% ~ 10% / 6-Scale Dynamic Overclocking Performance". The key benefit of D.O.T. is that users don't have to manually change the core and memory speeds of your card; MSI's proprietary software will handle that automatically. While any kind of warranty covered overclock is nice, there is usually still some performance to be found by bumping up clock speeds manually using software like Rivatuner or Coolbits. 


HOW WE CONFIGURED THE TEST SYSTEMS: We tested our cards on a Foxconn WinFast 6150K8MA motherboard, powered by an AMD Athlon 64 3000+ (Winchester Core) processor and 1GB of low-latency Mushkin Redline RAM. The first thing we did when configuring the test system was enter the BIOS and load the "High Performance Defaults."  The hard drive was then formatted, and Windows XP Professional with SP2 was installed. When the installation was complete, we installed the latest chipset drivers available, installed all of the other necessary drivers for the rest of our components, and removed Windows Messenger from the system. Auto-Updating and System Restore were disabled, the hard drive was defraggmented, and a 768MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows XP's Visual Effects to "best performance," installed all of the benchmarking software, and ran the tests.

The HotHardware Test System
A Mid to low end gaming rig

Processor -

Motherboard -

Video Cards -

emory -

Audio -

Hard Driv
e -


Hardware Used:
AMD Athlon 3000+ (1.8GHz, Winchester Core)

Foxconn WinFast 6150K8MA
(Nvidia nForce 4 Chipset)

XFX GeForce 6600 DDR2 256MB
On board Video(GeForce 6150)

B Mushkin Redline PC3200 RAM

Integrated on board

Western Digital "Caviar"

80GB - ATA 100

Operating System -
Chipset Drivers -
DirectX -

Video Drivers

Synthetic (DX) -
Synthetic (DX) -
DirectX -

DirectX -
OpenGL -
Relevant Software:
Windows XP Professional SP2
nForce Drivers v6.82
DirectX 9.0c

NVIDIA Forceware v84.21

Benchmarks Used:
3DMark05 v1.0.2
3DMark06 v1.2
Half Life 2*
Far Cry*
Quake 4*

* - Custom Test (HH Exclusive demo)

Benchmarking: 3DMark05

Performance Comparisons with 3DMark05 v1.2.0
Starting out synthetic

3DMark05 is a synthetic benchmark that requires a DirectX 9.0 compliant video card, with support for Pixel Shaders 2.0 or higher, to render all of the various modules that comprise the suite. To generate its final "score", 3DMark05 runs three different simulated game tests and uses each test's framerate in the final score tabulation. Fillrate, Memory bandwidth, and compute performance all have a measurable impact on performance in this benchmark. We ran 3DMark05's default test (1024 x 768) on all of the cards and configurations we tested, and have the overall results posted for you below.

We begin our round of testing with Futuremark's 3DMark05. While 3DMark06 has been out for a while now, millions of people have already benchmarked their cards using 3DMark05, so we thought we'd give you some scores for the sake of comparison. With a score of 1839, the MSI NX7300GS falls right between the more expensive XFX GeForce 6600 DDR2 and the 6150 IGP, besting the integrated solution by a significant amount.

Benchmarking: 3DMark06



Performance Comparisons with 3DMark06 v1.0.2
Details: http://www.futuremark.com/products/3dmark06/

Futuremark recently launched a brand-new version of their popular benchmark, 3DMark06. The new version of the benchmark is updated in a number of ways, and now includes not only Shader Model 2.0 tests, but Shader Model 3.0 and HDR tests as well. Some of the assets from 3DMark05 have been re-used, but the scenes are now rendered with much more geometric detail and the shader complexity is vastly increased as well. Max shader length in 3DMark05 was 96 instructions, while 3DMark06 ups the number of instructions to 512. 3DMark06 also employs much more lighting, and there is extensive use of soft shadows. With 3DMark06, Futuremark has also updated how the final score is tabulated. In this latest version of the benchmark, SM 2.0 and HDR / SM3.0 tests are weighted and the CPU score is factored into the final tally as well.


The MSI NX7300GS gets a total score of 816 with Futuremark's latest benchmark.  It got 315 points in to Shader Mark 2.0 test, and 281 points in the HDR/Shader Mark 3.0 test.  Please note, we didn't receive a score for our on-board video test, however, as the chipset doesn't support the hardware requirements of this benchmark.  This is an important point to consider, because many on-board solutions simply won't be able to run some applications, where the NX7300GS can.

Benchmarking: Quake 4 v1.2

We'll be testing the games in this article using "High Quality" settings at resolutions of 800x600 and 1024x768, and we'll also be throwing in some scores using 2x Anti-Aliasing and 4x Anisotropic filtering as well. For the budget type card we're reviewing today, we feel these settings strike a good balance between performance and image quality. 

Performance Comparisons with Quake 4
Details: http://www.quake4game.com/

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the wildly popular Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 800 x 600 and 1024x768 without anti-aliasing enabled and then again with 2X AA and 4X aniso enabled simultaneously.


Our first test is with Quake 4 running at 800x600. The MSI NX7300GS will start a trend here, falling between the Foxconn mobo's on-board video, the Nvidia GeForce 6150, and our XFX GeForce 6600 DDR2. With the visual quality of the Doom 3 engine, you'll need a decent card to fully enjoy a game like Quake 4, as 39.7FPS is just the average framerate. During a good fight, the framerates can drop into the low 20's.

The game becomes a little hard to play at 1024x768. The game still looks decent with Medium Quality video settings, so either using the lower resolution or reducing the image quality settings should offer a playable experience.

Benchmarking: Half-Life 2

Performance Comparisons with Half-Life 2
Details: http://www.half-life2.com/

Half Life 2
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  So, when Valve announced Half-Life 2 was close to completion in mid-2003, gamers the world over sat in eager anticipation. Unfortunately, thanks to a compromised internal network, the theft of a portion of the game's source code, and a tumultuous relationship with the game's distributor, Vivendi Universal, we all had to wait until November '04 to get our hands on this classic. We benchmarked Half-Life 2 with a long, custom-recorded timedemo in the "Canals" map, that takes us through both outdoor and indoor environments. This test was run using "High-Quality" settings at resolutions of 800 x 600 and 1024x768 without any anti-aliasing or anisotropic filtering and with 2X anti-aliasing and 4X anisotropic filtering enabled concurrently.


After benchmarking a plethora of cards with Valve's Source engine, we've been fairly impressed on how well it runs on lower end hardware. At 800x600, you'll get a very playable 78.38FPS when using 2xAA and 4xAF. The demo scores are a little misleading though, as things such as sound processing, physics, and enemy AI aren't taking up any resources. When playing the game, you'll get some noticeable slowdowns every now and then, but it still runs well.

When upping the resolution to 1024x768, playing Half-Life 2 with AA or AF becomes a bit harder. An average framerate of 54.77FPS is still quite playable, but gets pretty sluggish in some parts. Without AA or AF, it runs much better while still looking fairly good.

Benchmarking: Far Cry v1.33

Performance Comparisons with FarCry v1.33
Details: http://www.farcry.ubi.com/

If you've been on top of the gaming scene for some time, you probably know that FarCry was one of the most visually impressive games to be released in 2004. Courtesy of its proprietary engine, dubbed "CryEngine" by its developers, FarCry's game-play is enhanced by Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. Before titles such as Half-Life 2 and Doom 3 hit the scene, FarCry gave us a taste of what was to come in next-generation 3D gaming on the PC. We benchmarked the graphics cards in this article using the standard Regulator demo, at various resolutions without anti-aliasing or anisotropic filtering enabled, and then again with 2X AA and 4X aniso enabled concurrently.


Performance scores from Far Cry have always been somewhere between those of Quake 4 and Half-Life 2. Fortunately with MSI's NX7300GS, this game is much more playable than Quake 4. With High Quality settings and no AA or AF, you'll get a high enough framerate to enjoy the game at 800x600.

Upping the resolution here really hurts the NX7300GS's performance. At the end of our game testing, we'd have to say you'll probably be sticking to settings like 800x600 with no AA/AF for most newer games. With the exception of Half-Life 2, the games we tested ran a bit slow, but were still playable for the most part. We're not done though; not yet. It's time to see how much extra performance we can squeeze out of MSI's NX7300GS TD256E by overclocking its core and memory...

Overclocking Results & Temp Readings


Overclocking Results
Overclocking on a budget

For overclocking, we used the coolbits registry hack to enable the overclocking features in the Forceware control panel. We let the software automatically detect the highest possible clock speeds for the card, which brought us from the default 550MHz Core and 810MHz Memory to 600MHz Core and 917MHz Memory. To test for stability, we then loaded up ATI Tool and ran the stress that also scans for artifacts. After a clean 30 minute run, we decided to see how much further we could push the card. Unfortunately the Core wouldn't budge, so we were left with 600MHz for the Core. We were able to squeeze a little more juice from the memory though, finally landing on 960MHz, which gave us a maximum overclock of 600MHz/960MHz. We then decided to bench the same Far Cry Training demo we used earlier, running tests with and without AA and AF applied.

We got some decent results from our overclocking test. Without using any AA or AF we saw a gain of about 4FPS, and even when using AA and AF, we saw a similar 4FPS gain in performance. While it's not much, it does help smooth out a game that runs at under 60FPS.

Temperature Readings
Hot Hardware?

Throughout our testing, we took a few temperature readings for our card using the built in temp sensor. Without any sort of an overclock, our card hovered around 46'C-47'C. After a half hour of ATI Tool's artifact detector running, the core temperature peaked at 63'C. We also took some temperature numbers with our card overclocked. A small increase of 1'C brought the idle temp to 47'C, and only reached a max temp of 65'C after a half hour with ATI Tool. It seems the relatively small cooler on this card does it's job well, and is fairly quiet too.

Final Thoughts and Conclusion

Even though Nvidia's GeForce 6150 chipset is arguably the most capable integrated graphics solutions currently available, there is a lot of performance to be gained by using a dedicated graphics card, as our testing has shown. Most developers design their games to run on mid-range to high-end graphics hardware. With titles like F.E.A.R. and Bethesda's The Elder Scrolls IV: Oblivion, for example, you'll certainly need a powerful graphics card to really enjoy the game. For those people, a GeForce 7300GS is probably not the best choice.  However, less taxing games such as Far Cry and Half-Life 2 ran quite well on the 7300 GS. And the 7300 GS was able to run portions of the 3DMark06 test suite that the IGP could not.

The MSI NX7300GS TD256E is a good entry-level video card.  While its performance obviously isn't going to break any records, it is a huge upgrade from any integrated video solution.  It wasn't as fast at the GeForce 6600 DDR2 we used for reference, but then again, it wasn't designed to be.  The MSI NX7300GS TD256E is available for around $75-$80 on-line, whereas the 6600 DDR2 usually sells for approximately $100 or more.  This may not sound like a big price difference, but the 6600 is technically priced about 20% higher, which is a major concern for those in the market for a card like the MSI NX7300GS TD256E.

The MSI NX7300GS TD256E's bundle left a little to be desired, but considering its price we won't be overly critical. The bottom line is this; users on a limited budget who are also looking for an inexpensive upgrade from integrated video will be well served by the MSI NX7300GS TD256E.  All things considered, we're giving the inexpensive MSI NX7300GS TD256E a 7.5 on our Heat Meter.

  • Fairly Cheap
  • PureVideo
  • TurboCache & D.O.T. Technology
  • Decent framerates in some games
  • Light Bundle



Get into HotHardware's PC Hardware Forum Right Now!

Content Property of HotHardware.com