GPU Tech: NVIDIA Talks Fermi, Unveils Nexus - HotHardware

GPU Tech: NVIDIA Talks Fermi, Unveils Nexus

0 thumbs up
Other features of Fermi include support for C++ (current-generation CUDA products only support C), and, of course, the already oft-repeated fact that this core is some three billion transistors in size. NVIDIA has publicly tried to blow the importance of this off, claiming that analysts have always expressed concerns over the size of the company's chips, but there's no arguing that three billion transistors is a lot. Typically speaking, the more transistors in a product, the greater the chance something will go wrong when fabbing it; NVIDIA is taking something of a risk in building Fermi on a monolithic core instead of aiming for a mid-range, mid-size core and dual-GPU configurations ala AMD.
 

Fermi's block-level diagram. The increased amount of configurable/L1 cache per SM and the 768K of unified L2 are obvious improvements over GT200, but NVIDIA has made changes to boost core execution efficiency all the way around.

Dig into NVIDIA's whitepapers on Fermi, and you may end up thinking that the company designed a compute engine that happens to be capable of handling graphics rather than the other way around. Many of Fermi's changes should translate across GPU computation and gaming; there's no inherent reason why both sides can't benefit from certain improvements. Certain features, like support for 64-bit addressing, however, are rather obviously aimed at the scientific computing market rather than the needs of the game industry.

For the moment, NVIDIA is talking about Fermi strictly as a scientific computing part, non-Tesla versions will come, of course, but they aren't the company's focus today. As for when those announcements will become reality, that's anyone's guess. Jen-Hsun refused to comment on when we might see Fermi cores ship beyond pointing to a Q4 2009/Q1 2010 timeframe. Fermi's evolution is a demonstration of how divergent AMD and NVIDIA's roadmaps have become. While AMD is staying focused in the consumer and workstation space, NVIDIA is adamant in its belief that scientific computing and major data set crunching (as well as consumer app acceleration) are the waves of the future. On paper, Fermi appears to be a strong competitor, but if it takes NVIDIA nine more months to push GeForce cards out the door, it could find itself matched against an even new series of Radeon cards, rather than the 5800 products currently on the market.

When we discussed NVIDIA's
Tegra platform, we noted that the company's lack of a CPU design would undoubtedly impact its own Tegra product development. With Fermi, NVIDIA has built an architecture with some similar features to what you might expect to find on a massively parallel processor. In order to help developer's take full advantage of the process, NVIDIA has developed its own heterogeneous programming environment.

Article Index:

0
+ -

If only they just had a chance in Hades of obtaining a license to the x86 instruction set.

That's where this is heading, after all... a fully nVidia system to compete with Larrabee and AMD/ATI.

0
+ -

It's also why I think they are starting to look at disabling their physX use on systems using non NVIDIA hardware.

Though Fermi seems to be specifically a shot across the bow of IBM's Cell and AMD's Opteron processors.

0
+ -

Take a gander at this:

Nvidia fakes Fermi boards at GTC: http://www.semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc

If that's to be believed, it indicates extreme desperation by nVidia to detract from ATI's recent thunder.

0
+ -

Yup, they just want to make people forget about the 5800 series by saying "Nvidia is still here!  We'll have a product to compete... in a few months!"

Nvidia still hasn't dropped prices on the 285 to compete with ATi.  They are trying to ignore the superior competition, and I can't help but feel it works.  People buy GTX 275's instead of 4890s all the time even though the latter is cheaper.  The brand name is strong.  I think SLI scales better, but for single card use I see no need to buy Nvidia because they cost more.

0
+ -

Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

This is a very valid point Crisis Causer. I still don't really get Nvidia's mentality. I have long been an ATI user since my First all in wonder when it was initially release on the fist PC I ever built. I have looked at Nvidia's hardware but have only bought one of there cards ever. This has largely because of the price versus performance and the current software platforms (read games) at the times. I have never understood such a great difference in a GPU's price versus there performance level's. I am not saying Nvidia has not had better performance parts it is just for the minimal gain, and generally the extra 200 bucks for the name makes no sense to me. This is still the same, and with this round I think Nvidia is just putting this out, as a hello were here to gesture.

I do however find there different focus very interesting and think in the long run it will be beneficial, as a CPU and a GPU operate on a differing spectrum. So the combination in general leads to a much stronger general hardware platform. So in the long run this will be beneficial to us the users! When operations are truly focused for the best efficiency on a dual Pu unit’s cpu/gpu and each operate together for specific operations versus which will complete said operation with the greatest efficiency it then doubles the hardware capability end to end. Or at least seemingly so!

0
+ -

ATI's thunder is due to the price of their products.

NVIDIA still makes a few faster GPU's but they're not proportionately priced compared to ATI's offerings.

ATI has a better understanding of the financial pain we're all feeling these days and they're capitalizing on it with prices that are grounded in reality.

0
+ -

Oh, I like both equally. :)

I generally buy nVidia right now. But, unless nVidia introduces a game-changer, I'm very likely to switch when the open ATI drivers for Linux mature.

0
+ -

As long as their drivers stay better than the horrible quality ones they put out in the 90's and early 00's ATI seems to be doing well to me :P

0
+ -

The concern is that Nvidia appears to be replicating the DEC Alpha experience with Fermi.

DEC too created a wonderful design, which prefigured the evolution of processors over the next 20 years, but DEC lacked the resources to profit from it.

Unless Nvidia finds a substantial new profit source soon, Fermi and they may suffer a similar fate, particularly given the current business environment.

The decision to curtail chipset development suggests that money is already insufficient to support all of Nvidia's programs.

The imperative for Nvidia is therefore to get fresh support, ideally from some entity such as Livermore, which would underpin additional equity investment.

Login or Register to Comment
Post a Comment
Username:   Password: