GPU Tech: NVIDIA Talks Fermi, Unveils Nexus

rated by 0 users
This post has 9 Replies | 2 Followers

Top 10 Contributor
Posts 26,087
Points 1,183,275
Joined: Sep 2007
ForumsAdministrator
News Posted: Fri, Oct 2 2009 12:22 PM
GPU Tech: NVIDIA Talks Fermi, Unveils NexusIf you've followed the early announcements concerning Fermi, NVDIA's next-generation GPU architecture, you should already be aware that the new GPU core is both an evolution of the existing GT200 architecture and a significant new design in its own right. NVIDIA made it clear early on that they weren't going to be talking about GeForce products at the conference this year, but instead have discussed Fermi as a Tesla successor and future high-end engine primed to drive the GPGPU industry.
 

So that's 16 times 32...carry the four...
 

While it carries many of the same features as the GT200 series, Fermi is distinctly its own animal. NVIDIA's Fermi whitepaper describes the new architecture as follows: "G80 was our initial vision of what a unified graphics and computing parallel processor should look like. GT200 extended the performance and functionality of G80. With Fermi, we have taken all we have learned from the two prior processors and all the applications that were written for them, and employed a completely new approach to design to create the world’s first computational GPU."

GPU Tech: NVIDIA Talks Fermi, Unveils Nexus
  • | Post Points: 65
Top 10 Contributor
Posts 5,053
Points 60,715
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Fri, Oct 2 2009 7:17 PM

If only they just had a chance in Hades of obtaining a license to the x86 instruction set.

That's where this is heading, after all... a fully nVidia system to compete with Larrabee and AMD/ATI.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 20
Top 200 Contributor
Posts 457
Points 5,945
Joined: Oct 2009
Location: Norfolk, VA, USA

It's also why I think they are starting to look at disabling their physX use on systems using non NVIDIA hardware.

Though Fermi seems to be specifically a shot across the bow of IBM's Cell and AMD's Opteron processors.

  • | Post Points: 5
Top 10 Contributor
Posts 5,053
Points 60,715
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Sat, Oct 3 2009 8:44 AM

Take a gander at this:

Nvidia fakes Fermi boards at GTC: http://www.semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc

If that's to be believed, it indicates extreme desperation by nVidia to detract from ATI's recent thunder.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 20
Top 10 Contributor
Posts 8,622
Points 103,875
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Sat, Oct 3 2009 11:27 AM

ATI's thunder is due to the price of their products.

NVIDIA still makes a few faster GPU's but they're not proportionately priced compared to ATI's offerings.

ATI has a better understanding of the financial pain we're all feeling these days and they're capitalizing on it with prices that are grounded in reality.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 10 Contributor
Posts 5,053
Points 60,715
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Sat, Oct 3 2009 12:23 PM

Oh, I like both equally. :)

I generally buy nVidia right now. But, unless nVidia introduces a game-changer, I'm very likely to switch when the open ATI drivers for Linux mature.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 20
Top 200 Contributor
Posts 457
Points 5,945
Joined: Oct 2009
Location: Norfolk, VA, USA
Soupstyle replied on Sat, Oct 3 2009 12:57 PM

As long as their drivers stay better than the horrible quality ones they put out in the 90's and early 00's ATI seems to be doing well to me :P

  • | Post Points: 5
Top 75 Contributor
Posts 1,677
Points 24,005
Joined: Aug 2002
Location: Pittsburgh, Pennsylvania

Yup, they just want to make people forget about the 5800 series by saying "Nvidia is still here!  We'll have a product to compete... in a few months!"

Nvidia still hasn't dropped prices on the 285 to compete with ATi.  They are trying to ignore the superior competition, and I can't help but feel it works.  People buy GTX 275's instead of 4890s all the time even though the latter is cheaper.  The brand name is strong.  I think SLI scales better, but for single card use I see no need to buy Nvidia because they cost more.

Hello

  • | Post Points: 20
Top 10 Contributor
Posts 4,830
Points 45,790
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Wed, Oct 7 2009 2:17 PM

Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

This is a very valid point Crisis Causer. I still don't really get Nvidia's mentality. I have long been an ATI user since my First all in wonder when it was initially release on the fist PC I ever built. I have looked at Nvidia's hardware but have only bought one of there cards ever. This has largely because of the price versus performance and the current software platforms (read games) at the times. I have never understood such a great difference in a GPU's price versus there performance level's. I am not saying Nvidia has not had better performance parts it is just for the minimal gain, and generally the extra 200 bucks for the name makes no sense to me. This is still the same, and with this round I think Nvidia is just putting this out, as a hello were here to gesture.

I do however find there different focus very interesting and think in the long run it will be beneficial, as a CPU and a GPU operate on a differing spectrum. So the combination in general leads to a much stronger general hardware platform. So in the long run this will be beneficial to us the users! When operations are truly focused for the best efficiency on a dual Pu unit’s cpu/gpu and each operate together for specific operations versus which will complete said operation with the greatest efficiency it then doubles the hardware capability end to end. Or at least seemingly so!

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: Oct 2009
etudiant replied on Thu, Oct 8 2009 10:05 PM

The concern is that Nvidia appears to be replicating the DEC Alpha experience with Fermi.

DEC too created a wonderful design, which prefigured the evolution of processors over the next 20 years, but DEC lacked the resources to profit from it.

Unless Nvidia finds a substantial new profit source soon, Fermi and they may suffer a similar fate, particularly given the current business environment.

The decision to curtail chipset development suggests that money is already insufficient to support all of Nvidia's programs.

The imperative for Nvidia is therefore to get fresh support, ideally from some entity such as Livermore, which would underpin additional equity investment.

Page 1 of 1 (10 items) | RSS