NVIDIA GF100 Architecture and Feature Preview

rated by 0 users
This post has 45 Replies | 5 Followers

Top 10 Contributor
Posts 26,345
Points 1,191,215
Joined: Sep 2007
ForumsAdministrator
News Posted: Mon, Jan 18 2010 12:14 AM

NVIDIA GF100 Architecture and Feature PreviewBack in late September of last year, NVIDIA disclosed some information regarding its next generation GPU architecture, codenamed "Fermi". At the time, actual product names and detailed specifications were not disclosed, nor was performance in 3D games, but high-level information about the architecture, its strong focus on compute performance, and broader compatibility with computational applications were discussed.

We covered much of the early information regarding Fermi in this article. Just to recap some of the more pertinent details found there, the GPU codenamed Fermi will feature over 3 billion transistors and be produced using TSMC's 40nm processes. If you remember, AMD's RV870, which is used in the ATI Radeon HD 5870, is comprised of roughly 2.15 billion transistors and is also manufactured at 40nm. Fermi will be outfitted with more than double the number of cores as the current GT200, 512 in total. It will also offer 8x the peak double-precision compute performance as its predecessor, and Fermi will be the first GPU architecture to support ECC. ECC support will allow Fermi to compensate for soft error rate (SER) issues and also potentially allow it to scale to higher densities, mitigating the issue in larger designs. The GPU will also be execute C++ code.

During the GPU Technology conference that took place in San Jose, NVIDIA's CEO Jen-Hsun Huang showed off the first Fermi-based Tesla-branded prototype boards, and talked much of the compute performance of the architecture. Game performance wasn't a focus of Huang's speech, however, which led some to speculate that NVIDIA was forgetting about gamers with this generation of GPUs. That obviously is not the case, however. Fermi is going to be a powerful GPU after all. The simple fact of the matter is, NVIDIA is late with their next-gen GPU architecture and the company chose a different venue--the Consumer Electronic Show--to discuss Fermi's gaming oriented features...

NVIDIA GF100 Architecture and Feature Preview

  • | Post Points: 110
Not Ranked
Posts 11
Points 250
Joined: Jan 2010

From listening to people theoretically more knowledgeable about hardware than I (which really would not be that difficult, to be honest. I'm more into the practical information than the technical information, e.g. This card goes in that slot), I've heard that the boost in double floating point precision is something that is pretty much not utilized (if it's even possible to be) in games, and so it's nothing that will help frame rates/gaming performance, yet is something that is built into the architecture, so it's something that can't just be cut for, say, the Geforce series of Fermi (if they continue that line), to make them cheaper.

Basically, it sounded like the cards will have a large piece of them on there, that will be paid for by the consumer, that won't actually be used by games at all. Something that just raises costs with no benefit for an average gamer that buys one. Could anyone shed light on this?

  • | Post Points: 50
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

Finally, some delicious news about Fermi!

Thanks Marco, been waiting for this!

  • | Post Points: 5
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

A lot of technical information, but it's also showcasing some of the things the Nvidia DirectX 11 enabled cards will be able to do. The free-flowing hair and water look incredible.

The higher anti-aliasing modes, ray tracing, tessellation, Nvidia is showing how much more powerful Fermi is than the GT200 series. And I think we're talking multiples, at least 2-3 times the performance in certain areas.

Hard numbers will bear that out, but it's safe to say Nvidia has something very powerful up their sleeve.

  • | Post Points: 5
Top 10 Contributor
Posts 8,683
Points 104,320
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Mon, Jan 18 2010 9:17 AM

It's good news. Now I wait for the new cards to be introduced, get old, and get less expensive. (they will be frightfully expensive I think) And then, using the tried and true 'Trickle-Down' effect, I'll get one.

Time is on my side. Yes, It is,..............

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 200 Contributor
Posts 457
Points 5,945
Joined: Oct 2009
Location: Norfolk, VA, USA
Soupstyle replied on Mon, Jan 18 2010 12:31 PM

The biggest price drop will occur if/when ATI releases a response to the GF100

That being said the 3 card Supersonic Sled demo is awesome, wish I could run it on my computer! (no physix card here ;_; )

:P

  • | Post Points: 5
Top 500 Contributor
Posts 123
Points 1,695
Joined: Jan 2010
Location: Mountainy Virginia
Bighorse replied on Mon, Jan 18 2010 1:41 PM

I really wish Nvidia was giving us more juicy info, but hey this stuff sounds powerful. Fermi is gonna be so amazing solo, then when you hit SLI and tri-SLI its gonna be heart stopping. Developers better make some sweet stuff to push these new cards to their limits so we can all drool at the beautiful graphics these things can push.

 

Also on a side note, I bet it will play crysis LOL.

  • | Post Points: 35
Top 10 Contributor
Posts 6,374
Points 80,315
Joined: Nov 2004
Location: United States, Arizona
Moderator

very cool... if it come out to be what it claims then I might trade in my 2 275's for one of them...

"Never trust a computer you can't throw out a window."

2700K

Z77 GIGABYTE G1.SNIPER

GIGABYTE GTX670

G.Skill Ripjaws X 16gb PC2133

Antec P280

Corsair H100

Asus Blu-ray burner

Seasonic X650 PSU

Patriot Pyro 128gb SSD

  • | Post Points: 5
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

Yep, this seems like the prelude to a launch.

The good news is that at least we know they have a Fermi based video card on hand.

 

Here are some demo vids:

 

 

  • | Post Points: 20
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Mon, Jan 18 2010 6:45 PM

We will see when it gets here. From what I was reading this one is going to be different focus wise all the way around. I hope DX11 is picked up unlike DX10 or 10.1 because it looks like it has a lot of advantages visually especially in the realism sector.

 

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Mon, Jan 18 2010 6:50 PM

When you look through the full article and the slides the show check out the one of the hair. I studied that one pretty deep, and it looks considerably similar to the real thing. With something like hair they are minuscule to the point of blending together. In the Nvidia demo of hair you could see thousands of separate hairs in the image. So the detail and construction level of this card looks to be awesome. However; we will have to see how that affects speed of rendering etc for a final verdict.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 35
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

The hair and the water pics, both look amazing.

Though I thought it was showing off DirectX 11 tessellation, rather than a feature of the Nvidia cards.

The Supersonic Sled demo would be unique to Nvidia because it employs PhysX.

  • | Post Points: 35
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Tue, Jan 19 2010 12:22 AM

That's true but when I looked at that picture first I was like why are the showing a blond wig on here. Then I looked at it closer and scrolled down and read to details and was like wow that almost looks like rl hair. So producing a pic of that much detail that I can see through my current GPU on a webpage image which is not the same GPU is like minus 2-400 percent detail wise at the least.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 20
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

I see what you mean. Man, but I would loved to see a video demonstration of the free flowing hair and the water. With the speed of news coming out about Fermi, I think we'll have a demo pretty soon!

  • | Post Points: 20
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Tue, Jan 19 2010 3:03 PM

Well what I am saying is that the picture you can see is awesome and the real picture on your PC would be 2-400 times better. So this thing will blow away everything on the market I would imagine, but it also changes the general functionality of a GPU as well. The focus and delivery mechanisms as well as software platform is in many ways totally different, or at least the focus is. I am pretty confident the reason the 5970 is two tweaked 5850 gpu's, is because ATI is working on something new as well. I also think that inn realistic pictures we are on a cusp. Look at Avatar it is animation done by computers almost completely with real actors at the same time. It is a meshing of technologies which I see on your PC in a relatively short amount of time. The 5870 started it this Nvidia hardware expands it, and ATi expands it just like normal. The impact on digital imagery and its availability to the normal person will change though.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 20
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

I was able to find some video demos on youtube and I posted them a couple of posts above. They're worth checking out! My current graphics card would melt if I tried running any of the demos on it, lol.

The Streaming Multiprocessors on the GF100 have taken a giant leap forward:

  • 32 CUDA Cores (4x compared to GT200)
  • 16 or 48KB of Shared Memory (3x compared to GT200)
  • 16 or 48KB of L1 Cache (There was no L1 on GT200)

We're seeing some major increases in hardware power and we're also seeing real improvements in geometric processing (tessellation and displacement mapping). Rob over at Techgage mentions it in his review:

"While pixel shaders have had an increasing focus from GPU generation to the next, there's been almost no love to the triangle generator. Compared to the GeForce FX (2003), the shading horsepower has increased by 150x, while the geometric processing has increased by only 3x."

 

You're right that the new Nvidia cards will surpass the offerings by ATI, and Nvidia has not tried to hide that fact. Look at this graph they released of tessellation performance(red is ATI):

The 5870's max FPS barely touches the min of the GF100.

  • | Post Points: 20
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Tue, Jan 19 2010 4:28 PM

yeah those video demos are awesome especially the hair one and the water. I still think the detail in the hair is awesome. when the wind blows it looks real

 

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 200 Contributor
Posts 457
Points 5,945
Joined: Oct 2009
Location: Norfolk, VA, USA
Soupstyle replied on Tue, Jan 19 2010 5:34 PM

You are right, that hair demo is pretty impressive too.

  • | Post Points: 20
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

I wonder how much of that is going to translate into actual gaming. Remember that hair demo close up was pulling in 25 FPS. While we may not see hair that detailed in games anytime soon, anything one-tenth as good would look awesome.

  • | Post Points: 5
Top 75 Contributor
Posts 1,248
Points 11,580
Joined: Jan 2005
Location: Florida

All this talk is worthless until the thing actually comes out.  Plus, it is going to be really expensive since there is such a large chunk of its hardware that is worthless for gaming and which can not be cut from the core.

The biggest thing though is that ATI could possibly release their next gen card a few months after this thing hits the market.  NVidia is way behind in production/research cycle and we might be seeing out releasing with the leader swinging back and forth every three or four months.

Smooth Creations LANShark "Blue Flame" + ASUS G73JH-A2 + ASUS EeePC S101H

"I frag therefore I am!"

  • | Post Points: 20
Top 50 Contributor
Posts 2,361
Points 48,680
Joined: Apr 2000
Location: United States, Connecticut
ForumsAdministrator
MembershipAdministrator
Marco C replied on Tue, Jan 19 2010 10:38 PM

Keep in mind, that slide is provided by NVIDIA, and it's only a specific 60 second snapshot, of a certain part of that benchmark.

Marco Chiappetta
Managing Editor @ HotHardware.com

Follow Marco on Twitter

  • | Post Points: 20
Top 10 Contributor
Posts 6,181
Points 90,135
Joined: Aug 2003
Location: United States, Virginia
Moderator

Have you guys seen hardware yet?

  • | Post Points: 35
Top 200 Contributor
Posts 457
Points 5,945
Joined: Oct 2009
Location: Norfolk, VA, USA
Soupstyle replied on Tue, Jan 19 2010 11:53 PM

They had them on display at CES, but I doubt they let anyone open the cards up or test them yet.

  • | Post Points: 5
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

ATI releasing their next-gen tech so soon? Their current high-end cards are already in such high demand that they're selling above MSRP.

I do agree with you, all this is mostly conjecture until we see it backed up by in-game performance.

  • | Post Points: 35
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Wed, Jan 20 2010 3:02 PM

I totally agree with Infinity Nvidia has gotten behind considerably in a market which moves as fast as GPU's. I am almost positive that before this product is available in quantity ATI is almost guaranteed if not to release a next gen card from there stock line to at least have widespread media about it. As for it's current capabilities being at the top of the market, then the next gen component will blow it away. I think Nvidia has some catch up work to do. That is even if this outperforms ATI's current stock there next gen may be more impressive than this is component and capability wise.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 10 Contributor
Posts 5,053
Points 60,715
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Wed, Jan 20 2010 8:30 PM

gibbersome:
Though I thought it was showing off DirectX 11 tessellation, rather than a feature of the Nvidia cards.

Tessellation is a feature of the card, not DirectX.  The DirectX 11 standard specifies which adds APIs to control it, and specifies that only hardware that supports it can be called "DirectX 11 compatible".  OpenGL3.2 supports the same tessellation with the same cards, even on Windows XP (and Linux).

 

Tessellation controlled by OpenGL:

 

I agree about the hair and water vids - really nice looking stuff, though the frame rates on the hair seem a bit troubling considering there's nothing else being rendered in the demo.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 35
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

Ahh, thanks. But the hardware though is not unique to Nvidia, unlike PhysX which runs off CUDA-enabled GeForce CPU. The hair and water tessellation is possible with any DirectX 11 compatible card. I guess that's the point that I was making.

About the hair vids, I was thinking the 25 FPS was very low as well, but we this kind of detail won't be present in games, not yet anyway. Something even 10% of that would be a huge improvement in realism in games.

  • | Post Points: 5
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Thu, Jan 21 2010 12:45 AM

Nice post 3vi1 I really like the openGL video. And on your "gibbersome" gaming comment yeah that kind of detail would be flat out bad in a game. Thats also another reason I was pointing out the hair thing, can you imagine playing or making a charachter as detailed as that in a multiplayer game that would be sick, but I am sure we'll see it within the next year or two. The next one I am waiting for is SWTOR (Star wars the old republic) which has finally been given a release date of spring 2011. If anyone likes MMO type games check it out  http://www.swtor.com/

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 20
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Thu, Jan 21 2010 12:48 AM

Oh and Gibbersome the reason they were running over msrp was a plant delay upgrading there chip lathe's it is supposed to be over now and they are supposed to be widely available at normal prices anywhere by the end of the month.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 75 Contributor
Posts 1,248
Points 11,580
Joined: Jan 2005
Location: Florida

The point was that NVidia is so far behind in releasing their next product that the next gen ATI part could be released within a few months of it.  If this NVidia part bombs compared to current ATI stuff or even just barely beats it then ATI can hold off for a little while since their parts are cheaper to build and can undercut NVidia. 

If it does what this article says and beats ATI by a huge chunk (50%+) then you can be sure that the next gen ATI part will be coming out ASAP.  With this really long delay I expect that ASAP to be ~3 months.

Smooth Creations LANShark "Blue Flame" + ASUS G73JH-A2 + ASUS EeePC S101H

"I frag therefore I am!"

  • | Post Points: 5
Top 200 Contributor
Posts 457
Points 5,945
Joined: Oct 2009
Location: Norfolk, VA, USA
Soupstyle replied on Thu, Jan 21 2010 8:33 PM

I am waiting for SWtOR too, I am guessing it gets pushed into the Summer to Early Fall of 2011. I think the end product will be worth the wait though.

  • | Post Points: 35
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Fri, Jan 22 2010 12:33 AM

As for the video card thing Nvidia has put this off to long I think. Of course as far as I remember it has actually been over a year since the actually released anything that was not a re-branded (shrunken maybe but same architecture except memory changes) and truly new 100%. Before this it was about every 8-9 months for both companies and would be at least a major update. They have released the 295 but it's just dual gpu the hardware is not super new (maybe Tweaked). Either way as ATI has also known I would imagine considerably more than any of us about what they intended to do for most likely over a year they should be prepared. So I would think we would see something major, not that the 5800 up line is not at least concerning current hardware. However; if you had been in a major back and forth with a competitor, and for some reason they were slowed down for any reason, and you new the basic time line for the slow down as well as what the intended to put out next what would you do?

 

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Fri, Jan 22 2010 12:39 AM

As for SwtOR I imagine it will be late spring early summer (I say June first 2 weeks of July). As I had posted on the forum I appreciate there strategy with the release as anything else released in the MMO world even WOW usually takes some time to get straightened out with all foretold features active and running well. I personally am not a WOW player really I tried it for my free month and had like 3 20 level characters. I started in the second week of EQ1, and have been on every beta from original release until Dragons (about 10 straight), and a GM on 3 servers. The graphics are just to low quality for me, not to mention I played for a very, very long time.

For some reason the refer to players like me as hardcore at least in MMO's. I also played Vanguard beta and for quite some time  afterward graphically it is better than anything including Conan (which I also beta'd, I have beta tested almost every major successful MMO release since EQ1 even WOW, but it would never work at first, neither would AOC for that matter DND online just was a joke, Warhammer beat was kind of cool but after a couple weeks following release I was bored, Heros was cool I did not really like Villians much, well enough listing of beta's trust me I have probably beta'd at least 30 MMO's all together some not major releases) Either way VG has pretty much had it's death warrant signed for now as they have not gone past initial release other than some added quests for higher levels but not enough, stability work which was done after 3 months, (Believe it or not VG on release was very demanding graphically and basically would not play right with anything under a 1900+ series ATI or a High end Nvidia card, kind of like Crysis, you also needed as much ram as you could get and a high class CPU as well as a fast HD just for general play at low settings) basically no new abilities a +5 level cap it's a joke.

Either away enough of my wisdom on MMO's except to state that I believe and hope it is true that SWtOR is complete when finished which is actually very rare, and stable on release. Then they can worry about expansion in the normal 1 to 2 year time span, and know BIO I think it will be close to yearly which is more inline with something like WOW where the company (Blizzard Oh yeah I beta'd Diablo and Diablo 2 as well my first betas) actually wants there game to be a success and pushes it.

The thing with Sony online makes no sense to me really as a successful and regularly updated content and technology wise game is nothing but a bank in itself. Your customer base buys the original usually within a week to a month of release, pays you monthly access fees and buys every upgrade you put out for it. Money wise they are the most successful because of the fat loot they make after release which is non stop and recurring.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 20
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

I know many people are eagerly waiting for SwtOR. Also, if you're an Elder Scrolls fan, it's been leaked that they're developing an MMO as well. After the success of WOW, MMORPG's really seem like all the rage with large game developers. Personally, I'm more of a single RPG guy. I like my games to have a definite beginning and end to them. Baldur's Gate was my first and I've been hooked ever since.

Back to Nvidia, yeah, I agree they may have waited too long. If ATI's supply problems resolve later this month, many more 5xxx series cards will be sold in the upcoming weeks, especially with a moderate price drop with the increased supply. Nvidia recognizes this, and that's why you saw the push to release the lower-mid GT300 series mobile cards. They stood to lose out on the mobile market and may have even tried to undercut ATI.

Again, if Fermi is the game changer that Nvidia has been hyping it up to be then it won't matter in the end. Right now though I get the feeling that Nvidia bit off more than it could chew.

  • | Post Points: 5
Top 50 Contributor
Posts 2,361
Points 48,680
Joined: Apr 2000
Location: United States, Connecticut
ForumsAdministrator
MembershipAdministrator
Marco C replied on Fri, Jan 22 2010 10:46 AM

Yes, we saw hardware, but they were pre-production, prototype parts. No word yet on what final specs will be.

Marco Chiappetta
Managing Editor @ HotHardware.com

Follow Marco on Twitter

  • | Post Points: 35
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Sat, Jan 23 2010 2:24 AM

That is what I was thinking as well Marco. We keep seeing things about them even some supposed benchmarks versus a 285. I am betting these were run internal in Nvidia though. I believe that's about as real as a Unicorn. I wanna see something that's going to be released ( would like to know a price line but not mandatory), and someone like you or Dave testing it before I believe anything.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

Nvidia is getting sued by Rambus over their graphics chip technology. Nvidia is just one of many companies that Rambus is pursuing with the goal of gaining lucrative settlements and royalties, they already got $900 million from Samsung over memory chip technology.

  • | Post Points: 20
Not Ranked
Posts 1
Points 20
Joined: Jan 2010
Quinid replied on Sat, Jan 23 2010 9:52 PM

@Anakhoresis

I agree with everything you said with gaming.

As for double floating point precision, this feature could be a big deal for GPU computation projects that use CUDA like GPUID. This feature could save projects wasted time double checking calculations as the GPU may catch the alot of the miscalculations itself.

  • | Post Points: 20
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009
gibbersome replied on Sat, Jan 23 2010 11:50 PM

ATI also has double precision floating support in their 5800 series line up.

Interestingly the 5700 series doesn't, but if you go back to two years ago, the 4770 had it. I'm guessing this was a cost cutting measure for ATI.

 

@Quinid Thanks for answering.

  • | Post Points: 20
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Sun, Jan 24 2010 2:12 AM

Yeah it makes no sense otherwise to leave it out really. Look at the 5970 it's 2 oc'd 5850 cores not 2 5870 cores. So there's little stuff they do to save money. The funny part about it is you could get 2 5850's cheaper than a 5970 or for the same amount, but they outperform it in crossfire.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 20
Page 1 of 2 (46 items) 1 2 Next > | RSS