NVIDIA Tegra 4 Details Leak; Promises An Impressive Punch

rated by 0 users
This post has 8 Replies | 2 Followers

Top 10 Contributor
Posts 25,895
Points 1,174,335
Joined: Sep 2007
ForumsAdministrator
News Posted: Tue, Dec 18 2012 3:40 PM
When dual core phones first hit the smartphone market several years ago, Tegra 2 was perfectly positioned across the smartphone and tablet market. Tegra 3 has done extremely well in tablets this past year; Nvidia has won high-profile designs with everyone from Google to Microsoft. In the past few months, however, it's become clear that Tegra 3's quad-core, Cortex-A9 CPUs and older GPU technology wouldn't be able to compete with the latest designs from Apple or Qualcomm.

A leaked slide from a Chinese site has shed light on what Team Green plans to answer with and the next-generation SoC is pretty darn sexy. Here's what Wayne (aka Tegra 4) purportedly looks like:



The new Tegra 4 packs 72 GPU cores compared to 12 cores in Tegra 3. Conveniently, NV claims T4's performance is 6x that of Tegra 3. In this case, comparing cores against cores is a really bad idea. Tegra 2 and Tegra 3 are based on a GeForce 7-derived architecture. They have specialized hardware for pixel shader and vertex shader programs. The 12 cores inside Tegra 3 are cores -- they are capable of executing programs -- but they are far more limited than the CUDA architecture Nvidia debuted in 2006, to say nothing of the company's latest CUDA-capable hardware.

Nvidia's G80, and all of its desktop/mobile parts since 2006, are far more programmable and flexible. So why didn't the company bake that tech into Tegra from the beginning?

Power and die size. Nvidia's second-generation GeForce 7, built on 90nm, had a die size of 196mm sq. The first generation GeForce 8800 GTX, also built on 90nm, weighed in at 484mm sq. Since we're talking about total area, that makes G80 6x the size of G70. Adding programmable architectures with flexible resource allocation isn't the cheapest way to increase performance, but in the long run, it's the smartest.

But figuring out how to build a programmable GPU that's faster than a fixed-function model while fitting in your pocket? That took awhile. The timing makes more sense when you factor in the fact that NV started working on Tegra before Fermi had even taped out. The GeForce 7 was a thoroughly known quantity in 2008, the low-power future of CUDA was rather murkier.

What all this means is that the 72 cores inside Tegra 4 aren't going to be exactly comparable to T3s, and I don't expect a 6x linear increase -- but they should pack quite a punch. The CPU side is similarly aggressive; a quad-core Cortex-A15 should be able to match anything at the high end of the spectrum.

If the leak is true, T4 should hit the ground running in 2013. Apple, Qualcomm, Samsung, and Intel all have their own plans for the segment, but Nvidia should be well equipped to fight them.
  • | Post Points: 65
Top 50 Contributor
Posts 3,094
Points 38,135
Joined: Aug 2003
Location: Texas
acarzt replied on Tue, Dec 18 2012 5:47 PM

This article is somewhat confusing... CUDA is mentioned in great deal but it is never made clear if the cores in the new Tegra 4 are CUDA capable.... Sooooo are they? Or is that coming in the next gen?

  • | Post Points: 5
Top 100 Contributor
Posts 1,016
Points 10,925
Joined: Dec 2010
Location: Mcallen, Texas
OSunday replied on Tue, Dec 18 2012 6:40 PM

Even not a literal 6x increase, if it literally increases by 3x those are still HUGE gains.

I've heard that the Tegra 4 is also able to power resolutions up to 2560x1440, meaning it should be able handle something maxed out at 1080p with ease.

The Tegra 4 is gonna do some impressive things

  • | Post Points: 5
Top 10 Contributor
Posts 8,578
Points 103,250
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Tue, Dec 18 2012 7:54 PM

NVIDIA has got their game on,..........

Love their video cards,............

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 35
Top 150 Contributor
Posts 756
Points 7,645
Joined: Nov 2012
Location: Dallas, Tx
Dorkstar replied on Tue, Dec 18 2012 7:59 PM

Is that sarcasm I'm tasting?

  • | Post Points: 20
Top 10 Contributor
Posts 8,578
Points 103,250
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Tue, Dec 18 2012 8:35 PM

No, I really do like the video cards that they're putting out these days.

I don't use a new phone, mine is 8 years old now, but still works, so I can't speak to the quality of the Tabs and phones they're a part of.

I hear a lot of good though,...........

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Top 100 Contributor
Posts 1,016
Points 10,925
Joined: Dec 2010
Location: Mcallen, Texas
OSunday replied on Wed, Dec 19 2012 12:27 AM

How do you feel they compare to AMD?
Every time I try to justify buying an Nvidia card the fact that my research reveals I can get similar or better performance for less money always turns me away from the brand 

  • | Post Points: 20
Top 150 Contributor
Posts 756
Points 7,645
Joined: Nov 2012
Location: Dallas, Tx
Dorkstar replied on Wed, Dec 19 2012 9:18 AM

OSunday:

How do you feel they compare to AMD?
Every time I try to justify buying an Nvidia card the fact that my research reveals I can get similar or better performance for less money always turns me away from the brand 

Everytime AMD starts to convince me to purchase their products, Nvidia does something better.  Either it be more advanced physics, better overall quality (not benchmarking), but I do admit their prices are typically higher.  If you look at Farcry 3 benchmarked, AMD performs better on paper, but a lot of reviews say the AMD cards stutter a lot when actually playing.  Then again, that's the only game i've seen AMD not out perform or match Nvidia when comparing some of the higher end cards.  I'm still keeping a close eye on AMD, i've never bought their cards, but this year they've really started to impress me.

  • | Post Points: 5
Top 500 Contributor
Posts 93
Points 800
Joined: Jul 2012
Shadizzle replied on Wed, Dec 19 2012 9:57 AM

This will be a pretty sweet little SoC if they do release it within the spec's on paper. I too am a little confused in regards to the CUDA being capable or not though. It looks like that what they are going for, and if so. It will DEF. be needed for 2560x1440 - Def. looking forward to this, the tegra 3 was extremely potent when it was released. Cant wait to see how this sucker performs once its out there in the real world.

@ AMD comments/cards

Everytime AMD comes out with something, it almost always looks great on paper, but the real world performance is what varies. Ive had both mid and high end AMD/Nvidia cards in the past, performance specs aside, I always felt the Nvida cards were made nicer. Their heatsinks almost always covered both the GPU and GDDR, rather than just the copper/aluminum spreader over the GPU with some plastic shroud to help "direct" air over the GDDR(more their mid line). During gaming on their 3D side, I would often experience shutter or tearing more often, and found that they really dont seem to hold up over time as well either. Ive had to RMA the majority of my AMD cards in past, whereas I found myself just re-selling the nivida cards(may not be a big deal to some, but when you drop hundreds on a card, and return from it is welcomed). On their professional side, being the Firepro vs. quadro- there is no competition. The quadro cards just smash through everything out there if you are utilizing the hardware properly. Really they leave AMD in the dust on a professional note. - Im sure they have a couple out there that prove their worth, sadly it has not been within the S/W I use.

AMD does have their place here, and like them or not. We def. need them. Same idea as to why we need AMD to survive in the CPU world as well. Keeps the competition moving forward, prices stay in check, and more so. Having that competition keeps your competitors honest(to a point obviously). I will say though, of all the series Ive had the opportunity to test with. The AMD 6800 series was probably my fav. and seemly well built, and has out lived the 78 series Ive had already.

Final note, I can REALLY see the competition making a turn in AMDs benefit if the HSA computing catches on!

  • | Post Points: 5
Page 1 of 1 (9 items) | RSS