Intel's Sandy Bridge, AVX Extensions On Track For Q4 2010

rated by 0 users
This post has 17 Replies | 1 Follower

Top 10 Contributor
Posts 26,218
Points 1,187,270
Joined: Sep 2007
ForumsAdministrator
News Posted: Thu, Jul 1 2010 4:34 PM
For those of you who can't wait to get your hands on the latest and greatest, news today should have you making out a Christmas list. Intel has confirmed that the next-generation Nehalem processor (codenamed Sandy Bridge) is on track for volume shipment in 2010, which means we'll probably see at least a few SKUs launching before the end of the year.

Sandy Bridge is a 'tock' in Intel's tick-tock model, meaning it's a new architecture delivered on an existing process technology. Many of the processor's characteristics will be familiar, such as the 256K of L2 cache and the 8MB L3, but the chip has a few surprises up its sleeve. The GPU core will be integrated into the CPU die, unlike the 32nm / 45nm approach Clarkdale uses, and the processor will support what Intel calls AVX (Advanced Vector Extensions.) Intel reportedly projects that a Sandy Bridge CPU running x87 FPU code will be capable of up to 2GB/s of double-precision throughput per core.  If the same workloads are then rewritten to support AVX, theoretical maximum performance is no less than 8GB/s per core (double precision), a fourfold increase. AVX doesn't just add new instructions, Intel claims it streamlines and allows the CPU to execute older instructions more quickly as well.


It'll take a few years for Intel's new SIMD to be adopted across the market, but the degree of difference between AVX and non-SIMD code makes it virtually certain that companies will leap to take advantage of the new technology. AMD's own support for the new standard (hopefully) won't be far behind. While Sunnyvale originally backed its own SIMD, SSE5, it's since committed to supporting AVX on the new line of Bulldozer processors. SSE5 theoretically still exists—AMD claims to have developed it in close communication with software vendors—but we don't expect to hear much about it, even after it's available.

If AVX is as potent as Intel claims, it could be the most important SIMD introduction since SSE2. For those of you who don't remember, the Pentium 4's launch performance, particularly its FPU performance, was quite weak compared to other products from both Intel and AMD. Back then, SSE itself was just creeping into the market; software support for the Pentium 4's SSE2 was almost nonexistant. As time passed, software vendors introduced new products that utilized SSE2, and the P4's comparative ranking began to change. Where a P4 2GHz had once been often outperformed by a 1.4GHz Athlon with no SIMD support at all, it was now capable of dominating its rivals.*

If Sandy Bridge's power efficiency is significantly achieved through the use of AVX, PC performance could rise significantly as vendors adopt the new extensions. Long-term, AVX could even have an impact on mobile and ultra-mobile devices. The fewer clock cycles it takes to perform a task, the more quickly a CPU can return to its power-saving idle mode. Intel has sunk a great deal of work into tackling power consumption by building ever more frugal processors, but there's definitely something to be said for attacking the problem in the other direction. We wouldn't expect AVX to debut on Atom anytime soon, but given time, it definitely could.

* There was always a certain argument that the P4 wouldn't have needed SSE2 so much if its performance in x87 code wasn't so weak. We acknowledge this, but 10 years later, it seems beside the point. ;)
  • | Post Points: 230
Not Ranked
Posts 33
Points 355
Joined: Jan 2010
until240 replied on Thu, Jul 1 2010 11:00 PM

Will Sandy Bridge still work on X58 boards?

  • | Post Points: 5
Top 500 Contributor
Posts 119
Points 1,405
Joined: Mar 2010
Location: San Francisco, CA
dlim783 replied on Thu, Jul 1 2010 11:10 PM

I wonder if Sandy Bridge is going to work with x58 motherboard? What I'm concerned about is what LGA socket will be?

  • | Post Points: 5
Top 500 Contributor
Posts 119
Points 1,405
Joined: Mar 2010
Location: San Francisco, CA
dlim783 replied on Thu, Jul 1 2010 11:15 PM

I think it's going to be released around maybe October, November, or December. And, I also think that Sandy Bridge should support LGA 1366 socket. I don't know. I am going to wait 'till it comes out.

  • | Post Points: 5
Top 500 Contributor
Posts 194
Points 1,430
Joined: Mar 2009
Location: Bengalooru (Bangalore), India
Xylem replied on Thu, Jul 1 2010 11:54 PM

Sandy Bridge employs LGA 1155 Socket H2 (not 1156) or LGA 2011 Socket R (22nm die shrink). Hence the current line of MBs would be rendered useless once again just like when it was the case during the Core i launch. We would have to end up building the system ground up!! Sad

Like even if the other components remain the same, The MB, CPU & GPU are the main pieces!! Intel common guys, you cant come up with a new socket type every half year?? We Intel fans have to wait for a while till you stabilize a socket before getting one. Like I just built a system with LGA 1156 & 875K Proc, which Intel claimed is Core 2010 line and b4 the end of 2010 Sandy bridge is getting launched!! WTF?? Angry

Check here to learn more about Sandy Bridge.

Travel Rig |Acer Aspire One - ZG5 Mods so far |Toshiba MK4009GAL 40GB.4 - port USB HUB with BT, Touchscreen & GPS. 1GB RAM upgrade. | 

Main Rig |Core 2 Quad Q6600(3.2GHz), 2x2GB OCZ Reaper 800 MHz RAM, 19" AOC 913FW, 2 x 500GB WD HDD & 2 x 1TB WD Green, nVIDIA 8600GT 1GB

Suzuki Swift GTi | 2008 Version | Lots of mods in pipeline.

  • | Post Points: 20
Top 500 Contributor
Posts 119
Points 1,405
Joined: Mar 2010
Location: San Francisco, CA
dlim783 replied on Fri, Jul 2 2010 12:02 AM

I doubt it's going to be expensive to purchase a brand new bridge, "Sandy Bridge." It better run faster and reduce power consumption.

  • | Post Points: 5
Not Ranked
Posts 33
Points 355
Joined: Jan 2010
until240 replied on Fri, Jul 2 2010 12:09 AM

How exactly does having the CPU and GPU on the same die work? People will still need a separate GPU, right? So what benefits does the integrated GPU offer?

  • | Post Points: 5
Top 10 Contributor
Posts 4,832
Points 45,800
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Fri, Jul 2 2010 12:41 AM

To make it simple until240 a cpu and a gpu do there tasks in different ways and therefore manipulate data completely different from each other as well as separately until very lately. Although they can each do many of the same tasks they do them differently (almost completely opposing difference). So certain things were done primarily by one while others were done primarily (as well as almost completely) by one or the other, while the other was not capable of doing the same thing. With this you may you use a separate GPU, but the two would both lessens some of the load on either side with the data that particular component is better with. Therefore the task will be done in combination that much quicker.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 50 Contributor
Posts 3,105
Points 38,225
Joined: Aug 2003
Location: Texas
acarzt replied on Fri, Jul 2 2010 2:07 AM

x87? Don't you mean x86?

  • | Post Points: 5
Top 500 Contributor
Posts 119
Points 1,405
Joined: Mar 2010
Location: San Francisco, CA
dlim783 replied on Fri, Jul 2 2010 3:01 AM

I mean damn! I have to sell old motherboard and purchase new motherboard that supports LGA 2011 Socket! At least it's going to run tremendously fast! Can't wait for it.

  • | Post Points: 5
Top 75 Contributor
Posts 1,809
Points 18,105
Joined: May 2009
Location: Waikiki

So instead of spending 1K on the 985X now, We should wait for stinky britches?

Intel Core i7-875K Quad
Asetek 510LC 120MM
4GB Kingston Hyper-X DDR-3
ASUS P7P55D-E Pro
CyberPower 800 PSU
Kingston 64GB SSD 
2 Hitachi 1-TB HDD'S
FirePro V8800
8X Blu-Ray DVD±R/±RW
HPw2207 22" LCD
Cintiq 21UX
CoolerMaster 690II Advance
Win 7 Pro 64 bit
Special thanks to HotHardware.com!
  • | Post Points: 5
Top 100 Contributor
Posts 1,073
Points 11,630
Joined: Jul 2009
Joel H replied on Fri, Jul 2 2010 10:05 AM

Alright, a few things here:

1) Until240: Rapid1 is mostly right. The one thing he doesn't mention is die size. Once upon a time -- as in, the last twenty years or so--a GPU was far too big to reasonably package with a CPU. Only a handful of companies ever tried to do it (Cyrix being one) and the resulting products were very poorly received. If you wanted a video display that could handle color and 640x480 or higher, you needed either a separate GPU (at first) or a GPU integrated into the motherboard.

2) Acarzt: No, I mean x87. See here: http://en.wikipedia.org/wiki/X87

In modern parlance, the term "x87" refers to floating point code that does NOT support SSE, SSE2, or any of the other various SIMD (Single Instruction Multiple Data) sets. The Athlon's x87 floating point unit was much more powerful than the Pentium 4s, and the P4's was better than Prescott's.

The result of all this was that if you did a P4 vs Athlon comparison in 2001 using tests released in 1999-2000, the Athlon would win. If you used the handful of updated tests--the tests Intel certainly wanted everyone to use--the P4 won. There was a period of several years where it could be argued that the choice of benchmark made all the difference in the world. This is always true to some extent, but from the 2001-2003 time period it was particularly prevalent. By late in 2003, the P4 was clocked sufficiently high that it tended to compete very well with Athlon chips even in raw x87 FPU code--keep in mind that the P4 was, by this time, running at 3.2GHz with dual-channel DDR400. The Athlon, meanwhile, was nearly a full 1GHz behind and had much less memory bandwidth to work with. The P4's cache latencies and L2 bandwidth were also lower and higher than the Athlon's; Intel prioritized a 2 cycle L1 cache with P4, and the CPU used a 256-bit pipe to L2 cache vs. Athlon's 64-bit pipe.

Then Opteron came out and rewrote the book. We hope you've enjoyed the history lesson. ;)

  • | Post Points: 5
Top 100 Contributor
Posts 1,073
Points 11,630
Joined: Jul 2009
Joel H replied on Fri, Jul 2 2010 10:14 AM

One other thing:

Sandy Bridge motherboards will NOT be compatible with current LGA1366 / LGA1156 chips. This works both ways, meaning:

1) You won't be able to drop a current Core i3/i5/i7 CPU into a Sandy Bridge board.

2) You won't be able to put a Sandy Bridge processor into a current motherboard.

  • | Post Points: 5
Not Ranked
Posts 33
Points 355
Joined: Jan 2010
until240 replied on Fri, Jul 2 2010 10:32 AM

Cool. Thanks for the info guys.

  • | Post Points: 5
Top 500 Contributor
Posts 136
Points 1,890
Joined: Mar 2010
sp12 replied on Fri, Jul 2 2010 5:02 PM

Actually, socket 2011 will be for both 22 and 32nm chips, but it'll be an enthusiast/workstation socket with PCIe 3.

  • | Post Points: 5
Not Ranked
Posts 77
Points 865
Joined: May 2009
Location: Waterloo, Ontario
Blazer replied on Fri, Jul 2 2010 7:12 PM

This means that the system I put together today will be obsolete in a couple of months? That is almost as bad as buyng a new car.

Huh?

i7 920 @ 3.8 gb, 12 gig g.skill PC3 12800 12GB (2GB x 6) CL 6-8-6-20-2N, 4 x Wd Black 640gig hd in raid 10 (storage), 2 OCZ vertex2 60gb raid 0 (boot), 2 Sapphire HD6850, Gigabyte X58A-UD5 mb, Silverstone SST-STF85F psu, cooled by corsair h50, and lots of fans in an Antec P160W case.    

  • | Post Points: 20
Top 75 Contributor
Posts 1,809
Points 18,105
Joined: May 2009
Location: Waikiki
Good analogy! 
 
It would be nice if we would be buying something like a 69' SS Camaro, or a 54'Vette!

Instead we get foreign Yugo's and are left to wonder!
Intel Core i7-875K Quad
Asetek 510LC 120MM
4GB Kingston Hyper-X DDR-3
ASUS P7P55D-E Pro
CyberPower 800 PSU
Kingston 64GB SSD 
2 Hitachi 1-TB HDD'S
FirePro V8800
8X Blu-Ray DVD±R/±RW
HPw2207 22" LCD
Cintiq 21UX
CoolerMaster 690II Advance
Win 7 Pro 64 bit
Special thanks to HotHardware.com!
  • | Post Points: 5
Top 100 Contributor
Posts 1,073
Points 11,630
Joined: Jul 2009
Joel H replied on Mon, Jul 5 2010 7:52 PM

Blazer,

Obsolete by whose standards? If you mean "not top of the line," then that's true. On the other hand, a dual-core Socket 939 system from four years ago still handles everything just fine--and you can game on it, too.

Hardware is only obsolete when it stops being able to do what you want of it in a reasonable timeframe.

  • | Post Points: 5
Page 1 of 1 (18 items) | RSS