Intel Previews 32nm Itanium "Poulson" Processor

1 thumbs up
This week, at ISSCC (International Solid-State Circuits Conference) Intel unveiled its next-generation Itanium processor, codenamed Poulson. This new octal-core processor is easily the most significant update to Itanium Intel has ever built and could upset the current balance of power at the highest-end of the server / mainframe market. It may also be the Itanium that fully redeems the brand name and sheds the last vestiges of negativity that have dogged the chip since it launched ten years ago.


Here's the sneak peak

To discuss why, we'll have to flip through some history.

From Merced to Tukwila

Intel began work on what would become Itanium back in 1994 in a joint venture with HP. The two companies chose to pursue a design philosophy they termed EPIC (Explicitly Parallel Instruction Computing). As an EPIC processor, Itanium pursued a very different design philosophy compared to the Pentium Pro and the other out-of-order execution processors that followed it.

Instead of using specific CPU hardware to re-arrange and optimally schedule instructions for execution (defined as Out of Order Execution, or OoOE), Itanium relies on the compiler to optimize code at run-time. This allowed the designers of Merced (the first generation Itanium) to devote more die space to execution hardware, thus boosting theoretical performance. The weak link in the chain was the compiler itself. If it failed to detect and exploit thread-level parallelism at runtime only a fraction of the CPU's execution units were in use at any given time.

Merced, the First Generation Itanium Processor
 
Intel initially promised Itanium processors would debut in 1999; the first chips didn't actually hit market until 2001. Things got worse from there: the two year delay gave Itanium's competitors time to launch faster versions of their own chips, Itanium's much-touted 32-bit hardware compatibility was slow, and it quickly became apparent that then-modern compilers were not capable of delivering the degree of optimization Itanium required.  Supporting applications, meanwhile, were few and far between.

This was logical given the chicken-egg dilemma of introducing a brand-new architecture, but it was negative ammunition all the same. Most damaging of all was the way Itanium had been initially marketed. When Intel announced it was targeting a 1999 launch window, respected analysts were soon predicting that the chip's combination of 32-bit compatibility and advanced 64-bit execution would sweep the length and breadth of the x86 industry. The chip's initial weaknesses were significant, but its marketing was worse. 


We're not kidding when we say the chip was poorly misrepresented/marketed. The predictions above were made by IDC. Original image courtesy of Wikipedia

Over the last ten years, Intel has refreshed and updated the Itanium core multiple times. The last significant refresh, Tukwila, was built on a 65nm process with up to four cores and 24MB of L3 cache. With Poulson, Intel is leapfrogging 45nm entirely and moving Itanium to its cutting-edge, 32nm process.

Article Index:

0
+ -

32nm = WOW! I wonder how small will they get 10 years from now? anyone wanna guess?

0
+ -

10nm if not smaller... but i wouldnt be surprised if enterprise wide quantum-light processors hit the market or are just about to hit the market.... Unless 2012 dooms day, or a post apocalyptic-zombie world, or google-skynet-watson controlled earth where the humans are batteries that power the robots after we blacken the sky etc

0
+ -

20 something nm is next. I think 22?

They will probably switch to fiber and the manufacturing process will get bigger again, and then start shrinking from there all over again.

And then a switch over to Quantum computers.

Computers are going to start changing drastically. We will all have to relearn a lot of thing :-P

0
+ -

As long as I can play Spider Solitaire,..........

0
+ -

As long as I can play Spider Solitaire,..........

and Free Cell! i'm with ya on this one haha.

0
+ -

Hey coolice that "google-skynet-watson" scenario might mess up Apples plan to take it over (the world).

0
+ -

Wow those forecasters must have been hanging with someone and drunk or otherwise inebriated when they wrote that forecast, that was of by a country mile +10!

0
+ -

Intel is actually building a 14nm fab facility, supposed to be done in 2013:

http://www.electronicsweekly.com/Articles/2011/02/19/50530/intel-to-build-14nm-fab.htm

0
+ -

I recently went back and read a thread on CPU predictions from tech forum members in the year 2000. An astonishingly large number of people thought we'd seriously be using things like quantum or light-based computers.

The fact is, scientists in 2010 are still thrilled when someone demonstrates a way to use 5-10 atoms to hold some bit of data. At the same time, Intel's current version of Lights Peak (the supposed optical interface) is currently using copper wire.

 

I'm not knocking LP, but if even Intel hasn't figured out how to deploy the optical interface is a cost-effective manner, we're likely 20 years or more away from the sorts of things most techies like to predict are coming "just down the road."  Furthermore, I don't think we'll see a real push into such areas until conventional manufacturing methods simply run out of room.

Login or Register to Comment
Post a Comment
Username:   Password: