Spectre of a GPU: Intel's Larrabee Still In Development

Spectre of a GPU: Intel's Larrabee Still In Development

When Intel announced its plans to develop a discrete graphics card capable of scaling from the consumer market to high-end GPGPU calculations,  it was met with a mixture of scorn, disbelief, interest, and curiosity. Unlike the GPUs at SIGGRAPH in 2008 (or any of the current ones, for that matter), Larrabee was a series of in-order x86 cores connected by a high-bandwidth bus. In theory, Larrabee would be more flexible than any GPU from ATI or NVIDIA; Intel predicted its new GPU would begin an industry transition from rasterization to real-time raytracing (RTRT).


Larrabee's original GPU core. A bit of CPU here, a dash of GPU there...

Larrabee parts were supposed to ship in 2010, but last December Intel announced it was canceling its plans for a consumer Larabee launch and would instead keep the units and development platform in house. This was interpreted in some corners as proof that the project had been killed. New rumors from sources inside Intel suggest that Larabee isn't dead as some of thought and that Santa Clara is quietly working on refining its initial product. The unnamed source told TechEye that while ATI and NVIDIA had certainly caught and surpassed Larrabee's initial performance targets, "there is a whole product plan and roadmap still in place." Intel doesn't appear concerned about the lost time, either. "Intel picked some really big hitters and a lot of those people are still hovering around waiting for Larrabee to come online again."

Itanic or Incredible?

This is tough to call. On the one hand, as TechEye notes, it's never a good idea to bet against Intel. Santa Clara has the funds, technology, and brains to attack a currently unprofitable market segment for a decade or more; it could easily polish and refine Larrabee through several silicon revisions or process technologies before launching the card. NVIDIA has already demonstrated the tremendous scaling power and raw performance of GPGPU solutions in scientific computing; you can bet that Intel doesn't want to see its server processors sidelined into the secondary role of powering the "real" compute centers. Given the long-term growth potential of the market, Intel would be crazy not to stay involved.

On the other hand, there's an armada of questions about Intel's driver support, hardware performance, the need (and benefits) of RTRT compared to rasterization, and whether or not Intel understands the needs of the market it would be entering. Santa Clara may have hired a team of geniuses, but both ATI and NVIDIA have decades of experience (NVIDIA admittedly has more) in a segment where Intel would be the newcomer.

We fully expect to hear about Larrabee again, but Intel's ability to move the market towards x86 graphics cards and RTRT is still very much in question.
0
+ -

I don't know, it was shuttered once and it may well be again. Too many question marks surrounding Larrabee for ATI or Nvidia to be concerned...or is that Intel's plan?

Also about rasterization vs raytracing, I was reading an interesting piece about ti >> (http://www.alastra.com/~peter/io/raytracing-vs-rasterization.html). Why wouldn't an Nvidia/ATI GPU be able to handle raytracing better than Larrabee?

0
+ -

I don't know on this one I think Intel learned that it is very difficult to build a GPU. They are the undisputed winner in the CPU arena, but GPU's are a whole other animal. I stand by my statement if they "Intel" want to be taken seriously they are going to have to buy Nvidia. Some say this will never happen, I don't know but I will stick my neck out and say for them to be a formidable challenger in the GPU space this will need to happen. Suddenly AMD does not look so bad after their 5.4 bilion dollar acquisition. This will be one to watch to see how they counter Bulldozer.

0
+ -

Remember that one of the reasons why AMD first acquired ATI in '06 was to appeal to Dell and snatch business away from Intel. It was to gain integrated graphics solutions that would compete with Intel's.

I remember there being some semblance of a partnership between Nvidia and Intel back then. We've since realized that Intel doesn't play nice with anyone. Or rather, Paul S. Otellini doesn't play nice with anyone.

0
+ -

Yea gibbersome I remember reading the acquisition at the time not fully understanding the reasoning, but today clearly AMD knew the market was headed for a CPU/GPU marriage. I wonder what will happen to Nvidia if they do not get bought out and the courts uphold the ban on them from Intel's chipset business I do not think VIA can support them.

0
+ -

Wow that is a good link gibbersome which shed's some light on the differences. Raytracing sounds much more detail oriented. This would seem to add more reality to computer graphics, and makes me wonder why the graphics card makers do not use it. He speaks of how a modern Graphics card can do it, so why not?

0
+ -

That sounds very correct la_guy on both the AMD/ATI thing, and your via statement. However; I think Nvidia will be fine (remember I said I think), as the are doing a lot of other stuff now. Sooner or later they will get a current video card released to I imagine. Although if you remember back to the starting AMD hot days, and Nvidia chipset was the only thing you wanted in a MB when you got one. So I am sure it has hurt them to some degree.

0
+ -

To me it boils down to this statement, "Santa Clara has the funds, technology, and brains to attack a currently unprofitable market segment for a decade or more;". Furthermore, Intel will fiercely defend its reign over the processor market. Processing power is being offloaded to the GPU and unless Intel competes directly they risk becoming irrelevant in this market.

The ideal solution would be for Intel to make a play for NVidia but considering the bad blood between them it is unlikely that this would be a friendly merger. That leaves a more likely hostile takeover. Unfortunately, the culture clash that would surely follow makes this approach very unappealing for Intel.

Then there's the issue of fair competition. I suspect regulators would look long and hard at the prospect of a union between Intel and NVidia. I'm not so sure they'd allow it.

So that leaves us with the in house solution and from where I'm sitting, it makes good sense.

0
+ -

I think AMD is currently in a better position to merge CPUs and GPUs. INtel has never made good graphics as far as their integrated solutions are concerned. If Intel could acquire NVidia, they would be in a much better position to make Larrabee work, but I doubt that will ever happen, and even if they tried, such a merger might not be allowed.

0
+ -

Intel will probably steal more of other companies talent rather than buy NVIDIA. They can throw almost unlimited money at their problems but that doesn't mean that they like to. Swiping a few brains is cheaper than buying whole companies anyway.

They want into the high end graphics market, so expect to see them there in a few years. Also expect that they'll have something good to go to market with. Most of their technology is considered really good, or is the best.

0
+ -

Gibbersome, bbdl:

RTRT is still too slow for modern gaming; Intel's entire bet was that they could change that with Larrabee. Also, Gib, modern ATI/NV GPUs can handle ray tracing but are built for rasterization. Intel was essentially claiming it could deliver "good enough" rasterization performance and stunning RTRT performance in a one-two combo that would push the market towards RTRT.

0
+ -

I can tell you the GPU in my laptop is Intel, and across all platforms...it rocks!

It can handle DCC with relative ease, and gaming as well. If they can step up to meet and exceed the power of some of the ATI's and Nvidias, I believe they could really find their way to producing a very professional GPU? All they would have to do is listen to the people who use workstation and gaming GPU's, then if they find a way to make them take full advantage of marrying them with their perspective CPU's? It could prove to be very valuable to their profit margin.

As far as DCC rendering, I know having all similar components makes a big difference. It becomes very troublesome if you render with an AMD CPU and an Nvidia GPU, and vice versa.

I have even heard the rumors on how they are developing cards that do nothing else but merge the routing abilities of both to make HDRI renders process all components as one.

0
+ -

Anima,

Are you on crack?

0
+ -

There was supposed to be more in that post, it got cut off. You're the first person I've ever seen who identified Intel GPUs as "rocking" or believed they handled gaming well. Like, ever.

0
+ -

Joel H:

There was supposed to be more in that post, it got cut off. You're the first person I've ever seen who identified Intel GPUs as "rocking" or believed they handled gaming well. Like, ever.

 

LOL!! I thought he was being sarcastic at first. But if the Intel GPU meets his needs fine, no need for him to upgrade it. But I must disagree that any Intel GPU has ever been "rocking".

 

Login or Register to Comment
Post a Comment
Username:   Password: