Spectre of a GPU: Intel's Larrabee Still In Development

When Intel announced its plans to develop a discrete graphics card capable of scaling from the consumer market to high-end GPGPU calculations,  it was met with a mixture of scorn, disbelief, interest, and curiosity. Unlike the GPUs at SIGGRAPH in 2008 (or any of the current ones, for that matter), Larrabee was a series of in-order x86 cores connected by a high-bandwidth bus. In theory, Larrabee would be more flexible than any GPU from ATI or NVIDIA; Intel predicted its new GPU would begin an industry transition from rasterization to real-time raytracing (RTRT).


Larrabee's original GPU core. A bit of CPU here, a dash of GPU there...

Larrabee parts were supposed to ship in 2010, but last December Intel announced it was canceling its plans for a consumer Larabee launch and would instead keep the units and development platform in house. This was interpreted in some corners as proof that the project had been killed. New rumors from sources inside Intel suggest that Larabee isn't dead as some of thought and that Santa Clara is quietly working on refining its initial product. The unnamed source told TechEye that while ATI and NVIDIA had certainly caught and surpassed Larrabee's initial performance targets, "there is a whole product plan and roadmap still in place." Intel doesn't appear concerned about the lost time, either. "Intel picked some really big hitters and a lot of those people are still hovering around waiting for Larrabee to come online again."

Itanic or Incredible?

This is tough to call. On the one hand, as TechEye notes, it's never a good idea to bet against Intel. Santa Clara has the funds, technology, and brains to attack a currently unprofitable market segment for a decade or more; it could easily polish and refine Larrabee through several silicon revisions or process technologies before launching the card. NVIDIA has already demonstrated the tremendous scaling power and raw performance of GPGPU solutions in scientific computing; you can bet that Intel doesn't want to see its server processors sidelined into the secondary role of powering the "real" compute centers. Given the long-term growth potential of the market, Intel would be crazy not to stay involved.

On the other hand, there's an armada of questions about Intel's driver support, hardware performance, the need (and benefits) of RTRT compared to rasterization, and whether or not Intel understands the needs of the market it would be entering. Santa Clara may have hired a team of geniuses, but both ATI and NVIDIA have decades of experience (NVIDIA admittedly has more) in a segment where Intel would be the newcomer.

We fully expect to hear about Larrabee again, but Intel's ability to move the market towards x86 graphics cards and RTRT is still very much in question.