ATI designed the original Wii's 'Hollywood' GPU and is handling the next-generation Wii U's graphics processor as well. AMD made its own announcement in the wake of the Wii U's debut yesterday, claiming that the custom Radeon
HD GPU will enable "high-definition graphics support; rich multimedia acceleration and playback; and multiple display support."
We suspect the new chip's 'multiple display support' refers to the new controller's 6.2" display rather than any multi-screen co-op gaming experiences. It makes sense for Nintendo to go this route--the less work that happens on the controller, the longer its battery life will be.
The move to a "Radeon HD" GPU sheds some light on what we can expect from the console's GPU. Hollywood was reportedly based on the Flipper graphics processor used by the Nintendo Gamecube. Both chips are fixed-function products (the R600-derived chip at the heart of the XBox 360 is programmable).
Calling the Wii U's GPU a customized Radeon HD part strongly implies that the next-generation chip will significantly more powerful than the Wii's current solution. How this impacts game development and overall preformance remains to be seen. Nintendo could opt to keep the new chip's clock low in order to guarantee silent running and low power consumption. Historically, the company has chosen to maintain profitable margins and small systems rather than pushing existing technology to its maximum potential.
In other news, investors weren't pleased
with Nintendo's Wii U unveiling; the stock closed down more than five percent today. Shuji Hosoi, an analyst with Daiwa Securities, said that the decline was due to uncertainty in how Nintendo would compete with the tablet/smartphone gaming market. "People are puzzled whether this will really sell. it would be extremely difficult because the competition is so intense," he said, referring to products such as the iPad from Apple Inc. and other rivals. "People have already changed."
If this sounds like deja vu, it's because we heard the same concerns voiced five years ago. Investors then were worried that Nintendo was betting the farm on a gimmicky motion-controller concept that would alienate core gamers who wanted more than cute sprites and retreads of classic titles.
Gaming on smartphones and tablets may be red-hot for the moment, but there's no scientific proof that trends in these two markets are siphoning game purchases elsewhere. The most popular mobile games are titles that can be played for a few minutes at a time; there's no evidence that a customer who buys Angry Birds
is thus less likely to purchase the next Final Fantasy.
If nothing else, the Wii U has gotten people talking about gameplay and design, just as the Wii did years ago. There's no reason to think the new tablet controllers are more gimmicky than the motion controllers of the past five years; Nintendo has ridden that gimmick all the way to the bank.