Silverthorne: Vista On 600 Milliwatts?

Silverthorne: Vista On 600 Milliwatts?

The battle for ultra-low power x86 capable CPUs is reaching amazing new heights, or perhaps 'lows' is the more appropriate term.  Just imagine a CPU that fits in a mobile device, can run Vista, and won't burn your hands or run out of battery power in about 15 seconds.

The first thought is that we must be talking about something a bit on the slow side, right?  Perhaps, but by slow we're talking about something in the league of the original Centrino platform performance.  That's certainly not going to break any desktop records, but for ultra-portable devices, that kind of performance has only been a fantasy, until Intel announced Silverthorne.

If you've been wondering just how Intel plans to put all that performance into such an efficient package, here's the short answer:

“The 25-mm^2 chip uses a host of power management techniques, including the ability to switch in or out of a new C6 deep-sleep state in just 100 microseconds. Intel redesigned its register files and cache circuits for lower active and standby power, created new I/O power planes and enabled the chip's 533-Mtransfer/second front-side bus with an optional energy-efficient CMOS mode.”

The Silverthorne still isn't going to be powering smartphones, but it's a major step in that direction.  If Intel continues to develop this technology, we might see x86 compatible smartphones in only a few years.
0
+ -

The future does indeed look bright when speaking in terms of powr saving. I just wish GPU's were being just as effective. They have gotten better, but nothing compared to intels Silverthrone. These power management based architectures they're developing are truly great. Also looking forward to the improvement over the next 5 years or so. Mostly due to the rising in laptop usage, I'd love to be able to sit and watch several movies before having the juice run out ^_^. Definitely heading down the right path. 

0
+ -

 Video card have a long way from being energy efficent seeing as all they have millions and millions of transistors.

0
+ -

 Very true, but considering the claims on the upcoming r700 and g100, with taking in less power while offering better performance then the previous generation, it's definitely a needed trait. I'm looking forward to multicore gpu's, I'm curious how that will develop. 

0
+ -

 I don't know how you see the E700 or G100/GT200 as taking less power when they'll both could be power hungy monster cards. A far as I know the GT200, which is nVidia's next high end card after teh 9800 GX2, is gonna be a 250W video card which is more than the hot R600.

0
+ -

Werty: There is a flawed view in what your saying. The GT200 is the 9800GX2, Which is two G92's...NOT a new architecture. They are two 8800 GTS g92 gpu's on a dual PCB solution. Obviously dual gpu cards would take more power, look at the 3870x2, at full load it is about the same (maybe a little less) than two 3870x2's. But at idle, it takes a little more power than a single 3870 x2.

In terms of g100, G100 is the next new architecture, G80 is obviously an architecture that sports the 8800 GTX/Ultra/GTS, and lower end 8400/8500/8600 lines. G92 is basically g80 shrunken down to a 65nm process. So it is exactly the same, but power will drop due to the smaller process node. The GT200 is still g92...not G100...G100 will be the next NEW architecture out of nvidia, and it's said to offer better performanec with a lower power draw than an ultra, rumors state a 15-25% decrease in power consumption. It will be made on a 55nm process.

So power draw is on the verge of dropping, it's not worries though, you just made a mistake regarding GT200 being a new architecture...A lot of people are confused regarding this G80/G90 fiasco...But g100 is the next architecture, GT200 is a codename for the 9800 GX2, which is based on the g92 architecture (which is g80 shrunken down basically). 

Login or Register to Comment
Post a Comment
Username:   Password: