Qualcomm Plans To Launch Extra-Snappy Dual-Core Snapdragon By Q4

rated by 0 users
This post has 10 Replies | 1 Follower

Top 10 Contributor
Posts 26,110
Points 1,183,840
Joined: Sep 2007
ForumsAdministrator
News Posted: Thu, Aug 19 2010 4:19 PM
When Apple launched the iPad last April, Qualcomm's Snapdragon QSD8250 was a popular point of comparison. The results, if you were a Qualcomm fan, were a bit disappointing. Clock-for-clock, the 1GHz A4 generally whomped all over the 1GHz Snapdragon. To be fair, this could also be the result of unoptimized software or strict power conservation settings, but neither factor prevented Qualcomm from looking like the red-headed stepchild.


The previous generation QSD8650. The 8672 promises an additional core and extra rawr.

By Q4 the tables may have turned. In June, Qualcomm shipped its first 1.2GHz dual-core Snapdragon and it expects to launch a 1.5GHz dual-core QSD8672 built on 45nm technology. The new chip purportedly supports 1080P playback although Qualcomm's own webpage lists 1440x900 as the maximum resolution, supports HSPA+ networks, and offers increased 3D performance. Qualcomm lists up to 80M triangles/sec and 500M 3D pixels/sec as theoretical outputs for the QSD8672, up from 22M triangles/sec and 133M 3D pixels/sec on the QSD8x50.

It'll presumably be 2011 before we see devices built around the new processors, but this new flavor of Snapdragon might be capable of challenging the A4's performance while simultaneously delivering a performance-per-watt ratio that continues to make life difficult for the next generation of Atom processors Intel badly wants to see in mobile devices. 
  • | Post Points: 125
Top 75 Contributor
Posts 1,248
Points 11,580
Joined: Jan 2005
Location: Florida

This is what my hone will be rocking next year. Gotta love the once a year updates you get as a Sprint Premere customer.

Smooth Creations LANShark "Blue Flame" + ASUS G73JH-A2 + ASUS EeePC S101H

"I frag therefore I am!"

  • | Post Points: 5
Top 50 Contributor
Posts 3,102
Points 38,250
Joined: Aug 2003
Location: Texas
acarzt replied on Fri, Aug 20 2010 11:34 AM

Man, these cell phone CPUs seem to be getting faster and faster at an alarming rate. Keep it up!!

Pretty soon i'll be playing crysis on my cell phone lol

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Fri, Aug 20 2010 1:20 PM

Maybe not Crysis, but Carmack's Rage demo on the iPhone 4 was impressive.

  • | Post Points: 5
Top 10 Contributor
Posts 4,830
Points 45,790
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Fri, Aug 20 2010 8:48 PM

Yup it all gets smaller, faster, and less power consuming like the nand memory. Imagine what one of these could do running on some of those new Intel 25nm tri channel memory architectures.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Sat, Aug 21 2010 12:14 PM

Rapid1,

You're talking apples and oranges. As Leonard Nimoy has observed, "Physics is a ***." Die shrinks and advanced power management technologies continue to redefine what's possible on mobile devices, but these processors, including Atom, are explicitly not designed in ways that would allow them to take advantage of such memory bandwidth.

The key to understanding CPU development in this area is remembering that products are built to an entirely different set of characteristics. Size, heat, and power consumption are tightly constrained--far more so than in even a netbook--while battery power is again restrained by the physical limits of battery technology. That's why you see so much emphasis on die shrinks or power management--despite the difficulty of achieving these improvements, it's still much easier than improving battery life in an economical fashion.

Even if we skip ahead ten years, there are certain desktop/laptop capabilities that will never show up in conventional smartphones. No matter how much tech advances in other respects, battery life will continue to be an issue.

  • | Post Points: 5
Top 10 Contributor
Posts 4,830
Points 45,790
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Sat, Aug 21 2010 9:17 PM

Your statements are true in one aspect Joel, but it is more dependent on the time line. I say this because I would be willing to bet that in 5 years from now a smart phone, tablet, netbook etc will do what a desktop does today. I say this in regards to processing power, but in graphical power this will not be true in the smart phone alone I would imagine. Of course display technology could also change that much where it would be comparable with a GPU today. The big thing here is where are we going in something like a desktop in that same time frame.

I predicted to a friend of mine at work roughly a year before the iPhone was brought onto the market ( I remember the RAZR by Motorola had just come out, and was the big thing then), that in a few years we would have a device like a small computer we would carry like a cell phone. Well the iPhone came out, and now there are comparable devices in mass which display if not the same capabilities, then better. So what will we have 5 years from now?

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 10 Contributor
Posts 4,830
Points 45,790
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Sat, Aug 21 2010 9:23 PM

Oh and if the desktop CPU which has been locked at about 3Ghz as well as up or down a half gig, and has been so for years now stays the same, and the cell phone chips keep moving up as they have been as of late who knows. I know that several manufacturers are working above 1 GHZ now in smart phone processors. I have heard that several have devices at 1.5Ghz, and a few are at 2Ghz. The tegra 1 or 2 mobile GPU's while not the same capabilities as a desktop GPU, and or a Laptop unit, are generally past an Intel or ATI on board one. So who knows what we will have in our pocket in 5 years.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 20
Top 10 Contributor
Posts 4,830
Points 45,790
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Sat, Aug 21 2010 9:37 PM

Just as an addition to what I was saying!

"New DaVinci™ DM37x video processors from Texas Instruments combine 1 GHz ARM® Cortex™-A8® and 800MHz C64x+™ DSP, enabling 720p HD video for media-rich applications

Hardware offers 50 percent increase in ARM performance, 40 percent increase in DSP performance, double the graphics performance with 40 percent less power consumption over OMAP3530 device"

and

"Earlier this week, LG Electronics confirmed plans to use a dual-core processor from NVIDIA in its line of Optimus smartphones. This is a big win for NVIDIA, which has been working hard to expand beyond graphics chips for PCs and enter the smartphone market with its Tegra chip."

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 20
Top 500 Contributor
Posts 152
Points 1,870
Joined: Jan 2006
Location: Canada, Ontario
imAcpufan replied on Tue, Aug 24 2010 12:49 PM

It's cool to hear Qualcomm plans to bring a faster proc to phones.  I've got the 1Ghz snapdragon in my HTC Desire and I find I fast enough for most tasks.  However, Samsung's Hummingbird processor really does perform graphics better - I've seen Quake 3 demos running on it, which I find pretty crazy.  I'd like to see how the dual-core snapdragons will compare.

AMD Phenom II 955 BE @ 3.2Ghz - Gigabyte MA790X-UD4P - 4GB Corsair DDR2-1066

BFG GTX 260 - Creative Audigy 2 ZS - CM Cosmos S Case

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Tue, Aug 24 2010 10:59 PM

Rapid1,

Your various predictions aren't going to happen. Smartphone microprocessors look the way they do because designers purposefully sacrifice performance to keep a small die size and low power consumption. Consider what we've observed in the last 10 years:  General consensus is that a dual-core Atom performs similarly to a Pentium-M Dothan at around 800MHz. The Atom, of course, uses much, much less power. That might seem to give strength to your argument...until you consider that the power consumption of your average smartphone processor is around 0.1W.

Ten years from now, we might see CPUs that perform like today's high end systems with 1/10th-1/20th the power consumption, but we're almost certainly *not* going to see smartphones drawing 1/100th the power of modern chips in five. The reason smartphones won't start adopting high-end, performance-enhancing, power-hungry  designs is because battery efficiency isn't improving and human hands aren't getting any bigger.

Smartphones are going to get better. I have no doubt that in five years we'll see phones that make today's look like garbage. I'm still absolutely certain that my Q6600 will eat your future cell phone for breakfast.

  • | Post Points: 5
Page 1 of 1 (11 items) | RSS