Apple A8X iPad Air 2 Processor Packs Triple-Core CPU, Hefty Graphics Punch

When Apple debuted its A8 SoC, it proved to be a modest tweak of the original A7. Despite packing double the transistors and an improved GPU, the heart of the A8 SoC is the same dual-core Apple "Cyclone" processor tweaked to run at higher clock speeds and with stronger total GPU performance. Given this, many expected that the Apple A8X would be cut from similar cloth -- a higher clock speed, perhaps, and a larger GPU, but not much more than that.

It appears those projections were wrong. The new A8X chip inside Apple's new iPad Air 2, is a triple-core variant of the A8, with a higher clock speed (1.5GHz vs. 1.4GHz), a larger L2 cache (2MB, up from 1MB) and 2GB of external DDR3. It also uses an internal metal heatspreader, which the Apple A8 eschews. All of this points to slightly higher power consumption for the core, but also to dramatically increased performance.

According to reviews, the new A8X is a significant power house in multiple types of workloads; PhoneArena claims its the top-performing mobile device on Geekbench by a wide margin.


Graph, credit: PhoneArena

Gaming benchmarks are equally impressive. The iPad Air 2 nudges out Nvidia's Shield in GFXBench's Manhattan offscreen test, at 32.4fps to 31 fps. Onscreen favors the NV solution thanks to its lower-resolution screen, but the Nvidia device does take 3DMark Ice Storm Unlimited by a wide margin, clocking in at 30,970 compared to 21,659. Whether this implies an early driver state or simply means that 3DMark is well suited to the particular capabilities of Nvidia' Tegra K1 is unclear. It would make sense if it was -- 3Dmark is primarily a PC benchmark company and they may simply have more expertise in optimizing code for Nvidia GPUs as opposed to Apple's PowerVR solution.



Early word on the Metal API, which the A8X supports, is inconclusive -- it's impossible to separate the performance improvements that come from better GPU hardware and higher clock speeds from those specifically granted by the more robust API.

Will Apple Build An A8X MacBook?

Whenever Apple releases a new processor someone inevitably floats the idea that this will be the chip that launches a new, ARM-only SKU for Apple from top to bottom. The answer, for the record, is and continues to be "No." But let's look at why.  Performance has nothing to do with it -- I have no doubt that Apple could build a much more powerful version of the A8X in a 15-20W form factor as compared to the 4-8W of a typical tablet -- possibly enough to close the gap between the A8X and the Haswell chips inside the MacBook Air. If performance can be equalized, why not go ARM?

The answer: Because performance is just one small factor out of many.

Apple has made major architectural shifts several times in its history. It went from 68k Motorola processors to PowerPC, and it went from the G5 to Intel-based x86 chips. In both cases, it waited to make the jump until it was absolutely clear that the new architecture offered huge performance improvements in the same power envelope and that its current solution was unlikely to ever bridge the gap.

Waiting to move until the timing was right blunted the (sizeable) amount of criticism from irate users who were unhappy with being told they'd have to upgrade systems before they might wish to. It also helped Apple sell the new systems. Shifting to an x86 Mac didn't just you equivalent performance, it bought you 1.5x - 3x the performance you typically got from other Mac hardware. This was particularly true in mobile, where the company's single-core G4 processors were stuck well below 2GHz.

Could Apple build an ARM-based MacBook? Undoubtedly. Could that MacBook offer performance equal to that of an x86 chip? Quite possibly. But for Apple, "just as good" isn't good enough. Should the gap between ARM and x86 begin to narrow in the other direction, with ARM chips offering 25-40% superior performance per watt in traditional testing -- then we'll see Intel start to sweat.

Via:  PhoneArena
Show comments blog comments powered by Disqus