New Analysis Casts Doubt On Intel's Smartphone Performance vs. ARM

A few weeks ago, the analyst company ABI Research published a report claiming that Intel's new CloverTrail+ platform (dual-core Medfield) for smartphones was significantly faster and more power efficient than anything ARM's various partners were shipping. If you follow the smartphone market, that was a very surprising claim. Medfield was a decent midrange platform when it launched in 2012, but Intel made it clear that it's goal for Medfield was to compete with other platforms in its division -- not seize the performance crown outright.

The dual-core / quad-thread CloverTrail+ has improved on Medfield's performance, but it's still based on 2008-era microprocessor technology. Qualcomm, Samsung, and Apple, meanwhile, have not stood still. Last year, Cortex-A9 phones were still at the top of the market; now there are multiple devices using A15-class parts. Claiming that the Intel chip is head-and-shoulders above everything else was extremely odd.

Further investigation by other analysts has blown serious holes in the ABI Research report. Not only does it focus on a single, highly questionable benchmark (AnTuTu), the x86 version of that benchmark is running different code than the ARM flavors. Furthermore, the recently released Version 3.3 of the test is much faster on Intel hardware than on any of the other platforms.



This sort of cherry-picking and suspicious optimization makes ABI Research look foolish and raises questions about whether or not Intel has returned to its old tricks. As analyst Jim McGregor writes, "The only clear conclusions that should be drawn are that the leading ARM-based processors still have a performance lead over the latest Intel processor."

Unexpected Consequences:  SoCs Make Benchmarking More Difficult

One thing we want to highlight, however, is that benchmarking really has gotten more difficult in this brave new era of SoC products. Smartphones and tablets are essentially closed systems. Even laptops are more flexible -- a strict apples-to-apples comparison between notebooks could use identical external displays and the same RAM between multiple systems. There are also only two CPU vendors; an Intel quad-core is an Intel quad-core, regardless of whether it's in a Dell or an HP system.

SoCs are different. Every phone has a different chip and every chip is configured slightly differently, with various I/O blocks, offload processors, screen technologies, manufacturing nodes, and leakage characteristics. It's further complicated by the fact that mobile usage models are still evolving, as are the underlying technologies and capabilities of the devices themselves.  

But even with those caveats in place, the ABI Research report is bad science. Single-source performance comparisons almost inevitably are. Even an extremely comprehensive suite of tests like SiSoft Sandra is a starting point for evaluating performance, not a single end-game metric. The ABI Research team undoubtedly knew that -- which raises questions about their objectivity and intent when writing the report.

In the interest of full disclosure, ARM expressed dissatisfaction with the ABI Report when we met with them in late June and the author of the EETimes article, Jim McGregor, was also at that meeting. In this case, however, the problems with the ABI Research report are self-evident. ARM's own benchmark data comparing the Cortex-A15/A9 to Atom has to be taken with a substantial grain of salt but that doesn't change the fact that ABI is single-sourcing their conclusions from a version of a benchmark that boosts x86 performance far more than ARM.

Clover Trail+ is a good chip. We expect Bay Trail, when it launches late this year, to be a far better mobile solution. These kind of shenanigans help no one and serve only to confuse the issue.