The Pixel 2 and Pixel 2 XL are both equipped a custom-designed co-processor that enables advanced image processing and machine learning capabilities. Called "Pixel Visual Core," this is one of the selling points of Google's newest phones, only it is not yet enabled. Google originally planned to flip the switch in its Android 8.1 Oreo Developer Preview 1 (DP1), and while that did not happen, developers will be able to try it out with the next preview release.
"If your app uses the camera APIs and you have a Pixel 2 device, you'll be able try an early version of Pixel Visual Core starting in Developer Preview 2, planned for November 2017. Testing on Developer Preview 1 is not yet supported," Google stated on its Android Developers page.
Photos taken on Pixel 2 w/ a third-party app. Picture on right is HDR+ on Pixel Visual Core (Source: Google)
The chip is built into every Pixel 2 and Pixel 2 XL handset. When turned on, it will enable more applications to use the Pixel 2's camera for taking HDR+ quality photos, making it possible to get better pictures of scenes with a large range of brightness levels, from dimly lit landscapes to sunny skies. Users will also be able to take multiple pictures in sequence, with intelligent HDR+ processors occurring the background.
"To expand the reach of HDR+, handle the most challenging imaging and ML applications, and deliver lower-latency and even more power-efficient HDR+ processing, we’ve created Pixel Visual Core," Google stated in a separate blog post. Google added that the Pixel Visual Core is its first custom-designed co-processor for consumer products.
The backbone of the Pixel Visual Core is the Google-designed Image Processing Unit (IPU). This is a fully programmable, domain-specific processor that was built from the ground up to deliver maximum performance at low power, Google says. it features eight custom cores, each with 512 arithmetic logic units (ALUs), and delivers raw performance of more than 3 trillion operations per second.
"Using Pixel Visual Core, HDR+ can run 5x faster and at less than one-tenth the energy than running on the application processor (AP)," Google claims.
The chip is tightly coupled with software, the latter of which controls many more details of the hardware than a typical processor. By handing over more control to the software, Google says it makes the functionality of the chip simpler and more efficient. And to make programming it easier for developers, the IPU leveragles domain-specific languages—Halide for image processing and TensorFlow for machine learning. A custom compiler then optimizes the code for the underlying hardware.