These 5 Game-Changing Snapdragon 8 Gen 2 Camera Tricks Are Coming To Android Flagships
Snapdragon 8 Gen 2 AI- Accelerated Camera Tricks Coming To Android Flagships Soon
At Qualcomm’s annual tech summit in Maui which took place recently, the company introduced the Snapdragon 8 Gen 2, a chip that will soon power the mightiest of Android smartphones. Obviously, this flagship mobile platform delivers a wide array of performance and efficiency improvements, plus cool new features like ray tracing in mobile games. But it’s the processor’s new camera tricks that are most likely to matter to consumers looking for a next-generation Android handset.
For those interested in the technical details, the Snapdragon 8 Gen 2’s Spectra ISP (Image Signal Processor) supports 18-bit processing and now includes direct access to the AI engine in the chip’s Hexagon processor, making it what Qualcomm calls a Cognitive ISP. This AI-accelerated camera tech enables impressive real-time image processing features, which – if enabled by phone manufacturers – could make a big difference to your images and the camera experience as a whole.
Keep It level: Auto Horizon Leveling
The first demo I saw at Qualcomm’s summit pitched a Snapdragon 8 Gen 2 reference prototype against an iPhone 14 Pro. Both handsets were mounted next to each-other on a tripod head that could be tilted side-to-side. Each phone was facing the beach and running its camera app while casting its display to a TV via HDMI. I was invited to move the tripod head, and observe the viewfinder in both headsets (mirrored on the TVs above).
While the iPhone 14 Pro behaved like you’d expect – the horizon would tilt in the viewfinder as I moved the tripod head (and both phones) side-to-side – the Snapdragon 8 Gen 2 reference device kept the horizon perfectly level. And this was happening in real time at the ISP level. I could wiggle my fingers in front of the lens and see them move in the viewfinder, and even capture photos and videos, all while the horizon was rock steady.
Now this isn’t completely new. GoPro and other action / 360-degree cameras have offered this feature for a while now, but this is the first time I’ve seen horizon leveling done in real time – in hardware – on a smartphone. I often edit my photos to level the horizon, so I welcome having this feature as an option. Perhaps you will too? It’s going to be handy when shooting video in certain situations, like on a moving boat for example.
Fun With Portraits: Bohek Engine Improvements
Portrait mode has been available on handsets for years now. This feature typically keeps faces in focus and blurs the background with various levels of success. It tries to simulate the natural bokeh (background blur) and shallow depth-of-field of the larger aperture lenses used with interchangeable lens cameras. Often the effect uses AI to separate faces or objects from the background, or a second lens / depth sensor to create a depth map of the scene.
The more accurate this depth map – and the more layered it is – the better the effect, with objects located further away getting progressively more blurred. Some phones let you adjust the amount of background blur and even show this in the viewfinder in real time, while Nokia and others allow you to change the shape of the bokeh (the shape of the filter used to blur the background) to a heart or star instead of the usual circle.
While it's a standard feature these days, portrait mode is usually implemented in software. The effect is especially demanding and power hungry when running in real time in the viewfinder, or when enabled while capturing video (what Apple calls Cinematic Mode). As a result, with last year’s Snapdragon 8 Gen 1 chip, Qualcomm introduced a hardware bokeh engine that implements portrait mode in real time at the ISP level.
The new Snapdragon 8 Gen 2 mobile platform cranks things up a notch with an updated bokeh engine that lets you adjust the amount of background blur and the shape of the bokeh in hardware. As you can see in the pictures above, a demo using Qualcomm’s Snapdragon 8 Gen 2 reference prototype let me change both the intensity and type of background blur – even when using the front-facing camera in low light. Fun!
Make It Pop: Semantic Segmentation
Automatic scene detection is nothing new. Smartphones and even standalone cameras have offered this feature for years now, with more sophisticated versions using AI to identify scene content – from people to food to sunsets – and adjust overall color, exposure, and sharpness accordingly. More recent handsets add something called semantic segmentation, which breaks an image into multiple areas (think layers in Photoshop).
This AI-based feature can detect people, animals, the sky, plants, buildings, vehicles, the ground (and more), and optimize color, exposure, and sharpness for each area, making images pop. So far, semantic segmentation has mostly been implemented in software, but the Snapdragon 8 Gen 2 can analyze scenes in hardware at the ISP level – with up to 8 separate layers being processed in real time – which opens up new possibilities.
For example, you can now see the benefits of semantic segmentation in the viewfinder in real time, or while recording video. This tech also enables more than just color, exposure, and sharpness adjustments for each layer. Qualcomm’s algorithm can identify and remove blemishes from skin and even reflections in glasses, as you can see above in the Snapdragon 8 Gen 2 reference device demo I captured during the summit.
Zoom With Ease: 200MP Sensor SupportEarlier this year, Samsung’s Isocell HP1 200MP sensor made its debut on the Motorola Edge 30 Ultra, Xiaomi 12T Pro, and Infinix Zero Ultra. But you can expect to see more Android phones with 200MP cameras in 2023, including Samsung’s own Galaxy S23 Ultra. Alongside the many Snapdragon 8 Gen 2 image processing updates, Qualcomm announced broader support for 200MP sensors, including Samsung’s new Isocell HP3.
What this means for you and me is higher quality digital zoom -- especially at 2x and 4x – by using the center 50MP and 12MP areas of the massive 200MP sensor to deliver an almost lossless zoom experience. Think of this as an evolution of the 12MP “optical-quality” 2x zoom found on this year’s iPhone 14 Pro and Google Pixel 7 series, but with an additional 4x step enabled by the large number of pixels.
Video On Steroids: Quad-Exposure Digital Overlap HDRAt the summit, Qualcomm also revealed that the Snapdragon 8 Gen 2 supports a new feature available on Sony’s 50MP IMX989 and IMX800 sensors – which are currently used in the Xiaomi 12S Ultra and Honor 70 handsets, respectively. This new tech, called quad-exposure digital overlap HDR (High Dynamic Range), simultaneously captures four different exposure levels for each video frame, which are then blended together, to ultimately improve noise and dynamic range.
With this new feature, you can expect to capture higher quality videos without having to resort to using special HDR modes. Qualcomm gave me a demo, and I noticed more balanced highlights and shadows, plus sharper details across the board. Hopefully, we’ll see Sony’s quad-exposure digital overlap HDR-capable sensors paired with the Snapdragon 8 Gen 2 on several new handsets in the coming months, and we’ll be able to test this further.