Android Contextual Suggestions: How Google AI Predicts Your Next Move

hero magic cue
Google’s has rolled out an Android feature called Contextual Suggestions that quietly learns your routines and offers the right action at the right moment. Think playlists when you reach the gym or casting prompts before the big game. To some Android power users, this might sound all too familiar.

Gradually rolling out on the stable channel to later Pixel devices (as Initially spotted by our friends at 9To5Google), Contextual Suggestions lives in Settings under All services and is enabled by default. Google says it builds predictions from device activity and location inside an encrypted, on-device space so apps only receive suggestions, not raw data. Early reports confirm the rollout on Pixel 10-series phones and show Google positioning the capability as a system-level nudge that surfaces helpful actions.

For long-time Android tinkerers the idea is surprisingly similar: instead of manually wiring triggers and actions (which is very much like refining an AI prompt), Contextual Suggestions uses AI to infer those links for you. That’s the same user desire that IFTTT, Tasker, and MacroDroid answered for years: give me “when this happens, do that” but implemented in two different ways. For example, MacroDroid runs complex, device‑level macros on Android with sensors, constraints, and multiple actions tied to local events. Google’s feature blends those strengths: it’s local, respecting device sensors and habits, but it surfaces ready-made actions like cloud applets do, without users authoring rules. 

contextual suggestions
Credit: 9To5Google

From a practical standpoint, there are differences as well. MacroDroid and Tasker give power users explicit control (you define triggers, sequence actions, add delays, and debug flows) which remains critical for bespoke automations and privacy‑sensitive workflows. Contextual Suggestions, by contrast, trades granular control for convenience: you get predictive suggestions crafted by a model that watches patterns, then approve or ignore them. All this reduces friction for mainstream users who want automation without the overhead of building logic, but then, you also cede the discovery and initial configuration of automations to Google’s model. 

So far, the feature has surfaced primarily on newer Pixels and may require Android 14+ for some capabilities, and Google hasn’t explicitly confirmed a broad, cross‑OEM timeline. Because the learning happens in an encrypted on‑device space, Google frames this as privacy-conscious, but the model still relies on device activity and location signals to generate predictions, so something to think about there.

Main image: Contextual Suggestions shares some similarities with the Pixel 10-exclusive Magic Cue
AL

Aaron Leong

Tech enthusiast, YouTuber, engineer, rock climber, family guy. 'Nuff said.