Apple Is Reportedly Engineering An LLM For iPhone As Its Next Big Thing
As part of this effort the company has been working on its own large language model (LLM). According to Gurman, “all indications suggest that it will be entirely on-device.” There are several advantages to this design decision. The first is greatly improved response times because the AI doesn’t have to reach out to the cloud for information as often, eliminating some of the more awkward pauses when interacting with AI. Having it run locally also means Apple will have more control over privacy, which is a key feature of iPhones.

However, with this being the first real jump into AI by Apple, users should keep their expectations in check. Gurman says that Apple is still behind larger players such as OpenAI and Google, and that it’s unlikely Apple’s first outing will be better than what those other companies provide. It also might end up leaning on some of these providers to ensure a competent experience for its users.
Expect Apple to market these new features differently, though. Instead of focusing on generative AI, the company is expected to tout the different ways AI can be useful to the everyday tasks users are already doing on their iPhones. All indications are pointing to the 2024 Worldwide Developers Conference in June as being the kick-off event for AI on iPhones.
With few ways to innovate on the physical design of phones, it makes sense Apple wants to integrate AI into its products. However, with the way Siri has been handled there is reason to doubt how well these efforts will go. Hopefully iPhone users won’t be disappointed again.