
The new functionality will leverage Apple’s own foundation models to interpret user prompts and manage on-device data while employing Google’s Gemini model—running on Apple’s private cloud infrastructure—to retrieve and distil web-based information. This division of labour is intended to preserve user privacy while enabling richer, more informed responses. A formal agreement between Apple and Google has been struck to begin testing the Gemini model for this purpose.
Apple’s leadership has increasingly recognised the urgency to catch up in AI. Chief executive Tim Cook announced that the company is ready to enhance its investments, both in acquisitions and infrastructure, to close the gap with competitors. While Apple has acquired several small AI firms this year, the Siri upgrade signals a shift toward leveraging partnerships where necessary to support its ecosystem.
Sources confirm that the Siri overhaul combines three principal modules: a planner using Apple’s internal models for context-aware interpretation, a search function that utilises web queries summarised by Gemini, and a summariser that synthesises information into coherent responses. Apple aims to roll out this AI-enhanced Siri via iOS 26.4, possibly launching in spring 2026, following the expected unveiling of the iPhone 17 later this year.
Apple’s latest move marks a departure from reliance on external services such as ChatGPT, which it previously incorporated into Apple Intelligence. This shift strengthens its control over user data and advances its privacy-first vision, even as it embraces partnerships for scalable AI components.
This development arrives amid mounting pressure on Apple’s AI talent. Dozens of researchers, including key figures in the Foundation Models team, have exited for competitors, raising concerns about Apple’s ability to build in-house AI momentum. The Siri revamp provides a pragmatic way to show progress while rebuilding capacity and credibility.
Topics
Technology