Apple is reportedly preparing a substantial shift in how Siri works, moving the long-running digital assistant closer to a full chatbot experience later this year. According to a new report, the change will arrive in stages, beginning this fall and extending into next year, marking one of the most significant revisions to Siri since its introduction more than a decade ago.
The effort reflects a broader reassessment of Apple’s approach to artificial intelligence. Rather than positioning Siri as a simple voice-controlled helper, Apple appears to be rebuilding the assistant as a conversational system capable of handling more complex requests. The transition is expected to begin with an update later this year that quietly replaces Siri’s underlying AI models, even if the outward interface initially remains familiar.
This intermediate update is expected to rely on Apple Foundation Models version 10, which are reportedly based on Google Gemini technology. That system is said to support web searches, better contextual awareness, and deeper understanding of what’s currently on a user’s screen. Apple has previously promised features such as app-level actions, screen reading, and personal context awareness, and this update is expected to finally deliver much of that functionality.
The more visible change is expected with the next major operating system cycle. With the launch of iOS 27 and corresponding updates across Apple’s platforms, Siri is expected to adopt a chatbot-style interface. Rather than brief command-and-response interactions, users would be able to engage in extended conversations, ask follow-up questions, generate text or images, and analyze files in a way that more closely resembles tools like ChatGPT or Gemini.
Unlike standalone chatbots, Siri’s advantage would come from its position inside Apple’s ecosystem. The assistant is expected to recognize open apps and windows, allowing it to suggest actions or carry out tasks based on what the user is currently doing. It would also continue handling system-level commands, such as device settings and smart home controls, which third-party chatbots typically cannot manage.
Deeper integration with Apple’s own apps is also a key focus. Siri is expected to work more closely with Mail, Photos, Music, and other built-in services, enabling more precise and multi-step commands. These capabilities would be especially useful for users managing large photo libraries or complex workflows on macOS.
Privacy remains a central concern. Apple is reportedly debating how much long-term memory Siri should have, as persistent memory raises questions about data storage and profiling. For now, Apple is expected to rely on its Private Cloud Compute infrastructure, keeping processing within its own controlled environment rather than fully offloading tasks to external servers.
The report, attributed to Mark Gurman of Bloomberg, suggests Apple is designing Siri’s new architecture to be modular. This would allow the company to swap underlying AI models over time, whether for performance reasons, regulatory requirements, or a future shift to fully in-house systems.
If these changes arrive as described, Siri’s evolution will be less about novelty and more about closing long-standing gaps. The real test will be whether Apple can deliver a chatbot experience that feels reliable, restrained, and genuinely useful, rather than simply reactive to industry pressure.
