Back to Blog

Apple Is Turning Siri Into a Platform. Users Will Choose Which AI Brain Powers Their iPhone.

Apple will open Siri to rival AI chatbots in iOS 27 via a new Extensions system. Users choose between Gemini, Claude, ChatGPT, and more. Siri becomes a multi-model AI platform across 2.2 billion devices. The model-agnostic era reaches every pocket.

The most closed ecosystem in consumer technology just signalled the most open AI strategy in the industry.

Yesterday, Bloomberg's Mark Gurman reported that Apple plans to open Siri to rival AI assistants in iOS 27. Not just one partner. Not just ChatGPT. Any AI chatbot installed from the App Store will be able to integrate with Siri through a new “Extensions” system.

Users will choose which AI brain handles their queries from the Settings app. Gemini for complex research. Claude for creative coding. ChatGPT for general conversation. Or any other AI service that enables the integration. The user picks. Siri routes. The best AI for each task wins.

This is not an incremental feature update. This is Apple transforming Siri from a standalone assistant into a multi-model AI platform — and it changes the competitive landscape for every company building AI products, enterprise platforms, and customer-facing systems.

What Apple Is Actually Building

The details from Bloomberg paint a clear architectural picture.

Apple is developing an “Extensions” system that allows AI chatbot apps installed via the App Store to integrate directly with Siri. The options will be available in the Apple Intelligence and Siri section of the Settings app, with Apple providing download links for chatbot apps. Users can enable or disable individual AI services, giving them granular control over which chatbots are active.

The system will work across iPhone, iPad, and Mac — unified across Apple's entire device ecosystem through iOS 27, iPadOS 27, and macOS 27. Apple plans to announce the feature at the WWDC 2026 keynote on June 8 and release it in the fall.

Currently, Siri can already hand off questions to ChatGPT through Apple's partnership with OpenAI. When Siri encounters a question it cannot handle, it suggests sending it to ChatGPT. Users can also ask Siri to query ChatGPT directly. The Extensions system expands this from an exclusive partnership to an open marketplace of AI capabilities.

This is happening alongside several other Siri initiatives. Apple is preparing a dedicated Siri app with a new interface. It is unifying Siri with its Spotlight search feature. It is adding new entry points like “Ask Siri” and “Write with Siri” toggles. And it is continuing to develop its own Gemini-powered Siri chatbot — the complete overhaul codenamed “Campos” — as the default intelligence layer.

The Extensions system sits on top of all of this. Apple's own Gemini-powered Siri handles the baseline. Third-party chatbots provide specialised capabilities on demand. The user controls the routing. The platform orchestrates everything.

Why This Is Apple's Biggest Strategic Pivot in a Decade

To understand why this matters, you need to understand what Apple is giving up.

For 15 years, Apple's competitive advantage was the closed ecosystem. Hardware, software, and services — all designed, controlled, and optimised by Apple. The premium users paid for iPhones was a premium for integration: everything works together because one company controls everything.

Opening Siri to rival AI assistants breaks that pattern fundamentally. Apple is saying, explicitly, that no single AI — not even one it controls — is the best answer for every user's needs. Different users want different AI capabilities. Different tasks require different AI strengths. And Apple's job is not to build the best AI for every situation. It is to build the best platform for accessing the best AI for each situation.

This is the platform play. The same strategic logic that made iOS the dominant mobile platform — not by building every app, but by building the best platform for accessing every app — is now being applied to AI assistants.

The business logic also makes sense. Bloomberg noted that expanding Siri to multiple AI chatbots allows Apple to generate revenue from third-party AI subscriptions made through the App Store. Every Claude subscription, every Gemini subscription, every ChatGPT subscription purchased through the Extensions system generates Apple's standard App Store commission.

Apple does not need to win the AI model race. It needs to be the platform where users access AI models — and collect a percentage of every subscription along the way.

The Model-Agnostic Era Is Now Universal

Stand back and look at what happened in March 2026 — the full sequence.

Week one: Google embedded Gemini into every Workspace document. Microsoft built Copilot Cowork on Anthropic's Claude and launched Agent 365 as a multi-model orchestration platform. Microsoft's VP said it plainly: “Every 60 days, there's a new king of the hill.”

Week two: Nvidia revealed a $1 trillion infrastructure roadmap with NemoClaw as a hardware-agnostic enterprise agent platform. The Nemotron 4 coalition united competitors (Perplexity, Mistral, Cursor) to build an open frontier model together. Nvidia's survey confirmed 88% of enterprises reporting AI-driven revenue gains, with 85% saying open-source models matter to their strategy.

Week three: Apple shipped Gemini-powered Siri to 2.2 billion devices via iOS 26.4. Anthropic invested $100 million in the Claude Partner Network and launched an enterprise marketplace.

Week four: Apple announces it will turn Siri into a multi-model platform where users choose their preferred AI.

Every major technology company — Google, Microsoft, Nvidia, Apple, Anthropic — has independently reached the same architectural conclusion within a single month. No single AI model is best for every task. The platform that orchestrates multiple models, lets users or systems select the best one for each task, and applies consistent governance across all of them wins.

This is no longer a trend. It is not a prediction. It is the settled architecture of the AI industry, confirmed by every company with the resources and market position to have chosen differently.

What the Extensions Architecture Tells Enterprise Leaders

Apple's Extensions system is a consumer product, but the architectural principle it embodies applies directly to enterprise AI.

The Extensions model works like this: a platform layer (Siri) sits above multiple AI providers. The user (or the system) selects which provider handles each request. The platform maintains context, identity, and governance regardless of which AI runs underneath. New providers can be added without changing the platform. Existing providers can be swapped without disrupting the user experience.

This is precisely the architecture that defines enterprise AI orchestration. An enterprise platform sits above multiple AI models. The orchestration layer selects which model handles each task based on performance, cost, compliance, and data sensitivity requirements. The governance framework — audit trails, access controls, data residency, compliance reporting — applies consistently regardless of which model runs underneath. New models can be integrated without changing enterprise workflows. Better models automatically improve enterprise operations.

Apple just validated this architecture for consumers across 2.2 billion devices. Enterprise leaders who have been debating whether to commit to a single AI provider or build for model flexibility now have their answer from the world's most premium consumer technology company: build for flexibility. Build the platform. Let the models compete. Let the best one win for each task.

The enterprises that designed their architecture this way — with orchestration layers that route tasks to the best available model, governance frameworks that apply consistently across providers, and deployment flexibility that adapts as the model landscape evolves every 60 days — are the ones positioned to capture value from every improvement, from every provider, automatically.

The enterprises locked to a single provider are the ones who will watch their competitors benefit from every new model release while they wait for their one provider to catch up.

What This Means for Voice AI Specifically

Apple's move carries specific implications for every company building voice-based customer interactions.

When 2.2 billion device owners experience a Siri that lets them choose the best AI for each query — routing research questions to one model, creative tasks to another, and code generation to a third — their expectations for every voice interaction change permanently.

Enterprise voice systems that offer a single, rigid AI backend will feel primitive by comparison. Customers who are accustomed to choosing their AI brain on their personal devices will not accept being forced into a single-model experience when they call a business, interact with a support chatbot, or engage with an enterprise voice assistant.

The implication for enterprise voice AI is clear: build for multi-model. Design voice systems where the orchestration layer can route different types of interactions to different AI models based on what each interaction requires. Customer support queries that require empathy and contextual understanding route to models that excel at conversational intelligence. Technical queries route to models that excel at reasoning and accuracy. Multilingual queries route to models with the strongest capability in the customer's language.

The voice AI systems that match the flexibility Apple is building into Siri will meet the expectations of customers who interact with that flexibility every day on their personal devices.

The xAI Lawsuit and the Platform Stakes

There is a footnote to the Bloomberg report that reveals the competitive intensity behind Apple's decision. Elon Musk's xAI startup has sued Apple and OpenAI, accusing the two companies of conspiring to “ensure their continued dominance” in the AI market through their exclusive Siri-ChatGPT partnership.

Apple's response — opening Siri to every AI provider rather than defending exclusivity — is strategically elegant. By making Siri a platform for all AI assistants, Apple eliminates the antitrust argument entirely. There is no exclusive arrangement to challenge if every AI provider has equal access to Siri integration.

But the move also raises the competitive stakes for every AI provider. In a world where Siri users can choose between Gemini, Claude, ChatGPT, Grok, and any other AI chatbot, the models must compete on merit — on the quality of their responses, the accuracy of their reasoning, the naturalness of their conversation, and the usefulness of their capabilities for each specific task.

For AI companies, the Siri Extensions system creates both opportunity and pressure. The opportunity is access to 2.2 billion devices. The pressure is that users can switch to a competing model with a toggle in Settings. Customer retention becomes a function of quality, not distribution lock-in.

This competitive dynamic — where models compete on merit within a common platform — is exactly what drives the model improvement cycle that benefits everyone. Every model provider has an incentive to improve because users can switch instantly. Every improvement benefits the platform's users because they have access to every provider. The platform owner (Apple) benefits because better models attract more users and more subscription revenue.

Enterprise AI platforms that adopt the same architecture — letting models compete within an orchestration framework, routing each task to the best option, and making it easy to adopt new models as they improve — capture the same competitive dynamic. The orchestration layer benefits from every model improvement. The enterprise benefits from every improvement the orchestration layer captures.

The Architecture Question Is Answered

For three years, the enterprise AI world debated the right architecture. Single model or multi-model? Closed ecosystem or open platform? Lock in to one provider or build for flexibility?

March 2026 answered the question. Not with a whitepaper or a conference keynote, but with the actual product decisions of the five most valuable technology companies on Earth.

Google chose multi-model for Workspace. Microsoft chose multi-model for Copilot. Nvidia chose hardware-agnostic for NemoClaw. Anthropic built a partner ecosystem around Claude while enabling its use within Microsoft's multi-model platform. And Apple — the company that defined the closed ecosystem — chose to open Siri to every AI provider in the industry.

When Apple goes multi-model, the debate is over.

The platform wins. The orchestration layer wins. The architecture that routes each task to the best available model, governs consistently across providers, and captures value from every improvement wins.

Every enterprise building AI systems in 2026 should design for this reality. Not because one company said so, but because all of them did — independently, within a single month, by shipping products that embody the same architectural principle.

The model-agnostic era is not approaching. It arrived. In data centres. In enterprise platforms. In the world's most premium consumer devices. And now, through the Settings app on your iPhone.

“Apple spent 15 years building the most closed ecosystem in technology. Yesterday, it announced that Siri will become the most open AI platform in the industry — letting users choose between Gemini, Claude, ChatGPT, and any other AI through a new Extensions system in iOS 27. When the company that defined the closed ecosystem goes multi-model, the architecture debate is settled. Not by argument. By the product decisions of every major technology company on Earth, within a single month.”

What to Watch

WWDC 2026 on June 8. Apple will announce iOS 27 and the Extensions system. Watch for the specific architecture of how third-party AI integrates with Siri — the APIs, the data access policies, the user controls, and the revenue-sharing model. These details will shape how AI companies compete on Apple's platform for years.

Which AI companies enable Extensions first. Speed of adoption signals competitive confidence. The companies that integrate with Siri Extensions at launch believe their models can win in a direct comparison. The ones that delay may be less confident about head-to-head competition.

Enterprise voice AI architecture decisions. Apple's move accelerates the timeline for enterprise voice systems to go multi-model. Enterprises designing voice AI deployments in 2026 should build for model flexibility from day one — because the consumer baseline just moved to multi-model, and enterprise systems that cannot match it will feel outdated immediately.

The subscription economics. If Apple takes its standard App Store commission on AI subscriptions accessed through Siri Extensions, the economics of AI distribution change significantly. AI companies that currently acquire customers directly will need to factor Apple's commission into their pricing and customer acquisition strategy.

March 2026 started with Google embedding AI into Workspace. It ends with Apple turning the iPhone into a multi-model AI platform. In between, every major technology company confirmed the same principle: the future of AI is not one model. It is the platform that orchestrates all of them.