Apple Opens iOS 27 to Claude, Gemini, ChatGPT
Apple's iOS 27 'Extensions' feature lets users swap Claude, Gemini, or ChatGPT into Siri, Writing Tools, and Image Playground - the first time rival AI models can power Apple Intelligence natively.

Apple is turning its AI stack into a platform. Bloomberg's Mark Gurman reported Tuesday that iOS 27 will ship a feature called "Extensions" that lets users route Apple Intelligence queries through third-party AI models - Claude, Gemini, and ChatGPT are the confirmed targets. It's the first time the system has allowed rival models anywhere near Siri, Writing Tools, or Image Playground.
TL;DR
- Apple's "Extensions" in iOS 27 lets users install a Claude or Gemini app, then select it as the AI powering Siri, Writing Tools, and Image Playground
- Google's Gemini stays as the contractually embedded default - Claude, ChatGPT, and others arrive as opt-in additions through the App Store
- Different voices distinguish which model Siri is using during a conversation
- Apple disclaims responsibility for output from third-party models
- Formal details land at WWDC on June 8; iOS 27 ships this fall
How Extensions Actually Works
Apple didn't invent a new concept here. Extensions follow a pattern that's already standard on desktop - the AI equivalent of a browser's default search engine selector. The user installs an app (say, the Anthropic Claude app), that app registers support for the Extensions interface, and the user can then promote it as the active model for one or more Apple Intelligence features.
The integration points disclosed so far are Siri, Writing Tools, and Image Playground. That covers the three places where Apple Intelligence is most visible to everyday users: conversational queries, in-app text editing, and AI image creation.
The Gemini Layer Underneath
One distinction worth tracking carefully: Gemini isn't participating as an Extension. According to reporting on Apple's deal with Google, Gemini is embedded contractually at a deeper level, powering the rebuilt Siri backbone under a roughly $1 billion per year arrangement. The Core AI framework that replaces Core ML in iOS 27 will bake in Apple Foundation Models - themselves distilled from Gemini training data.
Claude, ChatGPT, and any other provider that ships Extensions support sits on top of that. The distinction matters because it means even after selecting Claude as your "preferred" model, the system foundation is still Google's.
Voice as a Trust Signal
Apple's solution to "which model is talking right now" is surprisingly practical: different voices. When Siri hands off a query to Claude, the response comes back in a different voice than Siri's default. Users don't get a label in small print - they get an audible signal. This is an interesting design choice for voice-first interactions, though time will tell how confused users will be when Siri sounds different halfway through a conversation.
Compatibility and Requirements
| Component | Requirement |
|---|---|
| OS version | iOS 27, iPadOS 27, macOS 27 (fall 2026) |
| Provider setup | Install provider app from App Store |
| App Store section | Dedicated "AI Extensions" listing (new) |
| Supported features | Siri, Writing Tools, Image Playground |
| Data routing | Queries go to third-party model directly |
| Privacy coverage | Apple's policies don't apply to third-party output |
The App Store gatekeeping detail is worth noting. Apple isn't opening an open API that any developer can call. Providers need App Store presence and need to implement the Extensions interface spec - which won't be public until WWDC on June 8. This is still a controlled marketplace, not open infrastructure.
What Developers Are Looking At
Before WWDC, the implementation details are still behind Apple's doors. What's known from past iOS extension systems suggests the interface will follow an App Extension model: a declared capability in the app's Info.plist, a handler object that Apple calls when the feature is invoked, and a response returned through a defined schema.
// Conceptual shape (pre-WWDC speculation):
// Provider app declares in Info.plist:
NSExtension:
NSExtensionPointIdentifier: com.apple.intelligence.model-provider
NSExtensionPrincipalClass: ClaudeIntelligenceProvider
// Apple calls the handler with a prompt object
// Provider returns a structured response
// Apple renders it in Siri / Writing Tools UI
That's speculation based on how existing App Extensions work - Apple hasn't disclosed the real API. Developers signing up for the beta track will get access at the June 8 session. The actual protocol will determine whether Extensions are truly flexible or whether Apple bakes in constraints that limit what providers can do.
Apple Intelligence currently powers Siri, Writing Tools, and Image Playground - the three surfaces that Extensions will open to Claude, Gemini, and other third-party models in iOS 27.
Source: macrumors.com
Where It Falls Short
The default layer stays closed. Gemini runs at the system level under contract. Extensions give users a choice layer on top, but the embedded default isn't up for auction. If Apple's own Foundation Models or Gemini produce a bad result before the user's selected model is invoked, Extensions won't help.
App Store as tollbooth. Every AI provider that wants iOS 27 access needs to ship an App Store app and comply with Apple's review process. Apple can reject or restrict any provider at any point. The openness is real but it's conditional, and the conditions are Apple's to set.
Privacy is undefined. Apple's statement that it "isn't responsible for content generated by any of the selected third-party models" is legally clear but technically vague. The routing path for a Writing Tools request that goes to Claude isn't disclosed. Does the full document context travel to Anthropic's servers? Partial context? Only the selected text? The data boundary between Apple's system and the extension provider's API is unknown until Apple publishes the spec.
Voice friction is real. The dual-voice approach for distinguishing Siri from third-party models works in theory. In practice, users who've spent years conditioning their mental model of "this voice = assistant" will find the switch disorienting. It's an engineering solution to an UX problem, and it may need more polish before general release.
No indication of Android parity. This is iOS-specific. Android users selecting Claude or Gemini for their assistant context are already in a different world - both models have direct Android presence without Apple's intermediary layer.
Apple's WWDC 2026 promotional graphic - the glowing ring design is understood to preview the new Siri visual interface coming in iOS 27. Extensions will be formally announced at the June 8 keynote.
Source: macrumors.com
The Bigger Shift
What Apple is building isn't an open AI marketplace - it's a controlled one. Extensions give Anthropic, Google, and OpenAI a seat at the table in iOS, but Apple owns the table. That's a deliberate choice and arguably the right one given the safety and privacy surface area of a 1-billion-device platform.
The strategic effect is significant regardless. Every major AI lab now has a reason to ship a polished iOS app and maintain Extensions support. Apple gets competitive pressure on AI quality without having to win the AI race itself. Users get a meaningful choice they didn't have in iOS 26.
Whether that plays out the way Apple intends depends heavily on the June 8 spec. Extensions built on a well-designed API with clear data contracts could be genuinely useful. Extensions built on a restricted interface that neuters providers' capabilities would be checkbox marketing. Developers will have answers in five weeks.
Sources:
- iOS 27 will let you choose between Gemini, Claude, and more for AI features - 9to5Mac
- iOS 27 Will Let You Pick Claude or Gemini Instead of ChatGPT for Apple Intelligence - MacRumors
- Apple to let users choose between Anthropic, Google, and OpenAI models - Sherwood News
- Apple plans to make iOS 27 a Choose Your Own Adventure of AI models - TechCrunch
