News

Apple's Core AI Will Replace Core ML in iOS 27

Apple plans to unveil Core AI at WWDC 2026, a modernized framework replacing Core ML that opens the door to third-party AI models and MCP integration across its entire ecosystem.

Apple's Core AI Will Replace Core ML in iOS 27

Apple is killing Core ML. Bloomberg's Mark Gurman reports that the company will unveil a new developer framework called Core AI at WWDC in June, replacing the machine learning toolkit that has powered on-device intelligence since 2017.

TL;DR

  • Apple will replace Core ML with a new "Core AI" framework at WWDC 2026, launching with iOS 27
  • The framework keeps Core ML's on-device focus but adds integration with third-party AI models and potentially the Model Context Protocol (MCP)
  • Apple Foundation Models trained on Google Gemini will be baked into the new framework
  • Both frameworks may coexist temporarily, but the rebrand signals Apple's full pivot from machine learning to generative AI

How Core ML Got Us Here

Core ML launched at WWDC 2017. It let developers run machine learning models directly on iPhones and iPads without sending data to the cloud - a privacy-first approach that was ahead of its time. Over eight years and three major revisions, the framework grew to support on-device model training (Core ML 3, 2019), custom model creation through Create ML, and eventually generative AI workloads including large language models and diffusion models.

But the name became the problem. "Machine learning" describes a subset of what developers now need. The industry shifted to LLMs, multimodal systems, and agentic architectures. Core ML's API surface didn't keep pace.

What Changed

The gap became obvious when Apple introduced the Foundation Models framework with iOS 26, giving developers access to on-device language models for the first time. Core ML still handled model inference, but the real action - prompt engineering, tool use, structured outputs - lived in a separate framework. Developers had to stitch together multiple APIs to build anything resembling a modern AI feature.

A circuit board with processor and integrated components Apple's Neural Engine has grown from 600 billion operations per second on the A11 to over 38 trillion on the M4 - hardware that has outpaced the software framework designed to use it.

What Core AI Actually Does

According to Gurman's reporting, Core AI's general purpose remains the same as Core ML - helping developers integrate AI models into their apps. But the scope is wider. Here is what the framework is expected to deliver.

On-Device Model Execution

Core AI inherits Core ML's core strength: running models locally on the Neural Engine, GPU, and CPU without a network connection. Privacy guarantees remain intact. Your data stays on your device.

Third-Party Model Integration

This is the real shift. Core AI reportedly introduces a standardized way for developers to plug external AI models into their apps. Gurman noted that one possible mechanism is MCP (Model Context Protocol), the open standard originally developed by Anthropic that lets AI systems interact with apps and data sources through a universal interface.

If Apple builds MCP, third-party models like ChatGPT, Gemini, or Claude could perform in-app actions on iPhone, iPad, and Mac - not just answer questions in a chat bubble, but actually manipulate data, write to files, and trigger workflows inside native apps.

Apple Foundation Models

The new framework will ship with Apple's own Foundation Models, trained using Google Gemini technology on Apple's Private Cloud Compute infrastructure. These models power the upgraded Siri that Apple has been working toward since iOS 26, and Core AI gives developers the same building blocks.

Developer workspace with multiple screens showing code Apple's developer ecosystem spans over 34 million registered developers. Core AI aims to give all of them a single entry point for building AI-powered features.

Core ML vs Core AI

FeatureCore ML (2017-present)Core AI (expected 2026)
On-device inferenceYesYes
Model format.mlmodel, .mlpackageTBD (likely backward-compatible)
LLM supportAdded late (iOS 26)Native, first-class
Third-party model integrationManual, fragmentedStandardized (possibly MCP)
Apple Foundation ModelsSeparate frameworkBuilt-in
Generative AI toolsLimitedExpanded
Agentic capabilitiesNoneExpected
Name reflects scopeNoYes

Why It Matters Now

The timing is not accidental. Three forces are converging.

The naming problem is real. Every major competitor - Google, Microsoft, Samsung, Meta - talks about "AI." Apple's developer documentation still sends people to the "Machine Learning" section. In a market where perception shapes adoption, Apple was branding itself with a term that sounds like 2018.

MCP is winning. The Model Context Protocol has become the de facto standard for connecting AI models to external tools and data. As we covered in our Xcode 26.3 analysis, Apple has already been moving toward agentic coding patterns. Core AI with MCP support would let any AI model - not just Apple's own - interact with iOS apps through a single, secure interface.

The ecosystem is massive. Apple has over 20 billion active devices worldwide. A standardized AI framework across iOS, iPadOS, macOS, watchOS, and visionOS could be what Gurman described as "a crucial turning point for the entire generative AI industry to truly move towards mass-market application on mobile devices."

"The switch from 'ML' to 'AI' is significant - Apple knows that 'machine learning' is a dated term that no longer resonates with developers or consumers," Mark Gurman wrote in his Power On newsletter.

Developer team collaborating at workstations The move to Core AI could standardize how millions of third-party developers build AI features across Apple's platforms.

Apple has not confirmed the details. The framework exists only in Gurman's reporting, and specifics - MCP support, model format changes, migration paths for existing Core ML apps - remain unverified. There's a real chance this ends up being mostly a rename with gradual API additions rather than the ground-up rethinking the industry wants.

What is clear is direction. Apple is no longer treating AI as a branch of its machine learning efforts. It's reframing the entire developer platform around it.

WWDC 2026 is expected in June. If Core AI delivers on even half of what these reports suggest, it'll be the most significant change to Apple's AI developer story since Core ML launched nine years ago.

For related coverage, see our guide on what MCP is and how it works, our teardown of Xcode 26.3's agentic coding features, and our analysis of Apple Intelligence and GPT integration in iOS 26.

Sources:

Apple's Core AI Will Replace Core ML in iOS 27
About the author Senior AI Editor & Investigative Journalist

Elena is a technology journalist with over eight years of experience covering artificial intelligence, machine learning, and the startup ecosystem.