OpenAI Moves to AWS One Day After Microsoft Exclusivity Ends

One day after Microsoft's exclusive license ended, OpenAI launched GPT-5.4, Codex, and jointly built Managed Agents on Amazon Bedrock - all in limited preview.

OpenAI Moves to AWS One Day After Microsoft Exclusivity Ends

OpenAI spent April 27 legally separated from Microsoft's cloud lock-in. It spent April 28 on stage at a San Francisco hotel announcing three products for Amazon Web Services.

AWS CEO Matt Garman and OpenAI chief revenue officer Denise Dresser shared a stage at Amazon's invite-only "What's Next with AWS" event to unveil three Bedrock additions in limited preview: GPT-5.4, Codex, and a jointly built Managed Agents platform. OpenAI CEO Sam Altman joined via recorded video - his in-person calendar that day was occupied by Elon Musk's lawsuit against him in an Oakland courthouse 30 miles away.

TL;DR

  • GPT-5.4 is in preview on Amazon Bedrock today; GPT-5.5 lands "within weeks" per Garman
  • Codex, with 4M+ weekly users, now supports AWS credential auth - usage applies toward existing AWS cloud commitments
  • Amazon Bedrock Managed Agents, built on AgentCore with OpenAI's harness, runs fully inside the customer's VPC
  • Microsoft's exclusive commercial license on OpenAI products formally ended April 27 - AWS launched the next day
  • No pricing, no general availability dates; all three are limited preview

Matt Garman and Denise Dresser announcing the OpenAI-AWS partnership at What's Next with AWS, April 28 2026 AWS CEO Matt Garman and OpenAI CRO Denise Dresser at the "What's Next with AWS" event in San Francisco on April 28, 2026. Source: techxplore.com

What Ended the Day Before

Microsoft's original 2019 deal granted it an exclusive commercial license to OpenAI's products. That exclusivity was progressively loosened through 2025 negotiations and formally wrapped on April 27, 2026, as part of a restructuring that also ended a revenue-sharing arrangement between the two companies. Microsoft keeps its position as OpenAI's primary cloud provider - most of OpenAI's training and inference still runs on Azure. But the new arrangement lets OpenAI distribute its models and tools across any cloud through 2032.

AWS is the first to receive them. Dresser put the motivation plainly: the Microsoft partnership had "limited our ability to meet enterprises where they are."

The partnership wasn't improvised overnight. Garman said it's been "in motion for the last six to nine months," which puts the serious discussions around September 2025 - roughly when the earlier AWS investment deal took shape. Under that arrangement, AWS agreed to invest up to $35 billion in OpenAI tied to a requirement that OpenAI eventually deploy two gigawatts of Amazon Trainium accelerators. The Bedrock products are the commercial output of that infrastructure bet.

Three Things on Bedrock Now

GPT-5.4 and the GPT-5.5 Preview

GPT-5.4 is available today through Bedrock's standard converse API. The model shows up alongside other Bedrock models - same IAM roles, same PrivateLink endpoints, same CloudTrail audit trail. Customers with existing Bedrock access don't need new accounts or separate OpenAI API keys.

GPT-5.5, which brought improved long-context reasoning when it launched earlier this year, is on deck. Garman said it would arrive on Bedrock "within weeks." A boto3 call looks the same as any other model on the platform:

import boto3

bedrock = boto3.client("bedrock-runtime", region_name="us-east-1")

response = bedrock.converse(
    modelId="openai.gpt-5-4",   # model ID format subject to change at GA
    messages=[
        {
            "role": "user",
            "content": [{"text": "Summarize this support ticket backlog."}]
        }
    ],
    inferenceConfig={"maxTokens": 2048, "temperature": 0.3}
)

print(response["output"]["message"]["content"][0]["text"])

IAM policies, VPC routing, and KMS encryption that teams already have configured for Bedrock apply automatically.

Codex

Codex, OpenAI's coding agent, wires into Bedrock differently from the API models. Authentication runs through AWS credentials rather than OpenAI API keys. Customer codebases don't reach OpenAI's training pipeline - the agent's inference stays inside Bedrock infrastructure. For security teams that have blocked direct OpenAI API access by policy, this is the gap-filler.

The same CLI, desktop app, and VS Code extension that over 4 million developers use weekly now accept AWS credential auth as an alternative. One operational detail worth flagging: Codex usage through Bedrock counts toward existing AWS cloud spend commitments. For large enterprise contracts structured around AWS Reserved Capacity or Enterprise Discount Programs, that's a meaningful incentive to route Codex through Bedrock rather than openai.com.

Managed Agents

Amazon Bedrock Managed Agents, powered by OpenAI, is the joint product built on Amazon's AgentCore platform. It combines OpenAI's agent harness - the reasoning loop, tool use, and task orchestration layer - with Bedrock's governance stack: IAM for access control, guardrails for content filtering, CloudTrail for audit logs.

Altman described the product direction: "The next phase of AI is going from you supply some text to an agent and get more text back...to we are going to have these agents running inside of a company doing all different kinds of work."

The entire session runs inside the customer's VPC. "The whole thing kind of stays within your VPC and so data is protected inside of the Bedrock environment," Garman said. In regulated industries - banking, insurance, government agencies - that VPC boundary is often the single deciding factor between pilot and production.

Garman also framed the broader problem it's solving: "Customers were kind of forced to pull that together themselves...By building this thing together, we make it much easier for customers to much more rapidly get to value."

Deployment Requirements

ProductWhat You NeedAccess PathApplies to AWS Commitment
GPT-5.4 / GPT-5.5Bedrock preview accessConverse APINo
CodexAWS credentials, Bedrock accessCLI, desktop app, VS CodeYes
Managed AgentsAgentCore-enabled accountBedrock Agent Runtime APIYes

All three are limited preview. AWS has a sign-up form but hasn't published timeline commitments for general availability.

Blue ethernet cables plugged into a server rack patch panel OpenAI's infrastructure products now route through AWS Bedrock's existing networking layer, including PrivateLink and VPC endpoints. Source: pexels.com

Where It Falls Short

No pricing was announced. For a product pitched squarely at enterprise budget cycles, that silence is loud. Teams can't build a business case around a limited preview with no cost model.

The VPC isolation that Garman positioned as a feature is also a structural constraint. Managed Agents running inside a customer's VPC can't easily call external third-party APIs without additional networking configuration - NAT gateways, security group rules, egress inspection. Agents doing research, sending notifications, or pulling data from SaaS tools need that outbound path. The simpler the VPC, the more work setting up egress becomes.

Altman acknowledged the deeper architectural gap himself. Agents running inside corporate environments will need identity - a persistent, auditable way for internal systems to recognize them, grant scoped credentials, and track what they did. "We don't even have a primitive to think about that," he said, "but we may quickly need to figure that out as we have agents join the workforce." No platform has solved this yet, Bedrock included.

The Codex-on-Bedrock offering is also shallower than it might look. Routing authentication through AWS doesn't change what Codex can do - it's the same agent with the same capabilities, just with a different credential flow. Teams that already have OpenAI API access get nothing new from running it through Bedrock except the spend commitment credit and the policy bypass.


The three products represent a credible start for a newly multi-cloud OpenAI. Garman's framing was direct: "how do we get OpenAI technologies in the hands of AWS customers - that's what they wanted, and that's what we wanted." What's still missing is the answer to what it costs, when it ships widely, and whether the agent identity problem gets solved before enterprises try to put these systems into production.

Sources:

Sophie Zhang
About the author AI Infrastructure & Open Source Reporter

Sophie is a journalist and former systems engineer who covers AI infrastructure, open-source models, and the developer tooling ecosystem.