OpenAI’s DevDay 2025 in San Francisco dropped a wave of new tools, hardware hints, and a startling design vision.
In a closing dialogue, Jony Ive called the pace of change in AI “extraordinary,” underscoring the challenge of designing amid rapid upheaval.
Over a day of keynotes, live demos, and fireside chat, OpenAI revealed App SDK, AgentKit, ChatKit, stronger Codex capabilities, and a multi-gigawatt chip pact with AMD — all while laying bare its ambitions to fuse design, infrastructure, and AI into a singular platform.
Key Takeaways
- OpenAI’s new toolkits aim to turn ChatGPT into a platform, not just a model
- App SDK lets developers embed full apps inside ChatGPT interfaces
- AgentKit and ChatKit push agent-based logic and chat embedding forward
- AMD deal commits 6 gigawatts of GPU capacity to OpenAI’s infrastructure
- Jony Ive signals hardware ambition but cautions about AI’s breakneck pace
During DevDay 2025, OpenAI announced App SDK, AgentKit, enhanced Codex, and a major AMD chip deal. In a closing discussion with Sam Altman, Jony Ive said the pace of change in AI is “extraordinary,” highlighting the design challenges of working in such a fast-moving environment.
DevDay Returns Bigger Than Ever
OpenAI’s DevDay returned on October 6, 2025, at Fort Mason in San Francisco, drawing more than 1,500 developers for its largest edition yet.
The company views DevDay as a core moment to reveal what’s next and to orient developers toward its evolving platform strategy.
In prior years, OpenAI used DevDay for incremental updates; this time, the stakes felt larger — a signal that OpenAI is repositioning itself not merely as a model provider but as an AI platform company.
Major Announcements: Tools, Interfaces, and Agents
App SDK: Embedding Apps Within Chat
One of the day’s most talked-about launches was App SDK. With it, OpenAI gives developers the ability to embed external apps inside ChatGPT’s interface. Users will be able to call upon services (e.g., Spotify, Canva) directly through the chat experience, with follow-up edits and interactivity built in.
In a demo, OpenAI showed how a user could ask ChatGPT to generate poster mockups for a dog-walking business using Canva, then request edits — all inside the chat interface.
The broader aim: make ChatGPT a front door for apps rather than just a chat layer.
AgentKit: Prototyping to Production
Next up is AgentKit, which helps developers build agentic workflows — essentially programmatic agents that can chain tools, logic, APIs, and decision-making in autonomous tasks. In the keynote, a demo built an AI agent in about eight minutes from scratch.
AgentKit aims to bridge the gap between prototype and production: visually designing agents, wiring in tools and widgets, previewing, testing, and deploying — all in one flow.
ChatKit: Embedding Chat Everywhere
With ChatKit, developers can embed ChatGPT-powered chat windows into their apps or web products, maintaining brand identity and custom workflows. OpenAI pitched this as the “chat as interface” model.
This move suggests OpenAI sees conversational interfaces as a key layer in product UX going forward.
Codex & Extended Developer Tools
OpenAI also rolled out improved Codex capabilities — making it generally available with extra integrations (e.g., Slack, enterprise tooling).
They also introduced or refreshed supplemental tools: Sora 2, GPT-5 Pro, and a compact voice model called gpt-realtime-mini.
These additions underscore the push to make not only chat but multimodal input and synthesis more seamless.

Infrastructure & Scale: AMD, Compute Constraints, and Strategy
The AMD Deal: 6 GW of GPU Commitment
A central infrastructure announcement came via a partnership with AMD: OpenAI will deploy up to 6 gigawatts of AMD’s Instinct GPUs over multiple years.
The deal reportedly includes warrants giving OpenAI the option to take up to 10% of AMD shares, linked to deployment milestones.
This is part of a broader compute deal blitz: OpenAI already has relationships with Nvidia, Oracle, and others, but AMD marks one of the deepest commitments yet.
Compute as Bottleneck
Repeatedly through DevDay, OpenAI’s leadership flagged compute scarcity as a central constraint. Greg Brockman said, “there just simply is not enough,” implying that demand for AI exceeds current infrastructure supply.
Altman also mentioned that markets had reacted “strangely” to DevDay announcements, as stocks of companies like HubSpot and Coursera surged after mentions — evidence of how sensitive the ecosystem is to OpenAI’s movements.
In remarks to reporters, OpenAI reiterated that for now, growth and investment outweigh short-term profitability. Altman said profit is a future concern, not in his “top 10” immediate priorities.
The Altman–Ive Fireside Chat: Design, Disruption & Focus
Ive on the Extraordinary Pace
In the event’s emotional apex, Sam Altman and Jony Ive took the stage (not livestreamed) for a discussion about design and AI’s future. During this, Ive was asked about unexpected challenges; he laughed and said the pace of change is “extraordinary.” “From one week to the next, there’s something else and then something else,” he added.
That statement underscores a real design tension: how do you focus when your field itself is shifting underneath you?
Reimagining Relationship with Technology
Ive has long critiqued how we relate to our devices today. On stage, he emphasized wanting to build tools that make us happy, fulfilled, peaceful, less anxious, less disconnected.
He criticized existing smartphones and tablets (including ones he helped design) as anxiety-inducing and demanded a rethinking of how humans and technology interact. He said his team is juggling 15 to 20 hardware concepts in collaboration with OpenAI, though none of the products are final or revealed.
In an earlier OpenAI post, the io team (Ive’s device startup) formally merged with OpenAI, giving him and LoveFrom deep design responsibilities across the company.
He framed this moment as the convergence of his decades in design with what he sees as the generational moment of AI. “Everything I have learned over the last 30 years has led me to this moment,” he said in that post.
The Challenge of Hardware
But hardware is notoriously difficult. Though Ive and OpenAI alluded to ambitious vision, they revealed few technical specifics. Press reports note possible delays, thermal/compute constraints, interface trade-offs, and the challenge of defining a new device category that sits between phone, wearable, and ambient AI.
Ive’s remarks suggest humility: he affirmed the difficulty of designing new form factors, and the need to rethink interface norms.
He also rejected an overly serious, exclusive design ethos: “we want interfaces that make people smile, not be just another deeply serious sort of exclusive thing.”
Context & Backdrop: Why This DevDay Feels Different
OpenAI’s Strategy Shift
The 2025 DevDay feels like the moment OpenAI is articulating a shift: not just releasing models, but building an entire AI platform stack — infrastructure, front-end interfaces, embedded agents, and even hardware design.
By offering App SDK, AgentKit, ChatKit, upgraded models, and committing to new compute, OpenAI is stacking layers in its own ecosystem, rather than merely enabling third parties to use its models.
The Design Imperative
That OpenAI brought in Jony Ive to lead design vision internally speaks to their recognition that user experience and interface design will be differentiators in AI’s next era. It’s not enough to have the most capable model — you need to shape how people meaningfully interact with it.
Ive’s presence and his caution about pace amplify the message: design must keep up with capability, or else AI tools risk becoming unusable or alienating.
Compute Arms Race & Financial Risks
OpenAI’s compute commitments are gargantuan in scale. The AMD deal, plus commitments to Nvidia and others, suggest multi-billion dollar bets.
The question: can OpenAI translate that infrastructure into sustainable business models before capital dries up? The risk is that compute costs outrun monetization, especially if revenue levers (e.g. app monetization in ChatGPT) stagnate.
Industry Reactions
The broader tech sphere is watching closely. Embedding apps in ChatGPT threatens the traditional app economy. OpenAI’s push into hardware is seen as encroachment into territories dominated by Apple, Google, Meta.
Analysts are alert to how partners (Spotify, Canva, Figma) respond — early stock bumps reflect investor sensitivity to OpenAI’s ecosystem strategy.
There’s also potential regulatory scrutiny ahead — control over compute, data flows, confluences of design and algorithm, and platform dominance all raise antitrust and privacy flags.
What Comes Next: Watch These Trajectories
- App ecosystem inside ChatGPT
How many high-quality apps get launched via App SDK? Will OpenAI adopt an App Store–style revenue share, or allow open monetization? - AgentKit adoption & standards
AgentKit’s success hinges on portability and interoperability. The freshly published Open Agent Specification (Agent Spec) (Oct 2025) seeks to standardize agent workflows across frameworks. - Hardware prototypes & leaks
Will we see tangible units in 2026? The signals will come via filings, partner leaks, or reveal at subsequent events. - Compute scaling pressure
The first tranche of AMD GPU deployments (and Nvidia/Others) will test whether OpenAI’s infrastructure can keep up with runaway demand. - Regulatory and competitive pushback
As OpenAI tightens its grip across stack layers, governments, rivals, and regulatory bodies will probe fairness, openness, and dominance.
Conclusion
OpenAI DevDay 2025 crystallized a larger vision: to turn ChatGPT from a model into a platform, to blend design and infrastructure, and to push into hardware in pursuit of a new human-AI paradigm. The new SDKs, compute pledges, and design dialogue make clear they believe the future is conversational, ambient, and design-driven. But, as Jony Ive warned, keeping pace in an era where change happens weekly won’t be easy — and the margin for misstep is vanishingly slim.