AMD came to CES 2026 with a message it rarely delivers this confidently: it’s no longer content playing catch-up in AI hardware.
On stage in Las Vegas, Lisa Su unveiled a new lineup of data-center AI chips, teased a far more powerful next-generation platform, and brought OpenAI leadership onstage to underline just how serious this moment is for the company.
A sharper push into the AI data center
The headline hardware was AMD’s MI455 AI processor, designed for large-scale data centers—the kind that power models like ChatGPT. These chips are already slated for deployments with major customers, including OpenAI, as part of a deal AMD signed late last year.
Alongside it, AMD introduced the MI440X, an enterprise-focused AI chip built for companies that want AI acceleration without rebuilding their entire infrastructure. It’s designed to drop into traditional on-premise server environments, a quieter but potentially important play for businesses not ready for hyperscale AI clusters.
OpenAI’s quiet endorsement
One of the most telling moments came when Greg Brockman joined Su on stage. Brockman stressed that steady advances in chip performance are critical to OpenAI’s long-term ambitions—a public signal that AMD’s silicon is becoming a real part of OpenAI’s computing future.
That matters because Nvidia still dominates AI hardware, selling every accelerator it can manufacture. AMD may be behind, but OpenAI’s backing gives it credibility that’s hard to buy.
Looking ahead: the MI500 preview
AMD also teased its longer-term bet: the MI500 series, expected to arrive in 2027. Su claimed the platform could deliver performance improvements on the order of 1,000 times compared to earlier generations.
It’s an eye-catching number—and one that comes with a long timeline—but it frames how AMD sees the next phase of AI adoption. Su argued that billions of people could be using AI tools daily within a few years, forcing the industry to massively expand global computing capacity.
Helios vs. Nvidia’s rack-scale systems
AMD didn’t stop at individual chips. Su showed off Helios, a full rack-scale AI system built around dozens of AMD GPUs. The company positioned it as a direct answer to Nvidia’s NVL systems, which currently define the standard for AI supercomputing racks.
Matching Nvidia feature-for-feature won’t be easy. But Helios signals that AMD wants to compete not just on components, but on complete AI platforms.
AI everywhere: PCs and robots included
CES wasn’t just about data centers. AMD also launched its Ryzen AI 400 Series processors for AI PCs, promising stronger on-device AI performance, better efficiency, and improved graphics. New Ryzen AI Max+ chips target premium laptops and compact workstations.
And in a nod to the broader AI ecosystem, AMD brought out Generative Bionics’ humanoid robot, GENE.01, powered by AMD CPUs and GPUs. The company says its first commercial units could arrive in 2026.
The bigger picture
AMD still trails Nvidia by a wide margin in AI revenue. But CES 2026 showed something new: a company willing to challenge Nvidia across chips, systems, and long-term roadmaps—with OpenAI standing close by.