Loading summary
Nathan Hager
Did my card go through?
Lisa Su
Oh, no.
Nathan Hager
Your small business depends on its Internet. So switch to Verizon Business and you can get LTE Business Internet starting at $39 a month when paired with select Business Mobile plans. That's unlimited data for unlimited business.
Lisa Su
There we go.
Nathan Hager
Get the Internet you need at the price you want. Verizon Business starting price for lte Business Internet 25 Mbps Unlimited Data Plan with select Verizon Business smartphone plan. Savings terms apply.
Lisa Su
Bloomberg Audio Studios Podcasts Radio news Welcome.
Ed Ludlow
Back to Bloomberg Tech. Lisa SU Helios with my 455X AMD's first rack scale system solution. But inside it, AMD's first and the world's first 2 nanometer chip of that type of A lot was made of it. When you actually just stood on stage and held it in your hand for the first time. Why is it significant?
Lisa Su
Well, first of all, Ed, it's great to be here with you at ces. I think this is always a great way to kick off the year because you get so much perspective. So it was fun giving the keynote last night. Look, Helios is a massive system. You can see it in the background here. And Mi455 is just an incredibly powerful chip. And probably the context I would give Ed is know one of the things that we're so clear about is that the demand for AI compute is just continuing to increase. And you know, we have seen that over the last five years. When you think about just how much new capabilities have come on board, we've now seen a real inflection in the number of people who are using AI. So if you think today there are probably more than a billion active users using AI and we expect that to scale to over 5 billion users over the next five years. So for all of that, you need computer and lots and lots of compute. And from that standpoint, you know, MI 455 is a significant leap forward in terms of technology capability. You know, made up of 2 and 3 nanometer chips, 320 billion transistors. Just a lot of performance and a.
Ed Ludlow
Lot of the timeline for it to be deployed in the real world then. And who will be the principal first user of it?
Lisa Su
You'll see it in 2H26 and it will ramp from there. And you know, we have very strong partnerships. OpenAI. Greg Brockman was on stage with with us last night talking about all of the use cases that they see. We've announced a partnership with Oracle, many others as well.
Ed Ludlow
So given that it is to, it's in full production now. It's getting ready to, we are, we.
Lisa Su
Are absolutely getting ready to ship it.
Ed Ludlow
That's at one end of the sort of scale and spectrum. At the other you have mi440x which is for small data centers. I'm trying to simplify it, but it's basically an enterprise product. What was it that you were trying to solve for with that?
Lisa Su
Yeah, I think what we're trying to solve for is, you know, the world is a very heterogeneous world. You have all kinds of use cases for AI from you know, sort of the very biggest cloud data centers that are doing, you know, large scale training and inference to enterprise applications as well as supercomputers. And so we actually have a family of chips. At the highest end is our 455 for the Cloud environment. But we announced last night a Mi 440 which is actually using the same basic building blocks but is now really focused on enterprise applications so that you can go into, you know, let's call it current data centers with the new technology. So we're excited about that as well. You know there is enterprises are starting to increase their adoption of AI. In some cases they want their own control of their data centers in terms of on prem environments.
Ed Ludlow
What are they doing with it though? I mean, you know, we, we've been so fixated on frontier models with hundreds of billions of parameters and the scale of infrastructure needed for that. With my 440 we're talking about something slightly different. I think it's just really interesting if you could explain what the demand is from those enterprises, what they want with the technology.
Lisa Su
Well, I think you see many enterprises now using AI all throughout their, their business processes. Whether you're talking about things in their workflow, even amd, we're using AI through every part of our development process. We're seeing a lot of applications in financial services, in health care. These are areas, especially in financial services that you know, people actually don't want everything necessarily in the cloud. They'd like to be able to have, you know, their own on prem deployments or private cloud deployments. And in this case you don't want to have to build a brand new data center for new generation of chip. My 440 allows us to use some of those existing data centers and upgrade with the new capabilities. Welcome.
Ed Ludlow
If you're watching us on Bloomberg Television or you're listening on Bloomberg Radio, we're live in Las Vegas and we're with AMD CEO Lisa Su and we're talking about the latest generation of accelerators. What makes this generation of AMD accelerators the better option, particularly for on PREM and at the edge over what Nvidia is offering right now?
Lisa Su
Well, the best way to think about it, Ed, is we're in this place where AI is at an inflection point. We're seeing AI now in every part of compute. We see it in the largest models. You know, when you're thinking about things like ChatGPT and Gemini and Grok. You know, we're also seeing, you know, many use cases in new capabilities like, you know, video production, entertainment, employment, health care, where you're doing drug discovery, all of these various things. You know, our claim to fame is really, you know, outstanding performance at, you know, very advantaged, total cost of ownership. And the other thing that, you know, we believe very strongly in is an open ecosystem and deep partnerships, you know, with our, you know, with our overall ecosystem coming together. So when you put those things in perspective, I think we have a great set set of applications that will take advantage of these newest generation chips.
Ed Ludlow
You mentioned that Greg Brockman, who's the Open Air president, was on stage with you last night. And one of the basic points that he made was there are tools and functions they would love to release and put out into the world, but they're compute constrained. I often ask you to quantify demand, but is there a way to quantify the severity in the lack of computer, you know, the deficit that's out there right now?
Lisa Su
Well, let me just give you some numbers to kind of ground what we think the demand environment is, is looking like. So if you think, you know, today we have about a billion active users and we're ramping that to, you know, 5 billion over the next five years. And we have about, let's call it 100 zeta flops of compute, you know, all around the world. I mean, that's just a generic number that, that aggregates all of that. You know, we think we have to increase compute by another one as you go over the next four or five years. And I introduced a term last night, the yaddle flop. You know, people are like, what is a yada flop? A yada flop is actually 10 to the 24th in terms of flops. So that's a one followed by 24 zeros. And to give you, you know, just a view of just how much things have really increased, I mean, that's another 100 times more computer than we have today. So that gives you an idea. Now you think what Are you going to use all that compute for? I mean, the truth is the models that we have today are great. I mean, they do amazing things. You know, we talked about a number of use cases. You know, perhaps, you know, one that's very, hits very close to home is, is writing software. Like, you know, people are using the AI tools right now to significantly enhance the productivity of software developers. But it's good, but it can get so much better. And I mean, I think that's the key point. You know, we like to say that is really going to be everywhere and it's really for everyone and it's for each one of us to make our businesses more productive. You know, each one of us more productive going forward. And so we're still in the very early innings of really unlocking the power of AI.
Ed Ludlow
So where we stand is we, okay, there's a, there's a compute deficit and software has kind of hit the limits of what current generation compute can offer. Help us understand the bottlenecks and barriers to deploying that compute. A lot at the moment about memory chips. What else? Energy, Electricity? What's crossing your desk, Lisa? This. That gives you pause and say, this is, this is a problem right now?
Lisa Su
Well, our job as a technology industry is to push the bleeding edge. I mean, that is our job. And so, so, you know, when we think about like the Mi 455 deploying, you know, 2 nanometer and 3 nanometer chips, having the latest generation memory, high bandwidth memory, that, that is out there and really deploying these big systems. The important thing is that the entire ecosystem come together and we plan together for this next big inflection in compute. And that's exactly what we're doing right now. I think we're working very closely with the entire supply chain to ensure that we have the, the resources to this compute environment. And yes, you know, some of the things that you mentioned are, let's call it constrained, but which is most severely. So, you know, I don't think that's any one thing. I think we're all looking at, you know, how do we build faster? You know, certainly, you know, power is one of those areas where, you know, you see throughout the world, you know, power is being brought online as fast as possible. Certainly from a silicon standpoint, you know, we're ramping our production capabilities with our partners. From a memory standpoint, our partners are, are ramping as well. So it's not any one thing. I think it's all of these things have to go sort of in tandem and that's why partnership is just so important in, in this business.
Ed Ludlow
We started this conversation talking about Helios first Rack scale architecture and infrastructure from amd. Could you talk about the future and how much of the content you want to own in a server? You know, we started this, this story with the gpu. Frankly, if you look at what Nvidia is doing, they want to increasingly own all of what's inside the box. Is that something that AMD is focused on too?
Lisa Su
You know, what's most important for us is to ensure that we have turnkey solutions that are very, very easy for our customers to deploy. Because when you think about, you know, how do you use all of this AI compute most effectively? You want it to go into the data center and really be up and running on day one. And for that you have to optimize a full system. But from that standpoint, you know, we are very focused on an open ecosystem. So yes, we design the CPUs and the GPUs and some of the networking elements, but we also work, you know, really with a broad ecosystem of partners with industry standards. It's all about ensuring that we get the best of all worlds when we put our solutions together.
Karen Moscow
Bloomberg Daybreak is your best way to get informed first thing in the morning, right in your podcast feed. Hi, I'm Karen Moscow.
Nathan Hager
And I'm Nathan Hager. Each morning we're up early putting together the latest episode of Bloomberg Daybreak US Edition. It's your daily 15 minute podcast on the latest in global news, politics and international relations.
Karen Moscow
Listen to the Bloomberg Daybreak US Edition podcast each morning for the stories that matter with the context you need.
Nathan Hager
Find us on Apple, Spotify or anywhere you listen.
Ed Ludlow
Looking ahead to my 5002027 that has 1000 times the performance of the my 300 generation. So your last generation of real world deployed gear, something's coming that's a thousand times better. How did you make it a thousand times better?
Lisa Su
It is just incredible engineering at every level. So my 455 is 10 times better than the chip that we just launched six months ago, the my 355. And my 500 is another 10X. You know, on top of that, we are using the most advanced technology out there. We have a very, you know, very clear focus on, you know, hardware, software, system, co design and it is, you know, clearly pushing the bleeding edge of capabilities.
Ed Ludlow
What is the status of AMD's ability to sell products into China right now?
Lisa Su
So, you know, China is an important market for us. You know, we actually sell a broad range of chips into China, including our, you know, our PC, these as well as, you know, other embedded chips.
Ed Ludlow
In the data center context, of course.
Lisa Su
Sorry, in the data center context, we are, you know, certainly we see China as an important market. We were, we did get some licenses from the U.S. government, you know, late last year as it relates to some of our previous generation, our Mi 308 chips. And we are in the process of applying for new licenses with our, my 325 chips that were recently been allowed to, to apply for licenses. We haven't gotten those licenses yet, but we continue to continue to view China as an important market for us.
Ed Ludlow
The reason I ask about it is in part because a lot of the work that's being done in open source models and bridging the gap between open and closed is being done in China to some extent. There's been a lot of discussion about the demand being there in China. But could you reflect a little bit on that demand, but also what the Chinese government's attitude is to you taking a later generation of tech to the country?
Lisa Su
Well, I do think the demand for, you know, AI in general and in China is high for all the reasons that we talked about. I think we are in a demand environment where, you know, more compute is beneficial across the world. We think, you know, China is an important market for us and it's very active in having our solutions deployed. So, you know, we continue to view it as something that's important. We're working with the US Government as well as our Chinese customers, you know, to find good solutions there.
Ed Ludlow
And there are signs from both governments that the license process is moving. Commerce is kind of notorious for things sitting on a desk for quite a long time.
Lisa Su
I think we are optimistic that, you know, we'll have an opportunity to, to get some of those licenses granted.
Ed Ludlow
You're watching Bloomberg Television. You're listening to Bloomberg Radio. This is Bloomberg Tech and we're live in Las Vegas with the AMD CEO Lisa Su. Last question really in the data center context is the, the markets and investors want data and signs that you're taking market share. What would the metrics be that you'd point to either that already exist or over the coming 12 months that would.
Lisa Su
Evidence that, well, I think my 455 is a, a clear inflection point in, you know, both our technology capability as well as the deep partnerships that we have across the industry. So we're excited about, you know, what we see in front of us. And you know, we've talked about you know, tens of billions of dollars in AI revenue as we get into 2027. And I think these are important metrics for us as a company when we think about the AI potential.
Ed Ludlow
For all the focus on data centers, some forget that AMD is leader in PC in many respects. The forecasters have very different opinions of what will happen this year. Market some see shrinking market some see modest growth. Driven literally by just ipc. You've been able to take market share and grow irrespective of what the broader conditions are, but they haven't been great. How have you done that and do you expect that to continue to be the case?
Lisa Su
Well, the PC market is a very good market for us. We grew a ton in the PC market in 2025, and that really came from the strength of our product portfolio. We bet early in ipc, so it was a clear area where we believe that the technology would generate demand. We also went through a refresh cycle with Windows 11. And as we go into 2026, I think we'll, we'll want to see how a few quarters play out. I think the general demand for computing is certainly there. There are some supply chain constraints that, you know, we're working through and we want to watch going forward. But, you know, our case is one where we are still underrepresented in parts of the market. You know, we are very, very strong in gaming, we're very strong in consumer. I think we're underrepresented in enterprise laptops. And we view this as a growth.
Ed Ludlow
Area for US is a PC change.
Lisa Su
That PCs absolutely help in terms of, you know, just the upgrade cycle coming in. We're excited about some of our work with development systems as well. We, we announced a new AI development system last night that, you know, we think will be also very attractive.
Ed Ludlow
Those constraints you were talking about in the PC context are specifically DRAM or it's broader than that.
Lisa Su
It's more around the memory side. So when you think about, you know, memory overall, I think we have, you know, so much demand coming from, let's call it AI Data center, compute that. We want to see how it impacts sort of the rest of the memory market out there.
Ed Ludlow
One of the other areas that you discussed with Greg Brockman on stage from OpenAI was sort of the Net or broad economic impact of AI, not just the companies. I think you were talking more about global economy. Again, very difficult, Lisa, but how does one measure progress in whether AI has or has not had a direct positive economic impact around the world in any given year?
Lisa Su
You Know, it's true, it's hard to deconvolve all of the things that are happening. But I think from a sense of, you know, what we see in the business and you know, many people want to see direct return on investment for a particular a set of investments. What I would say is that we know that AI is making a difference in the productivity of companies. We know that. I can see that within AMD in terms of as we deploy AI, you know, we're able to get products to market faster. We're able to significantly improve some of our business processes. So you know, as we go forward over the next several years, I think you're going to see that much broader in enterprises. Every CEO that I talk to is talking about, about AI. It is front and center in terms of how to build a better company, how to build a better portfolio. And so, you know, I think what, you know, Greg was talking about is when you aggregate all of that, I has to impact the world at a GDP level and we'll see that over the next few years.
Ed Ludlow
You're watching Bloomberg Television. You're listening to Bloomberg Radio. This is Bloomberg Tech Tech and we're live in Las Vegas. We're speaking to AMD CEO Lisa Su. You are an investor in Generative Bionics, also a technology partner and they have unveiled a humanoid robot here in Las Vegas, ces. In fact, if the magic of television can happen. And we cut to the wide in the background.
Lisa Su
Right.
Ed Ludlow
You know, this is the first tangible sign I feel we've seen, we've seen from AMD on how you intend to play in physical AI.
Lisa Su
Yes.
Ed Ludlow
Explain your strategy. It is the next big market, right?
Lisa Su
Yes. And I wouldn't say it's the first time, but it's probably one of the areas where we don't highlight as much because there's so much focus on data center and cloud and the opportunities there are, you know, very much in front of us. But when we look at fiscal AI, you know, starting from all of the work we've done in FPGAs and embedded real time capability, we have been in this space for a time. You know, we already power a lot of robotic applications, you know, out there. But I think as we go into the humanoid capability and you know, we're excited about our partnership with, you know, Bionics and the work with on gene 1, I think that takes us to another level in terms of capability and intelligence and what we're trying to do.
Ed Ludlow
So is the business model to be all things the brain Inside of the humanoid brain robot. On the inference side, the underlying software being trade on a trained on AMD accelerators just I don't what's the go to market I guess is what I'm asking.
Lisa Su
You should expect that our partnerships extend all through all of those levels. So we have the components that can power the humanoid robots, you know, sort of real time local capability which is very, very important. And then we also have the technology behind that in terms of terms of how to train and inference on these, on these humanoids.
Ed Ludlow
When last we met in person, it was in Washington D.C. and the President had just outlined a broad strategy for America and I and it really centered around infrastructure deregulation, allowing those building the infrastructure to move faster. That was kind of in the second half of last year. In the months that have followed, have you seen any signs that it worked and anything that you could point to that says yeah, people are able to build faster maybe to, to address some of the compute deficits we discussed?
Lisa Su
Well, I can say for sure, you know, the President's action plan, you know, when we met, I think this was back in July when it came out, I was very optimistic about having a really forward leaning strategy from sort of the, the whole view of what does it take for the US to lead in AI. And I think we've made a ton of progress along the way. And you know, I had Michael Kratzio joined us last night on stage as well to talk about the Genesis mission, which is, you know, another you know, sort of public private partnership approach to really advance science in the United States. And when you look at, you know, all of these things, you know, building faster, ensuring that we have the right export controls so that we are able to have the US stack adopted across.
Ed Ludlow
The right export controls.
Lisa Su
Currently we are certainly working very closely with the various parties in the US Government to ensure that we have the right balance there. And we also have, you know, this notion of how do we invest more here and ensure that in the United States States that we are, you know, running as fast as possible to bring, you know, capacity online to help us in science and you know, sort of the broader economic benefits.
Ed Ludlow
Lisa, what, what happens in 2026? What happens in the world of AI and what do you think will define this year in terms of the progress that your industry hopes to make?
Lisa Su
Well, I started our key keynote last night with the senses that, you know, you ain't seen nothing yet. That's really how I feel. I mean we're sitting here in January and it's just amazing how much progress is made, you know, every week and every month. When we see how these models are developing, when we see how the use cases are developing, and then when we see the tangible results on businesses and outcomes, I believe that, you know, we saw a good amount of that come to fruition in 2025. We're going to see much more of that in 2026. So that everyone should understand that, you know, AI is not just, you know, hype out there. It's not just, you know, sort of things that people are talking about in the investment community. It's things that people are using every day, real time and feeling like, hey, my life is better because I have this technology. And I think we're going to see that in 2027. 6.
Ed Ludlow
Lisa Su, AMD CEO, AMD with it's in the world's first 2 nanometer chip going into Helios, its first rack scale system solution.
Caroline Hyde
This is Caroline Hyde and I'm Ed.
Ed Ludlow
Ludlow inviting you to join us for Bloomberg Tech, a daily podcast focusing exclusively on technology, innovation and the future of business.
Caroline Hyde
Every weekday we bring you the top headlines from the world's biggest tech companies.
Ed Ludlow
From finance to defense, AI to entertainment, and from startups to the Magnificent Seven.
Caroline Hyde
We highlight the latest stories of the people and companies pushing the tech sector to new frontiers and the politics that shape global tech markets.
Ed Ludlow
We do this all every weekday, then bring you the most important conversations and analysis in our podcast.
Caroline Hyde
Search for Bloomberg Tech on YouTube, Apple, Spotify or anywhere else you listen.
Ed Ludlow
Join us every afternoon on your commute home and stay ahead of the tech news cycle.
Caroline Hyde
That's the Bloomberg Tech Podcast. I'm Caroline Hyde in New York.
Ed Ludlow
And I'm Ed Ludlow in San Francisco. Subscribe today wherever you get your podcasts.
Date: January 6, 2026
Host: Ed Ludlow (Bloomberg)
Guest: Dr. Lisa Su, CEO, AMD
Location: CES, Las Vegas
In this episode taped live at CES 2026, Bloomberg’s Ed Ludlow talks with AMD CEO Dr. Lisa Su about the company’s groundbreaking new data center chip—MI455X—which is the world’s first 2-nanometer chip for rack-scale systems, the accelerating global demand for AI compute, AMD’s enterprise and edge strategies, expanding markets like China, and the role of AI across industries and the broader economy.
“The demand for AI compute is just continuing to increase... MI455 is a significant leap forward in terms of technology capability.” — Dr. Lisa Su (01:01)
“Enterprises are starting to increase their adoption of AI... In some cases they want their own control of their data centers.” — Dr. Lisa Su (03:40)
“Our claim to fame is really outstanding performance at very advantaged total cost of ownership… and an open ecosystem.” — Dr. Lisa Su (05:42)
“The models we have today are great... but it can get so much better... we’re still in the early innings of unlocking the power of AI.” — Dr. Lisa Su (07:45)
“It's not any one thing… we’re all looking at how do we build faster?... that's why partnership is just so important.” — Dr. Lisa Su (09:49)
“We are very focused on an open ecosystem... it’s all about ensuring that we get the best of all worlds.” — Dr. Lisa Su (11:10)
“It is just incredible engineering at every level.” — Dr. Lisa Su (12:16)
“…every CEO that I talk to is talking about AI. It is front and center in terms of how to build a better company.” — Dr. Lisa Su (18:48)
“You ain’t seen nothing yet... AI is not just hype out there... it's things that people are using every day, real time and feeling like, hey, my life is better because I have this technology.” — Dr. Lisa Su (23:23)
Predicting the “Yadaflop” Revolution:
“A yada flop is 10 to the 24th in terms of flops. To give you a view... that's another 100 times more compute than we have today.” — Dr. Lisa Su (07:12)
AI’s Perceived Limits and Potential:
“The models... today are great...but it can get so much better... we’re still in the early innings.” — Dr. Lisa Su (07:45)
Market Share & AI Revenue Outlook:
“We’ve talked about tens of billions of dollars in AI revenue as we get into 2027.” — Dr. Lisa Su (15:27)
On Accelerated Engineering:
“MI455 is 10 times better than the chip that we just launched six months ago... MI500 is another 10X on top of that.” — Dr. Lisa Su (12:15)
This discussion is energetic, technically rich, and optimistic throughout, with Su expressing excitement for both the state of AI adoption in 2026 and the explosive hardware advances driven by AMD. The conversation balances technical details with big-picture analysis of AI’s economic and societal impact, always with a candid and forward-looking slant.