AMD can see big growth with AI in 2024

The earnings whirlwind for technology and silicon companies continues this week with AMD reporting its Q4 and full year 2023 results. I know that many other stories and analysts will dive into the dollars more deeply than me, but it’s worth recounting a couple of the key points that I think are relevant to how we look at the ability for AMD to sustain its momentum into 2024.

From a Q4’23 perspective, looking at year-on-year comparisons to a 2022 that was by all accounts pretty bad for AMD (and the chip market in general), revenue was up 10%, gross profit up 10%, margin essentially flat, and operating income jump 12% to over $1.4B. Going down by the key business units, the data center group that is comprised of the EPYC CPU chips and the Instinct GPUs and AI accelerators saw a 38% increase in revenue and 50% increase in operating income. The client segment that accounts for the company’s CPU chips for consumer desktops and laptops had revenue increase 62% and op-income up 136%.

For the first time in any substantial way, the data center business unit is now the largest component of AMD’s revenue.

The total 2023 results, compared to 2022, are a bit misleading because of just how bad the 1H of the year was.

The best news for investors during the earnings call came from CEO Lisa Su when she announced AMD was raising projections of revenue for its MI300 family of AI accelerators for the data center segment from $2B to $3.5B, a massive 75% jump. This is based on better-than-expected ramping of product validation with customers, and the resulting increased demand because of it. I wouldn’t be surprised to see that $3.5B number that AMD is promoting for 2024 to be an under call; Su mentioned in the Q&A that they had built up the supply chain to ship “substantially more” than the $3.5B mark if demand is there.

AMD continues to tout a projected market size of $400B for AI accelerators by 2027, and though details on this projection are still a bit light, if the company can manage to capture just 5% of that by 2027 we are looking at a revenue target of $20B, with a steep curve up from 2024 to hit that.

Investors are likely questioning if the company can sustain this kind of growth and momentum, does it have the expertise to compete with the likes of NVIDIA and hold off the rise of chip startups and even in-house silicon expansion at cloud service companies. Su has built an execution machine at AMD, giving customers like Microsoft and Lenovo the confidence to commit their product lines and futures to AMD’s roadmap; something that 10 years ago would have been unthinkable. When the CEO mentioned casually on the earnings call that they were speeding up their AI chip product release schedule, similar to how NVIDIA announced a faster cadence of new products last year, there are few that should doubt that she can make it happen.

On the client side of things, the future is a bit more murky. The company recently announced its Ryzen 8000-series of chips for laptops that include a new, faster AI accelerator that makes it one of several new CPUs coming to the market for the AI PC. Later this year AMD will have its “Strix” family of chips that promises to improve AI performance by a factor of three while also introducing a new CPU architecture dubbed “Zen 5.”

So, while the product family that AMD has for 2024 in the consumer space looks to be high performance and offer compelling AI and graphics features, the market growth in the PC space is projected to be much lower than the data center. Even though most analysts see a “supercycle” coming in the PC space thanks to the expansion of compelling AI uses case to drive consumers to buy new systems, AMD is hedging a bit here.

Risks for AMD are bigger in the client space than the data center segment in my view. For enterprise and cloud service providers integrating AI accelerators and new chips, the ability to build and customize software for AMD products is a smart portion of the overall cost of doing business. If AMD is only worried about 10-20 key applications for its MI300 family of AI GPUs, it can focus engineering efforts. But in the consumer space where AI applications and workloads will number in the dozens or hundreds, Intel has a massive software development team that it can utilize that AMD does not. As a result, AI software may be more compatible with Intel AI chips sooner than AMD’s.  

Another part of the risk is Intel’s latest chips, like its Core Ultra family, are really good, and in a market where OEMs and consumers aren’t starving for new processors (like we see in the data center today), then AMD will have to compete more directly on its value proposition. Those Core Ultra chips tend to target higher priced systems, so AMD will only have to be better than Intel’s last generation products to win.

I still expect to see unit and revenue share increases for AMD in the client space through 2024. The company has continued to grow its footprint relative to Intel in 2022 and 2023 in the laptop space, and like the data center business, AMD benefits from its record of continued execution. As it edges up from a 15% market share in 2022 to almost 20% as of late 2023 for client chips, it seems inevitable that AMD will be able to continue growth to upwards of 30% in short order.