1 Super Semiconductor Stock (Besides Nvidia or AMD) to Buy Hand Over Fist for the Artificial Intelligence (AI) Revolution


The artificial intelligence (AI) revolution wouldn’t be possible without the semiconductor industry. The majority of development happens inside data centers that are filled with graphics processing units (GPUs) from leading suppliers like Nvidia (NVDA 3.16%) and Advanced Micro Devices (NASDAQ: AMD).

However, AI workloads are also demanding more memory and storage capacity not just in data centers, but also in personal computers (PCs) and smartphones, and Micron Technology (MU 2.32%) is a top supplier of that hardware. The company just released its financial results for its fiscal 2025 second quarter (ended Feb. 27), and they revealed soaring revenue led by AI-related demand.

Here’s why Micron stock could be a great buy for investors seeking exposure to the AI boom.

Image source: Getty Images.

AI requires an increasing amount of memory

Memory is a critical component in data-intensive AI training and AI inference workloads. It stores information in a ready state so it can be called upon by GPUs instantly, which speeds up processing time. Micron’s HBM3E (high-bandwidth memory) for the data center is the best in the industry, delivering 50% more capacity than competing solutions while consuming 30% less energy.

Nvidia is using Micron’s HBM3E in its industry-leading Blackwell GB200 GPU, which is currently the gold standard for AI development. Nvidia will also use it in the upcoming Blackwell Ultra GB300 GPU, which will deliver even more processing power for AI applications. Since Nvidia’s top four customers have ordered 3.6 million Blackwell chips already, it’s no surprise Micron is completely sold out of its HBM3E solutions for calendar year 2025, and is already experiencing strong demand for its 2026 supply.

Micron says the market for HBM was worth $16 billion in 2024, and is set to more than double to $35 billion this year. It could then be worth $100 billion by 2030, so there will be astronomical financial rewards for staying ahead of the competition. In order to do so, Micron plans to launch its new HBM4E solution for the data center in 2026, which will provide a whopping 60% increase in bandwidth compared to the previous generation.

But Micron’s AI opportunity transcends the data center. As chips become more powerful, AI workloads will shift to PCs and smartphones, allowing chatbots and other applications to run offline. This will create a faster user experience and make them accessible from anywhere. The company says AI PCs already require a minimum DRAM (memory) capacity of 16 gigabytes, up from an average of 12 gigabytes for non-AI PCs last year.

Similarly, most AI smartphones now require 12 gigabytes of capacity or more, up from 8 gigabytes last year. Micron’s smartphone memory solutions are used in a number of Android-powered devices from top manufacturers including Samsung.

More memory capacity translates into more revenue for Micron, so the company is positioned to win whether AI workloads are processed in data centers or on devices.

Micron’s revenue is soaring, led by the data center

Micron generated $8 billion in total revenue during its fiscal 2025 second quarter, which represented a 38% increase compared to the year-ago period. However, there was a much bigger growth story beneath the surface of the headline number.

Revenue from Micron’s compute and networking segment, which is where it accounts for its data center memory sales, soared by a whopping 109% to a record $4.6 billion. Plus, revenue attributable just to HBM made up $1 billion of that figure, which was also a record high.

On the flip side, revenue from Micron’s mobile segment shrank by 33% to $1 billion, as customers had some built-up inventories that softened demand. However, the company expects to see modest growth in the mobile business this calendar year overall, especially as AI smartphone adoption kicks into high gear.

Micron’s strong top-line result led to a significant increase in its profitability, with earnings per share (EPS) doubling to $1.41. That trend is likely to continue in the current fiscal 2025 third quarter — the company is forecasting $8.8 billion in revenue and $1.37 in EPS, representing year-over-year growth of 29% and 356%, respectively.

Micron stock looks like a bargain relative to its peers

Since some of Micron’s hardware like the HBM3E memory is already sold out this year, there is some predictability to the company’s financial results. Wall Street’s consensus forecast (according to Yahoo!) suggests its EPS will come in at $6.93, placing its stock at a forward price-to-earnings (P/E) ratio of just 13.6.

That’s a 40% discount to AMD’s forward P/E ratio of 22.7, and an even steeper 47% discount to Nvidia’s forward P/E ratio of 25.9:

NVDA PE Ratio (Forward) Chart

NVDA PE Ratio (Forward) data by YCharts

As I highlighted earlier, Nvidia is using Micron’s HBM3E in its flagship GPUs like the GB200 and GB300. Since Nvidia already has orders for millions of those chips, Micron is likely to experience monumental sales growth in tandem over the next couple of years. As a result, it’s hard to justify the steep discount in Micron stock.

Plus, Micron will benefit significantly as AI workloads shift from data centers to PCs and smartphones, so the company is perfectly positioned to capitalize on this technological revolution. As a result, I think Micron stock could be a great addition to any balanced portfolio.



Source link

About The Author

Scroll to Top