-
The buildout of artificial quality (AI) infrastructure could beryllium a $7 trillion opportunity.
-
GPU designers are the astir evident beneficiaries of rising information halfway infrastructure spending.
-
Memory specializer Micron is positioned to turn alongside GPU stocks arsenic hyperscalers proceed to determination immense sums into constructing AI information centers.
-
10 stocks we similar amended than Micron Technology ›
When it comes to artificial quality (AI) stocks, the semiconductor manufacture is 1 of the astir intimately followed.
Parallel processors from spot designers similar Nvidia, Advanced Micro Devices, and Broadcom person go the hardware backbone of generative AI. Meanwhile, Taiwan Semiconductor Manufacturing remains possibly the astir lucrative pick-and-shovel AI commercialized successful the marketplace fixed its starring presumption successful the spot fabrication space.
Another spot banal that has enactment connected an awesome amusement passim 2025 is Micron Technology (NASDAQ: MU) -- its shares person soared 188% this twelvemonth alone. After this benignant of rally, investors mightiness fearfulness they've missed their accidental to get into the stock, but I'd promote taking a longer-term perspective.
Because Micron is specified a captious portion of the wide spot landscape, there's a lawsuit to beryllium made that adjacent now, it remains an underrated AI stock.
Much of the chatter surrounding the semiconductor manufacture relates to however graphics processing units (GPUs) and customized application-specific integrated circuits (ASICs) are being deployed to supply processing powerfulness successful information centers. For this reason, specialty players similar Micron person mostly been overshadowed by the likes of Nvidia, AMD, and Broadcom.
The absorbing nuance to recognize here, however, is that Micron doesn't adjacent vie with these companies. Rather, it's positioned to turn alongside its larger semiconductor cohorts.
At its core, Micron builds representation and retention chips for physics devices and information centers. The company's DRAM (short-term memory), NAND (long-term storage), and high-bandwidth representation (HBM) solutions are indispensable to let the processors handling AI workloads to pass with each different and process accusation efficiently crossed GPU clusters.
A caller study published by absorption consulting steadfast McKinsey & Company suggests that concern successful AI infrastructure could scope astir $7 trillion implicit the adjacent 5 years. In addition, Goldman Sachs is forecasting that the large hyperscalers -- Microsoft, Alphabet, Amazon, and Meta Platforms -- could walk astir $500 cardinal connected AI superior expenditures (capex) implicit the adjacent year.

20 hours ago
1




English (CA) ·
English (US) ·
Spanish (MX) ·