The gamble carries enormous risk and potential reward. Memory shortages have already sent prices soaring, reshaping global supply chains and profit margins. For Micron, every delayed shipment or misstep could ripple across the AI ecosystem, influencing how cloud providers, chip designers, and AI startups scale in the coming years

For decades, memory chips were low-margin commodities. Now, demand has outstripped supply as AI workloads explode. Micron Technology, the largest American maker of memory chips, is racing to expand production capacity with a $200 billion investment in new factories.
In Boise, Idaho, controlled explosions shake the ground each afternoon as engineers flatten bedrock for a new semiconductor plant. The company is spending $50 billion to more than double its 450-acre campus, including two massive chip factories. The first is expected to produce DRAM wafers for high-bandwidth memory chips, or HBM, by mid-2027. Both plants should be online by the end of 2028. Each factory will cover 600,000 square feet, using 70,000 tons of steel and 300,000 cubic yards of concrete.
Micron is also investing in other sites. Near Syracuse, New York, the company broke ground on a $100 billion fab complex, marking the state’s largest private investment. It also announced a $9.6 billion fab in Hiroshima, Japan. Competitor SK Hynix is building a $13 billion fab in South Korea and a $4 billion facility in Indiana.
The rush to expand comes as large language models grow in size and complexity. Firms like OpenAI, Oracle, xAI, and Anthropic are building data centers at an unprecedented scale, increasing demand for faster, higher-capacity memory chips. Processors from Nvidia, Google, Broadcom, and AMD require more memory for both training AI models and performing inference.
“I’ve been here for 28 years, and I’ve never seen anything so disruptive as AI,” Scott Gatzemeier, Micron vice president overseeing the U.S. expansion told the WSJ. “As we started to transfer from training to inference, the amount of data required just exploded, and we just didn’t have enough clean-room capacity to satisfy demand. We realized we had a huge problem.”
The shortage has fueled a gold rush. Since April 2025, Micron’s share price has more than sextupled to around $414, giving the company a market value near half a trillion dollars. As it moves toward more profitable products such as HBM chips for data centers, gross margins have jumped from 18.5 percent in early 2024 to 56 percent in the most recent quarter. Micron expects margins in the current quarter to reach 68 percent.
“Our business is on an extraordinary trajectory,” said Mark Murphy, Micron chief financial officer, at an investor conference. The company currently meets only about half to two-thirds of demand for some key clients. Buyers are seeking multiyear contracts to lock in supply and avoid price spikes.
Historically, memory-chip makers have been vulnerable to boom-and-bust cycles. After the pandemic, rising interest rates and inflation caused a pullback in device sales, leaving warehouses full of unsold chips and wiping out tens of billions in market value. Companies cut production to stabilize prices, but the AI surge since 2024 has reversed the trend. Contract prices for DRAM chips rose more than 170 percent over the last year, and DDR5 chips attached to AI servers have jumped nearly 500 percent since September 2025.
“We’re nowhere near the end of the shortage,” said Brad Gastwirth, head of global research at Circular Technology, a data-center reseller. “I think it lasts through the end of 2026 and at least the first half of 2027.”
Sumit Sadana, Micron chief business officer, said the company has accelerated construction of its second Boise fab as data-center projects expanded. “Memory has gone from being a system component to being a strategic asset,” he said. “The promise of AI is all ahead of us.”
Even as demand surges, Micron faces pressure from competitors and customer perceptions. Earlier in February, SemiAnalysis reported that Nvidia had rejected Micron’s HBM4 chips for its new Vera Rubin AI servers. Sadana called the reports inaccurate, stating that Micron is already shipping HBM4 and HBM3e chips, with supplies sold out through the end of the year.
Micron’s investments reflect a transformation in the memory market. Once a commodity business, memory is now central to AI’s future, and the company is betting billions to avoid becoming a bottleneck.
Get the latest news and insights that are shaping the world. Subscribe to Impact Newswire to stay informed and be part of the global conversation.
Got a story to share? Pitch it to us at info@impactnews-wire.com and reach the right audience worldwide
Discover more from Impact AI News
Subscribe to get the latest posts sent to your email.


