Skip to main content
Tags: ai | memory | investment
OPINION

Memory Is the Next AI Bottlebeck —and the Next Profit Opportunity

Memory Is the Next AI Bottlebeck —and the Next Profit Opportunity

Stephen McBride By Tuesday, 14 October 2025 09:49 AM EDT Current | Bio | Archive

Does the artificial intelligence (AI) megatrend feel a little long in the tooth?

ChatGPT will soon celebrate its third birthday.

And it’s been over two years since Nvidia’s (NVDA) blockbuster earnings lit the fuse on what’s become the most explosive megatrend of our time.

So it’s natural to wonder… is it time to move on?

Not even close.

A decade of investing in disruption has taught me a hard lesson: Megatrends last longer than you think. The real money is made by holding your winners—not rotating out of them too early.

AI is still the megatrend of our time… and the AI buildout is only accelerating.

When the music’s turned up to volume 10, you don’t sit out. You dance.

We’re still early in a multi-decade AI cycle. But the money is shifting—and that’s where the next wave of opportunity lies.

This isn’t a bubble. It’s a boom.

This year alone, big tech is expected to spend over $350 billion building AI data centers.

Many investors see that kind of spending and think “bubble.”

Some common pushbacks I hear, and my take:

“Where’s the revenue?”

The biggest AI spenders are currently spending way more on AI than they’re earning. But before revenues comes usage. And usage is exploding.

ChatGPT is the fastest-growing product in human history, with over 700 million weekly users. Google’s (GOOGL) Gemini recently shot to the top of the App Store.

This is what mass adoption looks like.

“They’re overspending.”

That’s not how the insiders see it.

Google co-founder Larry Page has said internally that he’s willing to go bankrupt rather than lose the AI race.

Meta Platforms (META) founder Mark Zuckerberg put it even more bluntly:

If we end up misspending a couple hundred billion dollars, I think that is going to be very unfortunate, obviously. But what I’d say is I actually think the risk is higher on the other side.

Anyone trying to call the top of the AI spending boom needs to remember those two quotes.

The people spending the money aren’t optimizing near-term “return on investment.” They believe the prize is a “digital-god-scale” platform worth trillions over decades.

When the prize is that large, the rational move is to keep investing.

“They’ll run out of cash.”

Not likely.

Big tech companies have so much spare cash lying around, they don’t know what to do with it.

They don’t have to spend this aggressively. They’re choosing to because they know what’s at stake by being left behind in the AI race.

The AI trade is entering Phase 2

Phase 1 of the AI boom was simple: Buy the companies building cutting-edge GPUs.

Thanks to that trend, Nvidia—the undisputed king of AI chips—became a trillion-dollar juggernaut almost overnight.

But we’re now entering Phase 2: feeding the GPUs.

AI data centers aren’t just rooms full of Nvidia chips. They’re three projects running at once: the brains, the plumbing, and the real estate.

For every $100 spent on a new AI data center, here’s roughly where the money goes:

  • The brains: $55–$65. AI chips, plus the high-speed networking gear, fiber optics, and switches that let chips talk to each other at light speed.
  • Power and cooling: $25–$35. All the equipment that gets electricity in and heat out. Giant transformers and the new generation of liquid-cooling loops that keep racks from melting down.
  • Land and buildings: $10–$20. The dirt, the concrete, and the pipes. Buying the site and putting up the shell of the building.

Three years ago, 75% of AI spending went to chips. Today, that number is closer to 50% and falling.

Why? Because we’re moving away from training AI models to using them.

And that subtle shift is creating a new round of AI winners.

Memory is the new bottleneck

There are two main types of computer chips:

  1. Logic: Nivida GPUs, which process data.
  1. Memory: Chips that store and deliver data.

AI models require logic chips to process massive amounts of data, but also memory chips that can store and release it quickly. That’s why AI servers are memory hogs.

A typical AI server uses 8X more memory than traditional servers. And with each generation, that number climbs even higher.

Think of it like this: If the GPU is the chef, memory is the pantry.

If the pantry is down the hall, the chef wastes time running back and forth to grab ingredients. But if it’s right next to the stove, the chef can cook much faster.

Right now, most AI models spend over 90% of their processing time waiting for data. This is called the “memory wall”—and it’s slowing everything down.

In other words, to make better AIs, we need a new way of feeding them data.

HBM: The hidden engine of Phase 2

To break through the memory wall, we need a new type of memory chip.

Enter High-Bandwidth Memory (HBM).

Instead of laying memory chips flat, HBM stacks them like a skyscraper right next to the GPU. The result? Data only has to travel millimeters, not inches.

Less distance = more data per second = better AI.

Every next-gen AI system is now being designed around HBM. And that’s setting off what I believe to be a new memory supercycle.

Three powerful forces are driving this:

  1. Longer conversations: AI models that once handled short paragraphs are chewing through entire documents. That requires more on-hand memory, especially next to the GPU.
  1. Exploding usage: Google now processes 980 trillion AI “tokens” every month. AI agents are booking travel, running code, and even conducting experiments. This level of usage requires always-on memory… and lots of it.
  1. New buyers: Nvidia is still the top HBM buyer, but Google, Amazon (AMZN), and Broadcom (AVGO) are all rapidly snapping up supply to power their custom AI chips.

Together, these shifts tell me we’re entering a new memory supercycle.

With megatrends, it pays to own the bottleneck.

Phase 1’s bottleneck was GPUs. Phase 2’s bottleneck is HBM.

PS: The AI boom is far from over… but its center of gravity is shifting. To stay ahead of the curve, you need to know where the money is moving next.

This is a topic I plan to cover closely in my free investing letter, The Jolt. To learn more about the technologies driving Phase 2 of the AI boom, and the best ways to invest in them, make sure you’re on the reading list. Go here to join The Jolt today.

______________

Stephen McBride is Chief Analyst, RiskHedge. To get more ideas like this sent straight to your inbox every Monday and Friday, make sure to sign up for The Jolt, a free investment letter focused on profiting from disruption.

© 2025 Newsmax Finance. All rights reserved.


StephenMcBride
Does the artificial intelligence (AI) megatrend feel a little long in the tooth?
ai, memory, investment
1118
2025-49-14
Tuesday, 14 October 2025 09:49 AM
Newsmax Media, Inc.

Sign up for Newsmax’s Daily Newsletter

Receive breaking news and original analysis - sent right to your inbox.

(Optional for Local News)
Privacy: We never share your email address.
Join the Newsmax Community
Read and Post Comments
Please review Community Guidelines before posting a comment.
 
TOP

Interest-Based Advertising | Do not sell or share my personal information

Newsmax, Moneynews, Newsmax Health, and Independent. American. are registered trademarks of Newsmax Media, Inc. Newsmax TV, and Newsmax World are trademarks of Newsmax Media, Inc.

NEWSMAX.COM
America's News Page
© Newsmax Media, Inc.
All Rights Reserved
Download the Newsmax App
NEWSMAX.COM
America's News Page
© Newsmax Media, Inc.
All Rights Reserved