The Next Trillion-Dollar AI Opportunity Is Here

(Kittipong Jirasukhanont/Dreamstime)

By Thursday, 25 September 2025 11:29 AM EDT ET Current | Bio | Archive

Artificial intelligence (AI) is no longer just a buzzword—it’s the driving force behind the biggest infrastructure buildout in history.

Tech giants are spending over $300 billion this year alone to expand their AI capabilities. That’s more than Switzerland spends running its entire government.

And yet, we’re only in Phase 1.

The next leg of the AI boom—Phase 2—will be even more disruptive and profitable.

Phase 1 was about Nvidia (NVDA) GPUs and training giant AI models.

Phase 2 is about three massive shifts now underway that will define where the next set of AI winners come from.

Let’s walk through each shift below…

Shift #1: Sovereign AI: A brand new, deep-pocketed customer

People often compare today’s AI boom to the dot-com bubble, with Nvidia as the Cisco Systems (CSCO) of its day.

The big difference: Cisco’s customers were debt-fueled. Nvidia’s customers are the largest, most profitable companies in the world.

As mentioned, US big tech companies will spend over $300 billion building out their AI data centers this year. That was Phase 1.

Phase 2 is even bigger: Sovereign AI.

Sovereign AI means every country is building its own “AI factory.” AI models trained on local data, running on local compute, under local control.

Special: Trump’s Trillion-Dollar AI Bet Will Mint the Next Wave of Millionaires... See Here

If you believe, as I do, that AI will run healthcare systems, defense planning, and education in the coming decades… do you really want those systems sitting on foreign chips in foreign data centers?

Imagine running America’s nuclear fleet on Chinese AI. Exactly.

Governments are already writing billion-dollar checks to make sovereign AI possible:

  • Europe is building out national supercomputers to give startups access to AI compute.
  • The UK just switched on its most powerful supercomputer ever.
  • India approved a 38,000 GPU national cluster to train models in 22 local languages.
  • Saudi Arabia earmarked $100 billion to turn itself into a regional AI powerhouse.

Canada, Japan, and South Korea are all in, too.

Soon, governments will spend more money on AI than big tech. Nvidia’s Jensen Huang put it best: “Every country will have an AI factory, just like every country has a telco.”

This is effectively a giant wealth transfer from the rest of the world to US AI companies that make the AI gear.

And that’s a great thing for clued-in investors.

Shift #2: From training to inference: compute goes prime time

When people talk about AI infrastructure, they usually mean thousands of GPUs training a giant model.

That’s the “training” phase.

Training is like building the engine of a Ferrari. You do it a handful of times, and it’s costly and complex. But once it’s built, it’s ready to race.

The real money is in driving the Ferrari every day, aka “inference.”

Every time you ask ChatGPT a question or have it generate an image, that’s inference. And usage is exploding.

Last spring, Google’s (GOOGL) Gemini models were processing 480 trillion tokens per month. That number has doubled to 980 trillion. ChatGPT will soon hit 1 billion monthly active users!

This changes everything about AI.

Training can be done in a remote desert data center using cheap power. Inference needs to sit closer to the user, run 24/7, and deliver answers in real time.

But that makes inference much more costly.

Urgent: This AI Boom Could Be Bigger Than Steel, Oil, and Tech Combined... Free Picks Here

Nvidia isn’t the big winner here. Walk into a data center, and alongside the GPUs you’ll see giant cooling fans… storage disks… memory chips… networking cables… and so on.

More “inference” means more heat… more energy consumed… more memory needed… and more data flowing through thick networking cables.

This shift changes where the money flows.

To move data faster between chips, these clusters need more connections per rack and faster optical links, driving a massive upgrade cycle in optics.

On Nvidia’s new GB200 racks, the optics bill alone can top $500,000. And demand for low-latency communication favors cutting-edge AI networking solutions like Ethernet.

The key takeaway is we’re shifting from huge, one-off training runs to continuous infrastructure spend. Inference creates sustained demand for AI gear. If the training era was Nvidia’s show, the inference era is where the broader ecosystem shines.

Get ready for a mad dash to overhaul AI data centers.

Shift #3: Data center spending is rotating to power, cooling, and networking

In 2006, Google built its first serious data center in Oregon. It cost about $600 million, a big deal at the time.

Fast-forward to 2025, and the bill for OpenAI’s “Stargate” project with Oracle Corp. (ORCL) and SoftBank is coming in at $500 billion! Individual sites within this project will consume more power than a small city.

These aren’t your father’s data centers. One AI rack now guzzles the same amount of electricity as a dozen legacy racks.

The big change is that AI data centers must act as one giant computer, not a warehouse of single servers.

Training and inference require every GPU to talk to every other GPU at blistering speeds. That’s why companies are packing chips closer together and spending billions on optical networking gear.

The bottleneck is no longer chips. It’s power.

A rack of servers stuffed with AI chips needs about 10X more power than a normal “cloud” server. Meta’s Hyperion cluster in Louisiana alone will consume roughly as much power as all of San Francisco.

Electricity = oil for AI.

AI’s huge power demands are also forcing the whole industry to shift to liquid cooling.

Old-school cloud centers were cooled like office buildings with giant air conditioners pushing cold air through rows of servers. But AI chips run so hot, air can’t carry the heat away fast enough to keep them from melting (literally).

That’s why liquid cooling is becoming to go-to. Water is 17X more effective than air at removing heat.

What does this mean for AI investors?

The next trillion dollars of AI spending won’t go toward chips alone…

But, rather, the companies providing the pipes and power needed to keep AI data centers running. This is where the Phase 2 fortunes will be made.

If you want to keep up with these shifts as they unfold in real time—along with the best ways to profit from them—make sure you’re signed up for my free disruption-focused letter, The Jolt.

I regularly discuss AI in The Jolt and the best ways to profit from this fast-moving megatrend.

It’s quick and easy to sign up. Just go here to join today.

______________

Stephen McBride is Chief Analyst, RiskHedge. To get more ideas like this sent straight to your inbox every Monday and Friday, make sure to sign up for The Jolt, a free investment letter focused on profiting from disruption.

© 2025 Newsmax Finance. All rights reserved.


StephenMcBride
Artificial intelligence (AI) is no longer just a buzzword -- it's the driving force behind the biggest infrastructure buildout in history.
ai, investing, data, centers, energy, electricity
1134
2025-29-25
Thursday, 25 September 2025 11:29 AM
Newsmax Media, Inc.

View on Newsmax