The AI gold rush has a dirty secret. That means most infrastructure investors are supporting the wrong operators. As tech giants spend more than $320 billion on data centers this year alone, up from $241 billion in 2024, there are key questions that will separate the winners from the losers. It’s not about who has the most equipment. Who can actually scale?
“Not all infrastructure investments are created equal,” said Nnadozi Odinaka, a strategic finance expert with an MBA from Georgia Tech who has advised business leaders in Africa, Europe and North America on the growing challenge, in an interview with the Guardian. “Investors are focused on availability and power capacity. The real question is whether the target operator has the balance sheet to scale at the speed AI demands.”
The numbers reveal the stakes. AI workloads consume 4 to 6 times more power per server than traditional computing. Racks with advanced GPUs currently require up to 500 kilowatts. This is enough to power nearby areas. A single AI training cluster requires over 100 megawatts of power, which is equivalent to the power grid of a small city.
But Odinaka, a specialist in financial risk advisory at a major consulting firm, sees investors consistently underestimating two risks: capital intensity and execution complexity.
“If data center operators cannot secure substations, cooling infrastructure and construction permits in advance, they will miss the market window entirely,” he warns. “By the time they compete for megawatts, they’ve already lost.”
Mr. Odinaka leverages his Big 4 audit and finance experience analyzing infrastructure risks to identify red flags.
First, there is a lack of power procurement pipeline. “Operators that are winning today secured power contracts two years in advance,” he points out. “If they just start negotiating, you’re already late.”
The second is outdated cooling technology. AI density requires liquid cooling systems, which most legacy operators lack. Renovation costs can exceed 50% of the original budget.
“Cost overruns of 50 to 100 percent are common in this sector,” Odinaka says. “Operators need financial space.”
For infrastructure investors accustomed to real estate fundamentals, Odinaka recommends a different framework.
“Stop looking at your current capacity utilization; it’s backward-looking,” he advises. “Ask if you can install two to three times your current capacity within 18 months. Also ask about reserve power capacity for the next five years. Ask if your capital structure can absorb unexpected events.”
Award-winning operators share three characteristics: strong utility partnerships, modular construction capabilities, and diverse funding sources. “They treat scalability as a financial engineering problem, not just a construction problem,” he explains.
Economic research projects show that AI could add $15 trillion to global GDP by 2030, but only if infrastructure expands proportionately. For investors, it’s the potential to create generational wealth. For those who are poorly prepared, it is a stranded asset.
“In five years, we will see which operators understand that data centers are the foundation of the AI economy,” Odinaka says. “The foundation determines how high you can build and how much value you can capture.”
His message to the infrastructure investment community was blunt: AI is real, infrastructure demand is real, but not all bets pay off.
“This is not about choosing the carrier with the most facilities,” he concludes. “The key is to identify who has the financial discipline, strategic foresight, and execution ability to profitably scale. Operators who are building smarter financial frameworks and securing critical inputs ahead of demand are the winners.”
For investors betting on the infrastructure layer of AI, scalability is not an option. Nor does it have the economic sophistication to make it happen.


