An analyst reviews AI funding cycles and adoption trends, raising concerns about the reality of AI growth demand.
Imagine a closed-loop system where one part is perpetually paying another to fuel an impression of growth. That is the essence of the concern surrounding the current AI investment frenzy.
AI growth demand has reached historic levels, yet as former Intel CEO Pat Gelsinger recently cautioned, a significant portion of this investment may be “circular,” effectively meaning major AI firms are funding their own growth without proving genuine, sustainable real-world adoption.
This scrutiny arrives at a critical juncture. The AI sector is witnessing unprecedented capital inflow, with billions poured into startups, infrastructure, and chip development.
But is this surge driven by end-user demand for AI-powered products or by the internal financing ecosystem of the tech giants themselves? Understanding this distinction is vital to grasping the true stability of the AI economy.
The Ecosystem: A Closed Circuit of Capital
What exactly is “circular financing” in the context of technology? It is a scenario where the primary buyers of a service or product are the companies that sell the foundational tools for that service. Think of it like a chain reaction within a small community of major players.
For AI, this loop involves several key components:
- The Infrastructure Sellers (e.g., Cloud and Chip Makers): These companies sell the indispensable processing power, specialized chips, and cloud services required to train and run large AI models.
- The AI Model Developers (e.g., Generative AI Startups): These firms raise capital from venture capitalists and tech giant corporate venture arms. They then immediately spend the vast majority of that capital on buying chips and cloud compute time from the Infrastructure Sellers to train their models.
- The Financiers (e.g., Tech Giants and VCs): They invest in the AI Model Developers, completing the loop by recirculating capital back to the Infrastructure Sellers, whom they may also own or rely on heavily.
The result is a dizzying spike in revenue for the Infrastructure Sellers, which is then cited as evidence of massive AI growth demand. However, the money is moving in a circle, fueling internal production without yet demonstrating widespread, profitable, real-world utility for the AI models themselves.
The Question of True Utility
The core problem is separating the production metric from the consumption metric. The industry is currently measuring success by how many chips are sold or how much compute time is billed, which reflects production capacity. A more stable metric would be how much value is created downstream for non-tech businesses or everyday consumers; that is, real consumption.
If a startup raises a half-billion dollars and spends 80% of it on computing infrastructure, that half-billion dollars counts as AI growth demand for the cloud provider. Yet, if the final AI product the startup creates cannot generate significant revenue from external customers, the financial engine is effectively running on its own fuel and may stall when the internal financing dries up.
Strategic and Societal Implications
This dynamic carries two major implications. Strategically, the potential for a bubble is real. If the surge in hardware and cloud revenue is tied more to a funding cycle than a usage cycle, the market correction could be abrupt if the AI Model Developers fail to monetize their products or if investment slows.
Societally, this model centralizes control. The few companies that control the infrastructure are the primary beneficiaries, creating massive moats around their businesses. This concentration of power raises competitive and ethical questions about who dictates the future of artificial intelligence.
The key takeaway is a call for discernment. The sheer technological progress in AI is undeniable and awe-inspiring. But for investors, business leaders, and policymakers, the real challenge is looking past the headline revenue figures and focusing on where the money is actually coming from and, crucially, whether it is generating genuine, profitable use cases that justify the massive infrastructure spend.
The future of AI relies less on how much compute power we can build and more on how much lasting value that power can deliver to the wider economy.






