Dispelling the boogeyman of AI circularity
Portfolio Manager Denny Fish draws a contrast between recent artificial intelligence (AI) transactions and those of the dot.com era, noting that rather than being an example of unproductive “circularity,” the current crop of deals represents a potentially virtuous circle for a nascent industry.

5 minute read
Key takeaways:
- A series of large transactions between major players within the AI ecosystem have led some analysts to question the economic merits of these deals.
- Rather than the sometimes-dubious practice of vendor financing, the current spate of transactions, which often include equity stakes, seeks to align the nascent AI ecosystem with its objective of meeting massive demand for computing capacity.
- Investments between AI platforms and chipmakers should create a flywheel premised on greater computing capacity and innovation leading to increased demand and ultimately resulting in monetization and additional investment.
Since the 2022 release of ChatGPT, AI has been the driving force behind the equity market’s historic gains. Invariably, when stocks leap from record close to record close, naysayers appear, seeking to poke holes in a rally’s underlying thesis. Such critiques are an essential component of markets operating efficiently. Some criticisms, however, hold up better than others.
Those concerned that the AI rally has become unmoored from economic fundamentals have recently bandied about the term circularity. It’s not a compliment. Instead, it’s a not-so-veiled reference to the sometimes-dubious business practice of vendor financing that gained popularity during the 1990s rise of the Internet.
Critics point to the recent slew of deals between AI ecosystem heavyweights. Chief among these is graphic processing unit (GPU) maker Nvidia inking a $100 billion transaction with ChatGPT creator OpenAI – a tie up that includes a 10% equity stake in the startup.
On this transaction’s heels, OpenAI struck another one with chipmaker AMD where it gets access to roughly six gigawatts of GPU power. In contrast to the Nvidia deal, OpenAI gains the ability to take up to a 10% position in AMD should the partnership achieve certain milestones. And only this week, Broadcom got in on the action with its own partnership with OpenAI for data center access.
In contrast to the vendor financing of an earlier era, these transactions, in our view, represent an effort by a nascent industry to address the current gaping supply imbalance for AI computing capacity. This imbalance is based upon widely accepted forecasts for voracious demand over the next decade. Some estimates call for AI infrastructure investment to rise from this year’s $600 billion to potentially as high as $4 trillion by decade’s end1.
The players
To understand why these AI pioneers are taking steps to align their interests, one must assess the current landscape, namely the frenetic race to acquire still-to-be produced compute capacity. The initial assumptions that the AI training phase would be the most compute intensive largely proved to be off target. The rise of test-time inference – AI’s operational phase – has required much more computing power than anticipated given the need for models to digest the vast quantities of new data created during each iteration of AI queries.
To fortify their models’ rapidly expanding capabilities, AI platforms such as OpenAI need additional computing capacity in the form of advanced GPUs. As evidenced by the values attached to recent deals, this costs a lot of money. Given its swollen order books, Nvidia has funds to invest. For its part, OpenAI will deploy this capital by building out data centers powered by Nvidia chips. Central to this thesis is the anticipated demand from a massive opportunity set of global customers seeking to leverage OpenAI’s capabilities.
OpenAI’s deal with AMD also aims to secure additional computing capacity over the next several years. In addition to OpenAI purchasing multiple generations of AMD chips, the companies will become even more aligned due to possibility of the AI platform taking up to a 10% ownership stake in the chipmaker. In both transactions, these players are synchronizing their economic interests so they can together achieve the necessary computing capacity to propel AI to its next stage, which is widespread implementation across the broader economy.
Hope is not a strategy
Vendor financing has long been an accepted – and often viable – business practice, provided its intention is for vendors to help customers maintain sufficient cash flows in the course of operations. This was often not the case in the 1990s, when vendors were not assisting customers in meeting downstream demand but rather selling their products on credit in the hopes that future demand would ultimately materialize. In many instances, it didn’t.
This is where critics’ AI circularity argument falls flat. In contrast to the if you build it, they will come mantra of many early Internet platforms and fiber networks, the rapid advancement of AI platforms and their equally rapid deployment by corporations, governments, and research institutions, in our view, hints that current estimates could undershoot eventual demand.
The 1990s Internet analogy also fails to consider the differences between debt and equity financing. Vendor financing was much more transactional, seeking to achieve a near-term commercial objective. The risk – in addition to demand for purchased capacity never materializing – was lending corporations entering these agreements to inflate their order books.
By its nature, the residual ownership of an equity stake means the financing partner benefits only when its customer is successful. In this respect, the current crop of AI-related transactions is the most recent in a series where key players seek to ensure this nascent ecosystem has sufficient infrastructure to meet anticipated demand. Earlier deals following similar strategies include Microsoft’s seed investment in OpenAI and Amazon investing roughly $8 billion in Anthropic, which entailed the AI startup using Amazon Web Services to host its models.
A virtuous circle?
Within this context, we don’t believe the slight – intended or not – of circularity is the most apt description of the collaboration occurring within a quickly evolving AI ecosystem.
More accurate, in our view, is a flywheel. This term implies a virtuous circle comprised of ongoing innovation and greater capabilities leading to higher demand and, ultimately, monetization. In formalizing relationships that not only could put AI adoption on a higher trajectory but also may improve their own profitability, these companies are seeking to carry out one of their most important tasks: effectively allocating capital.
Lastly – and for needed perspective – the recently announced deals between AI players represent only a sliver of the revenue streams potentially in play. Nvidia’s deals at maturity, for example, could account for just 10% of its possible sales over this time horizon. The cross-fertilization between AI infrastructure companies should, in our view, lay the groundwork for accessing the much larger earnings pie of tech hyperscalers2, sovereign buyers, and innumerable other AI end users across the global economy.
2 Hyperscalers tend to be megacap technology and internet companies whose massive investment in capital expenditure is a key source of growth for technology infrastructure providers (e.g. GPU producers).