Over the past few years, the commercial adoption of large language models (LLMs) has made substantial progress, driving massive and rapid inflows of capital into the AI investment market. As AI technology continues to advance, the competitive and cooperative dynamics among AI infrastructure providers and platform operators are taking shape. AI has evolved from “single-point innovation” to “system-level competition,” where suppliers and cloud providers are forming interconnected ecosystems. Nvidia’s recent wave of strategic investments serves as a prime example.
Below we will be sharing an in-depth look at the latest major developments shaping the AI investment landscape.
Key Highlights in Recent AI Investments

Oracle & OpenAI
OpenAI and Oracle announced a $300 billion deal last month, scheduled to begin in 2027. The partnership is part of Project Stargate: Oracle will provide the cloud infrastructure and power management systems OpenAI needs. Under the agreement, Oracle will supply up to 4.5 GW of computing power capacity, while OpenAI has committed to long-term usage of Oracle Cloud Infrastructure (OCI) services.
Want to learn more about Project Stargate? You can read this article: What is the Stargate Project? What are the challenges? Which Taiwanese concept stocks might be involved?
Oracle & Nvidia
To support future computational growth, Oracle plans to significantly expand its Nvidia chip procurement. One of the first deployment sites will be the new hyperscale data center in Abilene, Texas, with procurement alone estimated at around $40 billion. This will substantially boost Oracle’s AI computing power on its cloud platform and strengthen its role as one of OpenAI’s key infrastructure providers under the Project Stargate framework.
Nvidia & OpenAI
Nvidia plans to invest up to $100 billion in OpenAI in multiple stages, starting with an initial $10 billion for the construction of Project Stargate data centers. Under the agreement, OpenAI will purchase large quantities of Nvidia’s Blackwell-series accelerator chips in the coming years. This marks another deep collaboration following Nvidia’s earlier participation in OpenAI’s funding round.
OpenAI & AMD
OpenAI and AMD have reached a new strategic partnership aimed at reducing dependency on a single supplier. Under the deal, AMD will gradually supply OpenAI with GPU computing capacity totaling up to 6 GW over the coming years. The first phase, deploying 1 GW of Instinct MI450 GPUs, will start in the second half of 2026, with expansion to the full 6 GW planned later. In exchange, AMD granted OpenAI stock warrants allowing it to purchase up to 160 million AMD common shares at a symbolic price of $0.01 per share. The warrants are tied to AMD’s stock price and the scale of GPU deployment.
Nvidia & Intel
Nvidia plans to invest $5 billion in Intel, acquiring approximately 215 million shares at about $23 per share as part of a broader strategic investment. The market widely views this as a move to help Intel strengthen its position in AI chips and collaborate with Nvidia on optimized CPU-GPU integration.
Nvidia & CoreWeave
Nvidia has long been cultivating its AI ecosystem and built a close partnership with GPU-focused cloud provider CoreWeave. CoreWeave, an early recipient of Nvidia’s investment and technical support, has now become one of Nvidia’s most important computing partners in North America.
Interaction in the New Capital Model: The Perpetual AI Capital Machine
As investment deals among AI giants surface, the collaboration between Nvidia, OpenAI, and Oracle has created what analysts are calling an “The Perpetual AI Capital Machine.” These companies are no longer just suppliers and customers—they are also cross-investors, forming a self-reinforcing cycle of capital and computing power. Through direct investment in clients like OpenAI, Nvidia effectively secures long-term demand for its GPUs.
The Controversy: Bubble Risk
The “win-win loop” Nvidia is creating implies a feedback cycle—more investment leads to more computing power, which produces stronger models. Better models generate better products and attract more end users, which in turn drives more demand for computing power.
However, some analysts warn that this seemingly virtuous cycle carries risks. If market growth slows or commercial applications fail to meet expectations, this “mutual investment model” could lead to overlapping revenue recognition and capital recycling—resulting in circular revenue, where investments and sales are double-counted across corporate balance sheets.
Summary
From a strategic standpoint, this cross-investment model highlights a form of vertical integration in the AI ecosystem: Nvidia controls chip and compute supply, Oracle builds the cloud and energy infrastructure, and OpenAI leads innovation at the application and model layer. This alignment could accelerate growth among top AI leaders—but it may also reduce overall market diversity and competition.
