
Likes Received
Rate Of Return💢💢💢

🔥🚀 Jensen Huang is playing a much bigger chess game: $NVIDIA(NVDA.US) is not just selling computing power, but is "deciding who can use computing power".
In the past, $NVIDIA(NVDA.US) was always seen as the "shovel seller".
But over the past six months, I've started to realize that this judgment is outdated.
The real change that has occurred is—Jensen Huang is no longer just providing GPUs, but is using capital to redesign the entire power structure of the AI industry.
The nearly $19 billion in investments over the past 6 months is not a simple financial layout, but more like a complete "ecosystem control system".
When I break it down, the logic becomes very clear: it's not about spreading bets, but about making a systematic layout around two core questions: "how is computing power demand amplified + who consumes computing power".
The first layer, which I think is the most critical: controlling the underlying bottlenecks.
$Synopsys(SNPS.US)
$Intel(INTC.US)
When AI enters the 100,000-card cluster level, the real limitation is no longer the performance of a single GPU, but design efficiency, packaging capabilities, and data transmission.
Investing in EDA ($Synopsys(SNPS.US)) is essentially accelerating "the speed at which the next generation of GPUs are born".
Investing in $Intel(INTC.US) provides an additional strategic backup for advanced packaging and foundry, reducing reliance on a single supply chain.
Looking further, the optical communication line is more direct.
$Lumentum(LITE.US)
$Coherent Corp.(COHR.US)
Without 800G, 1.6T transmission capabilities, even the strongest GPUs are islands.
The logic here is simple: the stronger the computing power, the more valuable the network.
The second layer, which I think is the most "ambitious" part: the distribution rights of computing power.
$Coreweave(CRWV.US)
$Nebius(NBIS.US)
Traditional cloud providers are both customers and potential competitors.
If computing power in the future is completely controlled by platforms like AWS and Azure, then $NVIDIA(NVDA.US) will essentially be "platformized".
So it is instead supporting a group of cloud providers that "only sell GPU computing power".
$Coreweave(CRWV.US) getting priority supply of the most advanced GPUs is not an ordinary cooperation; this is "creating new computing power distribution centers".
In other words, $NVIDIA(NVDA.US) is avoiding being marginalized.
The third layer, which I think many people underestimate: energy.
AI is not a software problem; it's essentially a power problem.
When data center scale explodes, electricity will become the real ceiling.
Investing in models that combine energy and computing power like Crusoe essentially solves one problem:
"When global computing power demand explodes, where does the electricity come from?"
If this line holds, it will be a completely independent major track in the future.
The fourth layer is the future demand engine.
This is where I think is most worth watching.
$Eli Lilly(LLY.US)
$Recursion Pharmaceuticals(RXRX.US)
$Nokia(NOK.US)
AI drug discovery, communication edge, humanoid robots, autonomous driving…
These seem scattered, but the logic behind them is the same:
Pulling AI from "inside the screen" into the "physical world".
Once it enters the real world, computing power demand is not linear growth, but exponential amplification.
Training will converge, but inference will explode.
And inference is the long-term cash flow.
The fifth layer is the true core of the moat: software binding.
OpenAI, xAI, Mistral, Cohere, Perplexity…
These are not "investment hotspots", but the "lock-in mechanism" of the CUDA ecosystem.
As long as these models continue to rely on CUDA for optimized training and inference, the hardware advantage will not be easily replaced.
This is the deepest layer.
Looking back at this $19 billion now, I no longer see it as "investment".
It's more like an active offense:
Not waiting for demand to appear, but creating the demand itself.
Not waiting for customers to grow, but locking in future customers in advance.
Not selling products, but defining the operating method of the entire industry.
This is also why I increasingly feel—
The valuation debate about $NVIDIA(NVDA.US) is not essentially about "expensive or not", but:
Has the market understood that it has transformed from a "chip company" into an "ecosystem-level infrastructure"?
If this judgment holds true, then future opportunities will likely not only be concentrated in $NVIDIA(NVDA.US) itself, but will continuously spread along the flow of its capital.
Optical communication, energy, edge computing, AI applications…
These are the areas that will be amplified.
The question also becomes more specific:
When a company starts to decide "who can use computing power", where exactly are its boundaries?
Do you prefer to continue betting only on $NVIDIA(NVDA.US) itself, or follow its investment map to find the next layer of opportunity?

The copyright of this article belongs to the original author/organization.
The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.

