
Alphabet Return RateWho is actually paying for computing power? Analysis of AI infrastructure funding chain based on JPMorgan's perspective

NVIDIA's largest and most core source of revenue currently is its data center business, which essentially involves selling GPU/computing power systems.
Therefore, for long-term investors in NVIDIA, one must clearly consider: Who is actually paying for this computing power? Can it form a commercial closed loop?
The market constantly worries about excessive capital expenditures and whether the application side can generate profits—this fundamentally boils down to the same question:
Can the investment in computing power ultimately translate into "real cash flow from paying customers"?
Coinciding with a credit research report from JPMorgan, I’ll analyze the upstream and downstream of AI from a more 'industry chain + financing' perspective: What is the current situation, and where might the risks lie?
Over the past two years, discussions about AI investments have largely revolved around the stock market: Who sells more GPUs, whose models are stronger, whose cloud revenue grows faster.
JPMorgan offers a new perspective: Looking at it from the credit market (corporate bonds, loans, project financing), one obvious fact emerges:
AI is no longer a 'concept'—it is becoming 'hard assets': data centers, GPU clusters, power, distribution, land, fiber networks.
The essence of hard assets is that they require real money, often amplified by debt.
So, who is actually paying for AI computing power?
1. JPMorgan introduces the concept of 'AI debt': What is it?
JPMorgan screened a basket of 'highly AI-related' issuers (around several dozen to over seventy) from investment-grade corporate bonds. These companies span multiple sectors: Technology, unregulated power producers (independent power producers, IPPs), regulated utilities, capital goods & manufacturing, and some media/entertainment, data center/communication infrastructure REITs.
In the tech sector, almost all the familiar names are on the list:
Oracle, Apple, Broadcom, Amazon, Intel, Microsoft, IBM, Cisco, Dell, Qualcomm, NVIDIA, TSMC, Micron, Synopsys, Cadence, Salesforce, Adobe, ServiceNow, etc.
They concluded that, under JPMorgan’s own USD investment-grade index, the weight of these 'AI-related issuers' has risen from 11.5% at the end of 2020 to 14.0% now.
My interpretation: AI’s capital expenditures have grown large enough to 'claim a spot' in the credit market—it can no longer be understood solely through stock narratives.
2. AI debt is growing rapidly—is it a bubble?
Many might assume: If AI stocks are soaring, bonds must also be inflated into a bubble.
But the logic of the credit market is entirely different from stocks:
• Stocks are bought for 'upside elasticity,' where sentiment and valuations can push extremes;
• Bonds are bought for 'repayment certainty,' with limited upside and painful downside, so the bond market is more conservative and focused on cash flow.
JPMorgan’s general view is: The spread of this basket of 'AI-related investment-grade bonds' has not shown sustained, extreme 'outperformance/compression.' Overall, it’s more like: Issuer quality is higher, so spreads are slightly tighter—but it’s not a credit bubble-type distortion.
Simply put: They acknowledge that the rapid rise of AI stocks has indeed unsettled some credit investors, who fear stock price corrections could trigger chain reactions in the bond market. But from a fundamental perspective, this concern may be overestimated.
3. The 'primary payers' of AI: Hyperscalers
The real drivers of AI infrastructure are the budgets of a few cloud giants:
• Alphabet, Amazon, Meta, Microsoft, Oracle
What are they doing? Four words: Buying land and building.
Specifically: Building data centers, buying GPUs, laying networks, upgrading power distribution, securing power (PPAs), and managing cooling.
This is also why many companies exhibit a typical feature:
Capital expenditure growth > Revenue growth
—First, deploy the infrastructure, betting that future workloads will become 'essential.'
JPMorgan believes the bottleneck for AI is shifting from 'chips' to 'power and supporting infrastructure.'
Computing power can be bought, but grids, grid connections, substations, and land approvals are slower and harder.
4. Trend signal: From 'fighting with cash' to 'fighting with cash + debt'
So far, most big tech firms have relied on strong cash flows to sustain AI investments.
But the market is starting to see a shift: Financing structures are evolving.
A frequently cited example is Oracle:
• Raising funds via a large bond issuance for refinancing and AI-related capital expenditures.
The symbolic significance is: In the past, this race resembled 'highly disciplined cash flow investment.' If competition intensifies, some participants may shift toward:
An arms race supported by both 'cash flow + debt.'
Moreover, financing won’t rely solely on corporate bonds—it will increasingly become a 'combo':
• Corporate bonds (on the parent company’s balance sheet)
• Project financing / SPVs (debt at the project level, repaid by project cash flow)
• Syndicated loans → distributed to loan funds/private credit
• Hybrid capital (equity + debt + mezzanine structures)
In plain terms: In the past, it was 'companies funding data centers themselves.' In the future, it’s more like 'companies + banks + private capital pooling money,' layering different risk and duration funds into projects.
This is also one of the market’s concerns: If demand falls short or financing windows tighten, the first to be affected won’t be the 'story' but refinancing costs, project timelines, and credit spreads.
5. The 'middle layer' of AI payers: OpenAI and 'specialized cloud' CoreWeave
OpenAI: The 'bridge' between infrastructure and applications
It may not build many data centers itself, but it packages models into products (API, subscriptions, ChatGPT), selling computing power as 'cognitive services' to businesses and individuals.
Its significance to the industry chain: Turning computing demand into sustainable paying demand (at least, this is the path everyone hopes for).
CoreWeave: GPU-specialized cloud, a classic 'financing-sensitive player'
It’s more like a 'wholesaler/middleman for computing power in the AI era': Serving startups and mid-sized firms while also filling gaps for big clouds during computing shortages.
Its existence implies: Cloud computing may no longer be just 'the big three + small players,' but a cohort of 'specialized clouds' segmented by computing type/scenario/region.
6. Concerns about 'circular financing': How risky is it?
There is indeed market concern:
The intertwined equity and contracts between NVIDIA, CoreWeave, OpenAI, Microsoft, Oracle, etc.—could this create an internal loop of 'self-generated demand'?
JPMorgan’s view: While these entities do have highly intertwined capital and contractual relationships, each still serves a large number of 'third-party real customers.' Current demand and cash flow remain tangible, not entirely reliant on internal loops.
7. AI’s 'supply chain': Not just GPUs, but the entire system
AI’s impact on hardware has expanded from a single point (GPUs) to a system-level BOM:
• Computing: GPU / ASIC
• Memory: HBM
• Interconnect: Switches / optical modules / high-speed links
• Manufacturing & packaging: Advanced processes, advanced packaging
• Power & cooling: Power supply, UPS, liquid cooling, etc.
Thus, 'AI beneficiaries' aren’t a single point but an entire chain.
But a caveat: Full-chain benefits ≠ equal profitability. Bargaining power and scarcity vary greatly, as do cyclical attributes.
8. Software, data, media: Who benefits, who might be replaced?
JPMorgan believes AI won’t distribute gains evenly—it will rewrite value allocation.
• Enterprise software platforms (CRM/HR/finance, etc.): Short-term computing cost increases pressure margins; long-term, if AI features can be 'monetized,' they may drive higher ARPU/stronger stickiness.
• Creative/ad tools: AI boosts productivity but lowers barriers, forcing repricing of some premium features.
• Customer service outsourcing and other 'headcount-driven models': May face structural shocks long-term but could pivot to 'intelligent operation services.'
• Vertical data & tools (EDA, industry data): Data moats are often stronger than models, granting long-term pricing power.
Media follows another logic: Not an immediate collapse, but more volatile revenue and higher costs (copyright, compliance, rights protection). Credit spreads face 'moderate widening' pressure. The real buffer lies in turning IP into 'licensed assets' in the AI content chain, not just defense.
9. Power & utilities: The real 'shovel sellers'
JPMorgan sees these as the true 'shovel sellers.' AI data centers, combined with overall power demand growth, are indeed driving U.S. electricity demand to reaccelerate, with bottlenecks in grid connections, approvals, substations, and transmission being real.
My additional take:
• IPPs (independent power producers) are more like 'leverage on electricity prices/scarcity cycles,' benefiting more directly but also more sensitive to load realization, fuel prices, and market rules;
• Regulated utilities are more about 'asset expansion (rate base) + long-term visibility,' but returns are constrained by regulation, depending on grid connections and rate mechanisms.
10. Capital goods & manufacturing: Companies building factories, laying cables, installing cooling—short-term orders may be more 'fixed'
Data center construction will transmit demand to engineering, power distribution, cooling, and equipment manufacturing chains.
Construction spending has one feature: Front-loaded—once a project starts, equipment and engineering orders are often locked in first.
But a risk reminder:
If future data center and power investment pace falls short, the spreads of industrial corporate bonds already trading at 'AI premiums' may revert to peer averages (bond market adjustments resemble 'spread reversion,' unlike stock market volatility).
In one sentence: Who is paying for AI?
Break it into three layers:
- Direct payers: Hyperscalers (CAPEX, corporate bonds, project financing)
- Those selling computing as products: OpenAI / apps & specialized clouds (turning demand into paying cash flow)
- Shovel sellers: Hardware + power + engineering capital goods chains (orders, asset expansion, credit pricing)
What the credit market truly cares about isn’t 'how hot AI is,' but two things:
• Over the next decade, will capital expenditures push up leverage?
• Is cash flow from real third-party customers, not financing-driven internal loops?
For long-term NVDA investors like me, as long as (1) hyperscaler CAPEX remains high without clear cuts, (2) power/grid/data center expansions continue without structural bottlenecks, (3) financing windows don’t tighten sharply and refinancing costs don’t spike; while (4) NVDA’s product cadence and pricing power show no sustained weakening, (5) AI demand gradually translates into cloud vendors’ monetization and unit economics improvement.
Under these unchanged conditions, short-term volatility won’t alter my long-term thesis; I’ll only adjust 'swing positions' at extreme valuations, not core holdings.
Personal notes only, not investment advice.
The copyright of this article belongs to the original author/organization.
The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.

