Zurück zu den News

Digital Colliers Daily Briefing — April 30, 2026

Digital Colliers Daily Briefing — April 30, 2026
Digital Colliers Apr 30, 2026 7 min read

Digital Colliers Daily Briefing — April 30, 2026

The April 30 news cycle delivers a coherent, if vertiginous, picture of the AI capital cycle entering a new phase. Hyperscaler Q1 prints push combined 2026 capex projections to $725 billion, OpenAI says it has already cleared a Stargate compute target it once set for 2029, and Anthropic is fielding preemptive offers that would value it near $900 billion. Read together, the day's stories describe an industry where supply, not demand, is the binding constraint — and where the financing required to relax that constraint keeps escalating.

1. Hyperscaler Q1: $725B in 2026 Capex, and Google Starts Selling TPUs

Vintage accountant working a tape adding machine, evoking the scale of hyperscaler capex.

What happened. Q1 2026 reports from Microsoft, Alphabet, Amazon, and Meta lifted combined 2026 capex guidance to roughly $725 billion, a 77% jump from $410 billion in 2025, per a Financial Times analysis cited on Techmeme. Google Cloud cleared $20 billion in quarterly revenue (up 63% year-over-year) but warned it was capacity-constrained, with its backlog roughly doubling quarter-over-quarter to about $462 billion; Alphabet expects to recognize half of that within 24 months. AWS grew 28% to $37.6 billion — its fastest growth in 15 quarters, according to TechCrunch — while CEO Andy Jassy conceded free cash flow had fallen to $1.2 billion on a trailing basis, down from $25.9 billion a year earlier, as property and equipment purchases rose by $59.3 billion. Microsoft raised its FY2026 capex outlook by $25 billion to $190 billion, with CFO Amy Hood attributing the increase largely to memory and storage component prices that have, in some cases, more than tripled since last autumn (The Register). Sundar Pichai also confirmed Google will begin shipping TPUs into select customers' own data centers, citing demand from "AI labs, capital markets firms and high-performance computing applications."

Why it matters. The capex curve is now steeper than the revenue curve at every hyperscaler that reported. Microsoft has spent roughly $97 billion over four quarters to support $37 billion in AI ARR (up 123% year-over-year). Alphabet's $35.7 billion quarterly capex implies a 2026 figure between $180–190 billion. The shift to selling proprietary silicon externally — Google first, with AWS reportedly close behind — also begins to reshape Nvidia's competitive moat at the high end of training compute.

Who is affected. Memory suppliers (SK hynix, Micron, Samsung) are direct beneficiaries of the component squeeze; Nvidia faces a slowly widening competitive front; enterprise buyers face tighter capacity allocation through at least 2026, per Hood's own guidance. OpenAI sits on both sides — a $250 billion-plus Microsoft cloud customer and now a competitor for capacity with hyperscalers' own first-party AI services. Satya Nadella, asked about OpenAI's renegotiated deal, told analysts Microsoft retains royalty-free IP rights through 2032 and "fully plan[s] to exploit it" (TechCrunch).

What to watch next. Whether Q2 brings further capex revisions; the cadence of TPU external shipments and any pricing disclosures; and how quickly Microsoft's pivot of GitHub Copilot to per-token billing — implemented earlier this week — propagates to other AI products under cost pressure.

2. OpenAI Hits 10GW Stargate Target Three Years Early

Vintage engineer at a power transformer, symbolizing accelerated AI compute buildout.

What happened. OpenAI said it has now contracted 10 gigawatts of US AI compute capacity, the milestone it set for 2029 when Stargate was announced in January 2025. More than 3GW of that was added in the last 90 days. The company also disclosed that GPT-5.5 was trained at the flagship Abilene, Texas site, which runs Nvidia GB200 systems on Oracle Cloud Infrastructure. OpenAI is evaluating additional locations and launched a community engagement program with a donation to the Port Washington-Saukville Education Foundation in Wisconsin alongside Vantage Data Centers and Oracle.

Why it matters. Pulling a 2029 capacity target into early 2026 reframes timelines for everyone in the supply chain — turbines, transformers, switchgear, GPUs, HBM, and skilled trades labor. It also helps explain the hyperscaler component-cost surge Microsoft described: a single tenant is absorbing capacity at a rate the broader power and semiconductor ecosystem was not planning to deliver until late this decade.

Who is affected. Nvidia and Oracle, OpenAI's primary infrastructure partners, are the most direct beneficiaries; neoclouds (CoreWeave, Crusoe, Lambda) sit downstream of the same demand pull. Utilities and ISOs in Texas, Wisconsin, and other Stargate corridors face accelerated interconnection queues. North America's Building Trades Unions, named explicitly in OpenAI's post, gain leverage on workforce terms. Local communities get the dual edge of tax base and load growth — OpenAI took unusual care to detail Abilene's closed-loop cooling economics (annual water use comparable to "four average households" after initial fill), an implicit response to mounting opposition at other data center sites.

What to watch next. OpenAI's next capacity target beyond 10GW; whether Stargate financing structures shift further toward off-balance-sheet vehicles as the dollar amounts grow; and whether power availability — not chips — becomes the gating factor that finally slows the curve.

3. Anthropic in Talks for $50B at $850–900B Valuation

Vintage bank teller counting cash, evoking Anthropic's massive funding round.

What happened. Anthropic has received multiple preemptive offers to raise $40–50 billion at a valuation between $850 billion and $900 billion, TechCrunch reported, citing half a dozen sources. The company is expected to decide on the round at a May board meeting. Anthropic disclosed an annualized revenue run rate above $30 billion earlier this month; one source told TechCrunch the figure is now closer to $40 billion, up from roughly $9 billion at the end of 2025. The last round, in February, valued the company at $380 billion. One institutional investor prepared to commit $5 billion has reportedly been unable to secure a meeting with CFO Krishna Rao.

Why it matters. A $900 billion mark would put Anthropic at parity with OpenAI's $852 billion February post-money valuation, formalizing a two-horse race at the frontier-lab tier. The revenue trajectory — more than 4x in roughly four months — is being driven primarily by coding (Claude Code and Cowork), which is also where Anthropic's product differentiation against OpenAI is most pronounced. Counterpoint's recent ranking of Anthropic as the LLM revenue leader gives the valuation a defensible numerator.

Who is affected. Public-market AI proxies (Nvidia, Oracle, hyperscalers) gain another deep-pocketed compute customer; Amazon, Anthropic's primary cloud partner and largest strategic backer, sees its bet appreciate further just as it announced exclusive AI products with OpenAI on Bedrock. For the broader venture market, a $50 billion private round at this scale concentrates LP capital in ways that will continue to crowd out mid-stage AI rounds.

What to watch next. The May board decision and the final composition of the syndicate; whether Anthropic signals an IPO timeline, given TechCrunch's framing of this as potentially the last private round; and any disclosure on Claude Code customer concentration, which is increasingly relevant to the durability of the run rate.

The Through-Line

The same demand signal runs through all three stories: AI compute is being bought faster than it can be built, and the financing required to close the gap is escalating in step. Hyperscalers are absorbing the cost on their balance sheets, OpenAI is absorbing it through partner-financed infrastructure vehicles, and Anthropic is absorbing it through equity issuance at valuations that would have been implausible 12 months ago. The constraint to watch in the back half of 2026 is no longer chips or capital — both are flowing — but power, permitting, and the durability of the enterprise revenue that has to eventually justify the buildout.

Related Posts