Big Tech 4: $700B AI CapEx 2026 etc (0501)

1. Big Tech 4 Companies Plan Over $700 Billion in AI Data Center Investment in 2026 — Path to $1 Trillion by 2027 Becoming Visible

• Core Source

“The four major big tech companies (AMZN, GOOGL, META, MSFT) announced plans for over $700 billion in capital expenditures for AI data centers and related infrastructure in 2026 alone, sending a positive signal to the broader ecosystem.”

“Current CY26 hyperscale capex is expected to exceed $800 billion (+67% YoY), with a path to over $1 trillion (+25% YoY) by CY27, with revenues and free cash flow (FCF) continuing to improve throughout the year, which will contribute to justifying continued spending.”

“Microsoft’s FY26 Q3 (January–March) earnings release confirmed that growth is re-accelerating, centered on key growth drivers such as Azure Cloud and M365 Copilot.”

“MSFT specifically stated that its CY26 capex forecast of $190 billion assumes a $25 billion increase due to rising component prices, which may account for approximately 70% of the total capex increase (versus the prior market consensus of approximately $154 billion).”

“OpenAI’s CFO Sarah Friar stated that she is witnessing ‘a vertical wall of demand’ for the company’s products.”

• Expected Impact

The simultaneous capex increases by the four big tech companies represent not merely individual corporate investment plans, but an industry-wide consensus that demand for AI infrastructure is structurally sustainable. As analyzed by BofA, approximately 70% of MSFT’s capex guidance increase stems from rising component prices for GPUs, HBM, and other parts — indirectly proving that AI core component suppliers (NVIDIA, TSMC, memory manufacturers, etc.) hold very strong pricing power.

On the supply side, multiple hyperscalers have directly stated that computing capacity remains supply-constrained throughout 2026, suggesting that a demand-dominant environment is likely to continue at least through 2027, well beyond a short-term cycle. OpenAI CFO’s remark that “shortage of computing resources is actually constraining growth” corroborates this.

From an investment standpoint, benefits are expanding simultaneously across the entire AI infrastructure value chain (computing, memory, packaging, optical communications, power infrastructure). While Goldman Sachs recommended overweighting cloud and underweighting semiconductors, the fact that rising component prices are also positively reflected in semiconductor suppliers’ margins means investment opportunities exist across all segments of the value chain.

2. SanDisk (SNDK) Delivers Explosive AI Data Center NAND Growth — Gross Margin Reaches 78.4%, $42 Billion in Long-Term Supply Contracts Secured

• Core Source

“SanDisk’s Q3 revenue of $5.95 billion significantly exceeded market consensus of $4.72 billion, and adjusted EPS of $23.41 far surpassed the consensus of $14.51, delivering strong results.”

“Data center revenue grew 233% quarter-over-quarter, driving overall results. Bit shipments declined slightly, but improved high-value product mix and price increases (+137% q/q) maximized profitability.”

“The price strength observed in Q3 was nothing short of extraordinarily strong.”

“During Q3, SanDisk signed 3 multi-year contracts (New Business Model or NBM), with 2 additional contracts signed so far in Q4, and active negotiations ongoing with ‘several other customers.’ The 3 contracts signed in the quarter are expected to provide a minimum of $42 billion in contracted revenue (a quarterly record), and SanDisk is guaranteed $11 billion in payments in the event of contract termination.”

“Q4 guidance was also well above market expectations, with revenue of $7.75 billion–$8.25 billion (consensus $6.65 billion) and adjusted EPS of $30–$33 (consensus $24.6).”

• Expected Impact

SanDisk’s results are a landmark demonstration that NAND memory is emerging as core infrastructure for the AI inference and agentic AI era. Data center NAND revenue growing 233% quarter-over-quarter and ASP rising 137% quarter-over-quarter reflects not merely volume growth, but a dramatic improvement in profitability driven by a shift to higher-value products.

Particularly noteworthy is the NBM (New Business Model) contract structure. Securing a minimum $42 billion order backlog with $11 billion in contract termination penalties structurally mitigates the cyclical risk that has historically plagued the memory industry. As AI inference demand grows, the role of high-capacity NAND in KV cache (context storage) expands, suggesting that structural supply tightness is likely to persist.

Bernstein has raised its FY27 EPS estimate to $200.47, 60% above market consensus, while BofA significantly raised its target price to $1,550. The NAND market is undergoing a structural redefinition — from “mere storage device” to “core AI compute support component” — in the age of AI.

3. Alphabet (GOOGL): AI Becomes Growth Engine Across All Business Units — Google Cloud Revenue +63.4% YoY, Order Backlog Nearly Doubled

• Core Source

“Alphabet’s FY 1Q26 total revenue was $109.9 billion (YoY +21.8%, QoQ -3.5%), … operating income $39.7 billion (YoY +29.7%, QoQ +10.5%) … Total revenue, revenue ex-TAC, and operating income exceeded Factset consensus by 2.7%, 3.3%, and 9.5% respectively.”

“Segment revenues were Google Services at $89.6 billion (YoY +16%, QoQ -6.5%) and Google Cloud at $20.0 billion (YoY +63.4%, QoQ +13.4%), exceeding market consensus by 1.9% and 10.9% respectively.”

“The cloud order backlog has nearly doubled to $462 billion, demonstrating that Alphabet is succeeding in monetizing its AI investments, with high visibility.”

“Alphabet is currently generating over 16 billion Gemini tokens per minute through direct customer API usage, a +60% increase compared to the prior quarter.”

“The CEO emphasized that strong results were achieved through AI investment and a Full Stack AI Approach, noting that AI is manifesting across all business areas including search advertising, cloud, subscriptions, enterprise, and autonomous driving.”

• Expected Impact

Alphabet’s results offer the clearest demonstration among big tech companies that AI investment is translating into real revenue. Google Cloud growing 63.4% year-over-year and the order backlog nearly doubling to $462 billion are strong signals that enterprise customers are adopting Google’s AI solutions on a long-term basis.

Notably, search advertising revenue growing +19.1% year-over-year at the fastest pace in years proves that AI (Gemini) is acting as a growth accelerator for the core advertising business, not a cannibalizer. This data directly refutes the “AI erosion of search advertising” scenario that many investors had feared.

The fact that a portion of Alphabet’s quarterly net income of $62.6 billion stemmed from unrealized gains on investments in private companies such as SpaceX and Anthropic shows that Alphabet is strengthening its role as an investor across the broader AI ecosystem, well beyond its identity as a mere advertising company. JPMorgan raised its target price from $395 to $460, and Goldman Sachs from $400 to $450.

4. Qualcomm (QCOM) Officially Enters AI Data Center Market — Custom Chip Shipments for Hyperscale Cloud Customer Expected in December

• Core Source

“Qualcomm is making a full-scale entry into the AI data center market, developing custom chips in collaboration with one hyperscale cloud operator. First shipments are expected to begin in December this year, with three product lineups — CPUs, inference accelerators, and custom ASICs — being pursued simultaneously.”

“This is a strategic pivot made as the existing mobile-centric business faces pressure from Apple and Samsung Electronics strengthening their own chip development. If the data center becomes a new growth axis, it has the potential to mount a meaningful challenge to the NVIDIA-centric market structure.”

“Qualcomm (+15.12%) surged sharply after reporting solid results driven by a surge in the automotive segment and announcing reinforcement of its data center business.”

• Expected Impact

Qualcomm’s entry into the data center market introduces a new variable into the competitive landscape of AI semiconductors. While NVIDIA currently holds an overwhelming market share in AI accelerators, Qualcomm is differentiating itself by simultaneously targeting the CPU, inference accelerator, and ASIC segments — a three-pronged strategy against the backdrop of hyperscalers expanding their own custom ASIC development.

Qualcomm’s strength lies in its low-power design capabilities built up in the mobile and edge markets (ARM-based architecture), which could become a competitive advantage in an environment where data center power efficiency is an increasingly pressing concern. Particularly in a context where agentic AI proliferation is driving explosive growth in inference demand, the inference-specialized accelerator market has the potential to form a new demand segment distinct from NVIDIA’s GPU-centric training market.

This aligns with MediaTek raising its FY26 data center ASIC revenue guidance to $2 billion and projecting a 2027 TAM of $70–$80 billion, showing that the AI ASIC market itself is rapidly expanding. If Qualcomm secures a meaningful share of this market, a structural transformation becomes possible — reducing dependence on the mobile business while adding high-margin data center revenues as a new growth pillar.

5. MLCC and FC-BGA Supply Tightness Intensifies — Murata BB Ratio Hits 1.36, Samsung Electro-Mechanics Target Price Raised to KRW 1.02 Million

• Core Source

“New Capacitor order value reached ¥329.1 billion (+58.8% YoY), setting a new all-time high for the second consecutive quarter.”

“The BB Ratio (order/revenue, with 1.0 as the boom threshold) also reached 1.36, surpassing both the 1.20 during the 2021 COVID upcycle and the 1.32 during the 2017–2018 MLCC price hike cycle.”

“Q2 2026 MLCC shipment volume is expected to increase +5.5% QoQ, with capacity utilization expected to improve to the mid-90% range. At the same time, Blended ASP is expected to rise an additional +4.0% QoQ, driven by mix improvement effects from expanding AI-related demand.”

“FC-BGA capacity utilization is expected to rise to around 95%, driven by new entry into NVIDIA-related switch production and increased CPU supply volumes for customer A.”

“JP Morgan has determined that Samsung Electro-Mechanics’ core business divisions have officially entered an upcycle, and recommends upgrading to target price of KRW 980,000 … as a buying opportunity. FY26E–28E EPS has been raised by 11–17%, with operating profit CAGR of 55% and record-high ROE projected for 2026–28.”

• Expected Impact

Murata’s BB Ratio of 1.36 surpasses both the 2021 COVID upcycle (1.20) and the 2017–2018 price hike cycle (1.32), representing an all-time high level and signaling that MLCC supply-demand dynamics have entered an unprecedented state of tightness. Both increasing AI server demand and front-loaded procurement driven by expectations of potential price hikes are simultaneously contributing to order growth.

Murata disclosed data center revenue for the first time in this earnings release, recording ¥176.7 billion (+66.5% YoY) in FY2025, with data center revenue expected to grow +83.9% YoY in FY2026E. While the absence of explicit near-term price hike plans may serve as a short-term headwind for the stock, given that Yageo and Taiyo Yuden have already proactively announced MLCC price increases, broader industry-wide price hike diffusion is seen as a matter of time.

On the FC-BGA side, Samsung Electro-Mechanics’ capacity utilization is expected to rise to around 95%, and Meritz Securities projects package solution segment revenue and operating income to grow 54% and 141% respectively over the next two years. JP Morgan’s target price of KRW 980,000 and Meritz Securities’ target of KRW 1.02 million both reflect this structural upcycle. Given that AI accelerator demand is simultaneously exerting pressure on both MLCC and substrate (FC-BGA), valuation re-rating for component suppliers across the board is likely.

Comment [2]

  1. 감사합니다. BoA는 광학 섹터보다 전력반도체 섹터를 더 좋게 보고 있군요.
    전력반도체 잘 봐야될 것 같습니다.

    Reply

Leave a Comment