- Nvidia expects $1 trillion in orders for Blackwell and Vera Rubin systems through 2027 — double last year’s projection.
- DLSS 5 uses machine learning to deliver photo-realistic lighting on RTX 50-series GPUs, launching fall 2026.
- NemoClaw turns OpenClaw into an enterprise-grade AI agent platform with built-in security and privacy controls.
- The Nvidia Groq 3 LPU — built from the $20 billion Groq acquisition — ships Q3 2026 and boosts token-per-watt performance by 35x.
$1 Trillion in Orders and a New Chip Empire
Jensen Huang walked onstage at GTC 2026 in San Jose on Monday and did what he does best: made the rest of the industry feel behind. Nvidia now expects $1 trillion in purchase orders across its Blackwell and Vera Rubin chip platforms through 2027. That is double the $500 billion revenue opportunity the company projected just last year.
The centerpiece is Vera Rubin, a rack-scale AI system made up of 1.3 million components that Nvidia claims delivers 10x more performance per watt than its predecessor, Grace Blackwell. Seven chips are now in full production. But the real surprise was Nvidia’s integration of Groq technology — a non-Nvidia processor. The Nvidia Groq 3 Language Processing Unit, built from the $20 billion asset purchase Nvidia closed in December, ships in Q3 2026. A dedicated rack housing 256 LPUs sits beside the Vera Rubin system and increases tokens-per-watt performance of Rubin GPUs by 35x.
Nvidia also previewed Kyber, its next-generation rack architecture. It stacks 144 GPUs in vertical compute trays instead of horizontal ones to boost density and lower latency. Kyber ships inside the Vera Rubin Ultra system in 2027.
NemoClaw, DLSS 5, and the Agent Operating System
The software announcements were just as aggressive. Nvidia launched NemoClaw, an enterprise-grade platform built on top of OpenClaw, the open-source AI agent framework that has become the most popular open-source project in GitHub history in a matter of weeks. NemoClaw adds privacy controls, security guardrails, and sandbox orchestration — the exact features enterprises have been demanding before deploying autonomous agents at scale. Nvidia built it in collaboration with OpenClaw creator Peter Steinberger, who recently joined OpenAI.
“Every company in the world today needs to have an OpenClaw strategy, an agentic system strategy,” Huang said. “This is the new computer.”
Then came DLSS 5 — and it is not what anyone expected. Instead of boosting frame rates, DLSS 5 uses machine learning to deliver photo-realistic lighting on RTX 50-series GPUs. The AI model understands scene semantics — it processes skin, hair, water, and metal differently — and applies physically accurate lighting without requiring hardware that does not exist yet. Demos ran on Resident Evil Requiem, Hogwarts Legacy, Assassin’s Creed Shadows, and Starfield. Digital Foundry called the results “frankly astonishing.” It ships fall 2026.
Orbital Data Centers and the $4.5 Trillion Confidence Play
Nvidia also announced Vera Rubin Space-1 Module, a computing platform designed for orbital data centers. As land-based data center capacity runs thin and energy costs spike, Nvidia is betting that space is a viable frontier. OpenAI and xAI have both floated similar ideas, but Nvidia is the first major chipmaker to ship dedicated hardware for it.
The stock closed up 1.7% on Monday. Nvidia is now worth roughly $4.5 trillion — the most valuable public company on Earth. Huang’s message was clear: the inflection point for inference-driven AI has arrived, demand is not slowing, and Nvidia intends to own every layer of the stack, from silicon to orbit.
“If they could just get more capacity, they could generate more tokens, their revenues would go up,” Huang said. He is talking about his customers. He might as well be talking about himself.