For National Robotics Week 2026, NVIDIA laid out an updated stack of simulation, training, and deployment tools aimed at closing the gap between virtual robot learning and physical-world performance.
The push builds on announcements from NVIDIA GTC in March, where the company shipped a full cloud-to-robot workflow connecting simulation, reinforcement learning, and edge inference. The goal: let developers build, train, and deploy robots that can perceive, reason, and act in unstructured environments.
What NVIDIA shipped at GTC
The key releases span foundation models, physics simulation, and synthetic data:
- Isaac GR00T open models give robots the ability to parse natural language instructions and execute multi-step tasks using vision-language-action reasoning.
- Cosmos world models generate synthetic training data at scale, helping robotic systems generalize across environments without relying solely on real-world collection.
- Newton 1.0, now generally available and open source, provides a physics engine tuned for dexterous manipulation — accurate collision detection, realistic contact, and stable simulation of rigid and flexible parts.
- Isaac Sim 6.0, Isaac Lab 3.0, and Omniverse NuRec expand simulation capabilities, letting teams model real-world scenarios and validate systems before physical deployment.
Taken together, the releases form a layered pipeline: Cosmos generates data, Isaac Sim and Newton provide the training environment, GR00T supplies the model architecture, and edge hardware runs inference on the robot itself.
From language commands to robot motion
One demonstration showed NVIDIA NemoClaw integrated with Isaac Sim to control a Nova Carter autonomous robot using plain text commands. A developer types "move two meters forward" and NemoClaw translates the instruction into executable Python, sent to the simulator via a custom REST API. No manual coding required.
The workflow keeps everything inside Isaac Sim's physics-accurate environment, meaning teams can test language-driven control before touching real hardware.
Separately, a surgical robotics company called PeritasAI is using NVIDIA Isaac for Healthcare and the Rheo blueprint to build multi-agent intelligence for operating rooms. Working with Lightwheel and Advent Health Hospitals, the project targets real-time situational awareness, sterile coordination, and instrument management during procedures.
Underwater robotics also got attention. OceanSim, developed at the University of Michigan, uses Isaac Sim and Omniverse libraries to render physics-based sonar and camera imagery for subsea perception research. The GPU-accelerated framework generates synthetic underwater data fast enough for real-time training loops.
NVIDIA did not disclose pricing for the new tools. Isaac GR00T, Newton 1.0, Isaac Sim 6.0, and Isaac Lab 3.0 are available now through NVIDIA's developer portal, with on-demand GTC sessions covering each release in detail.
💬 Discussion
Sign in to join the discussion.
Sign in →No comments yet — be the first.