Since first announcing Intrinsic’s collaboration with NVIDIA, our teams have seen meaningful momentum on bringing our platforms closer. Today at GTC 2025, NVIDIA’s AI developer conference, we're sharing some of our most recent work together.
First, we have successfully integrated NVIDIA Isaac foundation models (FMs) for a robot-grasping capability onto the Intrinsic platform. Second, we are sharing a preview of the connection between Intrinsic Flowstate and NVIDIA Omniverse™ platform technologies for physically-based visualization and factory scale digital twins. As Intrinsic and NVIDIA continue to build bridges between our platforms and products, our collaboration will help pave the way for developers to deliver real-world value to industry with AI-enabled automation.
0 / 0
Updates to an NVIDIA FM-enabled grasping skill in Intrinsic Flowstate
Since announcing our successful test of NVIDIA Isaac Manipulator FMs to enable a new robot grasping capability on the Intrinsic platform, our teams have built deeper developer workflows and tighter UI integrations in Flowstate that make NVIDIA’s FMs far easier to access, build, and deploy. With this, select developers can now use Flowstate to select their grasp model and use that capability alongside many others from Intrinsic, to develop and deploy sophisticated AI-enabled solutions in real-world robotics and automation systems.
What this means practically for robotics app developers, is that rather than hard-coding specific grippers to grasp specific objects in a certain way which is incredibly time consuming and tedious, the synthetically-trained FM immediately proposes points for the gripper to grasp the object. This kind of FM can be trained for different types of grippers. Easy developer access to usable and deployable FMs like this will reduce development time and make it easier for non experts to program valuable solutions.
We’re thrilled by this step forward to make it faster and simpler to build with, and use, NVIDIA technologies with the Intrinsic platform. For solution builders and robotics developers building next generation automation solutions across industries — we aim to make world class AI capabilities practical to build, deploy and operate day-to-day. By building bridges between our platforms and making AI an accessible and usable tool for developers — we can help bridge the value gap between the promise and potential of AI and the practical, measurable value it needs to deliver in the physical world of robotics and automation. For now, this FM and Flowstate functionality is available to select partners, but we’re working towards enabling this for customers who want to use their own AI alongside other intelligent capabilities in simulated and real world solutions.
0 / 0
Successful USD streaming prototype to connect Intrinsic Flowstate and NVIDIA Omniverse
We’re also sharing a first look at a prototypical OpenUSD streaming connection between Intrinsic Flowstate’s Gazebo-based digital twin and the NVIDIA Omniverse platform. Many roboticists will be familiar with USD, a process developed for animation that refers to Universal Scene Description, which now is common in robotics to allow multiple contributors to work on the same project at the same time using distinct layers.
This prototype connects Intrinsic’s developer environment — which can be used to build, simulate, deploy, and maintain industrial robotics and automation solutions — with Omniverse, an NVIDIA platform of APIs, SDKs, and services that enable developers to integrate OpenUSD, NVIDIA RTX™ rendering technologies, and generative AI into existing software tools and simulation workflows. Together, we established a working connection that streams live information at the workcell level from Intrinsic Flowstate to Omniverse, allowing workcells to be visualized in a photo-realistic factory-scale digital twin. This advance opens a path to real industrial-scale use cases, such as factory layout planning and end-to-end automation management.
This connection makes it possible, for example, to visualize and better understand how different parts of a factory work together across individual workcells, production lines, and material flow - paving the way to be able to deploy, run, and maintain systems through one cohesive toolchain. By observing how the workcells coordinate in real-time, teams also are able to gain an improved understanding of synchronization, material flow, and timing across entire production lines. Ultimately, this can help automation providers and operators plan and optimize their throughput and overall efficiency.
0 / 0
With these innovations, we envision a future where our technologies work together to enable useful applications across industrial automation - from providing a more holistic view of operations, to easier collaboration around factory planning, and more autonomous robotics applications. These integrations will lead to more flexible, adaptive, and reusable solutions for millions more businesses and developers.
You can learn more about our work with NVIDIA here, as well as broader efforts around OpenUSD.
If you’ll also be at GTC, Intrinsic CEO Wendy Tan White will be joining a panel on “Physical AI for the Next Frontier of Industrial Digitalization” on Tuesday, March 18 at 2:00 p.m., where she’ll delve into how leaders across industries are leveraging AI and simulation to transform their operations.
0 / 0