Skip to content

Live from GTC 2024: FiftyOne to Integrate with NVIDIA Omniverse Simulation Services to Accelerate Autonomous Vehicle Development

Today, live from NVIDIA GTC 2024, we announced a new integration that enables Autonomous Vehicle (AV) developers to create, curate, and visualize robust synthetic training data to maximize AI model performance. Check out the press release announcement, NVIDIA’s blog post, or continue reading for a summary of today’s exciting news!

Two technologies, FiftyOne and NVIDIA Omniverse, sit at the heart of the integration, accelerating the path to autonomy. If you’re not yet familiar, FiftyOne is a computer vision platform that provides AI builders with a comprehensive suite of capabilities to efficiently and systematically analyze and optimize their datasets and models. NVIDIA Omniverse Cloud APIs provide high-fidelity and physically based sensor simulation and realistic behavior at cloud scale. 

Through the integration, developers can easily visualize and organize ground-truth data generated in simulation for streamlined training and testing. Key benefits of the integration include: 

  • Simplified data generation and analysis
  • Multi-sensor visualization
  • Support for converting data to common computer vision formats 
  • Unstructured data exploration 
Simulated single-camera AV image sample visualized in FiftyOne

“A rich ecosystem is critical for safe autonomous vehicle development,” said Zvi Greenstein, Vice President of AV Infrastructure at NVIDIA. “The FiftyOne platform will bring seamless data visualization to AV developers relying on high-fidelity, physically based sensor simulation delivered via Omniverse AV Sim APIs, helping them tailor datasets for their training and testing needs.”

The FiftyOne computer vision platform not only plays a crucial role in AV use cases but also enjoys widespread adoption in AI systems across diverse industries, including manufacturing, security, retail, finance, and healthcare.

If you haven’t already tried it, getting started with FiftyOne is incredibly easy. You can explore more than 70 datasets across all industries in FiftyOne instantly in your browser at try.fiftyone.ai. Or, if you’d like, you can go straight to the AV datasets in FiftyOne in your browser.

For example, you can explore these preloaded AV datasets:

  • BDD100k: Check out this dataset to see FiftyOne’s embeddings visualization in action. Click ‘+’ next to ‘Samples’, select ‘Embeddings’, and choose a key. Color by ‘timeofday.label’ and lasso-select points of interest. Why are samples with daytime labels in the nighttime cluster (and vice versa)?
  • nuScenes: This multi-modal sensor dataset contains camera images, LiDAR, and radar point clouds. Try clicking a sample to explore the multiple sensor channels associated with that scene.

Alternatively, because FiftyOne is open source, you can download it locally and be up and running in just a few minutes. Learn how to get started with FiftyOne in the docs and get access to the full power of the library, including the Python SDK, App, and the Brain. Check out our Getting Started video series to get up to speed fast.

For those of you who are heading to NVIDIA GTC 2024, be sure to visit Voxel51 at booth 632 in the Automotive Pavilion. We’d love to show you FiftyOne in action and answer any questions you may have.

Additionally, we invite you to join the thousands of engineers and data scientists already using FiftyOne to solve some of the most challenging problems in computer vision today!

What’s Next?