A lightweight visualization toolkit for exploring robotics datasets, built on a pre-configured JupyterLab environment with Voila for interactive applications.
Voilab provides a set of tools to interactively view and debug robotics data. The primary workflow is through a custom JupyterLab environment that includes built-in extensions for launching web applications and viewing URDF models directly from the UI.
This repository contains several packages. For more detailed information on each, please refer to their respective documentation files:
packages/umi: Tools and configurations for running SLAM pipelines with UMI datasets.packages/diffusion_policy: To train the diffusion policy with UMI datasets.diffusion_policy_layers: Overview of the diffusion policy package layers.ros2_integration: Overview of the ROS2 integration.Isaac Sim docker setup: How to setup the Docker containers for Isaac Sim.
Voilab uses uv for dependency management. You can install everything needed with:
# Install uv (if not already installed) and project dependencies
make install
# Or manually:
uv syncThe main functionalities of Voilab are accessed through a customized JupyterLab instance, which includes pre-installed extensions for visualization.
Start the JupyterLab server using the following command:
make launch-jupyterlabThis will open a JupyterLab interface in your web browser.
The interactive visualization tools are built as Jupyter notebooks that can be run as standalone web applications using Voila.
Usage:
- In the JupyterLab file browser (left panel), navigate to the
nbs/directory. - Right-click on an application notebook (e.g.,
replay_buffer_viewer.ipynb). - Select "Open with -> voila" from the context menu. This will open the application in a new browser tab.
- Location:
nbs/replay_buffer_viewer.ipynb - Goal: An interactive tool for exploring UMI-style datasets for debugging, validation, and quick data analysis.
- Features:
- Location:
nbs/aruco_detection_viewer.ipynb - Goal: An interactive tool to detect and visualize ArUco markers in the camera data from a dataset. This is useful for validating marker detection quality, camera calibration, and pose estimation.
- Features:
- Location:
nbs/dataset_visualizer.ipynb - Goal: A visualization tool for data collectors to review and refine their collected human demonstrations. Helps identify issues such as lost SLAM frames, low ArUco detection rates, and trajectory anomalies before final processing.
- Features:
- Pipeline Status: Overview of which UMI processing stages have completed
- Demo Quality Metrics: Detection rates, lost frames, and trajectory quality for each demo
- Trajectory & Video: Side-by-side 3D camera trajectory visualization and frame-by-frame video preview
- ArUco Tags: ArUco marker detection viewer with detection statistics and marker overlays
- CLI Usage:
# Launch the dataset visualizer web app uv run voilab launch-dataset-visualizer - Python Usage:
from voilab.applications.dataset_visualizer import show show("/path/to/session/directory")
The JupyterLab environment comes with a built-in viewer for Universal Robot Description Format (URDF) files.
Usage:
- Use the file browser to locate a
.urdffile. - Double-click the file to open it in a new tab with an interactive 3D viewer.
An example model for the Franka Emika Panda robot is provided in assets/franka_panda. You can test the viewer by opening assets/franka_panda/franka_panda.urdf.
Follow the established pattern when adding new applications:
- Notebook interface: Create
.ipynbfiles innbs/for interactive development. Ensure they can be rendered correctly with Voila. - Core logic: Implement visualization components in
src/voilab/applications/. - Utilities: Add reusable data loading/processing in
src/voilab/utils/. - CLI integration: (Optional) Register new commands in the voilab CLI following existing patterns.
Use uv sync to manage dependencies and test changes.


