Code Example: Batching a (Streaming) Dataset

This section shows how to access datasets hosted on Hugging Face using the LeRobotDataset class. We’ll start small and build up, so you can copy–paste the pattern that fits your use case.

First, any dataset on the Hub that follows the three pillars (tabular, visual, relational metadata) can be accessed with a single instruction.

Why windows? Most reinforcement learning (RL) and behavioral cloning (BC) algorithms operate on stacks of observations and actions. For brevity, we refer to joint states and camera frames collectively as a single “frame.” For example, RL often uses a history of observations $o{t-H_o:t}$ to mitigate partial observability, and BC typically regresses chunks of multiple actions ($a{t:t+H_a}$) rather than single controls.

To support these training patterns, LeRobotDataset provides native temporal windowing. You define the time offsets (before/after) around any frame using delta_timestamps. Unavailable frames are padded and a mask is returned to filter padded elements. This all happens inside LeRobotDataset and is transparent to higher‑level wrappers like torch.utils.data.DataLoader.

Temporal Windows Explained:

This is crucial for robot learning where decisions depend on recent history!

Conveniently, by using LeRobotDataset with a PyTorch DataLoader one can automatically collate the individual sample dictionaries from the dataset into a single dictionary of batched tensors for downstream training or inference. LeRobotDataset also natively supports streaming mode for datasets. Users can stream data of a large dataset hosted on the Hugging Face Hub, with a one-line change in their implementation. Streaming datasets supports high-performance batch processing (ca. 80-100 it/s, varying on connectivity) and high levels of frames randomization, key features for practical BC algorithms which otherwise may be slow or operating on highly non-i.i.d. data. This feature is designed to improve on accessibility so that large datasets can be processed by users without requiring large amounts of memory and storage.

Here are different ways to set up temporal windows depending on your use case. Skim the options and pick one to start—switching later is just a change to the dictionary.

basic-bc
history-bc
action-chunking

Basic Behavioral Cloning (learn current action from current observation):

# Simple: current observation → current action
delta_timestamps = {
    "observation.images.wrist_camera": [0.0],  # Just current frame
    "action": [0.0]  # Just current action
}

dataset = LeRobotDataset(
    "lerobot/svla_so101_pickplace", 
    delta_timestamps=delta_timestamps
)

Streaming Large Datasets

When to use streaming:

Performance: Streaming achieves 80-100 it/s with good connectivity!

download
streaming

Download Dataset (faster training, requires storage):

from lerobot.datasets.lerobot_dataset import LeRobotDataset

# Downloads dataset to local cache
dataset = LeRobotDataset("lerobot/svla_so101_pickplace")

# Fastest access after download
sample = dataset[100]

Training Integration

PyTorch DataLoader

import torch
from torch.utils.data import DataLoader

# Create DataLoader for training
dataloader = DataLoader(
    dataset,
    batch_size=16,
    shuffle=True,
    num_workers=4
)

# Training loop
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

for batch in dataloader:
    # Move to device
    observations = batch["observation.state"].to(device)
    actions = batch["action"].to(device)
    images = batch["observation.images.wrist_camera"].to(device)
    
    # Your model training here
    # loss = model(observations, images, actions)
    # loss.backward()
    # optimizer.step()

Why This Matters

This simple API hides significant complexity:

Compare this to traditional robotics data handling, which often requires:

LeRobotDataset standardizes and simplifies all of this!

Section Quiz

Test your understanding of LeRobot and its role in robot learning:

1. What makes LeRobot different from traditional robotics libraries?

2. Which of the following is NOT a key component of LeRobot’s approach?

3. What is the main advantage of LeRobot’s optimized inference stack?

4. Which types of robotic platforms does LeRobot support?

5. What does “end-to-end integration with the robotics stack” mean in the context of LeRobot?

6. What is the primary purpose of the delta_timestamps parameter in LeRobotDataset?

7. Which of the following best describes the three main components of LeRobotDataset?

8. What happens when you use StreamingLeRobotDataset instead of LeRobotDataset ?

9. In the context of robot learning, what does “temporal windowing” refer to?

10. What is the main advantage of LeRobotDataset’s approach to storing video data?

11. Which statement about LeRobotDataset’s compatibility is correct?

< > Update on GitHub