This section shows how to access datasets hosted on Hugging Face using the LeRobotDataset class. We’ll start small and build up, so you can copy–paste the pattern that fits your use case.
First, any dataset on the Hub that follows the three pillars (tabular, visual, relational metadata) can be accessed with a single instruction.
Why windows? Most reinforcement learning (RL) and behavioral cloning (BC) algorithms operate on stacks of observations and actions. For brevity, we refer to joint states and camera frames collectively as a single “frame.” For example, RL often uses a history of observations $o{t-H_o:t}$ to mitigate partial observability, and BC typically regresses chunks of multiple actions ($a{t:t+H_a}$) rather than single controls.
To support these training patterns, LeRobotDataset provides native temporal windowing. You define the time offsets (before/after) around any frame using delta_timestamps. Unavailable frames are padded and a mask is returned to filter padded elements. This all happens inside LeRobotDataset and is transparent to higher‑level wrappers like torch.utils.data.DataLoader.
Temporal Windows Explained:
[-0.2, -0.1, 0.0] gives you 200ms, 100ms, and current observations[0.0, 0.1, 0.2] provides current and next 2 actions (100ms apart)This is crucial for robot learning where decisions depend on recent history!
Conveniently, by using LeRobotDataset with a PyTorch DataLoader one can automatically collate the individual sample dictionaries from the dataset into a single dictionary of batched tensors for downstream training or inference. LeRobotDataset also natively supports streaming mode for datasets. Users can stream data of a large dataset hosted on the Hugging Face Hub, with a one-line change in their implementation. Streaming datasets supports high-performance batch processing (ca. 80-100 it/s, varying on connectivity) and high levels of frames randomization, key features for practical BC algorithms which otherwise may be slow or operating on highly non-i.i.d. data. This feature is designed to improve on accessibility so that large datasets can be processed by users without requiring large amounts of memory and storage.
Here are different ways to set up temporal windows depending on your use case. Skim the options and pick one to start—switching later is just a change to the dictionary.
Basic Behavioral Cloning (learn current action from current observation):
# Simple: current observation → current action
delta_timestamps = {
"observation.images.wrist_camera": [0.0], # Just current frame
"action": [0.0] # Just current action
}
dataset = LeRobotDataset(
"lerobot/svla_so101_pickplace",
delta_timestamps=delta_timestamps
)When to use streaming:
Performance: Streaming achieves 80-100 it/s with good connectivity!
Download Dataset (faster training, requires storage):
from lerobot.datasets.lerobot_dataset import LeRobotDataset
# Downloads dataset to local cache
dataset = LeRobotDataset("lerobot/svla_so101_pickplace")
# Fastest access after download
sample = dataset[100]import torch
from torch.utils.data import DataLoader
# Create DataLoader for training
dataloader = DataLoader(
dataset,
batch_size=16,
shuffle=True,
num_workers=4
)
# Training loop
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
for batch in dataloader:
# Move to device
observations = batch["observation.state"].to(device)
actions = batch["action"].to(device)
images = batch["observation.images.wrist_camera"].to(device)
# Your model training here
# loss = model(observations, images, actions)
# loss.backward()
# optimizer.step()This simple API hides significant complexity:
Compare this to traditional robotics data handling, which often requires:
LeRobotDataset standardizes and simplifies all of this!
Test your understanding of LeRobot and its role in robot learning: