← Back to projects

in-hand manipulation w/ raw tactile signals

Quick Project Overview

Goal

Enable a 12-DOF robotic hand to continuously rotate a 40 mm lug-nut in the real world— surviving slips and unexpected pushes—by harnessing its built-in raw tactile sensors.

Approach

1) Train a PPO “expert” in Isaac Gym.
2) Collect ≈45 k real trajectories with RGB-D and 28-channel touch.
3) Distill that data into a Transformer (BAKU) policy that ingests raw tactile + vision—no tactile simulation required.

Result

The multimodal policy lasts ≈ 180 s vs 13 s for the RL expert on the hardest test, a ~14× gain—demonstrating that raw tactile feedback dramatically improves robustness and self-recovery.

Project Demo Video

Read the Paper

Trouble viewing? Download the PDF ↗