“Many years later, as he faced the screwdriver, LeRobot was to remember that distant afternoon when he grabbed that red ball..."
Welcome to the Hug-n-Play project by Team MineRobots. Our solution features an SO-100 robot arm powered by AMD ROCm and Hugging Face LeRobot, trained to identify tools, "hug" them securely, and perform a celebratory "winning dance" upon successful retrieval.
Figure 1 (video): MineRobot performing the "Hug" (grasp) and the intentional "Winning Dance" (shake) before handover.
- Team Name: MineRobots
- Project Name: Hug-n-Play
- Task:
minerobots-pick-screwdriver - Goal: Autonomous identification, grasping ("hugging"), celebration dance, and handover of a screwdriver.
- Model: Action Chunking with Transformers (ACT) & XVLA trained on AMD MI300X.
git clone https://github.com/huggingface/lerobot.git --depth 1
cd lerobot
pip install -U pip
pip install -e .Leader Arm:
lerobot-calibrate --teleop.type=so101_leader --teleop.port=/dev/ttyACM1 --teleop.id=minerobots_leaderFollower Arm:
lerobot-calibrate --robot.type=so101_follower --robot.port=/dev/ttyACM0 --robot.id=minerobots_followerTo record the dataset, we utilized a 3-camera setup (Side, Top, Wrist) to capture the hugging motion.
lerobot-teleoperate \
--robot.type=so101_follower \
--robot.port=/dev/tty.usbmodem5AE60570611 \
--robot.id=so101_follower_mac \
--robot.cameras="{side: {type: opencv, index_or_path: 0, width: 1920, height: 1080, fps: 30}, arm: {type: opencv, index_or_path: 2, width: 1920, height: 1080, fps: 30}, top: {type: opencv, index_or_path: 1, width: 1920, height: 1080, fps: 24}}" \
--teleop.type=so101_leader \
--teleop.port=/dev/tty.usbmodem5AE60848661 \
--teleop.id=so101_leader_mac \
--display_data=trueWe utilized the AMD MI300X to train our models.
Used for the minerobots-screwdriver task.
export HF_USER=cfu
export DATASET_NAME=minerobots-screwdriver
export REPO_NAME=minerobots-screwdriver-act-20k
lerobot-train \
--dataset.repo_id=cfu/${DATASET_NAME} \
--steps=30000 \
--policy.type=act \
--policy.pretrained_path=cfu/minerobots-ball-act-100k \
--output_dir=outputs/train/${REPO_NAME} \
--job_name=${REPO_NAME} \
--policy.device=cuda \
--wandb.enable=true \
--policy.push_to_hub=true \
--policy.repo_id=${HF_USER}/${REPO_NAME}Requires lerobot >= 0.4.3.
export DATASET_NAME=minerobots-pick-screwdriver
export REPO_NAME=minerobots_screwdriver_xvla_50k
lerobot-train \
--dataset.repo_id=cfu/${DATASET_NAME} \
--policy.optimizer_lr=0.0001 \
--steps=50000 \
--policy.action_mode=auto \
--policy.type=xvla \
--policy.pretrained_path=lerobot/xvla-base \
--output_dir=outputs/train/${REPO_NAME} \
--job_name=${REPO_NAME} \
--policy.device=cuda \
--wandb.enable=true \
--policy.push_to_hub=true \
--policy.repo_id=cfu/${REPO_NAME} \
--policy.resize_imgs_with_padding=[224,224] Running the inference on the Edge device to perform the "Hug-n-Play" sequence.
lerobot-record \
--robot.type=so101_follower \
--robot.port=/dev/ttyACM0 \
--robot.cameras="{side: {type: opencv, index_or_path: 2, width: 640, height: 480, fps: 30}, top: {type: opencv, index_or_path: 6, width: 640, height: 480, fps: 30}, wrist: {type: opencv, index_or_path: 4, width: 640, height: 480, fps: 30}}" \
--policy.path=cfu/minerobots-srewdriver-act-20k \
--dataset.repo_id=lerobot/eval_inference_v58 \
--dataset.single_task="inference_act" \
--dataset.num_episodes=10 \
--dataset.reset_time_s=0 \
--display_data=falseIf using Feetech STS3215 motors, use the scripts provided in AMDYES.md to scan and change motor IDs.
- Scan Motors: Checks baud rates (1M/500k/250k/115200).
- Change ID: Unlocks EEPROM, writes new ID, and re-locks.
Team MineRobots (Alphabetical Order)
- Changqing Fu
- Sichen Su
- Wenzheng Wang
- Learning Fine-Grained Bimanual Manipulation with Low-Cost Hardware by Zhao et al.
- LeRobot by HuggingFace.


