🔥 JUST IN: Open-source robotics dataset from 100% real-world scenarios! 🤯 Chinese robotics company @AGIBOTofficial just released AGIBOT WORLD 2026, an open-source dataset systematically covering key embodied AI research directions. Built entirely from real-world environments: commercial spaces, and homes. Collected using AGIBOT G2 robots in free-form collection mode, providing structured, accurately annotated, high-quality data. Digital twin technology creates 1:1 scale replicas in simulation matching the real environments. Both real-world and simulation data are open-sourced. The AGIBOT G2 platform collects multiple data types simultaneously: RGB(D) cameras, tactile sensors, force sensors, LiDAR, IMU, and full-body joint states. Whole-body control coordinates arms, waist, and hands for complex tasks. First-person teleoperation lets operators control the robot from its perspective. The tasks covered are fine-grained manipulation, ultra-long-horizon tasks, spatial navigation, dual-arm coordination, and multi-agent/human-robot collaboration. The dataset includes error-recovery trajectories with annotations. Most datasets only show successful demonstrations. AGIBOT includes failures and how the robot recovers, teaching models how to handle mistakes. After collection, data is tested through policy training and real-robot deployment to ensure quality. Then processed through industrial quality control with multiple screening and cleaning rounds. Making it open-source accelerates embodied AI research by giving researchers access to high-quality real-world robot data at scale. 🇨🇳 Learn more here: agibot-world.com/ ~~ ♻️ Join the weekly robotics newsletter, and never miss any news → ziegler.substack.com
→ View original post on X — @clementdelangue, 2026-04-07 13:30 UTC