Robotics is advancing at lightning speed, expanding into different areas of our lives. And just as quickly, AI systems that help robots learn and perceive the world are evolving too. This week, Twitter was full of robotics updates about new interesting projects (even open source), fresh features and new implementation. We couldnβt ignore the wave of news, so we pulled together the most interesting highlights for you:
AEON β The first humanoid robot at BMW plant
AEON is an industrial wheeled bipedal humanoid robot that has joined the BMW team in Leipzig as a part of the iFACTORY approach. It can assemble batteries, place car components, helping to produce cars in real industry conditions. β Explore more
EgoScale β 20,000+ hours of human video unlock robot dexterity
Nvidia found a near-perfect log-linear scaling law (RΒ² = 0.998) between human data volume and action loss, directly predicting real-robot success. They created an EgoScale framework, which helped a humanoid with 22-DoF hands learn to assemble model cars, operate syringes, sort cards and fold shirts from 20,000+ hours of egocentric human video β no robot in the loop during pretraining. With just 4 hours of robot play data, the policy achieves 54% gains over training from scratch and even transfers to a 7-DoF Unitree G1 with 30%+ improvement. This means that scaling human motion may be the most practical path to robot dexterity. β Read more
The Physical Intelligence Layer
Physical Intelligence is building a shared βintelligence layerβ for robots β like APIs, but for physical action. They teamed up with Weave Robotics and Ultra Robotics to run Ο0.6 in real deployments: folding laundry at Sea Breeze Cleaners with 92% autonomy and packaging warehouse orders at 165 items/hour with minimal interventions, cutting interventions by up to 50% and improving throughput with each generation. So instead of engineering full stacks from scratch, companies can plug their hardware into Ο0βΟ0.6 models and benefit from shared foundation models to scale real-world deployments. β Read more
SimToolReal
Cornell and Stanford Universities proposed a way to teach robots how to use tools without hand-crafting every task. Instead of training on one tool at a time, SimToolReal trains a single reinforcement learning policy in simulation on lots of generated tool-like shapes. Now one policy can use new real-world tools zero-shot. The results are impressive. β Read more
Gemini 3 Flash and VLA models teach a robot to play a childrenβs game
A hands-on project by Paul Ruiz shows how to get a robot arm to play a toddler board game First Orchard. A vision-language-action (VLA) model handles pick-and-place motions, while Gemini 3 Flash tracks game rules and state from an overhead camera. Trained on 400 teleoperated episodes, the system can pick colored fruit pieces and play full rounds. Itβs a small but practical demo of embodied AI at home. β Read more
Asimov β an open source, bipedal humanoid robot
Menlo Research are going to open source the complete body design, simulation files, and a full list of actuators of their humanoid robot Asimov v1. Now we have open-source Asimov v0 β bipedal leg design for humanoids robots. It uses off-the-shelf motors and components and is built to be compatible with low-volume manufacturing, including MJF 3D printing. Each leg has 6 degrees of freedom (12 total), including an articulated toe and an advanced ankle mechanism. β Explore more
ElRobot
Another interesting open-source project. ElRobot is a low-cost, fully 3D-printed robotic arm designed for physical AI research and imitation learning. It costs around $220 per arm, has 7+1 degrees of freedom, and a 430 mm reach. Built from off-the-shelf servos and printable parts, itβs easy to assemble and modify. The system supports teleoperation (leaderβfollower setup) and includes camera mounts. β Explore more
KV-Tracker
KV-Tracker makes advanced multi-view 3D vision models practical for robotics. It enables a robot to track objects or entire scenes in real time using just a monocular RGB camera. By caching key visual information, it runs up to 15Γ faster and reaches ~27 FPS. This supports real-time 6-DoF pose tracking and on-the-fly 3D reconstruction for manipulation, navigation and interaction tasks. β Read more
Hyundai Motor Group and Atlas robots
In case you missed it: at CES 2026, Hyundai announced a ~$9B (KRW 9T) innovation hub in Korea. They are using Boston Dynamicsβ Atlas humanoid as a core platform of their strategy, planning mass production, first for factory tasks like assembly and parts sequencing. The new Saemangeum cluster is designed to produce ~30,000 robots a year and support large-scale robot training and manufacturing. β Read more
And just recently Bloomberg said Hyundai (not even Tesla with Optimus robot) is quietly leading the humanoid robot race. β Read more
