Robodog

Autonomous Legged Robots for Assisted Living

According to a recent report from the external page WHO, approximately 2.2 billion people worldwide have a near or distant vision impairment, including 217 million with moderate to severe impairment and 36 million who are blind. This represents a significant portion of the population that faces challenges in navigating and traveling independently.

Traditional aids like white canes and guide dogs offer limited assistance: canes cannot detect distant obstacles, and guide dogs are costly and scarce. Our project aims to enhance navigation aids by developing robotic guide dogs. These are built on relatively affordable, commercially available quadruped robots, which we equip with specialized sensors and computational units. Unlike wheeled robotic platforms, these robotic dogs can navigate stairs, tight spaces, and challenging terrains, overcoming the same obstacles a human would encounter.

This work leverages the expertise of the Center for Project-Based Learning, specializing in low-power embedded systems and novel sensor technologies. Utilizing this specialized knowledge, we equip the robotic guide dogs with sensor fusion and autonomous navigation algorithms. These algorithms enable the robots to guide visually impaired individuals from one location to another in unfamiliar environments while safely avoiding obstacles in real time. This integrated approach combines high-level navigation capabilities, commonly found in smartphone apps, with immediate, low-level obstacle detection traditionally provided by canes or guide dogs.

We collaborate with the  Barrier-Free at ETH Zurich initiative and integrate the PolyMaps indoor mapping service, allowing visually impaired users to navigate ETH buildings autonomously. The system has already been tested with visually impaired individuals, showcased publicly, and received media attention.

Beyond aiding visually impaired individuals, we are expanding our research by integrating a robotic arm to further assist in daily living tasks. Additionally, we are exploring ways for mobility-impaired users to guide the robot using smart glasses, providing a hands-free control interface for enhanced accessibility.

This project is supported by the ETH Future Computing Laboratory (EFCL) and carried out in collaboration with the Barrier-Free at ETH Zurich initiative.

Join us to work on - Download Flyer (PDF, 1.1 MB)

  • Human-Robot Interaction
    Improving and extending the human-robot interface with vocal and haptic feedback 

  • User-Aware Navigation
    Ensuring safe navigation for both the robot and the user.

Quadruped Robot Challenges

We are developing an autonomous navigation system for a quadrupedal robot, designed to traverse complex terrain and successfully compete in the external page ICRA Quadruped Robot Challenges. The system integrates advanced perception, state estimation, and learning-based control to enable robust, real-time navigation in dynamic and unstructured environments. By leveraging exteroceptive sensing and adaptive motion planning, the robot can handle obstacles, rough terrain, and varying ground conditions while maintaining stability and precise locomotion. Our approach aims to push the boundaries of quadrupedal autonomy, making legged robots more capable in real-world deployment scenarios.

Join us to work on - Download Flyer (PDF, 1.4 MB)

  • Sensor fusion & robust localization
    Integrating vision, IMU, and legged odometry for precise state estimation

  • Deep reinforcement learning
    Training adaptive locomotion and navigation policies

  • Motion planning & terrain-aware control
    Developing real-time decision-making strategies for complex environments

  • Perception & mapping
    Enhancing scene understanding for obstacle avoidance and traversability analysis
JavaScript has been disabled in your browser