I undertook some teaching duties on courses related to mobile robots, and has supervised many student projects for their Master or Semester thesis projects.

Supervised students

  • Peilun Guo (M.Sc, TUM), 2023.06 – Present.
  • Jens Naumann (M.Sc, TUM), 2023.04 – Present.
  • Juan Galvis (M.Sc, TUM), 2023.04 – Present.
  • Hanfeng Wu (M.Sc, TUM), 2023.04 – 2023.11.
  • Yixin Mai (M.Sc, TUM), 2023.02 – 2023.11.
  • Chuanlong Zang (M.Sc, TUM), 2023.02 – 2023.10.
  • Yining Ma (M.Sc, TUM), 2022.10 – 2023.04.
  • Yingye Xin (M.Sc, TUM), 2022.10 – 2023.11.
  • Dongyue Lu (M.Sc, TUM), 2022.09 – 2023.05.
  • Zhijie Ye (M.Sc, TUM), 2021.12 – 2022.08.
  • Jingyuan Ma (M.Sc, ETH), 2020.02 – 2020.08.

Lecturer for Seminar: Multi-Modal Perception for Mobile Robotics (IN, WinSem 2022-2023, IN2107/IN4461, TUM)

An accurate and robust state estimation of the robot’s environment is at the core of many robotic tasks such as obstacle avoidance or planning. These tasks are essential for robotic applications in real-world scenarios, such as industrial inspection or agricultural robotics. Recent advances have brought new types of sensors in the field, each unlocking new ways for improving the performance of the perception system, but also coming with a set of challenges. Finding ways to fuse them wisely with conventional sensors is a key challenge in the current robotics research.

Lecturer for Seminar: Deep-learning-based 3D scene representation and understanding for mobile robots (IN, SumSem 2022, IN2107/IN4461, TUM)

The representation and understanding of the surroundings are the prerequisites for autonomous intelligent robots. The focus of this seminar is investigating the recent advances in enhancing the scene representation and understanding of mobile robots, especially in challenging environments with the interference of dynamic objects, time-variant appearance, poor illuminations, and bad weather conditions, etc. The related topics include but are not limited to: scene and object representation, neural implicit representation, scene graph, pose estimation of mobile robots, 3D reconstruction, semantic segmentation, pose and shape estimation of dynamic objects, long-term autonomy, 3D scene understanding, semantic exploration, etc.–>

Teaching Assistant for Mobile Robotics (IN2404, WinSem 2021-2022 & Winsem 2022-2023, TUM):

Mobile robots have been sent to Mars, can vacuum clean our homes, mow the lawn, and are promised to carry us around in the form of self-driving cars or even flying taxis. In this module, you will learn about the different component of such mobile robots and their interactions: from perception to estimation, mapping, and control. As will also learn to work with different mathematical representations of robot states and maps. Since deployment in open-ended environments requires sophisticated perception, localisation, and mapping approaches, we will dedicate a substantial part of the course towards probabilistic multi-sensor-fusion and modern Simultaneous Localisation and Mapping (SLAM) and more general Spatial AI systems – including elements of Machine Learning. In the last part, you will then be learning about how to use these representations of robot state and surroundings for navigation and control.

Teaching Assistant for Mobile Robotics Practicals (IN2106, IN4306, WinSem 2021-2022 & Winsem 2022-2023, TUM):

This practical course is aligned with Mobile Robotics, following its theoretical contents with the aim of applying them practically on a drone flying in simulation and ultimately the real world (indoors)