Autonomous Navigation and Landing in
Urban Environments
Video Demonstration
Integrated stereo vision and Lidar sensors to accurately identify landing sites and navigate complex urban environments to perform dynamic landing maneuvers.
State Estimation & Self-Localization
Localization and tracking of landing-site is completed through sensor fusion of stereo camera and lidar data via an Unscented Kalman Filter (UKF)
Trajectory Generation and Tracking
System generates a trajectory avoiding nearby objects and within vicinity of targeted landing surface
Execution of Landing Maneuver
Robot executes DeepRL trained policy based on velocity and onboard state estimation