Runway Detection and Tracking for Autonomous Landing of a UAV [video]
Authors
McCarthy, Tyler
Subjects
Advisors
Date of Issue
2017-04-12
Date
Wednesday, 12 April 2017
Publisher
Language
Abstract
This presentation outlines an approach to develop a robust, real-time runway detection and tracking system for an unmanned aerial vehicle (UAV) using computer vision and standard guidance, navigation, and control (GNC) equipment. Feature extraction techniques, specifically the Hough transform, are used to identify the position and orientation of a runway from cameras on board an aircraft. Information relating the position of the aircraft and the runway is then integrated into a control system for the final approach and landing stages of UAV flight. While conceptual in nature, this presentation utilizes modeling and simulation to demonstrate the feasibility of a computer vision approach to an autoland capability for a UAV. This approach could also have potential applications in other autonomous fields by illustrating the practicality of vision-based feedback for guidance and control of unmanned autonomous systems.
Type
Video
Presentation
Presentation
Description
TechCon2017 (CRUSER)
Presented by ENS Tyler McCarthy, USN: NPS Systems Engineering
Includes slides
Presented by ENS Tyler McCarthy, USN: NPS Systems Engineering
Includes slides
Series/Report No
Department
Systems Engineering (SE)
Identifiers
NPS Report Number
Sponsors
NPS CRUSER
Funder
Format
Citation
Distribution Statement
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.