研究目的
To develop an autonomous navigation method for civil aircraft precision approach and landing in GPS-denied and low visibility conditions, integrating inertial data, FLIR images, and runway geographic information to achieve accuracy comparable to INS/DGPS integration.
研究成果
The proposed method achieves motion estimation accuracy comparable to INS/DGPS integration, with improved robustness and performance in low visibility and GPS-denied environments. It is suitable for civil aircraft precision landing, but future work should focus on reducing computational load with derivative UKF.
研究不足
The SR-UKF may cause redundant computation due to the linear feature of the processing model, potentially limiting real-time performance. The method relies on accurate runway geographic information and may be affected by extreme weather conditions not tested.
1:Experimental Design and Method Selection:
The method uses a visual-inertial fusion framework based on SR-UKF to integrate inertial measurements, vision observations from FLIR images, and runway geographic data. It includes improved runway detection algorithms and feature extraction.
2:Sample Selection and Data Sources:
Real flight data is gathered from a general aviation aircraft (Y-12F) equipped with sensors, including a SWIR camera, INS, flight parameter recorder, flight video recorder, air data computer, and radio altimeter, under various weather conditions at Pucheng airport in China.
3:List of Experimental Equipment and Materials:
Equipment includes SWIR camera (NIP PHK03M100CSW0), INS (Applanix AV510), flight parameter recorder (AMPEX miniR 700), flight video recorder (VM-4), air data computer (XSC-6E), radio altimeter (Honeywell KRA405b), DGPS ground station (Trimble R5), and embedded computer (Nvidia Jetson TX2).
4:2). Experimental Procedures and Operational Workflow:
4. Experimental Procedures and Operational Workflow: Flight data is recorded and synchronized with time stamps. The embedded computer processes sensor data and airport geographic information to output motion states through visual-inertial fusion. Runway detection involves coarse ROI estimation and fine feature extraction using EDLines detector and RANSAC fitting.
5:Data Analysis Methods:
Motion estimation errors are analyzed using RMS calculations and compared with other methods (e.g., INS/GPS integration, EPnP-based algorithm). Statistical analysis includes Monte Carlo simulation for error distribution.
独家科研数据包,助您复现前沿成果,加速创新突破
获取完整内容-
Embedded computer
Jetson TX2
Nvidia
Processes sensor data and runs the visual-inertial fusion algorithm for motion estimation.
-
SWIR camera
PHK03M100CSW0
NIP
Captures forward looking infrared (FLIR) images for runway detection and feature extraction in low visibility conditions.
-
INS
AV510
Applanix
Provides inertial measurements (accelerations and rotational velocities) for motion estimation and integration with vision data.
-
Flight parameter recorder
miniR 700
AMPEX
Records flight parameters and synchronizes data with time stamps.
-
Flight video recorder
VM-4
Records flight video data.
-
Air data computer
XSC-6E
Measures air pressure height for navigation.
-
Radio altimeter
KRA405b
Honeywell
Measures altitude using radio waves for landing navigation.
-
DGPS ground station
R5
Trimble
Provides differential GPS data for ground truth in integration navigation.
-
登录查看剩余6件设备及参数对照表
查看全部