Multimodal Sensing and Data Fusion Power Next-Generation Intelligent Unmanned Aerial Vehicles
Multimodal Sensing and Data Fusion Power Next-Generation Intelligent Unmanned Aerial Vehicles
2026-02-23
Multimodal Sensing and Data Fusion Power Next-Generation Intelligent Unmanned Aerial Vehicles
Advanced perception is rapidly emerging as the cornerstone of UAV autonomy, with multimodal perception and data fusion serving as foundational enabling technologies. The inherent limits of single sensors in unstructured and dynamic airspaces have positioned deep vision–LiDAR integration as a leading solution.
Baya et al. leveraged convolutional neural networks with LiDAR data to significantly enhance dynamic obstacle identification and tracking, maintaining robust flight safety in highly dynamic environments (Fig. 3(a)). Ullah et al. advanced vision–LiDAR fusion architectures, shifting emphasis from dynamic obstacle avoidance to cross-environment adaptability, addressing the varied demands of complex operational scenarios.
Perception enables drones to perceive their surroundings, while data fusion delivers accurate, reliable understanding. Xu et al. introduced a multimodal neural fusion framework that refreshes environmental models in real time and refines path planning, delivering exceptional performance in complex terrain (Fig. 3(b)). Jiang et al. proposed a multi-scale infrared–visual fusion algorithm optimized for low-visibility environments. These complementary innovations elevate UAV precision, robustness, and operational versatility across challenging real-world conditions.