Modern drone navigation systems must function reliably in environments that challenge individual sensor capabilities. Inertial measurement units drift over time with errors accumulating to 1-2% of distance traveled, while GPS signals attenuate by 20-30 dB when passing through building materials. Visual sensors struggle with frame rates and dynamic range limitations in changing light conditions, and lidar performance degrades significantly in precipitation, with range reduction of up to 60% in moderate rainfall.

The fundamental challenge lies in balancing sensor complementarity, computational efficiency, and robustness against environmental uncertainties while maintaining real-time navigation performance.

This page brings together solutions from recent research—including hierarchical multi-sensor architectures with federated path planning, physics-based thrust-drag estimation for wind compensation, dynamic mode-switching self-position estimation methods, and nonlinear state estimation techniques for optical-motion sensor integration. These and other approaches demonstrate practical implementations that maintain navigation integrity even when individual sensor modalities become compromised.

1. Dual-Redundancy Drone Flight Control System with Parallel Sensor and Actuator Integration

CASIC SIMULATION TECHNOLOGY CO LTD, 2025

A redundant flight control and navigation system for drones that ensures high safety and reliability through dual-redundancy architecture. The system comprises multiple sensors, a data fusion and filtering unit, a sensor information determination system, a flight control computer, and a control information determination system, all of which operate in parallel to provide fault-tolerant operation. The system can also interface with multiple actuators to execute control decisions.

WO2025060691A1-patent-drawing

2. Autonomous Drone with Integrated Sensor Fusion for 3D Mapping and Path Planning

KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY, 2025

Autonomous drone system for exploration and reconnaissance in unknown environments. The system allows the drone to autonomously fly, map the environment, locate targets, avoid obstacles, and return home. The drone acquires data from cameras, lidar, and IMU to estimate pose, recognize targets, and generate a 3D map. It plans safe paths using ray-casting and sensor fusion. The drone applies the path to fly autonomously. This allows it to explore unknown areas, accurately locate targets, avoid obstacles, and return home.

US2025036138A1-patent-drawing

3. UAV Altitude and Posture Control System Utilizing Secondary Barometric Compensation for Wind-Induced Pressure Variations

INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, 2024

Controlling a UAV's flight altitude and posture in environments where satellite signals are poor or blocked, like flying under bridges, by using a secondary barometer to compensate for wind-induced air pressure changes. The UAV's onboard barometer provides the initial air pressure reading. This is synchronized with an external reference barometer's reading and recalculated to account for any wind effects. The compensated air pressure is then fused with other sensor data to determine the target altitude and posture. This allows accurate flight control even when satellite signals are unavailable.

4. Hierarchical Multi-Drone and Sensor Platform with Federated Path Planning and Information Lateralization

SOTER TECHNOLOGY INC, 2024

Facilitating managing of paths for unmanned vehicles using a hierarchical multi-drone/sensor platform with information lateralization and federated path planning. The platform involves multiple drones, ground robots, and sensors with complementary functions acquiring heterogeneous information at different resolutions. This information is integrated and fed back to adjust planned paths for missions. The platform is modeled after brain lateralization, where drones/sensors have specialized roles like human hemispheres.

US12007791B2-patent-drawing

5. Calibration Method for Camera and Lidar Sensors Using Patterned and Reflective Calibration Board

SICK AG, 2024

A method for calibrating a camera and lidar sensor using a calibration board with known patterns and reflection areas. The method determines the intrinsic camera parameters and then uses the lidar sensor's projected laser lines to detect the calibration board's pose. The calibration board's known patterns and reflection areas enable simultaneous calibration of both sensors without requiring prior camera calibration.

6. Autonomous UAV Control System with Machine Learning-Based Pixel-Level Image Labeling for Landing Space Detection

WING AVIATION LLC, 2024

An autonomous control system for unmanned aerial vehicles (UAVs) that uses machine learning to detect landing spaces from camera images. The system generates pixel-level labels for the images, identifying landing spaces, occupied spaces, and non-landing spaces, and uses these labels to determine the UAV's relative position and control its movements.

7. Dynamic Mode-Switching Self-Position Estimation Method for Autonomous Mobile Objects

SONY CORP, 2024

An autonomous mobile object's self-position estimation method is dynamically switched between two modes based on the object's state. When the object is moving, it uses a successive estimation method that integrates internal sensor data. When the object is stopped, it switches to a discrete estimation method that uses external sensor data, enabling more accurate positioning in environments where GPS is unreliable.

US11926038B2-patent-drawing

8. Sensor Fusion System Integrating Visible, SWIR, and LWIR Cameras with Automatic Modality Switching for Enhanced Object Detection

GM GLOBAL TECHNOLOGY OPERATIONS LLC, 2024

A sensor fusion system for vehicle control applications that integrates visible light cameras with shortwave infrared (SWIR) and longwave infrared (LWIR) cameras to enhance object detection and recognition capabilities in adverse weather conditions. The system determines environmental conditions, such as low light or glare, and automatically switches to alternative sensor modalities to maintain reliable object detection and tracking. The fused images from multiple sensors are processed to generate a comprehensive understanding of the vehicle's surroundings, enabling proactive hazard detection and prediction.

9. Drone Navigation System with Onboard Sensor Integration and Physics-Based Thrust-Drag Estimation for Wind Compensation

THE REGENTS OF THE UNIVERSITY OF MICHIGAN, 2024

Autonomous drone navigation control for stable flight in extreme wind conditions using onboard sensors and physics-based modeling. The drone estimates thrust and drag forces from rotor geometry, measures wind, and uses feedforward flight control to compensate for winds. It also has a relationship between RPM and throttle input for rotors. This allows accurate flight path following in severe winds without relying on complex aerodynamic models.

10. Navigation System Integrating Optical and Motion Sensor Data Using Nonlinear State Estimation Technique

Dylan Krupity, 2024

A navigation system for autonomous vehicles that integrates optical sensor data with motion sensor data to provide accurate positioning in environments where traditional GNSS systems are unreliable. The system uses a nonlinear state estimation technique that incorporates a measurement model for optical samples, enabling direct integration of optical data with motion sensor data to generate a precise navigation solution. The system can operate in environments with limited GNSS visibility, such as dense urban areas, and can also be used in pedestrian navigation applications.

US11875519B2-patent-drawing

11. Multi-Sensor Calibration System Utilizing Third Sensor for 3D Measurement of Calibration Objects

SONY GROUP CORP, 2023

An information processing device, method, and system for accurate calibration of multiple sensors, including cameras and lidars, regardless of sensor type or range. The system uses a third sensor to measure 3D information of calibration objects placed within the measurement ranges of the first and second sensors, and calculates the relative positions and orientations of the sensors based on the third sensor data and sensor data from the first and second sensors.

WO2023243374A1-patent-drawing

12. Pose Estimation System Utilizing Integrated Inertial, Kinematic, and Odometry Sensors with Noise Adjustment

VOLVO CAR CORP, 2023

A lightweight pose estimation system for autonomous vehicles that determines vehicle position and orientation using a combination of inertial, kinematic, and odometry sensors. The system generates a pose value by integrating sensor readings and can adjust measurements based on observed noise. It enables efficient and real-time pose estimation for autonomous maneuvers, particularly in emergency braking applications.

EP4293318A1-patent-drawing

13. Environment Sensor Calibration System Utilizing Dual Marking Element Pose Analysis

VALEO SCHALTER UND SENSOREN GMBH, 2023

Calibration of an environment sensor system of an infrastructure device, such as a camera system or lidar system, for autonomous vehicle navigation. The system identifies two independent marking elements in the sensor's field of view, determines their relative pose, and compares it to a predetermined target value. Based on the comparison result, the system corrects the sensor's pose information in a reference coordinate system.

14. Navigation and Positioning System Utilizing Inertial, Binocular, and Radar Data with Nonlinear Graph Optimization for Drones

SHENZHEN UNIVERSITY, 2023

A high-reliability and high-precision navigation and positioning method for drones in GPS-denied environments. The method combines inertial sensor data with binocular camera images and ranging radar measurements to achieve accurate pose estimation. A nonlinear graph optimization algorithm based on a sliding window is used to fuse the sensor data and obtain high-precision pose estimates. The system also includes a loop closure detection module that enables four-degree-of-freedom pose graph optimization when the drone revisits a previously mapped location. The optimized pose data is then packaged into a pseudo-GPS signal that is input to the drone's flight controller for positioning and route planning.

15. Position Determination Method for Unmanned Aerial Systems Using Radar Node Network with Slant Distance and Elevation Angle Measurements

FIRST RF CORP, 2023

A method for determining the position of an unmanned aerial system (UAS) using a network of radar nodes. Each node measures the slant distance and elevation angle to the UAS, and transmits its position and azimuthal bounds to neighboring nodes. By combining these measurements, the nodes can solve for the UAS position without explicit measurement of its azimuthal position, providing a unique solution without ambiguity.

US11709251B1-patent-drawing

16. Visual Positioning Method Utilizing Sensor Data and Coarse Map Information for Mobile Device Position Estimation

QUALCOMM INC, 2023

A method for determining a position estimate of a mobile device using visual positioning. The method includes obtaining sensor information, detecting identifiable features in the sensor information, determining a range to the features, obtaining coarse map information, and determining the position estimate based on the range and map information. The method can utilize various sensors, including cameras, lidar, and radar, and can leverage coarse map data from remote servers.

17. Autonomous UAV Navigation System Utilizing GNSS, IMU Data, and Quadcopter Aerodynamics

SKYDIO INC, 2023

Autonomous navigation system for unmanned aerial vehicles (UAVs) that enables reliable flight in environments where traditional navigation sensors are unreliable. The system uses a combination of GNSS location signals, inertial measurement unit (IMU) data from accelerometers and gyroscopes, and quadcopter aerodynamics to determine the UAV's position, velocity, and orientation. By leveraging the unique characteristics of quadcopters, the system can maintain stable flight and navigate through challenging environments without relying on cameras, compasses, or magnetometers.

US2023204797A1-patent-drawing

18. Radar-to-Lidar Calibration Method Using Point Cloud Registration and Entropy Minimization

GM CRUISE HOLDINGS LLC, 2023

A method for radar-to-lidar calibration in autonomous vehicles that eliminates the need for specialized calibration targets. The method uses point cloud registration and entropy minimization to align radar and lidar point clouds gathered from different vehicle poses, enabling calibration in unstructured environments. The process aggregates radar and lidar point clouds using vehicle odometry and SLAM data, and then minimizes entropy over multiple drive segments to achieve accurate calibration.

EP4180834A1-patent-drawing

19. System for Automated Extrinsic Calibration of Vehicle and Robot Sensors with Turntable, Calibration Target, and Distributed Imaging Components

KINETIC AUTOMATION INC, 2023

Automated extrinsic calibration system for lidars, cameras, radars, and ultrasonic sensors on vehicles and robots, comprising a turntable system, calibration target system, and distributed imaging systems. The system enables precise calibration of sensor systems through automated scanning, target configuration, and image capture, eliminating the need for manual measurement and technician expertise.

WO2023081870A1-patent-drawing

20. Object Ranging Apparatus with Dual Estimation and Adaptive Result Combination

NEC CORP, 2023

An object ranging apparatus for improving accuracy and stability of object ranging in autonomous vehicles, comprising: an object recognition unit; a first distance estimation unit using depth estimation; a second distance estimation unit using motion parallax; and a combining unit that combines the results of the two estimation methods based on factors such as steering wheel angle and acceleration.

US2023075659A1-patent-drawing

21. Vision-Guided LIDAR System for Three-Dimensional Localization of Moving Platforms

22. Navigation Solution Validation System with Dynamic Sensor Integrity Monitoring and Faulty Data Rejection

23. Hierarchical Multi-Modal Sensing System with Adaptive Resolution for Environmental Mapping

24. Drone Interface Device for Real-Time Data Processing and Frequency Conversion Between Onboard Computer and Flight Controller

25. Sensor Configuration Method for Autonomous Vehicles Using Comparative Analysis of Target and Capability Specification Maps

Get Full Report

Access our comprehensive collection of 96 documents related to this technology