Modern drone navigation systems must function reliably in environments that challenge individual sensor capabilities. Inertial measurement units drift over time with errors accumulating to 1-2% of distance traveled, while GPS signals attenuate by 20-30 dB when passing through building materials. Visual sensors struggle with frame rates and dynamic range limitations in changing light conditions, and lidar performance degrades significantly in precipitation, with range reduction of up to 60% in moderate rainfall.

The fundamental challenge lies in balancing sensor complementarity, computational efficiency, and robustness against environmental uncertainties while maintaining real-time navigation performance.

This page brings together solutions from recent research—including hierarchical multi-sensor architectures with federated path planning, physics-based thrust-drag estimation for wind compensation, dynamic mode-switching self-position estimation methods, and nonlinear state estimation techniques for optical-motion sensor integration. These and other approaches demonstrate practical implementations that maintain navigation integrity even when individual sensor modalities become compromised.

1. Dual-Redundancy Drone Flight Control System with Parallel Sensor and Actuator Integration

CASIC SIMULATION TECHNOLOGY CO LTD, 2025

A redundant flight control and navigation system for drones that ensures high safety and reliability through dual-redundancy architecture. The system comprises multiple sensors, a data fusion and filtering unit, a sensor information determination system, a flight control computer, and a control information determination system, all of which operate in parallel to provide fault-tolerant operation. The system can also interface with multiple actuators to execute control decisions.

WO2025060691A1-patent-drawing

2. Autonomous Drone with Integrated Sensor Fusion for 3D Mapping and Path Planning

KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY, 2025

Autonomous drone system for exploration and reconnaissance in unknown environments. The system allows the drone to autonomously fly, map the environment, locate targets, avoid obstacles, and return home. The drone acquires data from cameras, lidar, and IMU to estimate pose, recognize targets, and generate a 3D map. It plans safe paths using ray-casting and sensor fusion. The drone applies the path to fly autonomously. This allows it to explore unknown areas, accurately locate targets, avoid obstacles, and return home.

US2025036138A1-patent-drawing

3. UAV Altitude and Posture Control System Utilizing Secondary Barometric Compensation for Wind-Induced Pressure Variations

INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, 2024

Controlling a UAV's flight altitude and posture in environments where satellite signals are poor or blocked, like flying under bridges, by using a secondary barometer to compensate for wind-induced air pressure changes. The UAV's onboard barometer provides the initial air pressure reading. This is synchronized with an external reference barometer's reading and recalculated to account for any wind effects. The compensated air pressure is then fused with other sensor data to determine the target altitude and posture. This allows accurate flight control even when satellite signals are unavailable.

4. Hierarchical Multi-Drone and Sensor Platform with Federated Path Planning and Information Lateralization

SOTER TECHNOLOGY INC, 2024

Facilitating managing of paths for unmanned vehicles using a hierarchical multi-drone/sensor platform with information lateralization and federated path planning. The platform involves multiple drones, ground robots, and sensors with complementary functions acquiring heterogeneous information at different resolutions. This information is integrated and fed back to adjust planned paths for missions. The platform is modeled after brain lateralization, where drones/sensors have specialized roles like human hemispheres.

US12007791B2-patent-drawing

5. Calibration Method for Camera and Lidar Sensors Using Patterned and Reflective Calibration Board

SICK AG, 2024

A method for calibrating a camera and lidar sensor using a calibration board with known patterns and reflection areas. The method determines the intrinsic camera parameters and then uses the lidar sensor's projected laser lines to detect the calibration board's pose. The calibration board's known patterns and reflection areas enable simultaneous calibration of both sensors without requiring prior camera calibration.

6. Autonomous UAV Control System with Machine Learning-Based Pixel-Level Image Labeling for Landing Space Detection

WING AVIATION LLC, 2024

An autonomous control system for unmanned aerial vehicles (UAVs) that uses machine learning to detect landing spaces from camera images. The system generates pixel-level labels for the images, identifying landing spaces, occupied spaces, and non-landing spaces, and uses these labels to determine the UAV's relative position and control its movements.

7. Dynamic Mode-Switching Self-Position Estimation Method for Autonomous Mobile Objects

SONY CORP, 2024

An autonomous mobile object's self-position estimation method is dynamically switched between two modes based on the object's state. When the object is moving, it uses a successive estimation method that integrates internal sensor data. When the object is stopped, it switches to a discrete estimation method that uses external sensor data, enabling more accurate positioning in environments where GPS is unreliable.

US11926038B2-patent-drawing

8. Sensor Fusion System Integrating Visible, SWIR, and LWIR Cameras with Automatic Modality Switching for Enhanced Object Detection

GM GLOBAL TECHNOLOGY OPERATIONS LLC, 2024

A sensor fusion system for vehicle control applications that integrates visible light cameras with shortwave infrared (SWIR) and longwave infrared (LWIR) cameras to enhance object detection and recognition capabilities in adverse weather conditions. The system determines environmental conditions, such as low light or glare, and automatically switches to alternative sensor modalities to maintain reliable object detection and tracking. The fused images from multiple sensors are processed to generate a comprehensive understanding of the vehicle's surroundings, enabling proactive hazard detection and prediction.

9. Drone Navigation System with Onboard Sensor Integration and Physics-Based Thrust-Drag Estimation for Wind Compensation

THE REGENTS OF THE UNIVERSITY OF MICHIGAN, 2024

Autonomous drone navigation control for stable flight in extreme wind conditions using onboard sensors and physics-based modeling. The drone estimates thrust and drag forces from rotor geometry, measures wind, and uses feedforward flight control to compensate for winds. It also has a relationship between RPM and throttle input for rotors. This allows accurate flight path following in severe winds without relying on complex aerodynamic models.

10. Navigation System Integrating Optical and Motion Sensor Data Using Nonlinear State Estimation Technique

Dylan Krupity, 2024

A navigation system for autonomous vehicles that integrates optical sensor data with motion sensor data to provide accurate positioning in environments where traditional GNSS systems are unreliable. The system uses a nonlinear state estimation technique that incorporates a measurement model for optical samples, enabling direct integration of optical data with motion sensor data to generate a precise navigation solution. The system can operate in environments with limited GNSS visibility, such as dense urban areas, and can also be used in pedestrian navigation applications.

US11875519B2-patent-drawing

11. Multi-Sensor Calibration System Utilizing Third Sensor for 3D Measurement of Calibration Objects

SONY GROUP CORP, 2023

An information processing device, method, and system for accurate calibration of multiple sensors, including cameras and lidars, regardless of sensor type or range. The system uses a third sensor to measure 3D information of calibration objects placed within the measurement ranges of the first and second sensors, and calculates the relative positions and orientations of the sensors based on the third sensor data and sensor data from the first and second sensors.

WO2023243374A1-patent-drawing

12. Pose Estimation System Utilizing Integrated Inertial, Kinematic, and Odometry Sensors with Noise Adjustment

VOLVO CAR CORP, 2023

A lightweight pose estimation system for autonomous vehicles that determines vehicle position and orientation using a combination of inertial, kinematic, and odometry sensors. The system generates a pose value by integrating sensor readings and can adjust measurements based on observed noise. It enables efficient and real-time pose estimation for autonomous maneuvers, particularly in emergency braking applications.

EP4293318A1-patent-drawing

13. Environment Sensor Calibration System Utilizing Dual Marking Element Pose Analysis

VALEO SCHALTER UND SENSOREN GMBH, 2023

Calibration of an environment sensor system of an infrastructure device, such as a camera system or lidar system, for autonomous vehicle navigation. The system identifies two independent marking elements in the sensor's field of view, determines their relative pose, and compares it to a predetermined target value. Based on the comparison result, the system corrects the sensor's pose information in a reference coordinate system.

14. Navigation and Positioning System Utilizing Inertial, Binocular, and Radar Data with Nonlinear Graph Optimization for Drones

SHENZHEN UNIVERSITY, 2023

A high-reliability and high-precision navigation and positioning method for drones in GPS-denied environments. The method combines inertial sensor data with binocular camera images and ranging radar measurements to achieve accurate pose estimation. A nonlinear graph optimization algorithm based on a sliding window is used to fuse the sensor data and obtain high-precision pose estimates. The system also includes a loop closure detection module that enables four-degree-of-freedom pose graph optimization when the drone revisits a previously mapped location. The optimized pose data is then packaged into a pseudo-GPS signal that is input to the drone's flight controller for positioning and route planning.

15. Position Determination Method for Unmanned Aerial Systems Using Radar Node Network with Slant Distance and Elevation Angle Measurements

FIRST RF CORP, 2023

A method for determining the position of an unmanned aerial system (UAS) using a network of radar nodes. Each node measures the slant distance and elevation angle to the UAS, and transmits its position and azimuthal bounds to neighboring nodes. By combining these measurements, the nodes can solve for the UAS position without explicit measurement of its azimuthal position, providing a unique solution without ambiguity.

US11709251B1-patent-drawing

16. Visual Positioning Method Utilizing Sensor Data and Coarse Map Information for Mobile Device Position Estimation

QUALCOMM INC, 2023

A method for determining a position estimate of a mobile device using visual positioning. The method includes obtaining sensor information, detecting identifiable features in the sensor information, determining a range to the features, obtaining coarse map information, and determining the position estimate based on the range and map information. The method can utilize various sensors, including cameras, lidar, and radar, and can leverage coarse map data from remote servers.

17. Autonomous UAV Navigation System Utilizing GNSS, IMU Data, and Quadcopter Aerodynamics

SKYDIO INC, 2023

Autonomous navigation system for unmanned aerial vehicles (UAVs) that enables reliable flight in environments where traditional navigation sensors are unreliable. The system uses a combination of GNSS location signals, inertial measurement unit (IMU) data from accelerometers and gyroscopes, and quadcopter aerodynamics to determine the UAV's position, velocity, and orientation. By leveraging the unique characteristics of quadcopters, the system can maintain stable flight and navigate through challenging environments without relying on cameras, compasses, or magnetometers.

US2023204797A1-patent-drawing

18. Radar-to-Lidar Calibration Method Using Point Cloud Registration and Entropy Minimization

GM CRUISE HOLDINGS LLC, 2023

A method for radar-to-lidar calibration in autonomous vehicles that eliminates the need for specialized calibration targets. The method uses point cloud registration and entropy minimization to align radar and lidar point clouds gathered from different vehicle poses, enabling calibration in unstructured environments. The process aggregates radar and lidar point clouds using vehicle odometry and SLAM data, and then minimizes entropy over multiple drive segments to achieve accurate calibration.

EP4180834A1-patent-drawing

19. System for Automated Extrinsic Calibration of Vehicle and Robot Sensors with Turntable, Calibration Target, and Distributed Imaging Components

KINETIC AUTOMATION INC, 2023

Automated extrinsic calibration system for lidars, cameras, radars, and ultrasonic sensors on vehicles and robots, comprising a turntable system, calibration target system, and distributed imaging systems. The system enables precise calibration of sensor systems through automated scanning, target configuration, and image capture, eliminating the need for manual measurement and technician expertise.

WO2023081870A1-patent-drawing

20. Object Ranging Apparatus with Dual Estimation and Adaptive Result Combination

NEC CORP, 2023

An object ranging apparatus for improving accuracy and stability of object ranging in autonomous vehicles, comprising: an object recognition unit; a first distance estimation unit using depth estimation; a second distance estimation unit using motion parallax; and a combining unit that combines the results of the two estimation methods based on factors such as steering wheel angle and acceleration.

US2023075659A1-patent-drawing

21. Vision-Guided LIDAR System for Three-Dimensional Localization of Moving Platforms

TELEDYNE SCIENTIFIC & IMAGING LLC, 2023

Determining the location and/or navigation path of a moving platform. The method includes using a vision system on a moving platform to identify a region of interest, classifying objects within the region of interest, directing random-access LIDAR to ping one or more of the classified objects, and locating the platform in three dimensions using data from the vision system and LIDAR.

US11598878B2-patent-drawing

22. Navigation Solution Validation System with Dynamic Sensor Integrity Monitoring and Faulty Data Rejection

TRX SYSTEMS INC, 2023

System for validating navigation solution outputs by monitoring sensor integrity and automatically eliminating faulty data. The system continuously assesses the accuracy and reliability of multiple navigation sensors, including GNSS, accelerometers, gyroscopes, and others, and dynamically adjusts the navigation solution based on sensor performance. When sensor integrity is compromised, the system automatically rejects the affected data to ensure reliable position and timing outputs.

US2023065658A1-patent-drawing

23. Hierarchical Multi-Modal Sensing System with Adaptive Resolution for Environmental Mapping

LAWRENCE LIVERMORE NATIONAL SECURITY LLC, 2023

A multi-modal sensing approach for environmental mapping in autonomous systems that adapts data capture to object identification. The system uses a hierarchical structure to mimic human visual processing, initially capturing low-resolution data over a wide field of view and then selectively applying high-resolution sensing to areas of interest. Object recognition algorithms inform the distribution of sensing resources, enabling efficient data collection and object classification.

US11585933B2-patent-drawing

24. Drone Interface Device for Real-Time Data Processing and Frequency Conversion Between Onboard Computer and Flight Controller

UNIV DEGLI STUDI DI FIRENZE, 2023

A drone system that enables advanced autonomous flight capabilities through a novel interface device that connects an onboard computer to the flight controller and inertial measurement unit. The interface device performs real-time data processing and frequency conversion to enable efficient communication between the onboard computer and flight controller, while also supporting the integration of additional sensors. This enables the development of advanced autonomous flight modes, including AI-powered control systems that can learn from human pilots.

WO2023286097A1-patent-drawing

25. Sensor Configuration Method for Autonomous Vehicles Using Comparative Analysis of Target and Capability Specification Maps

WAYMO LLC, 2022

Optimizing sensor configuration for autonomous vehicles by comparing target and capability specification maps to identify regions where sensors are insufficient for a task. The method involves determining a target specification map indicating sensor parameters needed for a task, and a capability map showing what sensors can provide. Comparing the maps identifies regions where sensors fall short, allowing modifications to improve coverage. This allows evaluating sensor sufficiency before deployment.

US11529973B1-patent-drawing

26. Vehicle Navigation Method Utilizing Weighted Multi-Source Data Integration with Kalman Filter

GE AVIATION SYSTEMS LLC, 2022

A method of operating a vehicle that improves navigation accuracy by combining data from multiple sources with statistical weights based on their reliability. The method collects navigation parameters from sensors, GPS, and inertial systems, determines their uncertainties, and assigns weights to each parameter based on its reliability. A navigational solution is then formed by blending the weighted parameters using a Kalman filter, providing an optimized navigation solution with overall uncertainty estimates.

27. Camera Pose and Scale Estimation Method with Prior-Informed Cost Function Adjustment

MICROSOFT TECHNOLOGY LICENSING LLC, 2022

Method for estimating camera pose and scale that achieves both high speed and high accuracy by incorporating prior knowledge of rotation and scale into the estimation process. The method uses prior parameters derived from inertial sensor measurements to bias the cost function, accelerating the estimation process and improving accuracy compared to conventional approaches. The prior parameters selectively influence the cost function through rotation and scale weights, which are adjusted based on sensor noise. The method determines the camera pose and scale by optimizing the similarity transformation that minimizes the cost function below a threshold.

US11443455B2-patent-drawing

28. Map Generation Apparatus with Feature Point Extraction and Density Adjustment Based on Landmark Significance

HONDA MOTOR CO LTD, 2022

Map generation apparatus for vehicle positioning that extracts feature points from sensor data, generates a map incorporating these points, recognizes landmarks, determines their importance, and adjusts the map's feature point density based on landmark significance.

29. Indoor Autonomous Aerial System with Machine Learning-Driven Navigation and Visual Data Processing for Micro Aerial Vehicles

FLYVIZ INDOOR LTD, 2022

Indoor autonomous aerial system that enables efficient and precise navigation of micro aerial vehicles (MAVs) using machine learning algorithms. The system combines MAVs with image capture capabilities to autonomously navigate to desired locations within a deployable space, leveraging input from visual data to guide the MAV. By analyzing visual patterns and features, the system can extract valuable location data and present it to the MAV controller, enabling precise navigation and sign presentation applications. The system also incorporates battery level monitoring through onboard sensors, eliminating the need for traditional GPS.

30. System for Automatic Calibration of Vehicular Sensors Using Multi-Frame Object Detection and Sensor Data Integration

NETRADYNE INC, 2022

Automatic calibration of vehicular sensor systems using visual data from cameras and other sensors. The system detects stationary objects, such as traffic signs, and tracks their positions across multiple frames to estimate camera pose and sensor offsets. It filters object detection tracks based on quality metrics and jointly computes camera calibration parameters and object locations using the filtered tracks. The system can also incorporate data from other sensors, such as GPS, IMU, and wheel odometry, to improve accuracy and robustness.

US2022270358A1-patent-drawing

31. Robotic Control System Integrating Inertial Measurement Unit and String-Encoder Sensors for 3D Position and Orientation Calculation

JEANOLOGIA TEKNOLOJI AS, 2022

A 3D position and orientation calculation and robotic application structure that enables precise and repeatable robotic operations by combining inertial measurement unit (IMU) and string-encoder position sensors. The system records the movements of a portable recording apparatus using the IMU and string-encoder sensors, and then applies these movements to a robot for unmanned operation. The system provides high accuracy and precision, eliminates human error, and enables the creation of precise robotic programs for repetitive tasks.

US2022193919A1-patent-drawing

32. Unmanned Aerial Vehicle Flight Control System with Multi-Sensor Data Fusion for Landing Coordination

HONEYWELL INTERNATIONAL INC, 2022

Computing flight controls for safe landing of unmanned aerial vehicles (UAVs) using sensor data fusion. The system involves a UAV with a multi-sensor suite including radar, cameras, AHRS, and GPS. Before descent, the UAV receives landing confirmation from a service. During descent, the sensors cross-check to confirm clearance, alignment, altitude, descent rate, obstacles, and beacon visibility. If conflicts arise, the UAV stops. If all checks pass, the UAV lands. This comprehensive sensor fusion enables safe and reliable landing by coordinating multiple sensors and data sources.

33. Device for Adaptive Self-Position Estimation Method Selection in Autonomous Mobile Objects

SONY GROUP CORP, 2022

A device for more appropriate action control in autonomous mobile objects like drones. It selects the best self-position estimation method from multiple options based on indexes like GPS accuracy. This allows optimized positioning for different situations. The selected positioning is used to obtain the mobile object's state, and actions are then controlled based on that state using predefined plans.

US2022157179A1-patent-drawing

34. Distributed Localization System with Visual Feature Matching and Inertial Data Integration for GPS-Denied Environments

SRI INTERNATIONAL, 2022

A collaborative localization system for multiple platforms that enables accurate positioning in GPS-denied environments through real-time, distributed information sharing. The system uses visual feature matching and inertial measurement data to determine platform poses, with each device contributing to a shared map of geo-referenced visual features. When a device encounters a new feature, it uses IMU data to estimate its position, while matching features to the shared map enables precise localization. The system's distributed architecture enables organic collaboration without requiring special behaviors, making it suitable for applications where multiple platforms operate independently.

35. Mobile Ground Vehicle with Pose Sensors for UAV Localization and Tracking

PICKER DRONES INC, 2022

System for accurate localization and tracking of unmanned aerial vehicles (UAVs) using a mobile ground vehicle. The system has a wheeled chassis with pose sensors that track UAV positions. A processor calculates UAV poses relative to the chassis using the sensor data. This allows UAV navigation even in nested environments where fixed sensors can't see them. The chassis can also provide extended power and computation resources for the UAVs.

36. Autonomous Aerial Drone with AI-Driven Navigation and Sensor-Integrated Obstacle Avoidance System

KARBASI ARDAVAN, 2022

Autonomous aerial drone that enables self-sustaining flight capabilities through advanced AI-driven navigation and obstacle avoidance. The drone integrates multiple sensors, including cameras and sensors, to detect and respond to environmental changes, while its onboard AI system continuously optimizes flight paths and collision avoidance strategies. The system enables autonomous flight, route planning, and real-time obstacle avoidance, eliminating the need for human operator intervention.

37. Method for Vehicle Positioning Using Stable Landmark Identification and Multi-Source Data Integration

TOPCON POSITIONING SYSTEMS INC, 2022

A method for determining a vehicle's position in urban environments using a combination of a last known position, stable landmarks, and a reference map. The method identifies stable landmarks by analyzing multiple images of the area over time, and uses these landmarks in conjunction with GNSS and IMU data to accurately determine the vehicle's current position when GNSS accuracy falls below a threshold.

US2022074757A1-patent-drawing

38. Radar-Based Aircraft Positioning System with Multi-Antenna and Multi-Receiver Configuration for Three-Dimensional Determination

JOBY AERO INC, 2022

A radar-based positioning system for aircraft that provides accurate, high-integrity, and continuous positioning in all weather conditions. The system uses multiple antennas with multiple receivers to measure range and velocity from reflected radar signals, enabling determination of position, velocity, and orientation in three dimensions. The system's radar componentry operates independently of GPS, providing a redundant and reliable positioning solution.

US2022066015A1-patent-drawing

39. Vision-Based Aircraft Navigation System Utilizing Multi-Sensor Image Analysis and Georeferenced Databases

FREDDY RABBAT NETO, 2022

A vision-based navigation system for aircraft that leverages onboard sensors and databases to output accurate latitude-longitude coordinates without GPS. The system uses computer vision techniques to analyze images from visible light, night vision, and infrared cameras, as well as synthetic aperture radar, to determine aircraft position and orientation. It references georeferenced databases of high-resolution satellite images, DTED, and NIR data to achieve GPS-like precision.

US2022057213A1-patent-drawing

40. Radar-Based Odometry System Utilizing Deep Neural Network for Static Object Detection

INVENSENSE INC, 2022

A method and system for providing odometry information for autonomous vehicles using radar sensors. The system employs a deep neural network to process radar measurements and detect static objects, which are then used to estimate the vehicle's pose and motion. The radar sensor can be a frequency-modulated continuous wave (FMCW) radar or a pulse-based radar, and the system can operate in various frequency bands, including 24 GHz and 77 GHz. The neural network-based approach enables accurate odometry estimation even in the absence of GPS signals, making it suitable for applications where GNSS is unreliable or unavailable.

WO2022036332A1-patent-drawing

41. Sensor System for Synchronized Image Capture and Lidar Scans with Correlated Double Sampling

WAYMO LLC, 2022

Synchronizing image capture and lidar scans in sensor systems like autonomous vehicles to provide temporally coordinated sensor information. The technique involves capturing high resolution images, low resolution images, and lidar scans simultaneously during a single scan interval. It uses a technique called correlated double sampling to remove defective pixel noise from the images. By synchronizing the sensor functions, it allows precise spatial correlation between the lidar point cloud and the images for better perception and sensor fusion.

42. System for Determining Physical State of Movable Object Using Multi-Sensor Data Deviation Analysis

SZ DJI TECHNOLOGY CO LTD, 2022

Decrating a physical state of a movable object to improve the operational safety margin/factor of unmanned vehicles. The determination includes obtaining a plurality of sets of sensor data from a plurality of sensing systems coupled to the movable object, the plurality of sets of sensor data comprising a distinct set of sensor data for each sensing system of the plurality of sensing systems; determining, for each sensing system of the plurality of sensing systems, a deviation between the corresponding set of sensor data and another set of sensor data for a different sensing system of the plurality of sensing systems; selecting one or more sets of sensor data from the plurality of sets of sensor data based at least in part on the deviations of the one or more sets of sensor data; and determining a physical state of the movable object based at least in part on the selected one or more sets of sensor data.

43. Self-Localization Device with Acquisition, Estimation, Key Frame Registration, Map Reading, and Re-Localization Units

MITSUBISHI HEAVY INDUSTRIES LTD, 2022

Self-localization device which can be used for mapping a map and performing steering control of the mobile object using a result of detection of the position and the posture. The device includes an acquisition unit, an estimation unit, a key frame registering unit, a map reading unit, and a re-localization processing unit.

US11216973B2-patent-drawing

44. Autonomous UAV Landing System with Monocular Camera-Based Marker Tracking and Pose Estimation

INFINIUM ROBOTICS PTE LTD, 2021

Autonomous landing system for unmanned aerial vehicles (UAVs) that enables precise landing on moving ground vehicles (GVs) in GPS-denied environments. The system uses multiple monocular cameras to detect and track markers on the GVs, and employs a calibration phase to establish camera-marker relationships. During flight, the cameras generate pose estimates that are fused to provide accurate position and orientation data for the UAV's flight controller. The system enables autonomous takeoff, tracking, and landing of UAVs on moving GVs, with applications in surveillance, inspection, and logistics.

US2021405654A1-patent-drawing

45. Integrated Stereo Vision and Multi-Sensor System for 3D Positioning and Obstacle Detection

DONGGUAN PANRAYS INCORPORATED LTD, 2021

An automatic driving object detection and positioning system that integrates stereo vision, inertial navigation, and satellite navigation to achieve high-accuracy positioning and obstacle detection. The system uses a direct binocular method to calculate 3D coordinates without feature extraction, and tightly couples the measurement data from the three sensors to correct inertial navigation errors and improve overall positioning accuracy.

WO2021248636A1-patent-drawing

46. Autonomous Vehicle Localization via Onboard Signal-Based Landmark Detection and Remote Association Storage

NAUTO INC, 2021

Precision localization for autonomous vehicles using signal analysis instead of GPS. The method involves detecting, tracking, and classifying distinctive landmarks from signals like images or LiDAR scans. By associating landmarks with locations, vehicles can determine their position without relying on GPS. This enables real-time, high-precision mapping and localization without GPS, which is useful for autonomous driving and other applications where GPS is unavailable. The landmark detection, tracking, classification, and parameter extraction are done onboard the vehicle. The landmark associations are stored remotely and retrieved to provide global position.

47. Unmanned Aircraft Navigation System Utilizing Optical Guidance Lines with Integrated Collision Avoidance Mechanism

UNIVERSITE DE BORDEAUX, 2021

Guidance system for unmanned aircraft that enables precise navigation along predetermined air routes, particularly in environments where GPS signals are unreliable or unavailable. The system employs a network of optical guidance lines and beams that are detected by onboard sensors, allowing the aircraft to maintain precise alignment and trajectory control. The system also includes a collision avoidance mechanism for managing multiple aircraft operating in the same airspace.

48. Calibration System Utilizing Polyhedral Target for Multi-Sensor Alignment

GM CRUISE HOLDINGS LLC, 2021

A calibration system for vehicle sensors that uses a polyhedral target to simultaneously calibrate multiple sensor types, including cameras and LIDAR systems, by capturing data from multiple angles and mapping sensor representations to a common coordinate system. The system enables accurate sensor calibration despite manufacturing variations and environmental degradation, ensuring reliable sensor data for autonomous vehicles and sensor-equipped vehicles.

49. Method for Position Estimation of Moveable Vehicle Elements Using Sensor Data Fusion and Iterative Refinement

APTIV TECHNOLOGIES LTD, 2021

A method for determining the position of a moveable element in a vehicle, comprising acquiring position-related data from multiple sensors, selecting the most accurate data based on predetermined criteria, and fusing the selected data to determine the element's position. The method iteratively refines the position estimate as more accurate data becomes available, enabling precise tracking of elements like cameras and steering columns despite movement and thermal expansion.

50. Method and Apparatus for UAV Path Generation Using Particle Swarm Optimization with Dynamic Obstacle Avoidance

ELECTRONICS & TELECOMMUNICATIONS RES INST, 2021

A method and apparatus for generating an optimal path for an unmanned aerial vehicle (UAV) that avoids obstacles detected by a sensor during autonomous flight. The method uses a particle swarm optimization algorithm to determine a smooth trajectory connecting waypoints, with the algorithm iteratively updating particle positions and velocity vectors based on fitness function values. The optimal path is generated by establishing an integrated path plan that takes into account both waypoints and the connecting trajectory, with the algorithm converging to a solution that balances exploration and exploitation.

US2021287556A1-patent-drawing

51. Single Checkerboard-Based Calibration Method for Sensor Coordinate Transformation in Heterogeneous Autonomous Driving Systems

52. Position Estimation Device with Multi-Sensor State Evaluation and Confidence-Based Sensor Selection

53. Autonomous Vehicle Navigation System Utilizing Inertial Measurement Unit as Primary Sensor with Correction from Auxiliary Sensors

54. Single-Frame Object Detection and Velocity Estimation via Integrated Camera, Lidar, and Radar Feature Maps

55. SLAM System with Dynamic Sensor Parameter Adjustment Based on Environmental and Positional Data

Get Full Report

Access our comprehensive collection of 96 documents related to this technology