This page presents patents and research papers on multi-sensor integration architectures, data fusion algorithms for accurate UAV positioning, navigation, and obstacle avoidance in GPS-denied areas, using:

  • Inertial-Visual-Lidar Fusion – Extended Kalman filter with inertial, visual odometry and tag recognition, tightly-coupled nonlinear state estimation, binocular camera with inertial sensor for feature extraction with ranging radar, vision-lidar coupling with Bayesian fusion for SLAM.
  • Multi-Sensor Positioning Algorithms – Sliding window optimization combining LiDAR, IMU and wheel speedometer, radar and 3D map correlation, weighted data integration with dynamic coefficients based on reliability assessment, factor graph optimization with inertial reference for state vectors.
  • Landing and Flight Control Fusion – Multi-sensor suite with radar, cameras, AHRS and GPS for descent cross-checking, LED module with IMU, camera, laser radar and ultrasonic distance for tunnel defect detection with neural network pose estimation.
  • Autonomous Navigation and Path Planning – Scene perception with semantic maps for obstacle avoidance and path generation, 3D mapping with ray-casting, high-resolution image capture with feature matching algorithms.

1. Autonomous Vehicle Navigation System with Integrated Motion and Perception Sensor Data Using Tightly-Coupled Nonlinear State Estimation

INVENSENSE INC, 2025

A navigation system for autonomous vehicles that integrates motion sensor data with perception sensor data to provide accurate positioning in real-time. The system builds an online map during navigation using perception sensor data and revises the navigation solution using a tightly-coupled nonlinear state estimation technique that incorporates both motion and perception sensor data. This approach enables reliable navigation in environments with weak or obstructed GNSS signals, such as urban areas with tall buildings or tunnels.

WO2025207878A1-patent-drawing

2. Binocular Camera and Inertial Sensor-Based Navigation System with Feature Extraction and Ranging Radar for UAVs

SHENZHEN UNIVERSITY, 2025

A high-precision navigation and positioning system for UAVs in GPS-denied environments, utilizing a binocular camera and inertial sensor to achieve accurate positioning and attitude estimation. The system extracts and tracks both point and line features from the camera images, and fuses this data with inertial measurements to achieve robust and reliable positioning. The system also incorporates a ranging radar for altitude measurement and loopback detection for graph optimization, enabling efficient and accurate positioning signals for autonomous UAV flight.

3. UAV Formation Positioning System with Multi-Sensor Data Fusion and Autonomous Flight Capabilities

SHANGHAI FUYA INTELLIGENT TECH CO LTD, 2024

A UAV formation positioning system and method that enables high-precision positioning and autonomous flight in complex environments. The system integrates multiple sensors, including GPS, IMU, magnetometer, barometer, laser radar, millimeter-wave radar, and high-resolution camera, to provide accurate position, attitude, and environmental data. A data fusion and positioning algorithm processes sensor data to achieve global consistency and optimize positioning results. The system also employs machine learning, autonomous obstacle avoidance, and cluster intelligence to enable self-organization and collaboration among UAVs in formation flight.

4. Multi-Sensor Fusion System with Integrated Scene Perception and Path Generation for Autonomous Tilting Wing Drones

Heisha Technology Co., Ltd., 2024

Multi-sensor fusion obstacle avoidance system for autonomous tilting wing drones that uses multiple sensors like cameras, lidar, radar, and IMU to accurately and robustly detect and avoid obstacles. The system fuses the sensor data to provide complete scene perception and semantic maps for intelligent obstacle avoidance. It combines satellite navigation with local obstacle sensing to avoid issues with single sensor failures or inaccurate positioning. The fused sensor data is used to generate optimal avoidance paths and control the drone's flight speed and heading to safely navigate complex environments.

5. Aircraft Navigation and Targeting System Utilizing Inertial Reference and Factor Graph Optimization

ROCKWELL COLLINS INC, 2023

A method and system for aircraft-based precision navigation and targeting in GPS-challenged environments, enabling accurate targeting and navigation through airspaces where GPS and comms systems are compromised or denied. The system uses inertial reference systems to determine aircraft state vectors and target information, which are then combined with subsequent GPS-derived positions to generate a precise targeting solution through factor graph optimization. The system can also incorporate ranging signals and data from companion aircraft to enhance accuracy and synchronization.

6. Indoor Drone Navigation System Utilizing Sensor Fusion and Extended Kalman Filter

SAMSUNG ELECTRONICS CO LTD, 2023

System for indoor autonomous drone navigation that doesn't rely on GPS. The drone uses multiple sensors (inertial, visual odometry, tag recognition) and an extended Kalman filter to estimate its position and navigate indoors without GPS. It leverages sensor data fusion from cameras, IMU, and tags to improve position estimation.

US11619952B2-patent-drawing

7. Vision-Lidar Coupled UAV System for Poor-Texture Tunnel Modeling with Bayesian Data Fusion

TONGJI UNIVERSITY, 2023

Method and system for modeling poor-texture tunnels using a vision-lidar coupling on an unmanned aerial vehicle (UAV). The system integrates a depth camera and lidar for simultaneous localization and mapping (SLAM), leveraging the wide-range information of the lidar and local details of the depth camera to improve accuracy. The system fuses point cloud data, raster maps, and pose information using Bayesian fusion, and iteratively refines the map model through feature matching between successive frames. The system also employs positioning UAVs and auxiliary lighting to enhance data quality and accuracy.

8. Unmanned Aerial Vehicle-Based Tunnel Defect Detection System with Integrated Pose Estimation and Real-Time Neural Network Analysis

TONGJI UNIVERSITY, 2022

A tunnel defect detection system using an unmanned aerial vehicle (UAV) that enables accurate defect detection in GPS-denied environments. The system integrates an LED module for illumination, IMU, camera, laser radar, and ultrasonic distance meter to estimate the UAV's pose. A trained neural network model detects defects in real-time from images collected by the camera and LED module, and the UAV hovers for further inspection when a defect is detected. The system achieves accurate defect detection and pose estimation in tunnels with no GPS signals and highly symmetrical structures.

9. Unmanned Aerial Vehicle Flight Control System with Multi-Sensor Data Fusion for Landing Coordination

HONEYWELL INTERNATIONAL INC, 2022

Computing flight controls for safe landing of unmanned aerial vehicles (UAVs) using sensor data fusion. The system involves a UAV with a multi-sensor suite including radar, cameras, AHRS, and GPS. Before descent, the UAV receives landing confirmation from a service. During descent, the sensors cross-check to confirm clearance, alignment, altitude, descent rate, obstacles, and beacon visibility. If conflicts arise, the UAV stops. If all checks pass, the UAV lands. This comprehensive sensor fusion enables safe and reliable landing by coordinating multiple sensors and data sources.

10. Multi-Sensor Fusion Positioning Method with LiDAR, IMU, and Wheel Speedometer Data Using Sliding Window Optimization

BEIJING TUSEN WEILAI TECH CO LTD, 2022

A multi-sensor fusion positioning method for autonomous vehicles and robots that combines LiDAR, IMU, and wheel speedometer data to achieve accurate positioning in GPS-denied environments. The method synchronizes sensor data, performs data preprocessing and correlation, and uses a sliding window framework to jointly optimize the vehicle's pose state. The solution enables robust and accurate positioning in scenarios where GPS signals are lost, LiDAR observations are degraded, or the vehicle is in a jumping scenario.

11. Autonomous Vehicle Positioning System Utilizing Radar and 3D Map Correlation

HONEYWELL INTERNATIONAL INC, 2021

Reliable and precise positioning for autonomous vehicles in urban environments using radar and 3D maps when GPS signals are weak or lost. The system emits radar beams to scan the surroundings, receives returns, correlates them with the 3D map database, and determines vehicle position and velocity. This provides backup positioning when GPS is unavailable or degraded in urban canyons.

12. Multi-Sensor Fusion System for UAS Location Determination Using Weighted Data Integration

RAYTHEON CO, 2020

Determining location of an unmanned aircraft system (UAS) through multi-sensor fusion, leveraging data from GPS, RF, radar, lidar, and other sensors to provide accurate location determination. The method assigns weights to sensor data based on reliability assessment, then combines weighted sensor data to generate a location map. This approach enables UAS operators to maintain situational awareness even when primary sensor signals are compromised or unavailable. The system uses a three-dimensional location graph to infer the UAS's precise position, with weights assigned to sensors based on reliability.

13. Mobile Robot with Integrated LiDAR and Camera Sensors for Hybrid SLAM

LG ELECTRONICS INC, 2020

A mobile robot that combines LiDAR and camera sensors to achieve robust simultaneous localization and mapping (SLAM) in various environments. The robot uses LiDAR-based geometry information to estimate its location in low-illuminance environments, while camera-based feature matching is used in well-lit areas. The system integrates odometry information from both sensors to minimize scale drift and enables loop closing in areas with sufficient illumination. This hybrid approach enables the robot to maintain accurate location estimation and map creation in diverse environments.

14. Laser-Based Positioning System for UAV Navigation Using Transmitter-Receiver Pair with Beam Reflection Angle Detection

TRANSPORTATION IP HOLDINGS LLC, 2020

A system for precise navigation and control of unmanned aerial vehicles (UAVs) in GPS-denied environments, such as indoors, using a laser-based positioning system. The system employs a laser transmitter and receiver mounted on separate UAVs, with the transmitter emitting a beam that is reflected back to the receiver. The receiver's position is determined by the angle of the reflected beam, allowing the UAV to maintain precise control and navigation. The system enables reliable and accurate positioning in environments where GPS signals are unavailable, making it suitable for applications such as indoor inspection and surveillance.

US2020094411A1-patent-drawing

15. UAV Sensor Fusion System Integrating Inertial and Image Data for Enhanced State Estimation and Calibration

THE HONG KONG UNIVERSITY OF SCIENCE AND TECHNOLOGY, SZ DJI TECHNOLOGY CO LTD, 2020

Enhancing UAV operation accuracy and flexibility using sensor fusion. The method involves fusing inertial and image sensor data from multiple sensors onboard the UAV to improve initialization, error recovery, calibration, and state estimation. This allows better autonomous navigation, obstacle avoidance, mapping, etc. compared to using just one sensor type.

US10565732B2-patent-drawing

16. Drone Visual Positioning System with High-Resolution Image Capture and Feature Matching Algorithm

SHANGHAI AUTOFLIGHT CO LTD, 2020

Visual positioning system for drones that enables autonomous navigation through areas with GPS signal degradation. The system employs a camera-based vision system that captures high-resolution images at high frame rates, allowing the drone to accurately determine its position and orientation. The system employs a matching algorithm to compare features from consecutive images, enabling the drone to calculate its position and velocity through complex environments.

17. Multimodal UAV Detection via Synchronized Sensor Data Fusion with Dynamic Weighting Coefficients

CHENGDU SKYDEFENCE TECHNOLOGY CO LTD, 2025

A method for detecting unmanned aerial vehicles (UAVs) using multimodal detection information fusion. The method synchronizes data acquisition from multiple sensors, calculates weighting coefficients based on time differences and confidence levels, and performs weighted fusion of the sensor data to produce accurate UAV target information.

18. Autonomous Drone with Integrated Sensor Fusion for 3D Mapping and Path Planning

KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY, 2025

Autonomous drone system for exploration and reconnaissance in unknown environments. The system allows the drone to autonomously fly, map the environment, locate targets, avoid obstacles, and return home. The drone acquires data from cameras, lidar, and IMU to estimate pose, recognize targets, and generate a 3D map. It plans safe paths using ray-casting and sensor fusion. The drone applies the path to fly autonomously. This allows it to explore unknown areas, accurately locate targets, avoid obstacles, and return home.

US2025036138A1-patent-drawing

19. UAV Altitude and Posture Control System Utilizing Secondary Barometric Compensation for Wind-Induced Pressure Variations

INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, 2024

Controlling a UAV's flight altitude and posture in environments where satellite signals are poor or blocked, like flying under bridges, by using a secondary barometer to compensate for wind-induced air pressure changes. The UAV's onboard barometer provides the initial air pressure reading. This is synchronized with an external reference barometer's reading and recalculated to account for any wind effects. The compensated air pressure is then fused with other sensor data to determine the target altitude and posture. This allows accurate flight control even when satellite signals are unavailable.

20. Hierarchical Multi-Drone and Sensor Platform with Federated Path Planning and Information Lateralization

SOTER TECHNOLOGY INC, 2024

Facilitating managing of paths for unmanned vehicles using a hierarchical multi-drone/sensor platform with information lateralization and federated path planning. The platform involves multiple drones, ground robots, and sensors with complementary functions acquiring heterogeneous information at different resolutions. This information is integrated and fed back to adjust planned paths for missions. The platform is modeled after brain lateralization, where drones/sensors have specialized roles like human hemispheres.

US12007791B2-patent-drawing

21. Calibration Method for Camera and Lidar Sensors Using Patterned and Reflective Calibration Board

SICK AG, 2024

A method for calibrating a camera and lidar sensor using a calibration board with known patterns and reflection areas. The method determines the intrinsic camera parameters and then uses the lidar sensor's projected laser lines to detect the calibration board's pose. The calibration board's known patterns and reflection areas enable simultaneous calibration of both sensors without requiring prior camera calibration.

22. Dynamic Mode-Switching Self-Position Estimation Method for Autonomous Mobile Objects

SONY CORP, 2024

An autonomous mobile object's self-position estimation method is dynamically switched between two modes based on the object's state. When the object is moving, it uses a successive estimation method that integrates internal sensor data. When the object is stopped, it switches to a discrete estimation method that uses external sensor data, enabling more accurate positioning in environments where GPS is unreliable.

US11926038B2-patent-drawing

23. Sensor Fusion System Integrating Visible, SWIR, and LWIR Cameras with Automatic Modality Switching for Enhanced Object Detection

GM GLOBAL TECHNOLOGY OPERATIONS LLC, 2024

A sensor fusion system for vehicle control applications that integrates visible light cameras with shortwave infrared (SWIR) and longwave infrared (LWIR) cameras to enhance object detection and recognition capabilities in adverse weather conditions. The system determines environmental conditions, such as low light or glare, and automatically switches to alternative sensor modalities to maintain reliable object detection and tracking. The fused images from multiple sensors are processed to generate a comprehensive understanding of the vehicle's surroundings, enabling proactive hazard detection and prediction.

24. Navigation System Integrating Optical and Motion Sensor Data Using Nonlinear State Estimation Technique

Dylan Krupity, 2024

A navigation system for autonomous vehicles that integrates optical sensor data with motion sensor data to provide accurate positioning in environments where traditional GNSS systems are unreliable. The system uses a nonlinear state estimation technique that incorporates a measurement model for optical samples, enabling direct integration of optical data with motion sensor data to generate a precise navigation solution. The system can operate in environments with limited GNSS visibility, such as dense urban areas, and can also be used in pedestrian navigation applications.

US11875519B2-patent-drawing

25. Multi-Sensor Calibration System Utilizing Third Sensor for 3D Measurement of Calibration Objects

SONY GROUP CORP, 2023

An information processing device, method, and system for accurate calibration of multiple sensors, including cameras and lidars, regardless of sensor type or range. The system uses a third sensor to measure 3D information of calibration objects placed within the measurement ranges of the first and second sensors, and calculates the relative positions and orientations of the sensors based on the third sensor data and sensor data from the first and second sensors.

WO2023243374A1-patent-drawing

26. Pose Estimation System Utilizing Integrated Inertial, Kinematic, and Odometry Sensors with Noise Adjustment

VOLVO CAR CORP, 2023

A lightweight pose estimation system for autonomous vehicles that determines vehicle position and orientation using a combination of inertial, kinematic, and odometry sensors. The system generates a pose value by integrating sensor readings and can adjust measurements based on observed noise. It enables efficient and real-time pose estimation for autonomous maneuvers, particularly in emergency braking applications.

EP4293318A1-patent-drawing

27. Environment Sensor Calibration System Utilizing Dual Marking Element Pose Analysis

VALEO SCHALTER UND SENSOREN GMBH, 2023

Calibration of an environment sensor system of an infrastructure device, such as a camera system or lidar system, for autonomous vehicle navigation. The system identifies two independent marking elements in the sensor's field of view, determines their relative pose, and compares it to a predetermined target value. Based on the comparison result, the system corrects the sensor's pose information in a reference coordinate system.

28. Navigation and Positioning System Utilizing Inertial, Binocular, and Radar Data with Nonlinear Graph Optimization for Drones

SHENZHEN UNIVERSITY, 2023

A high-reliability and high-precision navigation and positioning method for drones in GPS-denied environments. The method combines inertial sensor data with binocular camera images and ranging radar measurements to achieve accurate pose estimation. A nonlinear graph optimization algorithm based on a sliding window is used to fuse the sensor data and obtain high-precision pose estimates. The system also includes a loop closure detection module that enables four-degree-of-freedom pose graph optimization when the drone revisits a previously mapped location. The optimized pose data is then packaged into a pseudo-GPS signal that is input to the drone's flight controller for positioning and route planning.

29. Position Determination Method for Unmanned Aerial Systems Using Radar Node Network with Slant Distance and Elevation Angle Measurements

FIRST RF CORP, 2023

A method for determining the position of an unmanned aerial system (UAS) using a network of radar nodes. Each node measures the slant distance and elevation angle to the UAS, and transmits its position and azimuthal bounds to neighboring nodes. By combining these measurements, the nodes can solve for the UAS position without explicit measurement of its azimuthal position, providing a unique solution without ambiguity.

US11709251B1-patent-drawing

30. Visual Positioning Method Utilizing Sensor Data and Coarse Map Information for Mobile Device Position Estimation

QUALCOMM INC, 2023

A method for determining a position estimate of a mobile device using visual positioning. The method includes obtaining sensor information, detecting identifiable features in the sensor information, determining a range to the features, obtaining coarse map information, and determining the position estimate based on the range and map information. The method can utilize various sensors, including cameras, lidar, and radar, and can leverage coarse map data from remote servers.

31. Autonomous UAV Navigation System Utilizing GNSS, IMU Data, and Quadcopter Aerodynamics

SKYDIO INC, 2023

Autonomous navigation system for unmanned aerial vehicles (UAVs) that enables reliable flight in environments where traditional navigation sensors are unreliable. The system uses a combination of GNSS location signals, inertial measurement unit (IMU) data from accelerometers and gyroscopes, and quadcopter aerodynamics to determine the UAV's position, velocity, and orientation. By leveraging the unique characteristics of quadcopters, the system can maintain stable flight and navigate through challenging environments without relying on cameras, compasses, or magnetometers.

US2023204797A1-patent-drawing

32. Radar-to-Lidar Calibration Method Using Point Cloud Registration and Entropy Minimization

GM CRUISE HOLDINGS LLC, 2023

A method for radar-to-lidar calibration in autonomous vehicles that eliminates the need for specialized calibration targets. The method uses point cloud registration and entropy minimization to align radar and lidar point clouds gathered from different vehicle poses, enabling calibration in unstructured environments. The process aggregates radar and lidar point clouds using vehicle odometry and SLAM data, and then minimizes entropy over multiple drive segments to achieve accurate calibration.

EP4180834A1-patent-drawing

33. System for Automated Extrinsic Calibration of Vehicle and Robot Sensors with Turntable, Calibration Target, and Distributed Imaging Components

KINETIC AUTOMATION INC, 2023

Automated extrinsic calibration system for lidars, cameras, radars, and ultrasonic sensors on vehicles and robots, comprising a turntable system, calibration target system, and distributed imaging systems. The system enables precise calibration of sensor systems through automated scanning, target configuration, and image capture, eliminating the need for manual measurement and technician expertise.

WO2023081870A1-patent-drawing

34. Object Ranging Apparatus with Dual Estimation and Adaptive Result Combination

NEC CORP, 2023

An object ranging apparatus for improving accuracy and stability of object ranging in autonomous vehicles, comprising: an object recognition unit; a first distance estimation unit using depth estimation; a second distance estimation unit using motion parallax; and a combining unit that combines the results of the two estimation methods based on factors such as steering wheel angle and acceleration.

US2023075659A1-patent-drawing

35. Vision-Guided LIDAR System for Three-Dimensional Localization of Moving Platforms

TELEDYNE SCIENTIFIC & IMAGING LLC, 2023

Determining the location and/or navigation path of a moving platform. The method includes using a vision system on a moving platform to identify a region of interest, classifying objects within the region of interest, directing random-access LIDAR to ping one or more of the classified objects, and locating the platform in three dimensions using data from the vision system and LIDAR.

US11598878B2-patent-drawing

36. Navigation Solution Validation System with Dynamic Sensor Integrity Monitoring and Faulty Data Rejection

TRX SYSTEMS INC, 2023

System for validating navigation solution outputs by monitoring sensor integrity and automatically eliminating faulty data. The system continuously assesses the accuracy and reliability of multiple navigation sensors, including GNSS, accelerometers, gyroscopes, and others, and dynamically adjusts the navigation solution based on sensor performance. When sensor integrity is compromised, the system automatically rejects the affected data to ensure reliable position and timing outputs.

US2023065658A1-patent-drawing

37. Hierarchical Multi-Modal Sensing System with Adaptive Resolution for Environmental Mapping

LAWRENCE LIVERMORE NATIONAL SECURITY LLC, 2023

A multi-modal sensing approach for environmental mapping in autonomous systems that adapts data capture to object identification. The system uses a hierarchical structure to mimic human visual processing, initially capturing low-resolution data over a wide field of view and then selectively applying high-resolution sensing to areas of interest. Object recognition algorithms inform the distribution of sensing resources, enabling efficient data collection and object classification.

US11585933B2-patent-drawing

38. Drone Interface Device for Real-Time Data Processing and Frequency Conversion Between Onboard Computer and Flight Controller

UNIV DEGLI STUDI DI FIRENZE, 2023

A drone system that enables advanced autonomous flight capabilities through a novel interface device that connects an onboard computer to the flight controller and inertial measurement unit. The interface device performs real-time data processing and frequency conversion to enable efficient communication between the onboard computer and flight controller, while also supporting the integration of additional sensors. This enables the development of advanced autonomous flight modes, including AI-powered control systems that can learn from human pilots.

WO2023286097A1-patent-drawing

39. Sensor Configuration Method for Autonomous Vehicles Using Comparative Analysis of Target and Capability Specification Maps

WAYMO LLC, 2022

Optimizing sensor configuration for autonomous vehicles by comparing target and capability specification maps to identify regions where sensors are insufficient for a task. The method involves determining a target specification map indicating sensor parameters needed for a task, and a capability map showing what sensors can provide. Comparing the maps identifies regions where sensors fall short, allowing modifications to improve coverage. This allows evaluating sensor sufficiency before deployment.

US11529973B1-patent-drawing

40. Autonomous Aircraft Positioning System with Multispectral Sensor Suite for Object-Based Navigation

ROCKWELL COLLINS INC, 2022

Autonomous aircraft positioning system using multispectral sensors to enable precise landing and navigation without relying solely on traditional avionics like GPS or ILS. The system has a custom multispectral sensor suite onboard the autonomous aircraft that can accurately detect and identify specific objects like runway lights, airfield structures, or hatchery ponds. The sensor data is processed by an onboard object identification and positioning system to determine the aircraft's position and trajectory. It compares the sensor data with stored historical data to identify objects and calculate position. This allows autonomous landing without external visual cues. The system uses a wide range of spectral bands beyond visible light to enhance object detection.

US11532237B2-patent-drawing

41. Vehicle Navigation Method Utilizing Weighted Multi-Source Data Integration with Kalman Filter

GE AVIATION SYSTEMS LLC, 2022

A method of operating a vehicle that improves navigation accuracy by combining data from multiple sources with statistical weights based on their reliability. The method collects navigation parameters from sensors, GPS, and inertial systems, determines their uncertainties, and assigns weights to each parameter based on its reliability. A navigational solution is then formed by blending the weighted parameters using a Kalman filter, providing an optimized navigation solution with overall uncertainty estimates.

42. Camera Pose and Scale Estimation Method with Prior-Informed Cost Function Adjustment

MICROSOFT TECHNOLOGY LICENSING LLC, 2022

Method for estimating camera pose and scale that achieves both high speed and high accuracy by incorporating prior knowledge of rotation and scale into the estimation process. The method uses prior parameters derived from inertial sensor measurements to bias the cost function, accelerating the estimation process and improving accuracy compared to conventional approaches. The prior parameters selectively influence the cost function through rotation and scale weights, which are adjusted based on sensor noise. The method determines the camera pose and scale by optimizing the similarity transformation that minimizes the cost function below a threshold.

US11443455B2-patent-drawing

43. Map Generation Apparatus with Feature Point Extraction and Density Adjustment Based on Landmark Significance

HONDA MOTOR CO LTD, 2022

Map generation apparatus for vehicle positioning that extracts feature points from sensor data, generates a map incorporating these points, recognizes landmarks, determines their importance, and adjusts the map's feature point density based on landmark significance.

44. Indoor Autonomous Aerial System with Machine Learning-Driven Navigation and Visual Data Processing for Micro Aerial Vehicles

FLYVIZ INDOOR LTD, 2022

Indoor autonomous aerial system that enables efficient and precise navigation of micro aerial vehicles (MAVs) using machine learning algorithms. The system combines MAVs with image capture capabilities to autonomously navigate to desired locations within a deployable space, leveraging input from visual data to guide the MAV. By analyzing visual patterns and features, the system can extract valuable location data and present it to the MAV controller, enabling precise navigation and sign presentation applications. The system also incorporates battery level monitoring through onboard sensors, eliminating the need for traditional GPS.

45. System for Automatic Calibration of Vehicular Sensors Using Multi-Frame Object Detection and Sensor Data Integration

NETRADYNE INC, 2022

Automatic calibration of vehicular sensor systems using visual data from cameras and other sensors. The system detects stationary objects, such as traffic signs, and tracks their positions across multiple frames to estimate camera pose and sensor offsets. It filters object detection tracks based on quality metrics and jointly computes camera calibration parameters and object locations using the filtered tracks. The system can also incorporate data from other sensors, such as GPS, IMU, and wheel odometry, to improve accuracy and robustness.

US2022270358A1-patent-drawing

46. Robotic Control System Integrating Inertial Measurement Unit and String-Encoder Sensors for 3D Position and Orientation Calculation

JEANOLOGIA TEKNOLOJI AS, 2022

A 3D position and orientation calculation and robotic application structure that enables precise and repeatable robotic operations by combining inertial measurement unit (IMU) and string-encoder position sensors. The system records the movements of a portable recording apparatus using the IMU and string-encoder sensors, and then applies these movements to a robot for unmanned operation. The system provides high accuracy and precision, eliminates human error, and enables the creation of precise robotic programs for repetitive tasks.

US2022193919A1-patent-drawing

47. Device for Adaptive Self-Position Estimation Method Selection in Autonomous Mobile Objects

SONY GROUP CORP, 2022

A device for more appropriate action control in autonomous mobile objects like drones. It selects the best self-position estimation method from multiple options based on indexes like GPS accuracy. This allows optimized positioning for different situations. The selected positioning is used to obtain the mobile object's state, and actions are then controlled based on that state using predefined plans.

US2022157179A1-patent-drawing

48. Distributed Localization System with Visual Feature Matching and Inertial Data Integration for GPS-Denied Environments

SRI INTERNATIONAL, 2022

A collaborative localization system for multiple platforms that enables accurate positioning in GPS-denied environments through real-time, distributed information sharing. The system uses visual feature matching and inertial measurement data to determine platform poses, with each device contributing to a shared map of geo-referenced visual features. When a device encounters a new feature, it uses IMU data to estimate its position, while matching features to the shared map enables precise localization. The system's distributed architecture enables organic collaboration without requiring special behaviors, making it suitable for applications where multiple platforms operate independently.

49. Mobile Ground Vehicle with Pose Sensors for UAV Localization and Tracking

PICKER DRONES INC, 2022

System for accurate localization and tracking of unmanned aerial vehicles (UAVs) using a mobile ground vehicle. The system has a wheeled chassis with pose sensors that track UAV positions. A processor calculates UAV poses relative to the chassis using the sensor data. This allows UAV navigation even in nested environments where fixed sensors can't see them. The chassis can also provide extended power and computation resources for the UAVs.

50. Autonomous Aerial Drone with AI-Driven Navigation and Sensor-Integrated Obstacle Avoidance System

KARBASI ARDAVAN, 2022

Autonomous aerial drone that enables self-sustaining flight capabilities through advanced AI-driven navigation and obstacle avoidance. The drone integrates multiple sensors, including cameras and sensors, to detect and respond to environmental changes, while its onboard AI system continuously optimizes flight paths and collision avoidance strategies. The system enables autonomous flight, route planning, and real-time obstacle avoidance, eliminating the need for human operator intervention.

51. Method for Vehicle Positioning Using Stable Landmark Identification and Multi-Source Data Integration

52. Radar-Based Aircraft Positioning System with Multi-Antenna and Multi-Receiver Configuration for Three-Dimensional Determination

53. Vision-Based Aircraft Navigation System Utilizing Multi-Sensor Image Analysis and Georeferenced Databases

54. Radar-Based Odometry System Utilizing Deep Neural Network for Static Object Detection

55. Sensor System for Synchronized Image Capture and Lidar Scans with Correlated Double Sampling

Get Full Report

Access our comprehensive collection of 109 documents related to this technology