SLAM Technology for GPS-Denied Navigation in UAVs
This page presents innovative patents for enabling accurate UAV navigation and positioning in GPS-denied environments, through:
- Vision-Based Localization Systems – Camera-based positioning using semantic image label comparison, depth map alignment with machine learning, and point-line feature tracking with inertial sensors for autonomous navigation.
- Multi-Sensor Fusion Architectures – Integration of cameras, inertial sensors, lidar, radar, and ultrasonic systems with Bayesian data fusion and Kalman filtering for robust GPS-independent positioning.
- Neural Network-Based Navigation – Onboard trained neural networks for trajectory prediction, CNN-based object/landmark mapping with geometric transformation, and real-time tunnel defect detection.
- Alternative Positioning Technologies – mm-wave beacon systems with time synchronized signals for indoor positioning, wireless positioning tags for relative positioning, and signal transmission delay-based localization methods.
- GNSS Validation and Backup Systems – Visual localization for GNSS error validation, camera-based backup navigation with feature tracking, and sensor-derived digital maps for indoor flight path generation.
1. Binocular Camera and Inertial Sensor-Based Navigation System with Point and Line Feature Tracking for UAVs
SHENZHEN UNIVERSITY, 2025
A high-precision navigation and positioning system for UAVs in GPS-denied environments, utilizing a binocular camera and inertial sensor to achieve accurate positioning and attitude estimation. The system extracts and tracks both point and line features from the camera images, and fuses this data with inertial measurements to achieve robust and reliable positioning. The system also incorporates a ranging radar for altitude measurement and loopback detection for graph optimization, enabling efficient and accurate positioning signals for autonomous UAV flight.
2. Method for UAV Localization Using Semantic Image Label Comparison
WING AVIATION LLC, 2024
A method for determining the location of an unmanned aerial vehicle (UAV) using imagery, particularly in environments where traditional positioning systems are unreliable. The method involves generating semantic labels for objects in a captured image, comparing these labels to reference labels in a map to estimate the current location, and accumulating these estimates to determine a precise location.
3. UAV Navigation System Utilizing mm-Wave Beacons with Time-Synchronized Wideband Signal Transmission
FRAUNHOFER-GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V, 2024
A navigation system for unmanned aerial vehicles (UAVs) that uses mm-wave beacons to enable indoor and urban positioning without relying on GPS. The system involves transmitting time-synchronized wideband signals from spaced apart base stations using beams facing each other to create a flight path for the UAV. The UAV receives the signals and determines its position relative to the bases based on reception times and intensities.
4. UAV Localization via Camera-Based Depth and Semantic Map Alignment Using Machine Learning
WING AVIATION LLC, 2024
Using machine learning to localize an unmanned aerial vehicle (UAV) in an environment without relying solely on GPS. The method involves capturing a 2D image of the environment using the UAV's camera, applying a trained machine learning model to the image to generate a depth map and a semantic segmentation map, retrieving reference depth and semantic data for the environment, aligning the generated depth map with the reference depth to determine the UAV's location. The alignment associates the semantic labels from the generated map with the reference labels from the reference data.
5. UAV GNSS Position Validation System with Redundant Positioning and Visual Localization
WING AVIATION LLC, 2024
A system for validating GNSS position of unmanned aerial vehicles (UAVs) without reference to geofiducials, enabling precise navigation in urban environments and remote areas. The system employs redundant positioning systems, including visual localization, to validate GNSS data and mitigate errors introduced by signal reflections and dynamic environmental factors. Visual localization is achieved through computer vision routines that detect and recognize unique visual patterns on landing pads, eliminating the need for surveyed geofiducials and enabling rapid expansion of UAV networks to new locations.
6. Indoor UAV Navigation Positioning System with GPS-Based Flight Path Generation Using Sensor-Derived Digital Maps
VERIZON PATENT AND LICENSING INC, 2023
A positioning system for indoor UAV navigation that generates GPS-based flight paths using a digital map of the environment created from sensor data, enabling accurate localization and path planning in GPS-denied environments.
7. Vision-Lidar Coupling System for UAV-Based Tunnel Modeling with Bayesian Data Fusion
TONGJI UNIVERSITY, 2023
Method and system for modeling poor-texture tunnels using a vision-lidar coupling on an unmanned aerial vehicle (UAV). The system integrates a depth camera and lidar for simultaneous localization and mapping (SLAM), leveraging the wide-range information of the lidar and local details of the depth camera to improve accuracy. The system fuses point cloud data, raster maps, and pose information using Bayesian fusion, and iteratively refines the map model through feature matching between successive frames. The system also employs positioning UAVs and auxiliary lighting to enhance data quality and accuracy.
8. Method for Object and Landmark Mapping Using UAV Footage with Convolutional Neural Network and Geometric Transformation
THE TEXAS A&M UNIVERSITY SYSTEM, 2023
Method for identifying, locating, and mapping objects and landmarks using UAV camera footage in GPS-denied environments, comprising: obtaining aerial video footage; detecting targets of interest using a CNN; defining new reference points and pixel coordinates for each frame; applying geometric transformation to obtain real-world positions; and projecting targets onto an orthogonal map.
9. Navigation System for Unmanned Vehicles Utilizing Image-Based Feature Detection and Decoding
BATTELLE ENERGY ALLIANCE LLC, 2022
A navigation system for unmanned vehicles that uses visual features, such as image-based codes, to determine vehicle location and control movement within environments without GPS. The system captures images from the vehicle, detects and decodes the visual features, and uses the decoded information to calculate the vehicle's location and send control commands to navigate the vehicle through the environment.
10. Unmanned Aerial Vehicle-Based Tunnel Defect Detection System with Integrated Pose Estimation and Real-Time Neural Network Analysis
UNIV TONGJI, 2022
A tunnel defect detection system using an unmanned aerial vehicle (UAV) that enables accurate defect detection in GPS-denied environments. The system integrates an LED module for illumination, IMU, camera, laser radar, and ultrasonic distance meter to estimate the UAV's pose. A trained neural network model detects defects in real-time from images collected by the camera and LED module, and the UAV hovers for further inspection when a defect is detected. The system achieves accurate defect detection and pose estimation in tunnels with no GPS signals and highly symmetrical structures.
11. Target State Estimation Method for UAVs Using Image Recognition, Point Cloud Processing, and Extended Kalman Filtering
AUTEL ROBOTICS CO LTD, 2022
A target state estimation method for unmanned aerial vehicles (UAVs) that achieves high precision without relying on ground plane assumptions or height data. The method combines image recognition, point cloud processing, and extended Kalman filtering to estimate target location, speed, and other states. It uses multiple measurement sources, including image locations and point cloud data, to improve estimation accuracy and robustness.
12. Unmanned Aerial Vehicle Positioning System Using Wireless Tags and Relative Positioning in Satellite-Denied Environments
SZ DJI TECHNOLOGY CO LTD, 2022
System for unmanned aerial vehicles (UAVs) to accurately locate and position themselves while performing tasks in an environment where satellite signals are blocked. The system uses multiple UAVs with wireless positioning tags and onboard satellite positioning. The UAVs communicate their wireless signals with a central control device. The relative positions between the main UAV and the positioning UAVs are determined. The central device then uses the satellite positions from the positioning UAVs to provide accurate positioning for the main UAV. This allows the main UAV to always locate objects in the environment while performing tasks, even if satellite signals are blocked.
13. Autonomous UAV Flight System Utilizing Onboard Neural Networks for GPS-Independent Positioning
UNITED STATES OF AMERICA AS REPRESENTED BY THE ADMINISTRATOR OF NASA, 2021
Enabling autonomous flight of unmanned aerial vehicles (UAVs) in urban environments without relying on GPS or other external positioning systems. The method involves training onboard neural networks to provide real-time trajectory and location information. The neural networks are trained using sensor data from the UAV's surroundings to accurately predict wind conditions and determine its position in environments where GPS signals may be blocked.
14. Camera-Based Position Determination System for UAVs with Visual Feature Tracking Parameters
WING AVIATION LLC, 2021
A backup navigation system for unmanned aerial vehicles (UAVs) that uses a camera to determine position when the primary GNSS system fails. The camera system generates a map with tracking parameters that indicate the ease of determining location based on visual features, allowing the UAV to plan and follow a route to a destination even in areas with limited visual references.
15. UAV Localization Method Using Signal Transmission Delay and Terminal Device Position Data
SHANGHAI FEILAI INFORMATION TECH CO LTD, 2021
Method for determining the location of an unmanned aerial vehicle (UAV) using signal transmission delay information between the UAV and a terminal device. The method involves obtaining location points of the terminal device, determining signal transmission delay information between the terminal device and the UAV at each location point, and determining the communication distance between the terminal device and the UAV based on the delay information. The method further involves receiving UAV position information from the UAV, determining the validity of the UAV position based on the communication distance, and using the communication distance to estimate the UAV's location when the UAV's GPS fails or drifts excessively.
16. Multi-UAV Control System with Real-Time Sensor-Based Subject Tracking and Dynamic Path Adjustment
DALAN LANE, 2021
A system for controlling unmanned aerial vehicles (UAVs) that enables precise tracking and movement coordination of multiple subjects within a geographical area. The system uses a combination of sensors, including image sensors and physical markers, to track subjects and adjust the UAV's flight path in real-time to maintain optimal positioning and avoid obstacles. The system also enables dynamic path adjustments based on changes in subject movement, weather conditions, and other factors.
17. Unmanned Aerial Vehicle System with AGV-Controlled Movement Using Position-Unrelated Sensing Data
SAMSUNG ELECTRONICS CO LTD, 2020
An unmanned aerial vehicle (UAV) system that enables autonomous inventory management without GPS or high-end sensors. The system comprises a UAV equipped with a wireless communication circuit, sensor, and processor, and an automated guided vehicle (AGV) that controls the UAV's movement. The AGV receives position-unrelated sensing information from the UAV, determines its position, and transmits movement commands to the UAV. The UAV executes the commands while transmitting sensing information to the AGV, which continuously updates the UAV's position.
18. Autonomous UAV Flight Control System with GPS-Independent Route Switching Mechanism
ALPINE ELECTRONICS INC, 2020
Flight control for unmanned aerial vehicles (UAVs) to enable autonomous flight without GPS in areas where GPS signals are weak or unavailable. The UAV follows a pre-planned route based on measured positions. If GPS accuracy degrades below a threshold, the UAV switches to a different route that does not require GPS. This allows the UAV to continue flying with secured positioning when GPS is lost. The alternative route may involve ascending while maintaining the original latitude and longitude, or following a different route entirely. The goal is to keep the UAV from colliding with obstacles when GPS is lost.
19. UAV Localization Method Using Image-Based Ground Structure Recognition and Peer-to-Peer Information Sharing
AMAZON TECHNOLOGIES INC, 2020
A method for unmanned aerial vehicles (UAVs) to determine their location without relying on GPS or other external systems. The method involves UAVs capturing images of known ground structures during flight, extracting the location of those structures, and then sharing that temporal and spatial information with other nearby UAVs. By triangulating the time of flight of the shared information, UAVs can determine their own locations using the known ground structures as reference points.
20. Indoor Drone Navigation System Utilizing GPS-Formatted Path Generation from Sensor-Derived Digital Maps
VERIZON PATENT AND LICENSING INC, 2025
System for enabling indoor GPS-like navigation for drones using existing drone software and hardware. The system generates GPS-formatted flight paths inside buildings by creating a digital map of the indoor environment from sensor data, localizing the drone on the map, and generating GPS-like coordinates for indoor waypoints. This allows indoor drone navigation using existing drone software and hardware, without relying on GPS signals unavailable indoors.
21. Aerial Vehicle Navigation System with Camera-Based Terrain Object Detection and Electronic Map Integration
TOMAHAWK ROBOTICS INC, 2025
A navigation system for aerial vehicles enables autonomous flight in GPS-denied areas by using a camera to record images of terrain and detecting objects within those images. The system compares detected objects with an electronic map to determine vehicle location and generates flight instructions based on that location and a target location. The system can build the electronic map by processing multiple images and associating object locations, shapes, and colors.
22. SLAM System with Integrated Camera and TOF Sensor Utilizing Sparse Depth Data
RUICHI ZHIHUI TECHNOLOGY CO LTD, 2024
Enhancing SLAM system performance by leveraging sparse depth data from depth sensors to reduce computational complexity. The system integrates camera and TOF sensor images into a single SLAM framework, enabling simultaneous processing of environmental data and inertial information. The sparse depth data from depth sensors provides depth information directly to the SLAM system, while the camera image captures environmental context. This integrated approach reduces the time and computational resources required for traditional depth preparation, significantly improving SLAM system performance and operational speed.
23. Handheld 3D Scanning System with Projector and Cameras Utilizing Graph-Based SLAM and Machine Learning for Natural Feature Detection
FARO TECHNOLOGIES INC, 2024
A handheld 3D scanning system that generates 2D and 3D scans of an environment using a projector and cameras. The system employs a graph-based SLAM approach to enable simultaneous scanning, mapping, and trajectory generation while the scanner is moving. The system automatically detects natural features using machine learning, eliminating the need for artificial targets or manual feature identification. The system determines the scanner's pose by matching natural features across timepoints, enabling accurate registration of scans taken from different positions.
24. Mobile Scanning System with Graph-Based SLAM and Motion Compensation Integration
FARO TECHNOLOGIES INC, 2024
A mobile scanning system that generates 2D and 3D scans of an environment while integrated with an automated transporter robot. The system uses a graph-based SLAM approach to simultaneously locate the scanning device and map the environment, enabling autonomous or semi-autonomous scanning. The system determines a compensation vector and rotation for scan data based on the transporter robot's motion, allowing for accurate registration of scan data and generation of a comprehensive map.
25. Image-Based Localization and Tracking Method Using Laser Scanner Data with Surface Feature Matching and Position Refinement
FARO TECHNOLOGIES INC, 2024
A method for image-based localization and tracking using laser scanner data, comprising collecting first data comprising first surface points within an environment by a sensor associated with the processing system, determining an estimated position of the processing system within the environment by analyzing the first data using a simultaneous localization and mapping algorithm, collecting second data comprising second surface points within the environment by a three-dimensional (3D) coordinate measuring device associated with the processing system, matching the first set of surface features to the second set of surface features, refining the estimated position of the processing system to generate a refined position of the processing system based on the matching, and displaying, on a display of the processing system, an augmented reality representation of the second data based at least in part on the refined position.
26. Motion Estimation Device with Pixel Distance and Reliability Calculation Circuits for Visual SLAM
TOSHIBA ELECTRONIC DEVICES & STORAGE CORP, KABUSHIKI KAISHA TOSHIBA, 2024
Motion estimation device for improving accuracy of motion estimation in visual simultaneous localization and mapping (SLAM) applications. The device includes a receiving circuit for input frames and a calculation circuit that estimates pixel distances and reliability based on pixel information, enabling more accurate motion estimation.
27. Real-Time 3D Occupancy Grid Map Generation Using Voxel Division and Region Map Origin Updating for Drones
AGENCY FOR DEFENSE DEVELOPMENT, 2024
A real-time map generation method for drones that creates a 3D occupancy grid map by dividing the environment into voxels and updating the map based on sensor data and vehicle location. The method compensates for the limitations of traditional voxel arrays by using a region map origin updating technique and inflation-based occupancy state propagation.
28. Monocular Camera-Based Positioning System with Virtual Floor Zoning for Unmanned Aerial Vehicles
INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, 2024
Positioning system for unmanned aerial vehicles (UAVs) using monocular cameras to provide accurate positioning information for UAVs in areas where satellite signals are weak. The system generates positioning information for virtual floors of a field based on monocular camera images. It analyzes markings at each position to evaluate accuracy. Virtual floors are divided into zones based on accuracy. UAVs operate in appropriate zones. Sensing data from UAVs like height, motion, and images can be fused to improve positioning.
29. Simultaneous Positioning and Map Construction Method for Heterogeneous Unmanned Systems Using Binocular and Fisheye Camera Collaboration
SHENZHEN INSTITUTES OF ADVANCED TECHNOLOGY CHINESE ACADEMY OF SCIENCES, 2024
A method for simultaneous positioning and map construction of heterogeneous unmanned systems in degraded perception environments, where multiple robots with binocular cameras and fisheye cameras collaborate to achieve accurate positioning through mutual observation and map fusion. The method involves a stationary reference robot and a mobile exploration robot, where the reference robot maintains a stable position while the exploration robot generates a safe area map and follows a trajectory planned based on real-time visual observations and relative position adjustments.
30. 3D Map-Based Unmanned Aircraft Navigation System with Integrated Real-Time Positioning and Obstacle Detection
EHANG INTELLIGENT EQUIPMENT GUANGZHOU CO LTD, 2024
A 3D map-based unmanned aircraft flight control method, system, and medium that enables precise positioning and obstacle avoidance through integration of 3D map data and real-time positioning information. The system uses reference points in the 3D map to correlate with the aircraft's position, enabling accurate navigation and obstacle detection. The method combines 3D map data with real-time sensor data to perform obstacle avoidance control, ensuring safe and efficient flight operations.
31. Hybrid Localization System for Delivery Robots Using Image-Based and Structure-Based Analysis
DELIVERS AI ROBOTIK OTONOM SURUS BILGI TEKNOLOJILERI AS, 2024
A delivery system for autonomous or semi-autonomous delivery robots that enables location determination without relying on GPS signals. The system uses a hybrid localization method combining image-based and structure-based analysis, where depth cameras and environmental cameras capture images of the surroundings, and a location verification unit matches these images against a dataset to estimate the robot's position. The estimated position is then refined through structure-based calculation using point cloud maps constructed from the images.
32. Navigation System for Autonomous Vehicles with Integrated Optical and Motion Sensor Data Using Nonlinear State Estimation
Dylan Krupity, 2024
A navigation system for autonomous vehicles that integrates optical sensor data with motion sensor data to provide accurate positioning in environments where traditional GNSS systems are unreliable. The system uses a nonlinear state estimation technique that incorporates a measurement model for optical samples, enabling direct integration of optical data with motion sensor data to generate a precise navigation solution. The system can operate in environments with limited GNSS visibility, such as dense urban areas, and can also be used in pedestrian navigation applications.
33. Camera-Based Aircraft Navigation System Utilizing Visual Landmark Recognition and Machine Learning Algorithms
AURORA FLIGHT SCIENCES CORP, 2023
Aircraft navigation system that enables autonomous navigation through visual landmarks by leveraging camera-based scanning. The system employs a camera system to capture images of the aircraft's surroundings, which are then processed to determine the aircraft's position and orientation. This position information is used to guide the aircraft to its destination, eliminating the need for traditional navigation systems. The system employs machine learning algorithms to automatically identify and track landmarks in the aircraft's environment, enabling precise navigation through complex terrain.
34. Unmanned Aerial Vehicle System with Integrated Ultra-Wideband Ranging, Inertial Measurement, and Self-Localization Techniques
NANYANG TECHNOLOGICAL UNIVERSITY, 2023
An unmanned aerial vehicle (UAV) system that combines ultra-wideband (UWB) ranging with inertial measurement unit (IMU) data and onboard self-localization (OSL) techniques to achieve robust and reliable state estimation. The system uses UWB nodes placed on the UAV at offset positions to receive signals from anchor nodes in the environment, which are then combined with IMU data and OSL information to estimate the UAV's pose state. The system can also integrate lidar and visual features from onboard sensors to further enhance localization accuracy.
35. Robot with Dynamic Node Generation for Topological Mapping Using Real-Time Lidar and Camera Data
LG ELECTRONICS INC, 2023
A moving robot that generates a topological map by dynamically creating nodes based on real-time sensor data, including lidar and camera information, to accurately represent the robot's environment and enable efficient navigation. The robot determines open movement directions and creates new nodes when necessary, while also updating existing nodes based on sensor data. This approach enables the robot to generate a highly accurate map of its environment and navigate through it efficiently.
36. Snow Grooming Vehicle with Onboard Terrain Scanning and Reference Model-Based Navigation System
PRINOTH SPA, 2023
A snow grooming vehicle that can operate independently of GPS signals, using a reference model of the terrain created by scanning the area with sensors mounted on the vehicle itself. The vehicle determines its position and orientation relative to the reference model, allowing it to maintain its position and navigate through areas with poor satellite signal reception.
37. Visual Localization Map Update System with Consecutive Image Pose Calculation and Dynamic Adaptation Mechanism
NAVER LABS CORP, 2023
A method and system for updating a visual localization (VL) map to enable continuous location-based services in dynamic environments. The method calculates relative pose relationships between consecutive images, determines absolute poses based on these relationships, and updates the VL map accordingly. This approach enables the map to adapt to changes in the environment, such as new objects or structures, by leveraging consecutive query information from the device.
38. Autonomous Drone Navigation System with Iterative 3D Mapping and Optical Sensor-Based Pathfinding
GAL ZUCKERMAN, 2023
System and method for autonomous drone navigation in urban environments. The system iteratively maps and approaches urban areas using a fleet of drones, each equipped with optical sensors. The drones generate and execute flight paths that avoid ground-related features while converging on roads, with the 3D model evolving through each iteration to improve accuracy. The system enables safe and efficient drone operation at low altitudes, with applications in logistics, transportation, and infrastructure inspection.
39. Drone System with Stereo Vision, Optical Flow, and Depth Sensors for Autonomous Navigation and Obstacle Avoidance
DIGIT7 INDIA PRIVATE LTD, 2023
A drone system for inventory management in warehouses, enabling autonomous navigation and obstacle avoidance through a combination of stereo vision, optical flow, and depth sensors. The system generates a 3D map of the environment, estimates the drone's spatial position and orientation, and uses a collision prevention feature to find the shortest path between nodes while avoiding static and dynamic obstacles.
40. Loop Closure Detection System with Visual Laser Image Optimization and Multi-Sensor Fusion Integration
SHENZHEN PUDU TECHNOLOGY CO LTD, 2023
A loop closure detection system for improving the speed and accuracy of loop closure detection in SLAM systems, particularly in environments with changing viewing angles, brightness, and weak textures. The system uses a visual laser image optimization module to correct accumulated errors based on pose information, laser matching constraints, and loop closure pose constraints. The system is integrated into a multi-sensor fusion SLAM system and a robot, enabling efficient and reliable loop closure detection for long-term operation in diverse environments.
41. Detachable Autonomous Navigation System with Integrated 360-Degree Sensor Suite and Onboard AI Processing Unit
SPLEENLAB GMBH, 2023
A detachable control and navigation system for autonomous vehicles like drones that can provide autonomous navigation without external connectivity. The system uses onboard sensors like lidar, fisheye camera, and radar for 360 degree coverage. An AI processing unit analyzes the sensor data to generate navigation signals. This allows autonomous operation without relying on external communications. The sensor, processing, and communication components are detachably mounted on the vehicle for flexibility.
42. Navigation System Utilizing Position Probability Density Function Filter for GNSS-Denied Environments
HONEYWELL INTERNATIONAL INC, 2022
A navigation system for vehicles operating in GNSS-denied environments uses a position probability density function (PDF) filter to estimate real-time measurement errors of position and angular orientation. The system receives image data from onboard vision sensors and map data from a database, generates PDFs of image and map features, convolves them to form a measurement vector PDF, and estimates a position vector PDF using a nonlinear filter. The estimated position vector PDF is then used to generate statistics, including real-time measurement errors, which are used to improve navigation filter performance.
43. Drone Navigation System with Real-Time 3D Occupancy Mapping and Obstacle Prediction
SONY GROUP CORP, 2022
Enabling high-speed autonomous flight of drones by generating real-time 3D occupancy maps that predict potential obstacles. The system combines self-position estimation and distance measurements to create a continuous 3D map of the environment. This map is then used to generate a real-time 3D occupancy grid that predicts potential obstacles. The system continuously updates this grid in real-time as the drone moves, ensuring accurate obstacle avoidance even in dynamic environments.
44. Graph-Based SLAM Method Incorporating Manhattan Nodes for Mobile Agent Localization
ROBERT BOSCH GMBH, 2022
Method for determining the location of a mobile agent within an environment, comprising: determining a pose of the mobile agent from sensor data; establishing a graph-based SLAM algorithm with nodes and edges representing the pose and movement of the mobile agent; adding a Manhattan node and edge to the graph based on the Manhattan orientation of the mobile agent; and minimizing an error function over the graph to determine the exact pose of the mobile agent.
45. Mobile Scanning System with SLAM-Based Graph Representation for Continuous 2D and 3D Environmental Scans
FARO TECH INC, 2022
A mobile scanning system that generates 2D and 3D scans of an environment while the scanner is in motion. The system uses simultaneous localization and mapping (SLAM) to register scan data from different positions, enabling continuous scanning without manual registration. The SLAM approach abstracts raw sensor measurements into a graph-based representation, allowing for efficient registration of scan data from different positions. The system also enables users to select features from scan data and previously captured submaps, facilitating registration and map construction.
46. Distributed Multi-Camera System with Edge Computing and Relaxed Sensor Synchronization for Visual SLAM
NATIONAL UNIVERSITY OF SINGAPORE, 2022
A multi-camera system for visual simultaneous location and mapping (SLAM) that eliminates the need for a central processing unit by distributing computation across edge computing processors in each camera module. Each module contains a monocular camera and inertial measurement unit (IMU), enabling independent operation and collaborative data sharing between modules. The system employs a relaxed sensor synchronization method that eliminates the need for dedicated hardware and communication channels, allowing for wireless communication and scalability. The distributed architecture enables fault tolerance and redundancy, with each module capable of operating in monocular mode or contributing to multi-camera SLAM when multiple modules are present.
47. Method for Camera Pose Estimation Using Filtered Rotation and Displacement Matrices in Monocular Visual SLAM
SHENZHEN REOLINK TECHNOLOGY CO LTD, 2022
A method to improve accuracy of monocular visual SLAM (simultaneous localization and mapping) for camera localization in unknown environments without GPS. The method involves filtering the rotation and displacement matrices obtained from feature point matching in two frames to select the accurate pose of the camera. This is done by constructing epipolar constraints using feature point pairs, calculating matrices from pixel coordinates, then filtering with reference matrices from camera motion to select the correct pose.
48. Temporal Depth Fusion Using Gaussian Mixture Models for 3D Representation in Autonomous Vehicle Navigation
CALIFORNIA INSTITUTE OF TECHNOLOGY, 2022
Efficient temporal depth fusion for collision avoidance in autonomous vehicles using Gaussian mixture models (GMMs) to provide live obstacle avoidance for micro air vehicles (MAVs) and ground vehicles. The technique fuses consecutive depth maps from sensors like stereo, lidar, or SLAM to create dense, consistent 3D representations for path planning and collision avoidance. Each pixel is represented as a mixture of Gaussians, projected between frames using ego motion estimates, and updated with new depth observations. This compact, efficient approach unites the benefits of sparse depth model updates in SLAM/VIO with dense representation of multiview stereo.
49. Vehicle Autonomous Navigation System with Switchable Satellite and SLAM-Based Localization Modules
KUBOTA CORP, 2022
Automatic driving system for vehicles like tractors that can switch between satellite positioning and SLAM-based localization to provide stable autonomous operation in various environments. The system uses a satellite positioning module to calculate vehicle position from GPS signals. It also has a SLAM module to calculate vehicle position based on onboard sensors like lidar. The system switches between using the satellite positioning for normal operation or the SLAM positioning when GPS is poor. This allows reliable autonomous driving even in areas with weak satellite signals or indoor environments without GPS coverage.
50. Method for Stationary Feature Identification and Moving Feature Filtering in Stereo Image-Based SLAM
AMAZON TECHNOLOGIES INC, 2022
A method for improving simultaneous localization and mapping (SLAM) performance in autonomous mobile devices (AMDs) by identifying and filtering out moving features from stereo images. The method involves determining stationary features across multiple images, predicting their position using inertial sensor data, and using these stationary features to enhance SLAM localization and trajectory estimation. By reducing the number of moving features processed by SLAM, the method significantly improves SLAM performance, reduces latency, and enhances overall AMD operation.
Get Full Report
Access our comprehensive collection of 102 documents related to this technology
