Drone SLAM (Simultaneous Localization and Mapping) systems operate under stringent constraints, processing sensor data at rates exceeding 30Hz while maintaining positional accuracy within centimeters. Field tests reveal that visual-inertial SLAM algorithms typically consume 75-120MB of RAM and utilize 15-30% of onboard computing resources during flight operations, with performance degradation in low-texture environments where feature detection rates drop by 40-60%.

The fundamental challenge lies in balancing computational efficiency against mapping fidelity while accommodating the limited payload capacity and power budget of aerial platforms.

This page brings together solutions from recent research—including camera-TOF sensor integration frameworks that leverage sparse depth data, graph-based SLAM approaches for simultaneous scanning and mapping, real-time 3D occupancy grid generation with voxel division, and collaborative positioning methods using heterogeneous unmanned systems. These and other approaches demonstrate practical implementations that enable reliable autonomous navigation in GPS-denied environments without exceeding the resource constraints of commercial drone platforms.

1. Indoor Drone Navigation System Utilizing GPS-Formatted Path Generation from Sensor-Derived Digital Maps

VERIZON PATENT AND LICENSING INC, 2025

System for enabling indoor GPS-like navigation for drones using existing drone software and hardware. The system generates GPS-formatted flight paths inside buildings by creating a digital map of the indoor environment from sensor data, localizing the drone on the map, and generating GPS-like coordinates for indoor waypoints. This allows indoor drone navigation using existing drone software and hardware, without relying on GPS signals unavailable indoors.

US12243433B2-patent-drawing

2. Aerial Vehicle Navigation System with Camera-Based Terrain Object Detection and Electronic Map Integration

TOMAHAWK ROBOTICS INC, 2025

A navigation system for aerial vehicles enables autonomous flight in GPS-denied areas by using a camera to record images of terrain and detecting objects within those images. The system compares detected objects with an electronic map to determine vehicle location and generates flight instructions based on that location and a target location. The system can build the electronic map by processing multiple images and associating object locations, shapes, and colors.

US2025054175A1-patent-drawing

3. SLAM System with Integrated Camera and TOF Sensor Utilizing Sparse Depth Data

RUICHI ZHIHUI TECHNOLOGY CO LTD, 2024

Enhancing SLAM system performance by leveraging sparse depth data from depth sensors to reduce computational complexity. The system integrates camera and TOF sensor images into a single SLAM framework, enabling simultaneous processing of environmental data and inertial information. The sparse depth data from depth sensors provides depth information directly to the SLAM system, while the camera image captures environmental context. This integrated approach reduces the time and computational resources required for traditional depth preparation, significantly improving SLAM system performance and operational speed.

4. Handheld 3D Scanning System with Projector and Cameras Utilizing Graph-Based SLAM and Machine Learning for Natural Feature Detection

FARO TECHNOLOGIES INC, 2024

A handheld 3D scanning system that generates 2D and 3D scans of an environment using a projector and cameras. The system employs a graph-based SLAM approach to enable simultaneous scanning, mapping, and trajectory generation while the scanner is moving. The system automatically detects natural features using machine learning, eliminating the need for artificial targets or manual feature identification. The system determines the scanner's pose by matching natural features across timepoints, enabling accurate registration of scans taken from different positions.

5. Mobile Scanning System with Graph-Based SLAM and Motion Compensation Integration

FARO TECHNOLOGIES INC, 2024

A mobile scanning system that generates 2D and 3D scans of an environment while integrated with an automated transporter robot. The system uses a graph-based SLAM approach to simultaneously locate the scanning device and map the environment, enabling autonomous or semi-autonomous scanning. The system determines a compensation vector and rotation for scan data based on the transporter robot's motion, allowing for accurate registration of scan data and generation of a comprehensive map.

US12053895B2-patent-drawing

6. Image-Based Localization and Tracking Method Using Laser Scanner Data with Surface Feature Matching and Position Refinement

FARO TECHNOLOGIES INC, 2024

A method for image-based localization and tracking using laser scanner data, comprising collecting first data comprising first surface points within an environment by a sensor associated with the processing system, determining an estimated position of the processing system within the environment by analyzing the first data using a simultaneous localization and mapping algorithm, collecting second data comprising second surface points within the environment by a three-dimensional (3D) coordinate measuring device associated with the processing system, matching the first set of surface features to the second set of surface features, refining the estimated position of the processing system to generate a refined position of the processing system based on the matching, and displaying, on a display of the processing system, an augmented reality representation of the second data based at least in part on the refined position.

7. Motion Estimation Device with Pixel Distance and Reliability Calculation Circuits for Visual SLAM

TOSHIBA ELECTRONIC DEVICES & STORAGE CORP, KABUSHIKI KAISHA TOSHIBA, 2024

Motion estimation device for improving accuracy of motion estimation in visual simultaneous localization and mapping (SLAM) applications. The device includes a receiving circuit for input frames and a calculation circuit that estimates pixel distances and reliability based on pixel information, enabling more accurate motion estimation.

US12033335B2-patent-drawing

8. Real-Time 3D Occupancy Grid Map Generation Using Voxel Division and Region Map Origin Updating for Drones

AGENCY FOR DEFENSE DEVELOPMENT, 2024

A real-time map generation method for drones that creates a 3D occupancy grid map by dividing the environment into voxels and updating the map based on sensor data and vehicle location. The method compensates for the limitations of traditional voxel arrays by using a region map origin updating technique and inflation-based occupancy state propagation.

9. Monocular Camera-Based Positioning System with Virtual Floor Zoning for Unmanned Aerial Vehicles

INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, 2024

Positioning system for unmanned aerial vehicles (UAVs) using monocular cameras to provide accurate positioning information for UAVs in areas where satellite signals are weak. The system generates positioning information for virtual floors of a field based on monocular camera images. It analyzes markings at each position to evaluate accuracy. Virtual floors are divided into zones based on accuracy. UAVs operate in appropriate zones. Sensing data from UAVs like height, motion, and images can be fused to improve positioning.

10. Simultaneous Positioning and Map Construction Method for Heterogeneous Unmanned Systems Using Binocular and Fisheye Camera Collaboration

SHENZHEN INSTITUTES OF ADVANCED TECHNOLOGY CHINESE ACADEMY OF SCIENCES, 2024

A method for simultaneous positioning and map construction of heterogeneous unmanned systems in degraded perception environments, where multiple robots with binocular cameras and fisheye cameras collaborate to achieve accurate positioning through mutual observation and map fusion. The method involves a stationary reference robot and a mobile exploration robot, where the reference robot maintains a stable position while the exploration robot generates a safe area map and follows a trajectory planned based on real-time visual observations and relative position adjustments.

WO2024109837A1-patent-drawing

11. UAV Navigation System Utilizing mm-Wave Beacons with Time-Synchronized Wideband Signal Transmission

FRAUNHOFER-GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V, 2024

A navigation system for unmanned aerial vehicles (UAVs) that uses mm-wave beacons to enable indoor and urban positioning without relying on GPS. The system involves transmitting time-synchronized wideband signals from spaced apart base stations using beams facing each other to create a flight path for the UAV. The UAV receives the signals and determines its position relative to the bases based on reception times and intensities.

12. 3D Map-Based Unmanned Aircraft Navigation System with Integrated Real-Time Positioning and Obstacle Detection

EHANG INTELLIGENT EQUIPMENT GUANGZHOU CO LTD, 2024

A 3D map-based unmanned aircraft flight control method, system, and medium that enables precise positioning and obstacle avoidance through integration of 3D map data and real-time positioning information. The system uses reference points in the 3D map to correlate with the aircraft's position, enabling accurate navigation and obstacle detection. The method combines 3D map data with real-time sensor data to perform obstacle avoidance control, ensuring safe and efficient flight operations.

WO2024067133A1-patent-drawing

13. Hybrid Localization System for Delivery Robots Using Image-Based and Structure-Based Analysis

DELIVERS AI ROBOTIK OTONOM SURUS BILGI TEKNOLOJILERI AS, 2024

A delivery system for autonomous or semi-autonomous delivery robots that enables location determination without relying on GPS signals. The system uses a hybrid localization method combining image-based and structure-based analysis, where depth cameras and environmental cameras capture images of the surroundings, and a location verification unit matches these images against a dataset to estimate the robot's position. The estimated position is then refined through structure-based calculation using point cloud maps constructed from the images.

WO2024015031A1-patent-drawing

14. Navigation System for Autonomous Vehicles with Integrated Optical and Motion Sensor Data Using Nonlinear State Estimation

Dylan Krupity, 2024

A navigation system for autonomous vehicles that integrates optical sensor data with motion sensor data to provide accurate positioning in environments where traditional GNSS systems are unreliable. The system uses a nonlinear state estimation technique that incorporates a measurement model for optical samples, enabling direct integration of optical data with motion sensor data to generate a precise navigation solution. The system can operate in environments with limited GNSS visibility, such as dense urban areas, and can also be used in pedestrian navigation applications.

US11875519B2-patent-drawing

15. Indoor UAV Navigation Positioning System with GPS-Based Flight Path Generation Using Sensor-Derived Digital Maps

VERIZON PATENT AND LICENSING INC, 2023

A positioning system for indoor UAV navigation that generates GPS-based flight paths using a digital map of the environment created from sensor data, enabling accurate localization and path planning in GPS-denied environments.

US2023386350A1-patent-drawing

16. Camera-Based Aircraft Navigation System Utilizing Visual Landmark Recognition and Machine Learning Algorithms

AURORA FLIGHT SCIENCES CORP, 2023

Aircraft navigation system that enables autonomous navigation through visual landmarks by leveraging camera-based scanning. The system employs a camera system to capture images of the aircraft's surroundings, which are then processed to determine the aircraft's position and orientation. This position information is used to guide the aircraft to its destination, eliminating the need for traditional navigation systems. The system employs machine learning algorithms to automatically identify and track landmarks in the aircraft's environment, enabling precise navigation through complex terrain.

US11808578B2-patent-drawing

17. Unmanned Aerial Vehicle System with Integrated Ultra-Wideband Ranging, Inertial Measurement, and Self-Localization Techniques

NANYANG TECHNOLOGICAL UNIVERSITY, 2023

An unmanned aerial vehicle (UAV) system that combines ultra-wideband (UWB) ranging with inertial measurement unit (IMU) data and onboard self-localization (OSL) techniques to achieve robust and reliable state estimation. The system uses UWB nodes placed on the UAV at offset positions to receive signals from anchor nodes in the environment, which are then combined with IMU data and OSL information to estimate the UAV's pose state. The system can also integrate lidar and visual features from onboard sensors to further enhance localization accuracy.

18. Robot with Dynamic Node Generation for Topological Mapping Using Real-Time Lidar and Camera Data

LG ELECTRONICS INC, 2023

A moving robot that generates a topological map by dynamically creating nodes based on real-time sensor data, including lidar and camera information, to accurately represent the robot's environment and enable efficient navigation. The robot determines open movement directions and creates new nodes when necessary, while also updating existing nodes based on sensor data. This approach enables the robot to generate a highly accurate map of its environment and navigate through it efficiently.

US11774976B2-patent-drawing

19. SLAM Method with Enhanced Loop Closure via Wide-Field Spatial Feature Pose Estimation and Hybrid Bundle Adjustment

SAMSUNG ELECTRONICS CO LTD, 2023

A simultaneous localization and mapping (SLAM) method that improves loop closure (LC) performance by estimating a relative pose between a query image and a search image based on spatial features with a wider detection field of view, and optimizes a global map using a hybrid bundle adjustment (HBA) approach that combines incremental bundle adjustment (IBA) and full bundle adjustment (FBA) based on pose drift information.

20. Snow Grooming Vehicle with Onboard Terrain Scanning and Reference Model-Based Navigation System

PRINOTH SPA, 2023

A snow grooming vehicle that can operate independently of GPS signals, using a reference model of the terrain created by scanning the area with sensors mounted on the vehicle itself. The vehicle determines its position and orientation relative to the reference model, allowing it to maintain its position and navigate through areas with poor satellite signal reception.

21. Cloud-Based Collaborative 3D Mapping System for Autonomous Vehicles with Scalable SLAM Architecture

22. Visual Localization Map Update System with Consecutive Image Pose Calculation and Dynamic Adaptation Mechanism

23. Vision-Lidar Coupling System for UAV-Based Tunnel Modeling with Bayesian Data Fusion

24. Autonomous Drone Navigation System with Iterative 3D Mapping and Optical Sensor-Based Pathfinding

25. Drone System with Stereo Vision, Optical Flow, and Depth Sensors for Autonomous Navigation and Obstacle Avoidance

Get Full Report

Access our comprehensive collection of 90 documents related to this technology