140 patents in this list

Updated:

LiDAR systems require precise alignment between transmitter and receiver components, with typical tolerances in the sub-millimeter range. Even small misalignments can result in signal degradation, reduced detection range, and incorrect distance measurements that compound into positioning errors of several centimeters at typical autonomous vehicle operating distances.

The fundamental challenge lies in maintaining precise optical alignment while operating in dynamic environments subject to vibration, thermal expansion, and mechanical stress.

This page brings together solutions from recent research—including camera-assisted alignment techniques, iterative transformation optimization methods, rotating calibration platforms, and sensor fusion approaches. These and other approaches focus on achieving and maintaining calibration accuracy under real-world operating conditions.

1. Mobile Robot LIDAR Calibration via Spinning Data Acquisition at Varying Distances

Boston Dynamics, Inc., 2024

Automated calibration technique for LIDAR systems on mobile robots that leverages the robot's mobility to gather data for calibration. The technique involves capturing LIDAR measurements while the robot spins at different locations with varying distances to a calibration target. These measurements are processed to determine calibration data for aligning the LIDAR units. This allows accurate localization and avoids misalignment issues when using fixed-plane LIDARs.

2. Lidar Sensor Calibration System Using Lane Line Detection for Relative Positioning and Orientation

SHANGHAI YOUDAO ZHITU TECH CO LTD, SHANGHAI YOUDAO ZHITU TECHNOLOGY CO LTD, 2024

Automatic calibration of multiple lidar sensors on autonomous vehicles using lane lines. The method involves using lane lines detected by the lidars to calibrate the relative positions and orientations of the lidars. By finding common lane line features visible to multiple lidars, geometric constraints can be derived to determine the lidar-to-lidar and lidar-to-vehicle transformations. This allows flexible, scene-agnostic calibration without markers or prior models.

CN118131192A-patent-drawing

3. Calibration Method for Lidar Sensors Using Deep Learning-Based Point Cloud Recognition and Scan Matching

KOREA ELECTRONICS TECH INSTITUTE, KOREA ELECTRONICS TECHNOLOGY INSTITUTE, 2024

Method for calibrating multiple lidar sensors using deep learning and point cloud recognition to quickly and accurately extract inter-sensor calibration matrices. The method involves finding common objects in the lidar point clouds using a neural network, extracting those object areas, and applying scan matching to precisely calculate the calibration matrices. It provides a faster, more precise, and automated way to calibrate lidar sensors compared to manual methods.

4. Extrinsic Calibration Method for Non-Overlapping LiDAR Sensors with Iterative Alignment and Intrinsic Parameter Refinement

FORD GLOBAL TECH LLC, FORD GLOBAL TECHNOLOGIES LLC, 2024

Calibrating LiDAR sensors mounted on a vehicle that have non-overlapping fields of view (FOVs) to accurately represent the environment scanned by multiple sensors. The calibration involves collecting sensor data from the non-overlapping LiDARs in a calibration environment, aligning the data in a global coordinate system, and iteratively aligning the data from each sensor until the aligned sensor views match. This extrinsically calibrates the sensors without assuming overlap. Intrinsic calibration further refines the sensor parameters. Validation checks calibration accuracy over time.

5. LiDAR Sensor Calibration Method Using Relative Point Cloud Alignment on Moving Vehicle

HESAI TECH CO LTD, HESAI TECHNOLOGY CO LTD, 2024

Calibrating a LiDAR sensor on a moving vehicle without requiring specific trajectories or high-precision inertial navigation systems. The method involves calculating the relative position and orientation of the LiDAR between point clouds as the vehicle travels. This provides the necessary orientation data to determine the calibration parameters from the LiDAR coordinate system to the vehicle forward coordinate system. The method improves convenience and accuracy of LiDAR calibration by leveraging the vehicle's normal motion instead of specialized maneuvers.

WO2024104418A1-patent-drawing

6. Plane-Based Calibration Method for Relative Positioning of Multi-LiDAR Point Clouds

KOREA UNIVERSITY RESEARCH AND BUSINESS FOUNDATION, 2024

Calibrating the relative position between point clouds collected from multiple 3D LiDAR sensors in an autonomous navigation system. The method involves extracting planes from the point clouds, finding corresponding planes between sensors using similarity, initializing extrinsic parameters, and then optimizing them to minimize measurement point variance. The plane-based approach allows calibration using environmental features without special objects or trajectory estimation.

7. Lidar-IMU Calibration Using Factor Graph-Based Model for Rotation and Translation Estimation

NEOLIX TECH CO LTD, NEOLIX TECHNOLOGIES CO LTD, 2024

Calibrating the external parameters between a lidar sensor and an inertial measurement unit (IMU) in an autonomous navigation system. The calibration method involves optimally solving a factor graph-based model to estimate the rotation and translation relationship between the lidar and IMU coordinate systems. It separates the calibration from the odometer, allows simultaneous rotation and translation estimation, and avoids plane feature requirements.

8. LiDAR Sensor Calibration System Using Turntable-Based Intrinsic and Extrinsic Alignment

FORD GLOBAL TECH LLC, FORD GLOBAL TECHNOLOGIES LLC, 2024

Calibrating multiple LiDAR sensors with non-overlapping fields of view on an autonomous vehicle. The calibration method involves using a vehicle turntable system to transition the sensors from an uncalibrated state to a calibrated state. For intrinsic calibration, the sensors collect data from a calibration environment and optimize a model to estimate the intrinsic parameters like offsets. For extrinsic calibration, the sensors align sweeps using inter-sweep alignment and then globally align using a pose graph. Periodic calibration validation is also performed.

9. LiDAR Sensor Calibration Validation via Point Cloud Plane Segmentation and Normal Distance Measurement

FORD GLOBAL TECHNOLOGIES, LLC, 2024

Validating the calibration of LiDAR sensors mounted on vehicles to ensure accurate perception and navigation. The method involves collecting sensor data from the LiDAR in a calibration environment, segmenting the point cloud into planes, and measuring normal distances between the sweeps and planes. A validation score is determined using these distances. If the score is below a threshold, the calibration is validated. If not, it indicates a calibration issue. This allows proactive calibration maintenance and remediation.

10. Lidar Calibration Method Using Point Cloud Data and Transformation Matrices

SUZHOU AGV ROBOT CO LTD, 2024

Efficient lidar calibration method for autonomous vehicles that avoids the need for complex mechanical devices. The method uses point cloud data captured from the lidar at a target position to calculate the external parameters. The initial point cloud is transformed through matrices based on target parameters, centroid differences, and homogeneous transformations to generate the lidar's translational and rotational freedom relative to the vehicle's coordinate system.

CN117876504A-patent-drawing

11. Automatic Calibration System for Multi-Laser Lidar Arrays Using Relative Pose Estimation

YUNNAN KSEC INTELLIGENT EQUIPMENT CO LTD, 2024

Automatic calibration of multi-laser lidars on autonomous vehicles like AGVs without the need for manual calibration. The calibration involves calculating the relative poses between the lidars using the difference in calculated poses when they are stationary. This allows the lidars to self-calibrate without needing accurate measurement of their installation poses. By using the same map for all lidars, the relative poses can be obtained through navigation calculations.

CN117849768A-patent-drawing

12. Calibration Method for RGB-D Cameras and Lidar Sensors Using Anisotropic Error Models and Feature Point Extraction

清华大学, TSINGHUA UNIVERSITY, 2024

Calibrating RGB-D cameras and lidar sensors together for mobile robot perception systems. The calibration involves extracting feature points on a calibration plate using both sensors, and then iteratively refining the external parameter matrix between the two coordinate systems using anisotropic error models for lidar and RGB-D cameras. This accounts for the inherently anisotropic error distributions of each sensor type. The calibration improves the accuracy of converting depth measurements from both sensors into a common coordinate system for better fusion and perception.

CN115166701B-patent-drawing

13. Dual Lidar Calibration Method for Diagonally Mounted Lidar Systems Using Vehicle Motion Data

ZHEJIANG UNIV OF TECHNOLOGY, ZHEJIANG UNIVERSITY OF TECHNOLOGY, 2024

Diagonally installed dual lidar calibration method for autonomous vehicles with diagonally mounted lidars. The calibration involves capturing data from the lidars while moving in a straight line. The lidar points and angles are calculated from the known vehicle position and motion. This data is used to determine the actual lidar offsets from the nominal installed positions. The offsets are then applied to correct the lidar data for accurate mapping and navigation. The method addresses the challenge of calibrating diagonally mounted lidars with minimal overlap by using the vehicle motion to triangulate the lidar positions.

CN117805785A-patent-drawing

14. Feature-Based Calibration Method for Lidar and Inertial Measurement Unit External Parameters

北京理工大学前沿技术研究院, 北京理工大学, ADVANCED TECHNOLOGY RESEARCH INSTITUTE BEIJING INSTITUTE OF TECHNOLOGY, 2024

A method for calibrating the external parameters between lidar and inertial measurement units (IMU) in autonomous vehicles, to improve the accuracy of sensing data fusion. The calibration uses a feature-based approach that extracts distinctive points from the lidar point clouds and aligns them with the IMU data. This avoids the need for specialized equipment or indoor environments. The method involves: 1) Identifying feature points in the lidar point clouds using a feature extraction algorithm. 2) Extracting corresponding points from the IMU data using a projection method. 3) Matching the lidar and IMU points based on their feature descriptions. 4) Estimating the external parameters (e.g., position and orientation) between the lidar and IMU using the matched points. This enables calibration using lidar data alone, without additional equipment or environments.

15. Lidar-Camera Calibration Method Utilizing 3D-to-2D Point Conversion and Error Optimization

GUANGZHOU ALPCER INFORMATION TECH CO LTD, GUANGZHOU ALPCER INFORMATION TECHNOLOGY CO LTD, 2024

Calibration method for lidar and camera on autonomous vehicles to improve perception accuracy after assembly. The method involves aligning calibration points generated by scans from the lidar and images from the camera. The 3D points are converted to 2D camera coordinates. Coordinate errors are computed for the 2D points and features. Initial transformation matrix is optimized based on these errors to accurately represent the lidar-camera relative position/attitude. This joint calibration resolves coordinate discrepancies between sensors after vehicle assembly.

16. Bracket System for Joint Calibration of Multiple Lidar and Camera Sensors with Adjustable Mounting

NAT UNIV DEFENSE TECHNOLOGY PLA, NATIONAL UNIVERSITY OF DEFENSE TECHNOLOGY OF PLA, 2024

Rapid, accurate joint calibration method for multiple lidars and cameras in unstructured scenes without using calibration plates or structured features. The method involves adjusting the relative positions and heights of the bracket and sensor mounts to align the sensor fields of view. This allows joint calibration of the sensors without relying on external reference points. The bracket design enables rapid, on-the-fly calibration of multiple lidars and cameras mounted on unmanned vehicles in real-world scenarios without requiring pre-calibration or structured environments.

17. Joint Calibration Method for External Parameters of Multi-Sensor Systems Using Calibration Plate with Markings

HUZHOU LITIAN INTELLIGENT TECH CO LTD, HUZHOU LITIAN INTELLIGENT TECHNOLOGY CO LTD, 2024

A method for joint calibration of external parameters of multiple sensors like lidar, depth camera and RGB camera used in autonomous driving. The calibration is done using a device with a calibration plate and markings. The sensors simultaneously collect point cloud, depth and RGB data of the calibration device. The calibration plate center is calculated in each sensor's coordinate system using the markings. This allows calculating the calibration between any two sensors without resetting the device or using different calibration methods.

18. Calibration Method for Lidar and Position/Orientation Sensors Using Planar Feature Extraction and Voxel Grid Optimization

WUXI INTERNET OF THINGS INNOVATION CENTER CO LTD, ZHONGWEI WUCHUANG INTELLIGENT TECH SHANGHAI CO LTD, ZHONGWEI WUCHUANG INTELLIGENT TECHNOLOGY CO LTD, 2024

Method for accurately calibrating the external parameters between lidar and position/orientation sensors in autonomous vehicles. The calibration improves accuracy in environments with limited plane features and provides 6D accuracy compared to 5D for previous methods. The calibration involves extracting planar features from lidar point clouds, projecting them into world coordinates, finding corresponding points across frames, and optimizing parameters using voxel grids and ground reference points.

CN117788599A-patent-drawing

19. Lidar Sensor Calibration via Hybrid Algorithm with Common Coordinate System Conversion

DONGFENG YUEXIANG TECH CO LTD, DONGFENG YUEXIANG TECHNOLOGY CO LTD, SUZHOU BIG DATA GROUP CO LTD, 2024

Hybrid calibration method for lidar sensors in autonomous driving to improve accuracy and efficiency compared to traditional calibration methods. The method involves converting point cloud data from both lidars to a common coordinate system, calibrating each lidar separately, then adjusting their parameters using a hybrid algorithm. This leverages the strengths of both lidars and avoids manual intervention.

CN117706527A-patent-drawing

20. Calibration Method and System for Multiple Lidars and Integrated Navigation Systems with Combined Parameter Adjustment

INST OF AUTOMATION CHINESE ACADEMY OF SCIENCES, INSTITUTE OF AUTOMATION CHINESE ACADEMY OF SCIENCES, QINGDAO HUITUO INTELLIGENT MACHINE CO LTD, 2024

A method and system for calibrating multiple lidars and integrated navigation systems that improves accuracy and versatility compared to existing methods. The calibration method uses a combined approach that does not require additional inputs or preparation. It estimates initial external parameters between each lidar and integrated navigation system, then adjusts them based on factors like clock differences and vehicle vibrations. This provides calibrated parameters between each lidar and integrated navigation system.

CN117706530A-patent-drawing

21. Lidar Sensor Calibration System with Lane Line Association and High-Precision Positioning

22. Multi-Lidar Calibration Method with Combined Offline and Online Parameter Adjustment

23. Lidar-GNSS External Parameter Calibration Method Using Iterative Least Squares and Time-Synchronized Data Processing

24. Lidar Calibration Method Using Non-Overlapping Fields via Vertical Plane Alignment in Restricted Environments

25. Vehicle Sensor Array Calibration via Relative Orientation Adjustment and Deviation Correction

Request the full report with complete details of these

+120 patents for offline reading.

The patents shown here demonstrate a variety of methods for enhancing LiDAR calibration and alignment. Few techniques focus on adjusting misalignments in sensor arrays that are installed on cars. Others such as radiation calibration for aerial LiDAR systems are employed for purposes like classifying ground objects.