13 patents in this list

Updated:

Accurate LiDAR alignment and calibration are crucial for autonomous vehicles to perceive their environment reliably. Misalignments or calibration errors can lead to distorted data, affecting navigation and decision-making. In dynamic environments, even minor discrepancies can compromise safety and efficiency, highlighting the importance of precise sensor integration.

Professionals face challenges such as maintaining alignment under varying conditions and integrating data from multiple sensors. Temperature fluctuations, mechanical vibrations, and environmental factors can all impact the accuracy of LiDAR systems. Ensuring consistent performance requires robust calibration techniques that adapt to these variables.

This page explores a range of solutions from recent research, including sensor array calibration, camera-assisted alignment, and sensor fusion methods. These approaches enhance the precision of LiDAR systems, ensuring reliable data integration and improved performance in autonomous vehicle systems. By addressing these challenges, the solutions contribute to safer and more efficient navigation.

1. Vehicle Sensor Array Calibration via Relative Orientation Adjustment and Deviation Correction

Lyft, Inc., 2024

Calibrating sensors in a vehicle sensor array to correct misalignments and improve accuracy. The technique involves determining the relative orientation of the sensor array to the vehicle and comparing it to the expected orientation. If deviations are detected, calibration factors are calculated to correct the sensor measurements. Methods for this calibration include comparing sensor outputs, using camera markers, or analyzing vehicle motion data.

US11878632B2-patent-drawing

2. Radiation Calibration Method for Airborne Hyperspectral Imaging LiDAR with Monochromator and Rotating Whiteboard

HEFEI INSTITUTE OF PHYSICAL SCIENCE, CHINESE ACADEMY OF SCIENCES, 2024

Radiation calibration method for airborne hyperspectral imaging LiDAR system. The method calibrates the radiation detection of the LiDAR system so that the spectral data captured by the hyperspectral LiDAR can be used accurately for tasks like ground target classification. The calibration involves two steps: spectrum calibration using a monochromator to determine channel wavelengths, and radiation calibration using a rotating whiteboard to determine channel sensitivities.

3. LiDAR System with Camera-Assisted Alignment of Transmitter and Receiver Blocks

Waymo LLC, 2023

Ensuring precise alignment of LiDAR transmitter and receiver blocks to enhance LiDAR performance. A camera images the positions of the light sources in the transmitter block and the detectors in the receiver block. Offsets between these positions are calculated and used to adjust the alignment of the blocks. This ensures that beams from the sources are accurately directed to the corresponding detectors, even if they are initially misaligned.

4. Autonomous Vehicle Positioning System Integrating Inertial Measurements with LiDAR Point Cloud and Local Map Data

APOLLO INTELLIGENT DRIVING TECHNOLOGY (BEIJING) CO., LTD., 2023

Accurate, robust positioning for autonomous vehicles, using data from onboard sensors without relying on external maps. The method integrates inertial measurements with LiDAR point clouds and local maps to determine a vehicle's position. It compensates vehicle motion using inertial data, matches LiDAR points to maps, and probabilistically combines the data sources to optimize positioning.

5. Sensor Alignment Method Using Iterative Transformation and Mutual Information Metric

Luminar, LLC, 2022

Optimizing alignment of LiDAR and camera sensors to merge data for autonomous driving systems. The method involves iteratively adjusting transformation parameters to align sensor data sets from sensors with overlapping fields of view. The alignment is optimized using a metric of mutual information between the sensor datasets.

US11360197B2-patent-drawing

6. Rotating Platform-Based Calibration System for LiDAR and Camera Alignment in Autonomous Vehicles

Lyft, Inc., 2021

Calibrating LiDAR sensors with cameras in autonomous vehicles to ensure accurate alignment of data for environment perception. The calibration involves using a rotating platform with markers to generate 3D point clouds from the cameras and LiDAR. By optimizing the LiDAR position/rotation to match the cameras, misalignments can be determined and corrected.

7. Coherent LiDAR System with Waveguide-Coupled Lens Alignment and Phase-Calibrated Optical Modulators

GM GLOBAL TECHNOLOGY OPERATIONS LLC, 2021

Calibration and alignment of coherent LiDAR systems for vehicles to maximize performance and range. The LiDAR system uses FMCW technology with phase modulation. The receiving lens is aligned using an added waveguide coupler that allows a second light source to transmit through the lens. Optical phase modulators are calibrated to match the phase of the combined signals from the target and local oscillator. This ensures maximum signal strength when detected by photodetectors.

8. Sensor Calibration System for LiDAR and Camera Alignment Using Corner and Edge Extraction

DEEPMAP INC., 2020

Calibration of sensors like LiDAR and cameras on autonomous vehicles. The calibration allows for generating and maintaining high-definition maps for safe autonomous driving. The system uses captured LiDAR and camera data to calibrate the sensors. It extracts corners from LiDAR points to align with camera edges. This provides a transform for mapping between LiDAR and camera coordinates. The calibrated sensors are then used to generate high-precision maps that allow precise autonomous vehicle positioning in lanes for safe driving.

US10841496B2-patent-drawing

9. LiDAR Optics Alignment Method Utilizing Camera with Interposed Apertures

Waymo LLC, 2020

A method to align LiDAR optics using a camera. The method involves obtaining images with the camera while interposing different apertures between the camera and the LiDAR device. By analyzing the images, alignment offsets between the LiDAR transmitter and receiver can be determined.

10. Sensor Fusion Method for Aligning Camera Images with LiDAR Reflections in Autonomous Navigation

Mobileye Vision Technologies Ltd., 2019

Using camera and LiDAR sensors together to enable autonomous navigation. The method involves aligning camera images with LiDAR reflections to correlate objects identified in both sensor outputs. This allows attributing LiDAR depth information to camera-detected objects. Combining camera object recognition with LiDAR distance measurement provides more accurate navigation information for autonomous vehicles.

11. LiDAR Distance Measurement Method Using Split Laser Pulses for Temperature-Compensated Photodiode Bias Adjustment

Uber Technologies, Inc., 2018

A method to measure distances with LiDAR that compensates for temperature variations of the sensor and allows using lower cost components. The method involves splitting the laser pulse into a calibration pulse and an external pulse. The calibration pulse is directed to the avalanche photodiode to measure its response, which indicates the bias voltage needed to compensate for temperature changes. The external pulse is directed to an object to measure the reflected pulse and determine its time of flight for distance calculation. By adjusting the bias voltage based on the calibration pulse response, the photodiode gain can be maintained constant regardless of temperature changes.

12. Flash LIDAR System with Segmented Field of View and Randomized Illumination Timing and Direction

Harvey Weinberg, 2018

Flash LIDAR system that improves range and jamming resistance by scanning the field of view in segments using randomization in illumination time and direction. The system divides the FOV into segments that are sequentially illuminated by specific illuminators. Corresponding subsets of detectors receive the reflected light. Randomizing illumination reduces the total FOV area and increases illumination density on the target. This improves range. By scanning segments instead of the whole FOV, the system also provides jamming resistance.

US20180074196A1-patent-drawing

13. LIDAR Data Alignment Using Planar Feature Comparison with Reference Point Cloud

Google Inc., 2015

Aligning LIDAR data by comparing planar features in the data to corresponding features in a reference point cloud. This allows aligning of two 3D point clouds that are misaligned due to different sensor positions.

Request the PDF report with complete details of all 13 patents for offline reading.

The patents shown here demonstrate a variety of methods for enhancing LiDAR calibration and alignment. Few techniques focus on adjusting misalignments in sensor arrays that are installed on cars. Others such as radiation calibration for aerial LiDAR systems are employed for purposes like classifying ground objects.