Combined calibration method, device, equipment and medium for radar and integrated navigation system

文档序号:1612735 发布日期:2020-01-10 浏览:11次 中文

阅读说明:本技术 雷达与组合导航系统的联合标定方法、装置、设备及介质 (Combined calibration method, device, equipment and medium for radar and integrated navigation system ) 是由 曹家伟 黄玉辉 钱炜 于 2019-09-25 设计创作,主要内容包括:本申请实施例提供一种雷达与组合导航系统的联合标定方法、装置、设备及介质,其中,该方法通过获取自动驾驶车辆搭载的激光雷达在第一时刻和第二时刻采集得到的点云数据,以及组合导航系统在第一时刻和第二时刻采集得到的位置和姿态数据,将激光雷达采集获得的点云数据进行配准处理,得到第一齐次变换矩阵,根据组合导航系统采集得到的位置和姿态数据,计算得到第二齐次变换矩阵,根据第一齐次变换矩阵和第二齐次变换矩阵,来完成激光雷达与组合导航系统之间的联合标定。针对雷达与组合导航系统的联合标定,本申请实施例提供的方案对环境条件的要求较低,削弱了环境条件对标定准确性的影响,提高了标定的准确性。(The embodiment of the application provides a combined calibration method, a device, equipment and a medium for a radar and a combined navigation system, wherein the method comprises the steps of acquiring point cloud data acquired by a laser radar carried by an automatic driving vehicle at a first moment and a second moment and position and attitude data acquired by the combined navigation system at the first moment and the second moment, registering the point cloud data acquired by the laser radar to obtain a first homogeneous transformation matrix, calculating to obtain a second homogeneous transformation matrix according to the position and attitude data acquired by the combined navigation system, and completing the combined calibration between the laser radar and the combined navigation system according to the first homogeneous transformation matrix and the second homogeneous transformation matrix. Aiming at the combined calibration of the radar and the integrated navigation system, the scheme provided by the embodiment of the application has lower requirements on environmental conditions, the influence of the environmental conditions on the calibration accuracy is weakened, and the calibration accuracy is improved.)

1. A joint calibration method for a radar and a combined navigation system is characterized by comprising the following steps:

the method comprises the steps of acquiring point cloud data acquired by a laser radar carried on an automatic driving vehicle at a first moment and a second moment, and acquiring position and posture data acquired by a combined navigation system carried on the automatic driving vehicle at the first moment and the second moment, wherein the point cloud data acquired at the first moment and the second moment comprise point clouds of the same object, and the positions and/or postures of the automatic driving vehicle at the first moment and the second moment are different;

carrying out registration processing on the point cloud data acquired by the laser radar at the first moment and the point cloud data acquired by the laser radar at the second moment to obtain a first simultaneous transformation matrix between laser radar coordinate systems of the laser radar at the first moment and the second moment;

calculating to obtain a second homogeneous transformation matrix between the combined navigation coordinate systems of the combined navigation system at the first moment and the second moment according to the position and posture data acquired by the combined navigation system at the first moment and the second moment;

and determining a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on the first homogeneous transformation matrix and the second homogeneous transformation matrix, determining the distance between the obstacle detected by the laser radar and the automatic driving vehicle according to the third homogeneous transformation matrix, and controlling the automatic driving vehicle according to the distance.

2. The method of claim 1, wherein a time interval between the first time and the second time is less than a preset time period.

3. The method of claim 1, wherein a distance between the location of the autonomous vehicle at the first time and the location of the autonomous vehicle at the second time is less than a preset distance.

4. The method according to any one of claims 1-3, wherein the calculating a second homogeneous transformation matrix between the combined navigation coordinate systems at the first time and the second time according to the position and posture data acquired by the combined navigation system at the first time and the second time comprises:

respectively converting the position acquired by the integrated navigation system at the first moment into a first position under a UTM coordinate system, and converting the position acquired at the second moment into a second position under the UTM coordinate system;

determining a first transformation matrix from the integrated navigation coordinate system to the UTM coordinate system according to the first position and the attitude data acquired at the first moment, and determining a second transformation matrix from the integrated navigation coordinate system to the UTM coordinate system according to the second position and the attitude data acquired at the second moment;

and calculating to obtain a second homogeneous transformation matrix between the combined navigation coordinate systems of the combined navigation system at the first moment and the second moment according to the first transformation matrix and the second transformation matrix.

5. A control device, comprising:

the system comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring point cloud data acquired by a laser radar carried on an automatic driving vehicle at a first moment and a second moment and acquiring position and attitude data acquired by a combined navigation system carried on the automatic driving vehicle at the first moment and the second moment, the point cloud data acquired at the first moment and the second moment comprise point clouds of the same object, and the positions and/or attitudes of the automatic driving vehicle at the first moment and the second moment are different;

the registration processing module is used for carrying out registration processing on the point cloud data acquired by the laser radar at the first moment and the point cloud data acquired by the laser radar at the second moment to obtain a first simultaneous transformation matrix between laser radar coordinate systems of the laser radar at the first moment and the second moment;

the calculation module is used for calculating and obtaining a second homogeneous transformation matrix between the combined navigation coordinate systems at the first moment and the second moment according to the position and posture data acquired by the combined navigation system at the first moment and the second moment;

a determining module, configured to determine a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on the first homogeneous transformation matrix and the second homogeneous transformation matrix;

and the control module is used for determining the distance between the obstacle detected by the laser radar and the automatic driving vehicle according to the third homogeneous transformation matrix and controlling the automatic driving vehicle according to the distance.

6. The control device of claim 5, wherein a time interval between the first time and the second time is less than a preset time period.

7. The control apparatus of claim 5, wherein a distance between the position of the autonomous vehicle at the first time and the position of the autonomous vehicle at the second time is less than a preset distance.

8. The control device according to any one of claims 5 to 7, wherein the calculation module includes:

the coordinate conversion submodule is used for respectively converting the position acquired by the integrated navigation system at the first moment into a first position under a UTM coordinate system and converting the position acquired at the second moment into a second position under the UTM coordinate system;

the determining submodule is used for determining a first transformation matrix from the combined navigation coordinate system to the UTM coordinate system according to the first position and the attitude data acquired at the first moment, and determining a second transformation matrix from the combined navigation coordinate system to the UTM coordinate system according to the second position and the attitude data acquired at the second moment;

and the calculation submodule is used for calculating to obtain a second homogeneous transformation matrix between the combined navigation coordinate systems of the combined navigation system at the first moment and the second moment according to the first transformation matrix and the second transformation matrix.

9. An autonomous vehicle comprising a lidar, an integrated navigation system, and a processor and memory;

the laser radar is used for detecting obstacles around the automatic driving vehicle to obtain point cloud data;

the integrated navigation system is used for collecting position and attitude data of the automatic driving vehicle;

the memory has stored therein instructions to perform the method of any of claims 1-4 when executed by the processor.

10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-4.

Technical Field

The embodiment of the application relates to the technical field of automatic driving, in particular to a radar and integrated navigation system combined calibration method, device, equipment and medium.

Background

In an automatic driving vehicle, a laser radar and an INS/GPS combined navigation system are important components of the vehicle, wherein the laser radar can be used for detecting obstacles around the vehicle, the INS/GPS combined navigation system can be used for detecting the position and the posture of the vehicle, and in actual use, the laser radar and the INS/GPS combined navigation system need to be calibrated jointly so that the coordinates of the obstacles detected by the laser radar can be converted from a laser radar coordinate system to an INS/GPS combined navigation system coordinate system.

At present, the related technologies for realizing the joint calibration between the laser radar coordinate system and the INS/GPS integrated navigation system coordinate system mainly include two types: one is a manual calibration method, which has large error and low accuracy, and cannot well estimate the rotation angle information between a laser radar coordinate system and an INS/GPS combined navigation system coordinate system, and the other is a three-dimensional laser radar external parameter calibration method based on a hand-eye calibration model, which requires the acquisition data of uphill and turning environments at the same time, has high requirements on environmental conditions, and the calibration model is sensitive to environmental noise, is easily influenced by noise, and has the problem of low calibration accuracy.

Disclosure of Invention

The embodiment of the application provides a radar and integrated navigation system combined calibration method, device, equipment and medium, which are used for reducing the requirements of calibration on environmental conditions, reducing the influence of the environmental conditions on calibration accuracy and improving the calibration accuracy while realizing the combined calibration between a laser radar and an integrated navigation system.

A first aspect of an embodiment of the present application provides a joint calibration method for a radar and an integrated navigation system, where the method includes: the method comprises the steps of acquiring point cloud data acquired by a laser radar carried on an automatic driving vehicle at a first moment and a second moment, and acquiring position and posture data acquired by a combined navigation system carried on the automatic driving vehicle at the first moment and the second moment, wherein the point cloud data acquired at the first moment and the second moment comprise point clouds of the same object, and the positions and/or postures of the automatic driving vehicle at the first moment and the second moment are different; carrying out registration processing on the point cloud data acquired by the laser radar at the first moment and the point cloud data acquired by the laser radar at the second moment to obtain a first simultaneous transformation matrix between laser radar coordinate systems of the laser radar at the first moment and the second moment; calculating to obtain a second homogeneous transformation matrix between the combined navigation coordinate systems of the combined navigation system at the first moment and the second moment according to the position and posture data acquired by the combined navigation system at the first moment and the second moment; and determining a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on the first homogeneous transformation matrix and the second homogeneous transformation matrix, determining the distance between the obstacle detected by the laser radar and the automatic driving vehicle according to the third homogeneous transformation matrix, and controlling the automatic driving vehicle according to the distance.

A second aspect of the embodiments of the present application provides a control apparatus, including:

the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring point cloud data acquired by a laser radar carried on an automatic driving vehicle at a first moment and a second moment and acquiring position and attitude data acquired by a combined navigation system carried on the automatic driving vehicle at the first moment and the second moment, point clouds of the same object are contained in the point cloud data acquired at the first moment and the second moment, and the positions and/or attitudes of the automatic driving vehicle at the first moment and the second moment are different.

And the registration processing module is used for carrying out registration processing on the point cloud data acquired by the laser radar at the first moment and the point cloud data acquired by the laser radar at the second moment to obtain a first simultaneous transformation matrix between laser radar coordinate systems of the laser radar at the first moment and the second moment.

And the calculation module is used for calculating and obtaining a second homogeneous transformation matrix between the combined navigation coordinate systems at the first moment and the second moment according to the position and posture data acquired by the combined navigation system at the first moment and the second moment.

And the determining module is used for determining a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on the first homogeneous transformation matrix and the second homogeneous transformation matrix.

And the control module is used for determining the distance between the obstacle detected by the laser radar and the automatic driving vehicle according to the third homogeneous transformation matrix and controlling the automatic driving vehicle according to the distance.

A third aspect of an embodiment of the present application provides an autonomous vehicle, the vehicle comprising a lidar, an integrated navigation system, and a processor and memory; the system comprises a laser radar, an integrated navigation system, a memory and a processor, wherein the laser radar is used for detecting obstacles around the automatic driving vehicle to obtain point cloud data, the integrated navigation system is used for collecting position and attitude data of the automatic driving vehicle, and the processor is stored with instructions and used for executing the method of the first aspect when executing the instructions.

A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is configured to, when executed by a processor, implement the method according to the first aspect.

Based on the above aspects, the method, the device, the equipment and the medium for jointly calibrating the radar and the integrated navigation system provided by the embodiment of the application, by acquiring point cloud data acquired by a laser radar carried by an automatic driving vehicle at a first moment and a second moment, and combining the position and attitude data acquired by the navigation system at the first moment and the second moment, registering the point cloud data acquired by the laser radar at the first moment and the point cloud data acquired by the laser radar at the second moment to obtain a first simultaneous transformation matrix, and a second homogeneous transformation matrix is calculated according to the position and posture data acquired by the integrated navigation system at the first moment and the second moment, therefore, the joint calibration between the laser radar and the integrated navigation system is completed according to the first homogeneous transformation matrix and the second homogeneous transformation matrix. On one hand, the embodiment of the application only requires that data used for combined calibration are acquired at different moments of the position and/or the posture of the automatic driving vehicle, and no requirement is made on the environmental condition, so that the embodiment of the application can reduce the requirement of the combined calibration on the environmental condition, and on the other hand, the embodiment of the application determines the calibration relation between the laser radar and the combined navigation system according to the homogeneous conversion matrix of the laser radar coordinate system at different moments and the homogeneous conversion matrix of the combined navigation system at different moments, and does not need to use a hand-eye calibration model or a similar model of the related technology, so that the influence of environmental noise on the calibration result of the embodiment of the application is small, and the calibration accuracy can be improved.

It should be understood that what is described in the summary section above is not intended to limit key or critical features of the embodiments of the application, nor is it intended to limit the scope of the application. Other features of the present disclosure will become apparent from the following description.

Drawings

FIG. 1 is a schematic view of an automatic driving scenario provided by an embodiment of the present application;

FIG. 2 is a flowchart of a joint calibration method for a radar and integrated navigation system according to an embodiment of the present disclosure;

FIG. 3a is a schematic diagram of data acquisition provided by an embodiment of the present application;

FIG. 3b is a schematic diagram of another data collection provided by an embodiment of the present application;

FIG. 4 is a flowchart of another method for jointly calibrating a radar and a combined navigation system according to an embodiment of the present application;

fig. 5 is a schematic structural diagram of a control device according to an embodiment of the present disclosure;

FIG. 6 is a schematic structural diagram of another control device provided in the embodiments of the present application;

fig. 7 is a schematic structural diagram of an autonomous vehicle according to an embodiment of the present application.

Detailed Description

Embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present application. It should be understood that the drawings and embodiments of the present application are for illustration purposes only and are not intended to limit the scope of the present application.

The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the embodiments of the application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.

Fig. 1 is a schematic view of an automatic driving scenario provided in an embodiment of the present application, where a vehicle 10 in fig. 1 performs an automatic driving operation. The vehicle 10 is mounted with a laser radar 11, an integrated navigation system 12, and a controller 14, where the laser radar 11 is used to collect information around the vehicle 10 and obtain point cloud data around the vehicle (for example, in front of the vehicle). The integrated navigation system 12 includes an Inertial Measurement Unit (IMU) and a positioning device (GPS positioning device for ease of understanding), the IMU is configured to detect the attitude of the vehicle 10, such as yaw, pitch, roll, and the like. The locating device is used to detect the location of the vehicle 10, such as when the locating device is embodied as a GPS device, the vehicle location detected by the locating device includes latitude and longitude information of the vehicle. The controller 14 is configured to control the operation state of the vehicle 10 according to the data collected by the laser radar 11 and the integrated navigation system 12, for example, when an obstacle is encountered, adjust the moving track of the vehicle 10, and perform an obstacle avoidance operation. As shown in fig. 1, during the movement of the vehicle 10, if an obstacle 13 is encountered, the laser radar 11 obtains a position of the obstacle 13 under its own coordinate system (i.e., a laser radar coordinate system), and sends the position of the obstacle under the laser radar coordinate system to the processor 14, the processor 14 maps the position of the obstacle 13 under the laser radar coordinate system into the coordinate system of the joint navigation system according to a calibration relationship between the laser radar and the joint navigation system, and determines a relative position between the obstacle and the vehicle 10 according to the vehicle position detected by the joint navigation system, and if the distance between the obstacle 13 and the vehicle 10 is smaller than a preset distance, a corresponding automatic driving strategy is executed according to the relative position between the obstacle 13 and the vehicle 10 to achieve the purpose of avoiding the obstacle. Of course, fig. 1 is only one possible application scenario and is not the only scenario in which the embodiment of the present application can be applied, for example, in some embodiments, the embodiment of the present application may also be applied to a scenario such as automatic reversing. However, no matter what application scenario is applied, it is obvious that for the automatic driving technology, the calibration relationship between the laser radar and the joint navigation system is very important for the automatic driving, but as can be known from the related technologies introduced in the background art section, the existing calibration method has the defects of the above or similar, so that the calibration process is complex, the conditions are harsh, and the accuracy is low. It is important to overcome the drawbacks of the related art and to provide a convenient, reliable, effective and accurate calibration method.

In view of the defects in the related art, an embodiment of the present application provides a combined calibration scheme for a radar and a combined navigation system, where fig. 2 is a flowchart of a combined calibration method for a radar and a combined navigation system provided in an embodiment of the present application, the method may be executed by a control device mounted on an automatic driving device, and for example, a laser radar and a combined navigation system calibrated by the method may be applied to a scene shown in fig. 1, so as to implement an automatic obstacle avoidance operation of an automatic driving device. As shown in fig. 2, the method comprises the steps of:

step 201, point cloud data acquired by a laser radar carried on an automatic driving vehicle at a first moment and a second moment and position and posture data acquired by a combined navigation system carried on the automatic driving vehicle at the first moment and the second moment are acquired, wherein the point cloud data acquired at the first moment and the second moment comprise point clouds of the same object, and the positions and/or postures of the automatic driving vehicle at the first moment and the second moment are different.

For example, in the present embodiment, an autonomous vehicle (hereinafter, referred to as a vehicle) may travel in any travel state other than a stationary state along any travel route in any environment, for example, the vehicle may perform a turn-on-site travel or travel in at least one of a forward direction, a backward direction, a left direction, a right direction, and the like, so that the position and/or posture (for example, a yaw angle, a pitch angle, a roll angle, and the like) of the vehicle is changed. The laser radar carried on the vehicle detects the environment around the vehicle in the running process of the vehicle to obtain point cloud data of a first moment and a second moment, and meanwhile, the combined navigation system carried on the vehicle collects position and posture data of the vehicle at the first moment and the second moment.

For example, fig. 3a is a schematic diagram of data acquisition provided by an embodiment of the present application, in fig. 3a, a laser radar and a combined navigation system mounted on a vehicle acquire data at preset time intervals, and at this time, if a vehicle speed is not constant, distances between adjacent sampling positions may be different. After the data is completed, identifying objects in the point cloud data acquired at each moment, judging the position and the posture of the vehicle at each moment, if the point cloud data at the moment t1 and the point cloud data at the moment t3 contain the same object A and the position and/or the posture of the vehicle are changed, determining t1 and t3 as a first moment and a second moment respectively, and taking the data acquired at t1 and t3 as the input data of the joint calibration. It should be noted that although the first time and the second time are not two adjacent data acquisition times in fig. 3a, in other application scenarios, the first time and the second time may be two adjacent data acquisition times.

In addition, in some embodiments, in order to ensure that point clouds of the same object are included in point cloud data at adjacent times or different times, the preset time interval may be set to be smaller as needed, for example, smaller than a preset time length, where the preset time length may be set as needed, for example, the preset time length may be set to be 1.5 seconds in some examples, and the first time and the second time may be two times with a time interval of 1 second, that is, the first time and the second time referred to in this embodiment may be two times with a time interval smaller than the preset time length.

Fig. 3b is a schematic diagram of another data acquisition provided by the embodiment of the present application, in fig. 3b, the lidar and the integrated navigation system mounted on the vehicle acquire data at preset distance intervals, that is, the lidar and the integrated navigation system acquire data once every time the vehicle moves a preset distance, and the acquisition positions are z1, z2, z3, and z4, respectively. If the point cloud data collected at the positions z1 and z2 include the point cloud of the same object B, the time when the vehicle moves to the position z1 and the time when the vehicle moves to the position z2 are respectively determined as a first time and a second time, and the data collected at the position z1 and the position z2 are used as the input data of the joint calibration.

Similarly, in order to ensure that point cloud data of the same object is included in point cloud data of adjacent time or different time, the preset distance may be set to be smaller, for example, smaller than one preset distance, as required, wherein the preset duration may be set as required, for example, the preset distance may be set to be 10 meters in some examples, and the first time and the second time may be two data acquisition times with a distance interval of less than 10 meters, that is, the distance between the position of the vehicle at the first time and the position of the vehicle at the second time is smaller than the preset distance, which is only an example and is not a sole limitation on the preset distance.

Step 202, carrying out registration processing on point cloud data acquired by the laser radar at a first moment and point cloud data acquired by the laser radar at a second moment to obtain a first simultaneous transformation matrix between laser radar coordinate systems of the laser radar at the first moment and the second moment.

For example, in this embodiment, point cloud data acquired at a first time and a second time may be used as input, and an ICP algorithm or other point cloud registration algorithms are used to perform registration processing on the point cloud data at the two times, so as to obtain a first simultaneous transformation matrix from the first time to the second time in a laser radar coordinate system.

And 203, calculating to obtain a second homogeneous transformation matrix between the combined navigation coordinate systems of the combined navigation system at the first moment and the second moment according to the position and posture data acquired by the combined navigation system at the first moment and the second moment.

In an actual implementation, the vehicle position and attitude data acquired at the first time and the second time may be used as input, and for the data at the two times, the position (such as longitude and latitude) acquired by the integrated navigation system at the first time is converted into a first position (x1, y1, z1) in the UTM coordinate system, and the position acquired by the integrated navigation system at the second time is converted into a second position (x2, y2, z2) in the UTM coordinate system.

Further, a first transformation matrix from the integrated navigation coordinate system to the UTM coordinate system is calculated according to the first position (x1, y1, z1) and the attitude data acquired by the integrated navigation system at the first time, for example, assuming that the yaw angle, the pitch angle and the roll angle acquired by the integrated navigation system at the first time are α, β and γ, respectively, the first transformation matrix from the integrated navigation coordinate system to the UTM coordinate system can be represented as:

Figure BDA0002214270540000071

wherein the content of the first and second substances,

Figure BDA0002214270540000072

further, a second transformation matrix from the combined navigation coordinate system to the UTM coordinate system is determined according to the second position (x2, y2, z2) and the pose data acquired at the second time, and if Pb is counted, the combined navigation system combines a second homogeneous transformation matrix between the navigation coordinate systems at the first time and the second time as follows:

P=Pb-1Pa

and 204, determining a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on the first homogeneous transformation matrix and the second homogeneous transformation matrix, determining the distance between the obstacle detected by the laser radar and the automatic driving vehicle according to the third homogeneous transformation matrix, and controlling the automatic driving vehicle according to the distance.

Specifically, assuming that the first homogeneous transformation matrix obtained in step 202 is Q and the second homogeneous transformation matrix obtained in step 103 is P, a third homogeneous transformation matrix X between the laser radar coordinate system and the combined navigation coordinate system may be calculated according to the relationship PX being XQ.

Further, after the third homogeneous transformation matrix X is obtained, the third homogeneous transformation matrix X may be applied to an automatic driving process, when the vehicle performs an automatic driving operation, the lidar detects a surrounding environment of the vehicle, when an obstacle is detected, the controller mounted on the vehicle maps coordinates of the obstacle in the lidar coordinate system to coordinates in the combined navigation coordinate system according to the third homogeneous transformation matrix X, and determines a distance between the vehicle and the obstacle according to a vehicle position detected by the combined navigation, thereby performing a corresponding automatic driving strategy according to the distance.

In the embodiment, the point cloud data acquired by the laser radar carried by the automatic driving vehicle at the first moment and the second moment and the position and attitude data acquired by the combined navigation system at the first moment and the second moment are acquired, the point cloud data acquired by the laser radar at the first moment and the point cloud data acquired by the laser radar at the second moment are subjected to registration processing to obtain a first homogeneous transformation matrix, and a second homogeneous transformation matrix is obtained through calculation according to the position and attitude data acquired by the combined navigation system at the first moment and the second moment, so that the joint calibration between the laser radar and the combined navigation system is completed according to the first homogeneous transformation matrix and the second homogeneous transformation matrix. On one hand, the embodiment only requires that data used for combined calibration are acquired at different moments of the position and/or the posture of the automatic driving vehicle, and no requirement is made on the environmental condition, so that the requirement of the combined calibration on the environmental condition can be reduced, and on the other hand, the calibration relation between the laser radar and the combined navigation system is determined according to the homogeneous conversion matrix of the laser radar coordinate system at different moments and the homogeneous conversion matrix of the combined navigation system at different moments, and a hand-eye calibration model or a similar model in the related technology is not needed, so that the influence of environmental noise on the calibration result of the embodiment is small, and the calibration accuracy can be improved.

Fig. 4 is a flowchart of another method for jointly calibrating a radar and a combined navigation system according to an embodiment of the present application, where as shown in fig. 4, the method includes:

step 401, multiple groups of data pairs collected by a laser radar and an integrated navigation system carried on an automatic driving vehicle are obtained, and each group of data pairs comprises point cloud data, vehicle positions and vehicle attitude data which are collected by the laser radar and the integrated navigation system at two moments.

For each group of data, the point cloud data acquired by the laser radar at two moments comprise point clouds of the same object, and the positions and/or postures of the vehicles acquired by the combined navigation system at the two moments are different.

For example, assume that the point cloud data acquired at time t1 is d1, the position is w1, and the pose data is z 1; point cloud data acquired at the time of t2 is d2, the position is w2, and the attitude data is z 2; point cloud data acquired at the time of t3 is d3, the position is w3, and the attitude data is z 3; the point cloud data acquired at the time t4 is d4, the position is w4, and the posture data is z4, if the point cloud data d1 and d2 include point clouds of the same object and the positions w1 and w2 are different in the time t1 and the time t2, the data at the time t1 and the time t2 may form a group of data pairs G1((t1, d1, w1, z1), (t2, d2, w2, z2)), and if the point cloud data d2 and d2 include point clouds of the same object and the positions w2 and w2 are the same but z2 and z2 are different in the time t2 and the time t2, the data at the time t2 and the time t2 may form another group of data pairs G2((t 2, d2, z2), (t2, d2, z2, etc.). Of course, this is merely an example.

And step 402, calculating a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on each acquired data pair.

In this embodiment, the method for calculating the homogeneous transformation matrix between the lidar coordinate system and the combined navigation coordinate system based on the data pair may refer to the method in the embodiment in fig. 2, and details are not repeated here. For example, if the acquired data pair includes G1((t1, d1, w1, z1), (t2, d2, w2, z2)) and G2((t3, d3, w3, z3), (t4, d4, w4, z4)), then two third homogeneous transformation matrices C1 and C2 need to be calculated from the data pair G1((t1, d1, w1, z1), (t2, d2, w2, z2)) and G2((t3, d3, w3, z3), (t4, d4, w4, z4)), respectively.

And step 403, determining a target homogeneous transformation matrix with the smallest error from a plurality of third homogeneous transformation matrices obtained by calculation by adopting a Frobenius norm according to each group of obtained data pairs and the third homogeneous transformation matrices obtained by calculation based on each group of data pairs, so as to determine the distance between the obstacle detected by the laser radar and the automatic driving vehicle according to the target homogeneous transformation matrix, and controlling the automatic driving vehicle according to the distance.

For example, the GPU may be adopted to obtain the homogeneous transformation matrix with the smallest error from the calculated third homogeneous transformation matrices by accelerating the search. For example, before searching, a search range of the calibration parameter configured by the user may be obtained through a preset configuration interface, where the search range of the calibration parameter configured by the user includes: the search ranges in the three coordinate axis directions, the yaw direction, the pitch direction, and the roll direction in the navigation coordinate system are combined, for example, in some settings, the search ranges in the three coordinate axis directions may be set to-5 meters to 5 meters, and the search ranges in the yaw direction, the pitch direction, and the roll direction may be set to-3 to 3 radians.

Further, after obtaining the search range of the calibration parameters, the multiple third homogeneous transformation matrices calculated in step 402 may be respectively converted into corresponding calibration parameters, such as displacement on three coordinate axes, offset angles in yaw direction, pitch direction, and roll direction, and then according to the search range set by the user, one or more third homogeneous transformation matrices that meet the search range are determined from the multiple third homogeneous transformation matrices, and the error corresponding to each third homogeneous transformation matrix is obtained by substituting the one or more third homogeneous transformation matrices and corresponding data pairs into the following frobenius norm relation, and the smallest error is taken as the target homogeneous transformation matrix:

Figure BDA0002214270540000091

wherein k is the number of the third homogeneous transformation matrixes conforming to the search range, X is the third homogeneous transformation matrix conforming to the search range, and P and Q are the first homogeneous transformation matrix and the second homogeneous transformation matrix which are obtained by calculation based on the data pairs corresponding to the third homogeneous transformation matrix.

In this embodiment, the plurality of third homogeneous transformation matrices are obtained by the method in the embodiment in fig. 2, and then the target homogeneous transformation matrix is determined from the plurality of third homogeneous transformation matrices by using the frobenius norm method, and the target homogeneous transformation matrix is used as the calibration result, so that the calibration accuracy can be further improved, and the precision of the calibration result can be controlled by setting the search range, so as to meet different requirements of different scenes.

Fig. 5 is a schematic structural diagram of a control device according to an embodiment of the present application, and as shown in fig. 5, the control device 50 includes:

the acquisition module 51 is configured to acquire point cloud data acquired at a first time and a second time by a laser radar mounted on an autonomous vehicle, and acquire position and posture data acquired at the first time and the second time by a combined navigation system mounted on the autonomous vehicle, where the point cloud data acquired at the first time and the second time include point clouds of the same object, and the positions and/or postures of the autonomous vehicle at the first time and the second time are different.

And the registration processing module 52 is configured to perform registration processing on the point cloud data acquired by the laser radar at the first time and the point cloud data acquired by the laser radar at the second time, so as to obtain a first simultaneous transformation matrix between laser radar coordinate systems of the laser radar at the first time and the second time.

And the calculating module 53 is configured to calculate a second homogeneous transformation matrix between the combined navigation coordinate systems at the first time and the second time according to the position and posture data acquired by the combined navigation system at the first time and the second time.

A determining module 54, configured to determine a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on the first homogeneous transformation matrix and the second homogeneous transformation matrix.

And the control module 55 is configured to determine a distance from the obstacle detected by the laser radar to the autonomous vehicle according to the third homogeneous transformation matrix, and control the autonomous vehicle according to the distance.

In one embodiment, a time interval between a first time and the second time is less than a preset length of time, or a distance between a position of the autonomous vehicle at the first time and a position of the autonomous vehicle at the second time is less than a preset distance.

In one embodiment, the calculation module 53 includes:

and the coordinate conversion submodule is used for respectively converting the position acquired by the integrated navigation system at the first moment into a first position under a UTM coordinate system and converting the position acquired at the second moment into a second position under the UTM coordinate system.

And the determining submodule is used for determining a first transformation matrix from the combined navigation coordinate system to the UTM coordinate system according to the first position and the attitude data acquired at the first moment, and determining a second transformation matrix from the combined navigation coordinate system to the UTM coordinate system according to the second position and the attitude data acquired at the second moment.

And the calculation submodule is used for calculating to obtain a second homogeneous transformation matrix between the combined navigation coordinate systems of the combined navigation system at the first moment and the second moment according to the first transformation matrix and the second transformation matrix.

The control device provided in this embodiment has similar execution manner and beneficial effect as those of the embodiment in fig. 2, and is not described herein again.

Fig. 6 is a schematic structural diagram of another control device provided in an embodiment of the present application, and as shown in fig. 6, the control device 60 includes:

the acquisition module 61 is configured to acquire multiple sets of data pairs acquired by a laser radar and an integrated navigation system mounted on an autonomous vehicle, where each set of data pairs includes point cloud data, vehicle position data, and vehicle attitude data acquired by the laser radar and the integrated navigation system at two times.

For each group of data, the point cloud data acquired by the laser radar at two moments comprise point clouds of the same object, and the positions and/or postures of the vehicles acquired by the combined navigation system at the two moments are different.

And a calculating module 62, configured to calculate a third homogeneous transformation matrix between the laser radar coordinate system and the combined navigation coordinate system based on each acquired group of data pairs.

And a determining module 63, configured to determine, according to each acquired group of data pairs and a third homogeneous transformation matrix calculated based on each group of data pairs, a target homogeneous transformation matrix from among the plurality of third homogeneous transformation matrices calculated by using a frobenius norm, where the error is the smallest, so as to determine, according to the target homogeneous transformation matrix, a distance from an obstacle detected by the laser radar to the autonomous vehicle, and control the autonomous vehicle according to the distance.

The control device provided in this embodiment has similar execution manner and beneficial effect as those of the embodiment in fig. 4, and is not described herein again.

Fig. 7 is a schematic structural diagram of an autonomous vehicle 70 according to an embodiment of the present disclosure, as shown in fig. 7, the autonomous vehicle 70 includes a laser radar 71, an integrated navigation system 72, a processor 73, and a memory 74, where the laser radar 71 is configured to detect an obstacle around the autonomous vehicle to obtain point cloud data; the integrated navigation system 72 is configured to collect position and attitude data of the autonomous vehicle; the memory 74 has stored therein instructions that, when executed by the processor 73, are configured to perform the method of the fig. 2 or fig. 4 embodiment described above.

The embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method shown in the embodiment of fig. 2 or fig. 4, where the execution manner and the beneficial effects are similar, and are not described herein again.

The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a load programmable logic device (CPLD), and the like.

Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.

In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.

Further, while operations are depicted in a particular order, this should be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination.

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

17页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种同频干扰抑制方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!