Sensor combined calibration method and device, electronic equipment and storage medium

文档序号:761795 发布日期:2021-04-06 浏览:13次 中文

阅读说明:本技术 传感器联合标定方法、装置、电子设备及存储介质 (Sensor combined calibration method and device, electronic equipment and storage medium ) 是由 陈海波 陈安东 于 2020-12-07 设计创作,主要内容包括:本申请涉及自动驾驶领域,提供一种传感器联合标定方法、装置、电子设备及存储介质,其中方法包括:获取包含多个圆形标定孔的三维点云和二维图像;所述三维点云是基于车载激光雷达获取的,所述二维图像是基于车载相机获取的;确定每一圆形标定孔的圆心在三维点云中的三维坐标,以及确定每一圆形标定孔的圆心在二维图像中的二维坐标;基于每一圆形标定孔的圆心的三维坐标和二维坐标,确定所述三维点云所在的点云坐标系和二维图像所在的图像坐标系之间的坐标变换关系。本申请提供的方法、装置、电子设备及存储介质,降低了标定难度,提高了标定结果的准确性。(The application relates to the field of automatic driving, and provides a sensor combined calibration method, a sensor combined calibration device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a three-dimensional point cloud and a two-dimensional image which comprise a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera; determining a three-dimensional coordinate of the circle center of each circular calibration hole in the three-dimensional point cloud, and determining a two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image; and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole. The method, the device, the electronic equipment and the storage medium reduce the calibration difficulty and improve the accuracy of the calibration result.)

1. A sensor joint calibration method is characterized by comprising the following steps:

acquiring a three-dimensional point cloud and a two-dimensional image which comprise a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera;

determining a three-dimensional coordinate of the circle center of each circular calibration hole in the three-dimensional point cloud, and determining a two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image;

and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

2. The method for jointly calibrating a sensor according to claim 1, wherein said plurality of circular calibration holes are distributed on at least two calibration plates.

3. The sensor joint calibration method according to claim 2, wherein the at least two calibration plates are arranged at different positions in the field of view of the onboard camera.

4. The method for jointly calibrating sensors according to claim 3, wherein the determining the three-dimensional coordinates of the center of each circular calibration hole in the three-dimensional point cloud comprises:

determining a point cloud plane where each calibration plate is located;

determining point clouds corresponding to the circular calibration holes on each calibration plate based on the point cloud plane where each calibration plate is located;

and performing least square fitting on the point cloud corresponding to the circular calibration holes on each calibration plate, and determining the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud.

5. The method for jointly calibrating a sensor according to claim 1, wherein the determining of the two-dimensional coordinates of the center of each circular calibration hole in the two-dimensional image comprises:

and performing circle detection on the two-dimensional image based on Hough transform, and determining a two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image.

6. The method for jointly calibrating a sensor according to any one of claims 1 to 5, wherein the number of the circular calibration holes is not less than 9.

7. The sensor joint calibration method according to any one of claims 1 to 5, wherein the determining a coordinate transformation relationship between a point cloud coordinate system in which the three-dimensional point cloud is located and an image coordinate system in which the two-dimensional image is located based on the three-dimensional coordinates and the two-dimensional coordinates of the center of each circular calibration hole comprises:

and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on at least one of an EPnP algorithm, a UPnP algorithm, a DLT algorithm and a P3P algorithm, and the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

8. A sensor combined calibration device is characterized by comprising:

the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a three-dimensional point cloud and a two-dimensional image which comprise a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera;

the determining unit is used for determining the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud and determining the two-dimensional coordinates of the circle center of each circular calibration hole in the two-dimensional image;

and the calibration unit is used for determining the coordinate transformation relation between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

9. The integrated sensor calibration device according to claim 8, wherein the plurality of circular calibration holes are distributed on at least two calibration plates.

10. The sensor co-calibration device of claim 9, wherein the at least two calibration plates are arranged at different positions in the field of view of the onboard camera.

11. The sensor joint calibration apparatus according to claim 10, wherein the determination unit comprises a three-dimensional coordinate determination subunit, the three-dimensional coordinate determination subunit comprising:

the plane determining module is used for determining a point cloud plane where each calibration plate is located;

the point cloud determining module is used for determining point clouds corresponding to the circular calibration holes on each calibration plate based on the point cloud plane where each calibration plate is located;

and the three-dimensional coordinate determination module is used for performing least square fitting on the point cloud corresponding to the circular calibration holes on each calibration plate and determining the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud.

12. The sensor joint calibration apparatus according to claim 8, wherein the determination unit further comprises:

and the two-dimensional coordinate determining subunit is used for performing circle detection on the two-dimensional image based on Hough transform and determining the two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image.

13. A sensor joint calibration device according to any one of claims 8 to 12, wherein the number of the circular calibration holes is not less than 9.

14. The combined sensor calibration device according to any one of claims 8 to 12, wherein the calibration unit is specifically configured to:

and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on at least one of an EPnP algorithm, a UPnP algorithm, a DLT algorithm and a P3P algorithm, and the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

15. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the computer program, implements the steps of the method for joint calibration of a sensor according to any one of claims 1 to 7.

16. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method for joint calibration of a sensor according to any one of claims 1 to 7.

Technical Field

The application relates to the technical field of automatic driving, in particular to a sensor joint calibration method and device, electronic equipment and a storage medium.

Background

With the continuous development of computer vision technology, sensor technology, car networking technology and the like, the automatic driving technology makes great progress. The automatic driving technology senses the environment around the vehicle through a plurality of sensors, completes data fusion and automatically controls the vehicle to run according to the fused data. The basis of data fusion is joint calibration between various sensors.

In the prior art, a laser radar and a camera are jointly calibrated, a camera is used for shooting a two-dimensional code to obtain a world coordinate point and an image coordinate point, and the three-dimensional coordinate of a target point is reversely deduced by using internal parameters of the camera.

Disclosure of Invention

The application provides a sensor combined calibration method and device, electronic equipment and a storage medium, which reduce the calibration difficulty and improve the accuracy of a calibration result.

The application provides a sensor combined calibration method, which comprises the following steps:

acquiring a three-dimensional point cloud and a two-dimensional image which comprise a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera;

determining a three-dimensional coordinate of the circle center of each circular calibration hole in the three-dimensional point cloud, and determining a two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image;

and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

According to the sensor combined calibration method provided by the application, the plurality of circular calibration holes are distributed on at least two calibration plates.

According to the sensor combined calibration method provided by the application, the at least two calibration plates are arranged at different positions in the visual field of the vehicle-mounted camera.

According to the sensor combined calibration method provided by the application, the step of determining the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud comprises the following steps:

determining a point cloud plane where each calibration plate is located;

determining point clouds corresponding to the circular calibration holes on each calibration plate based on the point cloud plane where each calibration plate is located;

and performing least square fitting on the point cloud corresponding to the circular calibration holes on each calibration plate, and determining the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud.

According to the sensor combined calibration method provided by the application, the determining of the two-dimensional coordinates of the circle center of each circular calibration hole in the two-dimensional image comprises the following steps:

and performing circle detection on the two-dimensional image based on Hough transform, and determining the two-dimensional coordinates of the circle center of each circular calibration hole in the two-dimensional image.

According to the sensor combined calibration method provided by the application, the number of the circular calibration holes is not less than 9.

According to the sensor combined calibration method provided by the application, the coordinate transformation relation between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located is determined based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole, and the method comprises the following steps:

and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on at least one of an EPnP algorithm, a UPnP algorithm, a DLT algorithm and a P3P algorithm, and the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

The application also provides a calibration device is united to sensor, includes:

the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a three-dimensional point cloud and a two-dimensional image which comprise a plurality of circular calibration holes;

the determining unit is used for determining the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud and determining the two-dimensional coordinates of the circle center of each circular calibration hole in the two-dimensional image;

and the calibration unit is used for determining the coordinate transformation relation between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

The present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the sensor joint calibration method according to any one of the above methods when executing the computer program.

The present application also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method for joint calibration of a sensor as described in any of the above.

According to the sensor combined calibration method, the sensor combined calibration device, the electronic equipment and the storage medium, the coordinate transformation relation between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located is determined by obtaining the three-dimensional point cloud and the two-dimensional image which comprise the plurality of circular calibration holes according to the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole, so that the vehicle-mounted laser radar and the vehicle-mounted camera are jointly calibrated.

Drawings

In order to more clearly illustrate the technical solutions in the present application or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.

FIG. 1 is a schematic flow chart of a sensor joint calibration method provided in the present application;

fig. 2 is a schematic flow chart of a three-dimensional coordinate determination method provided in the present application;

FIG. 3 is a schematic structural diagram of a calibration plate provided herein;

FIG. 4 is a schematic structural diagram of a sensor calibration apparatus provided in the present application;

FIG. 5 is a schematic structural diagram of a three-dimensional coordinate determination subunit provided in the present application;

fig. 6 is a schematic structural diagram of an electronic device provided in the present application.

Detailed Description

In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.

Fig. 1 is a schematic flow chart of a sensor joint calibration method provided in the present application, and as shown in fig. 1, the method includes:

step 110, acquiring a three-dimensional point cloud and a two-dimensional image containing a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera.

Specifically, the sensor in the present application includes a vehicle-mounted laser radar and a vehicle-mounted camera. The vehicle-mounted laser radar and the vehicle-mounted camera can be jointly calibrated according to the round calibration hole. Compared with calibration holes with other geometric shapes, the circular calibration hole has the advantages of obvious shape characteristics and easiness in extraction. In order to ensure that a good calibration effect is obtained, a certain number of circular calibration holes can be selected for combined calibration.

Before the combined calibration is carried out, a plurality of circular calibration holes are arranged in the same visual field in the advancing direction of the vehicle, three-dimensional point cloud containing the circular calibration holes is obtained through a vehicle-mounted laser radar mounted on the vehicle, and a two-dimensional image containing the circular calibration holes is obtained through a vehicle-mounted camera mounted on the vehicle.

The size of the circular calibration hole can be set according to actual needs, for example, the diameter of the circular calibration hole can be set to 15 cm.

Step 120, determining a three-dimensional coordinate of the circle center of each circular calibration hole in the three-dimensional point cloud, and determining a two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image.

Specifically, feature extraction is performed on the obtained three-dimensional point cloud, so that point cloud corresponding to each circular calibration hole can be obtained, and then the circle center and the three-dimensional coordinates of each circular calibration hole are obtained through fitting. For example, feature extraction of a three-dimensional Point Cloud may be implemented based on PCL (Point Cloud Library), so as to obtain a circle center and a three-dimensional coordinate of each circular calibration hole.

And performing circle detection on the obtained two-dimensional image to obtain the circular calibration holes in the two-dimensional image, and the circle center and the two-dimensional coordinates of each circular calibration hole. For example, the two-dimensional image may be subjected to at least one of hough circle detection, contour tracking, and least squares fitting to obtain a circular calibration hole.

And step 130, determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on the three-dimensional coordinates and the two-dimensional coordinates of the circle center of each circular calibration hole.

Specifically, the purpose of the joint calibration in the present application is to obtain a coordinate transformation relationship between a point cloud coordinate system in which a three-dimensional point cloud is located and an image coordinate system in which a two-dimensional image is located. For example, the point cloud coordinate system in which the three-dimensional point cloud is located may be a spatial rectangular coordinate system established with the location of the vehicle-mounted laser radar as an origin. The image coordinate system of the two-dimensional image can be a plane rectangular coordinate system established by taking the position of the vehicle-mounted camera as an origin.

The coordinate transformation relationship between the point cloud coordinate system of the three-dimensional point cloud and the image coordinate system of the two-dimensional image can be represented by a rotation matrix and a translation vector. For example, the rotation matrix may be a matrix of dimension size 3 × 3, and the translation vector may be a vector of dimension size 3 × 1. The rotation matrix is used for representing the direction of the coordinate axis of the point cloud coordinate system relative to the coordinate axis of the image coordinate system, and the translation vector is used for representing the position of the space origin of the point cloud coordinate system in the image coordinate system. The rotation matrix and the translation vector together describe how the point cloud in the point cloud coordinate system is converted into corresponding pixels in the image coordinate system.

According to the three-dimensional coordinates and the two-dimensional coordinates of the circle center of each circular calibration hole, a PnP (passive-n-point) algorithm is utilized to solve the rotation matrix and the translation vector, and the coordinate transformation relation between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located is obtained.

According to the sensor combined calibration method, the three-dimensional point cloud and the two-dimensional image containing the plurality of circular calibration holes are obtained, and the coordinate transformation relation between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located is determined according to the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole, so that the vehicle-mounted laser radar and the vehicle-mounted camera are jointly calibrated.

According to any of the above embodiments, the plurality of circular calibration holes are distributed on the at least two calibration plates.

Specifically, a plurality of circular calibration holes can be arranged on the calibration plate, and the calibration plate is placed in the visual fields of the vehicle-mounted laser radar and the vehicle-mounted camera, so that the three-dimensional point cloud and the two-dimensional image containing the plurality of circular calibration holes can be conveniently acquired. The size and shape of the calibration plate can be set as required, for example, the calibration plate can be set to be rectangular, with a length of 1.5 meters and a width of 1 meter.

The circular calibration holes are arranged on the calibration plate, and circle detection can be performed in the three-dimensional point cloud and the two-dimensional image by utilizing the position relation among the circular calibration holes on the calibration plate. For example, if 4 circular calibration holes are arranged on 1 calibration plate, the position relationship of the 4 circular calibration holes is set to be distributed along 4 vertexes of a square, the side length of the square is set to be 24 centimeters, that is, the distance between the centers of the circular calibration holes is 24 centimeters, the corresponding areas can be quickly found in the three-dimensional point cloud and the two-dimensional image by using the position relationship for circle detection.

The plurality of circular calibration holes may be respectively arranged on the at least two calibration plates, so that the positions of the circular calibration holes may be moved by separately arranging the calibration plates. The number and the arrangement position of the circular calibration holes on each calibration plate can be set according to actual conditions, for example, 12 circular calibration holes can be respectively arranged on 3 calibration plates, the circular calibration holes on each calibration plate are distributed according to a square, 12 circular calibration holes can be respectively arranged on 4 calibration plates, and the circular calibration holes on each calibration plate are distributed according to an equilateral triangle.

Based on any of the embodiments described above, the at least two calibration plates are arranged at different positions in the field of view of the onboard camera.

Specifically, the calibration plate may be arranged at different positions in the field of view of the onboard camera before acquiring the three-dimensional point cloud and the two-dimensional image containing a plurality of circular calibration holes.

Here, the vehicle-mounted laser radar is generally mounted on the roof of the vehicle, and is capable of performing distance detection of the surrounding environment by rotating a plurality of laser pulses by 360 degrees around an axis. The onboard camera is generally fixedly mounted for acquiring an image of the vehicle in a forward direction. When the combined calibration is carried out, the view of the vehicle-mounted camera is also the common view of the vehicle-mounted camera and the vehicle-mounted laser radar.

For example, when the number of calibration plates is 2, the calibration plates may be disposed at the left and right sides of the same field of view, respectively, and when the number of calibration plates is 4, the calibration plates may be disposed at the upper left, lower left, upper right, and lower right sides of the same field of view, respectively.

According to the sensor combined calibration method, the calibration plates are separately arranged, so that the circular calibration holes are dispersed at different positions in the common visual field corresponding to the three-dimensional point cloud and the two-dimensional image, a plurality of circular calibration holes are prevented from being intensively distributed in a certain area in the visual fields of the vehicle-mounted laser radar and the vehicle-mounted camera, the dispersive measurement is realized, and the error is reduced.

Based on any of the above embodiments, fig. 2 is a schematic flow chart of the three-dimensional coordinate determination method provided in the present application, and as shown in fig. 2, step 120 includes:

step 1201, determining a point cloud plane where each calibration plate is located.

Specifically, a Random Sample Consensus (Random Sample Consensus) algorithm, a Singular Value Decomposition (SVD) algorithm, or the like may be adopted to extract feature points from the three-dimensional point cloud corresponding to each calibration plate, and calculate a plane where each calibration plate is located.

For example, the method for determining the plane of each calibration board by using the RANSAC algorithm includes:

(1) determining an approximate area of the calibration plate in the three-dimensional point cloud, and randomly selecting 3 points from the approximate area to form a fitting plane;

(2) setting a preset fitting threshold, calculating the distance between other points and the fitting plane, if the distance exceeds the preset fitting threshold, judging the point as invalid data, if the distance is less than the preset fitting threshold, judging the point as valid data, and considering the point to be positioned in the fitting plane;

(3) if the number of the points on the fitting plane exceeds the preset number, the fitting plane is stored, and the points in the fitting plane are marked as matched;

(4) and taking the fitted plane with the most matched points as the plane of the calibration plate.

Step 1202, determining point clouds corresponding to the circular calibration holes on each calibration plate based on the point cloud plane where each calibration plate is located.

Specifically, according to the shape characteristics of the calibration plates, the point cloud plane where each calibration plate is located is divided, invalid point clouds outside the calibration plates are removed, and the point clouds corresponding to the circular calibration holes on each calibration plate are determined.

And 1203, performing least square fitting on the point cloud corresponding to the circular calibration holes on each calibration plate, and determining three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud.

Specifically, a least square method is adopted to fit the point cloud corresponding to the circular calibration holes on each calibration plate, and parameters such as a three-dimensional coordinate and a radius of the circle center of each circular calibration hole in the three-dimensional point cloud are obtained.

According to the sensor combined calibration method, the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud are determined according to the point cloud plane where each calibration plate is located, the operation is simple, the realization is easy, and the difficulty of sensor combined calibration is reduced.

Based on any of the above embodiments, step 120 further includes:

and performing circle detection on the two-dimensional image based on Hough transform, and determining the two-dimensional coordinates of the circle center of each circular calibration hole in the two-dimensional image.

Specifically, the principle of Hough Transform is to connect edge pixels by using the global features of an image to form a region closed boundary, convert an image space into a parameter space, and describe points in the parameter space to achieve the purpose of detecting the edge of the image. The method carries out statistical calculation on all points which can fall on the edge, and determines the degree of the points belonging to the edge according to the statistical calculation result. The essence of Hough transform is to transform the coordinates of the image into the coordinates of the parameters, so that the transformed result is easier to identify and detect.

The step of performing circle detection on the two-dimensional image by adopting Hough transform comprises the following steps:

(1) converting pixel points of the two-dimensional image from a two-dimensional space coordinate to a polar coordinate space;

(2) normalizing the intensity of each pixel point in a polar coordinate space to enable the intensity range to be 0-255;

(3) searching pixel points of the two-dimensional image according to the fact that the radius value of the polar coordinate is equal to the input parameter, wherein the input parameter is the radius of the circular calibration hole;

(4) marking pixels of the found two-dimensional image;

(5) all the found pixels of the two-dimensional image are returned.

According to the sensor combined calibration method, circle detection is carried out on the two-dimensional image according to Hough transform, operation is simple, and implementation is easy.

Based on the above embodiment, the number of the circular calibration holes is not less than 9.

Specifically, a point for solving the rotation matrix and the translation vector can be obtained through the center of a circular calibration hole. In order to ensure that a good calibration effect is obtained, a certain number of circular calibration holes can be selected for combined calibration. For a rotation matrix with a dimension of 3 × 3 and a translation vector with a dimension of 3 × 1, at least 9 points may be selected, that is, the number of the circular calibration holes is not less than 9, for example, the number of the circular calibration holes may be set to 12, that is, the centers of 12 circular calibration holes are used for performing sensor joint calibration, so as to improve the calibration accuracy.

Based on any of the above embodiments, step 130 includes:

and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on at least one of an EPnP algorithm, a UPnP algorithm, a DLT algorithm and a P3P algorithm, and the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

Specifically, according to the three-dimensional coordinates and the two-dimensional coordinates of the center of each circular calibration hole, a PnP (passive-n-point) algorithm may be used to determine the coordinate transformation relationship between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located.

There are many solutions to the PnP problem, such as the EPnP algorithm, UPnP algorithm, DLT algorithm, and P3P algorithm.

According to the sensor joint calibration method, the coordinate transformation relation between the point cloud coordinate system where the three-dimensional point cloud is located and the image coordinate system where the two-dimensional image is located is solved by utilizing the PnP algorithm, the operation is simple, the implementation is easy, and the time complexity of the algorithm is reduced.

Based on any one of the embodiments, the application provides a sensor joint calibration method, which is used for calibrating a vehicle-mounted laser radar and a vehicle-mounted camera.

Fig. 3 is a schematic structural diagram of the calibration board provided in the present application, and as shown in fig. 3, the calibration board is rectangular, and has a length L of 1.5 m and a width W of 1 m. The calibration plate comprises 4 round calibration holes with the same size, the diameter D1 of the calibration holes is 15 cm, the position relation of the 4 round calibration holes is distributed along 4 vertexes of a square, the side length D2 of the square is set to be 24 cm, namely the distance between the centers of the round calibration holes is 24 cm.

Firstly, 3 calibration plates are arranged at different positions in the same visual field in the advancing direction of a vehicle, three-dimensional point cloud containing 12 circular calibration holes is obtained through a vehicle-mounted laser radar installed on the vehicle, and a two-dimensional image containing 12 circular calibration holes is obtained through a vehicle-mounted camera installed on the vehicle.

Secondly, determining the three-dimensional coordinates of the circle center of each circular calibration hole in the three-dimensional point cloud, and determining the two-dimensional coordinates of the circle center of each circular calibration hole in the two-dimensional image.

And thirdly, determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on the EPnP algorithm and the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole, wherein the coordinate transformation relation can be expressed as a rotation matrix and a translation vector. The rotation matrix is a matrix with a dimension of 3 × 3, and the translation vector is a vector with a dimension of 3 × 1.

The sensor joint calibration device provided by the present application is described below, and the sensor joint calibration device described below and the sensor joint calibration method described above may be referred to correspondingly.

Based on any of the above embodiments, fig. 4 is a schematic structural diagram of a sensor joint calibration apparatus provided in the present application, and as shown in fig. 4, the apparatus includes:

an obtaining unit 410, configured to obtain a three-dimensional point cloud and a two-dimensional image that include a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera;

a determining unit 420, configured to determine a three-dimensional coordinate of the center of each circular calibration hole in the three-dimensional point cloud, and determine a two-dimensional coordinate of the center of each circular calibration hole in the two-dimensional image;

the calibration unit 430 is configured to determine a coordinate transformation relationship between a point cloud coordinate system in which the three-dimensional point cloud is located and an image coordinate system in which the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the center of each circular calibration hole.

Specifically, the obtaining unit 410 is configured to obtain a three-dimensional point cloud and a two-dimensional image including a plurality of circular calibration holes, and the determining unit 420 is configured to determine a three-dimensional coordinate of a center of each circular calibration hole in the three-dimensional point cloud and a two-dimensional coordinate in the two-dimensional image, respectively. The calibration unit 430 is configured to determine a coordinate transformation relationship between a point cloud coordinate system in which the three-dimensional point cloud is located and an image coordinate system in which the two-dimensional image is located.

The application provides a sensor combined calibration device, through obtaining three-dimensional point cloud and the two-dimensional image that contains a plurality of circular calibration holes, according to the three-dimensional coordinate and the two-dimensional coordinate of the centre of a circle of each circular calibration hole, confirm the coordinate transformation relation between the point cloud coordinate system that three-dimensional point cloud belongs to and the image coordinate system that the two-dimensional image belongs to, thereby realize jointly calibrating on-vehicle laser radar and on-vehicle camera, because circular calibration hole shape characteristic is obvious and easy to detect, the calibration degree of difficulty has been reduced, simultaneously, adopt a plurality of circular calibration holes to calibrate, calibration result's accuracy has been improved.

According to any of the above embodiments, the plurality of circular calibration holes are distributed on the at least two calibration plates.

Based on any of the embodiments described above, the at least two calibration plates are arranged at different positions in the field of view of the onboard camera.

Based on any of the above embodiments, fig. 5 is a schematic structural diagram of a three-dimensional coordinate determination subunit provided in this application, as shown in fig. 5, the determination unit 420 includes a three-dimensional coordinate determination subunit 4201, and the three-dimensional coordinate determination subunit 4201 includes:

a plane determining module 42011, configured to determine a point cloud plane where each calibration board is located;

a point cloud determining module 42012, configured to determine, based on a point cloud plane where each calibration plate is located, a point cloud corresponding to a circular calibration hole on each calibration plate;

the three-dimensional coordinate determination module 42013 is configured to perform least square fitting on the point cloud corresponding to the circular calibration hole on each calibration board, and determine a three-dimensional coordinate of the circle center of each circular calibration hole in the three-dimensional point cloud.

Based on any of the above embodiments, the determining unit 420 further includes:

and the two-dimensional coordinate determining subunit is used for performing circle detection on the two-dimensional image based on Hough transform and determining the two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image.

Based on any one of the above embodiments, the number of the circular calibration holes is not less than 9.

Based on any of the above embodiments, the calibration unit 430 is specifically configured to:

and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on at least one of an EPnP algorithm, a UPnP algorithm, a DLT algorithm and a P3P algorithm, and the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

The sensor joint calibration device provided in the embodiment of the present application is used for executing the sensor joint calibration method, and the specific implementation manner thereof is consistent with the method implementation manner, and the same beneficial effects can be achieved, which is not described herein again.

Based on any of the above embodiments, fig. 6 is a schematic structural diagram of an electronic device provided in the present application, and as shown in fig. 6, the electronic device may include: a Processor (Processor)610, a communication Interface (Communications Interface)620, a Memory (Memory)630 and a communication Bus (Communications Bus)640, wherein the Processor 610, the communication Interface 620 and the Memory 630 complete communication with each other through the communication Bus 640. The processor 610 may call the logic command in the memory 630 to execute the method provided by the above embodiments, the method includes:

acquiring a three-dimensional point cloud and a two-dimensional image which comprise a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera; determining a three-dimensional coordinate of the circle center of each circular calibration hole in the three-dimensional point cloud, and determining a two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image; and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

In addition, the logic commands in the memory 630 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic commands are sold or used as independent products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including commands for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

The processor in the electronic device provided in the embodiment of the present application may call a logic instruction in a memory to implement the sensor joint calibration method, and a specific implementation manner of the method is consistent with the method implementation manner and may achieve the same beneficial effects, which is not described herein again.

The present application further provides a non-transitory computer-readable storage medium, which is described below, and the non-transitory computer-readable storage medium described below and the above-described sensor joint calibration method can be referred to correspondingly.

Embodiments of the present application provide a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented to perform the method provided in the foregoing embodiments when executed by a processor, and the method includes:

acquiring a three-dimensional point cloud and a two-dimensional image which comprise a plurality of circular calibration holes; the three-dimensional point cloud is acquired based on a vehicle-mounted laser radar, and the two-dimensional image is acquired based on a vehicle-mounted camera; determining a three-dimensional coordinate of the circle center of each circular calibration hole in the three-dimensional point cloud, and determining a two-dimensional coordinate of the circle center of each circular calibration hole in the two-dimensional image; and determining a coordinate transformation relation between a point cloud coordinate system where the three-dimensional point cloud is located and an image coordinate system where the two-dimensional image is located based on the three-dimensional coordinate and the two-dimensional coordinate of the circle center of each circular calibration hole.

When a computer program stored on a non-transitory computer-readable storage medium provided in the embodiments of the present application is executed, the method for jointly calibrating a sensor is implemented, and the specific implementation manner is consistent with the method implementation manner and can achieve the same beneficial effects, which is not described herein again.

The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.

Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes commands for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.

Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

14页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种无线电加超声波测距防丢的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!