Combined calibration method and device for camera and laser radar and storage medium

文档序号:1405168 发布日期:2020-03-06 浏览:43次 中文

阅读说明:本技术 摄像机与激光雷达的联合标定方法、装置及存储介质 (Combined calibration method and device for camera and laser radar and storage medium ) 是由 唐得志 赛影辉 阴山慧 肖飞 陈开祥 李垚 于 2019-11-28 设计创作,主要内容包括:本申请公开了一种摄像机与激光雷达的联合标定方法、装置及存储介质,属于计算机视觉技术领域。该方法包括:根据激光雷达扫描目标标定靶的激光点云,确定目标标定靶的N个特征点的空间坐标,N由目标标定靶的形状确定得到;通过摄像机检测N个特征点的平面坐标;根据N个特征点的空间坐标和N个特征点的平面坐标,确定激光雷达和摄像机的姿态信息,以实现摄像机和所述激光雷达的联合标定。本申请通过从激光雷达获得的目标标定靶的激光点云,确定特征点的空间坐标,并通过摄像头检测图像中特征点的平面坐标,通过空间坐标和平面坐标确定激光雷达和摄像机的姿态信息,仅需一次就可以完成摄像机与激光雷达的标定,提高了标定的效率与准确性。(The application discloses a camera and laser radar combined calibration method, a camera and laser radar combined calibration device and a storage medium, and belongs to the technical field of computer vision. The method comprises the following steps: according to the laser point cloud of a target calibration target scanned by a laser radar, determining the spatial coordinates of N characteristic points of the target calibration target, wherein N is determined by the shape of the target calibration target; detecting plane coordinates of the N characteristic points through a camera; and determining attitude information of the laser radar and the camera according to the space coordinates of the N characteristic points and the plane coordinates of the N characteristic points so as to realize the combined calibration of the camera and the laser radar. The space coordinate of the characteristic points is determined through the laser point cloud of the target calibration target obtained from the laser radar, the plane coordinate of the characteristic points in the image is detected through the camera, the attitude information of the laser radar and the camera is determined through the space coordinate and the plane coordinate, the calibration of the camera and the laser radar can be completed only once, and the calibration efficiency and accuracy are improved.)

1. A combined calibration method for a camera and a laser radar is characterized by comprising the following steps:

according to laser point cloud of a target calibration target scanned by a laser radar, determining the spatial coordinates of N characteristic points of the target calibration target, wherein N is determined by the shape of the target calibration target;

detecting the plane coordinates of the N characteristic points through a camera;

and determining attitude information of the laser radar and the camera according to the space coordinates of the N characteristic points and the plane coordinates of the N characteristic points so as to realize the combined calibration of the camera and the laser radar.

2. The method of claim 1, wherein before determining the spatial coordinates of the N feature points of the target calibration target from the laser point cloud of the target calibration target scanned by the laser radar, further comprising:

acquiring laser point cloud of the laser radar;

and according to the laser point cloud, carrying out levelness calibration on the laser point cloud.

3. The method of claim 2, wherein said leveling the laser point cloud from the laser point cloud comprises:

projecting the laser point cloud to obtain a top view of the laser point cloud;

in the top view of the laser point cloud, taking the position of the laser radar as a center, determining a cross reticle, wherein the cross reticle is a cross reticle which takes the position of the laser radar as the center and is symmetrical with each other;

in the top view of the laser point cloud, when any laser point cloud of the laser radar passes through four vertexes of the cross line, the level of the laser radar and the road surface where the laser radar is located is determined, so that the level calibration of the laser radar is completed.

4. The method of claim 1, wherein the determining the spatial coordinates of the N feature points of the target calibration target from the laser point cloud of the target calibration target scanned by the laser radar comprises:

clustering laser point clouds into a plurality of scanning line segments according to the distance between each laser radar point in the laser point clouds along the laser scanning direction and the change of the laser radar point direction;

clustering the scanning line segments into object segments to obtain a laser radar point set of a target plane of the target calibration target;

according to the point cloud intensity value and the intensity threshold value of the laser point cloud, eliminating interference points from the laser radar point set;

determining N boundary points of a preset area in the target calibration target according to the reflection intensity difference value of each laser radar point in the laser point cloud;

and determining the space coordinates of the intersection points of the boundary lines corresponding to the N boundary points as the space coordinates of the N characteristic points of the target calibration target.

5. The method of claim 4, wherein determining N boundary points of a preset area in the target calibration target based on the difference of the reflection intensity of each lidar point in the laser point cloud comprises:

for a laser radar subset of an ith scanning laser in a plurality of scanning lasers, determining whether a reflection intensity difference value between any two adjacent laser radar points in the laser radar subset is greater than a difference threshold value, wherein i is a positive integer greater than or equal to 1;

and when the reflection intensity difference value between any two adjacent laser radar points is greater than the difference threshold value, determining the average coordinate value of any two adjacent laser radar points as the coordinate value of any boundary point.

6. A camera and lidar combined calibration apparatus, the apparatus comprising:

the first determination module is used for determining the spatial coordinates of N characteristic points of a target calibration target according to the laser point cloud of the target calibration target scanned by a laser radar, wherein N is determined by the shape of the target calibration target;

the detection module is used for detecting the plane coordinates of the N characteristic points through a camera;

and the second determining module is used for determining the attitude information of the laser radar and the camera according to the space coordinates of the N characteristic points and the plane coordinates of the N characteristic points so as to realize the combined calibration of the camera and the laser radar.

7. The apparatus of claim 6, wherein the apparatus further comprises:

the acquisition module is used for acquiring laser point cloud of the laser radar;

and the calibration module is used for calibrating the levelness of the laser point cloud according to the laser point cloud.

8. The apparatus of claim 7, wherein the calibration module comprises:

the projection submodule is used for projecting the laser point cloud to obtain a top view of the laser point cloud;

the first determining submodule is used for determining a cross reticle by taking the position of the laser radar as a center in a top view of the laser point cloud, wherein the cross reticle is a cross reticle which is centered on the position of the laser radar and is symmetrical to the position of the laser radar;

and the second determining submodule is used for determining the level of the laser radar and the road surface when any laser point cloud of the laser radar passes through the four vertexes of the cross line in the top view of the laser point cloud so as to finish the level calibration of the laser radar.

9. The apparatus of claim 6, wherein the first determining module comprises:

the first clustering submodule is used for clustering the laser point cloud into a plurality of scanning line segments according to the distance between each laser radar point in the laser point cloud along the laser scanning direction and the change of the laser radar point direction;

the second clustering submodule is used for clustering the scanning line segments into object segments to obtain a laser radar point set of a target plane of the target calibration target;

the eliminating sub-module is used for eliminating interference points from the laser radar point set according to the point cloud intensity value and the intensity threshold value of the laser point cloud;

the third determining submodule is used for determining N boundary points of a preset area in the target calibration target according to the reflection intensity difference value of each laser radar point in the laser point cloud;

and the fourth determining submodule is used for determining the space coordinates of the intersection points of the boundary lines corresponding to the N boundary points as the space coordinates of the N characteristic points of the target calibration target.

10. A computer-readable storage medium, characterized in that the storage medium has stored therein a computer program which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.

Technical Field

The present application relates to the field of computer vision technologies, and in particular, to a method and an apparatus for jointly calibrating a camera and a laser radar, and a storage medium.

Background

With the development of unmanned vehicles, lidar and cameras have been applied to the field of unmanned vehicles as an important part of environmental perception. The objects such as pedestrians and vehicles in the image can be detected through the laser radar and the camera, and the laser radar and the camera are usually calibrated in advance to ensure the detection accuracy.

Currently, in calibration, the vertices of a polygonal plate, which may be a chessboard or a triangular plate, may be found in the point cloud obtained by a laser radar and the image captured by a camera, and the vertices are estimated by constructing convex hulls of the point cloud of an extraction plate to achieve calibration. However, since the vertex only includes geometric features of the outline of the polygonal plate, these features are formed by the discontinuity of the scanning line, which results in insufficient utilization of the point cloud and inaccurate calibration.

Disclosure of Invention

The application provides a camera and laser radar combined calibration method, device and storage medium, which can solve the problem of inaccurate calibration in the related technology. The technical scheme is as follows:

in one aspect, a joint calibration method for a camera and a laser radar is provided, where the method includes:

according to laser point cloud of a target calibration target scanned by a laser radar, determining the spatial coordinates of N characteristic points of the target calibration target, wherein N is determined by the shape of the target calibration target;

detecting the plane coordinates of the N characteristic points through a camera;

and determining attitude information of the laser radar and the camera according to the space coordinates of the N characteristic points and the plane coordinates of the N characteristic points so as to realize the combined calibration of the camera and the laser radar.

In some embodiments, before determining the spatial coordinates of the N feature points of the target calibration target according to the laser point cloud of the target calibration target scanned by the laser radar, the method further includes:

acquiring laser point cloud of the laser radar;

and according to the laser point cloud, carrying out levelness calibration on the laser point cloud.

In some embodiments, the leveling the laser point cloud according to the laser point cloud includes:

projecting the laser point cloud to obtain a top view of the laser point cloud;

in the top view of the laser point cloud, taking the position of the laser radar as a center, determining a cross reticle, wherein the cross reticle is a cross reticle which takes the position of the laser radar as the center and is symmetrical with each other;

in the top view of the laser point cloud, when any laser point cloud of the laser radar passes through four vertexes of the cross line, the level of the laser radar and the road surface where the laser radar is located is determined, so that the level calibration of the laser radar is completed.

In some embodiments, the determining the spatial coordinates of the N feature points of the target calibration target according to the scanning of the laser point cloud of the target calibration target by the laser radar includes:

clustering laser point clouds into a plurality of scanning line segments according to the distance between each laser radar point in the laser point clouds along the laser scanning direction and the change of the laser radar point direction;

clustering the scanning line segments into object segments to obtain a laser radar point set of a target plane of the target calibration target;

according to the point cloud intensity value and the intensity threshold value of the laser point cloud, eliminating interference points from the laser radar point set;

determining N boundary points of a preset area in the target calibration target according to the reflection intensity difference value of each laser radar point in the laser point cloud;

and determining the space coordinates of the intersection points of the boundary lines corresponding to the N boundary points as the space coordinates of the N characteristic points of the target calibration target.

In some embodiments, the determining N boundary points of a preset region in the target calibration target according to the reflection intensity difference of each lidar point in the laser point cloud includes:

for a laser radar subset of an ith scanning laser in a plurality of scanning lasers, determining whether a reflection intensity difference value between any two adjacent laser radar points in the laser radar subset is greater than a difference threshold value, wherein i is a positive integer greater than or equal to 1;

and when the reflection intensity difference value between any two adjacent laser radar points is greater than the difference threshold value, determining the average coordinate value of any two adjacent laser radar points as the coordinate value of any boundary point.

In another aspect, a combined calibration apparatus for a camera and a lidar is provided, where the apparatus includes:

the first determination module is used for determining the spatial coordinates of N characteristic points of a target calibration target according to the laser point cloud of the target calibration target scanned by a laser radar, wherein N is determined by the shape of the target calibration target;

the detection module is used for detecting the plane coordinates of the N characteristic points through a camera;

and the second determining module is used for determining the attitude information of the laser radar and the camera according to the space coordinates of the N characteristic points and the plane coordinates of the N characteristic points so as to realize the combined calibration of the camera and the laser radar.

In some embodiments, the apparatus further comprises:

the acquisition module is used for acquiring laser point cloud of the laser radar;

and the calibration module is used for calibrating the levelness of the laser point cloud according to the laser point cloud.

In some embodiments, the calibration module comprises:

the projection submodule is used for projecting the laser point cloud to obtain a top view of the laser point cloud;

the first determining submodule is used for determining a cross reticle by taking the position of the laser radar as a center in a top view of the laser point cloud, wherein the cross reticle is a cross reticle which is centered on the position of the laser radar and is symmetrical to the position of the laser radar;

and the second determining submodule is used for determining the level of the laser radar and the road surface when any laser point cloud of the laser radar passes through the four vertexes of the cross line in the top view of the laser point cloud so as to finish the level calibration of the laser radar.

In some embodiments, the first determining module comprises:

the first clustering submodule is used for clustering the laser point cloud into a plurality of scanning line segments according to the distance between each laser radar point in the laser point cloud along the laser scanning direction and the change of the laser radar point direction;

the second clustering submodule is used for clustering the scanning line segments into object segments to obtain a laser radar point set of a target plane of the target calibration target;

the eliminating sub-module is used for eliminating interference points from the laser radar point set according to the point cloud intensity value and the intensity threshold value of the laser point cloud;

the third determining submodule is used for determining N boundary points of a preset area in the target calibration target according to the reflection intensity difference value of each laser radar point in the laser point cloud;

and the fourth determining submodule is used for determining the space coordinates of the intersection points of the boundary lines corresponding to the N boundary points as the space coordinates of the N characteristic points of the target calibration target.

In some embodiments, the third determination submodule is to:

for a laser radar subset of an ith scanning laser in a plurality of scanning lasers, determining whether a reflection intensity difference value between any two adjacent laser radar points in the laser radar subset is greater than a difference threshold value, wherein i is a positive integer greater than or equal to 1;

and when the reflection intensity difference value between any two adjacent laser radar points is greater than the difference threshold value, determining the average coordinate value of any two adjacent laser radar points as the coordinate value of any boundary point.

In another aspect, a terminal is provided, where the terminal includes a memory and a processor, the memory is used to store a computer program, and the processor is used to execute the computer program stored in the memory, so as to implement the steps of the camera and lidar joint calibration method described above.

In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, and the computer program, when being executed by a processor, implements the steps of the camera and lidar joint calibration method described above.

In another aspect, a computer program product comprising instructions is provided, which when run on a computer, causes the computer to perform the steps of the above-described method for joint calibration of a camera and a lidar.

The technical scheme provided by the application can at least bring the following beneficial effects:

in the application, the space coordinate of the characteristic point can be determined through the laser point cloud of the target calibration target obtained from the laser radar, the plane coordinate of the characteristic point in the image is detected through the camera, the attitude information of the laser radar and the camera is determined through the space coordinate and the plane coordinate, the calibration of the camera and the laser radar can be completed only once, and the calibration efficiency and accuracy are improved.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.

FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;

fig. 2 is a flowchart of a method for jointly calibrating a camera and a lidar according to an embodiment of the present disclosure;

FIG. 3 is a flowchart of another method for jointly calibrating a camera and a lidar according to an embodiment of the present disclosure;

FIG. 4 is a schematic diagram of a target calibration target according to an embodiment of the present disclosure;

FIG. 5 is a schematic structural diagram of a combined calibration device for a camera and a lidar, provided by an embodiment of the present application;

FIG. 6 is a schematic structural diagram of another combined calibration device for a camera and a lidar, provided by an embodiment of the present application;

FIG. 7 is a schematic structural diagram of a calibration module according to an embodiment of the present disclosure;

FIG. 8 is a schematic structural diagram of a first determining module provided in an embodiment of the present application;

fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application.

Detailed Description

To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.

Before explaining the combined calibration method of the camera and the laser radar provided by the embodiment in detail, an application scenario and an implementation environment provided by the embodiment of the present application are introduced.

First, an application scenario related to the embodiment of the present application is described.

In order to realize unmanned driving, environment sensing is usually performed through a laser radar and a camera, and in order to ensure the accuracy of the environment sensing, the laser radar and the camera need to be calibrated in advance.

Currently, in calibration, the vertices of a polygonal plate, which may be a chessboard or a triangular plate, may be found in the point cloud obtained by a laser radar and the image captured by a camera, and the vertices are estimated by constructing convex hulls of the point cloud of an extraction plate to achieve calibration. However, since the vertex only includes geometric features of the outer contour of the polygonal plate, which are formed by the discontinuity of the scanning line, and the internal information is not used, the utilization of the point cloud is insufficient, and the calibration is inaccurate.

Based on the scene, the embodiment of the application provides a camera and laser radar combined calibration method for improving the sufficiency and calibration accuracy of point cloud utilization.

Next, a system architecture according to an embodiment of the present application will be described.

Referring to FIG. 1, FIG. 1 is a schematic diagram illustrating an implementation environment in accordance with an example embodiment. The implementation environment includes at least one terminal 101, and the terminal 101 may be a terminal communicatively connected to the automobile 102 or an in-vehicle terminal installed in the automobile 102. The communication connection may be a wired or wireless connection, which is not limited in this application.

The terminal 101 may be any electronic product capable of performing human-computer interaction with a user through one or more modes such as a keyboard, a touch pad, a touch screen, a remote controller, voice interaction or handwriting equipment, for example, a PC (Personal computer), a mobile phone, a smart phone, a PDA (Personal Digital Assistant), a wearable device, a pocket PC (pocket PC), a tablet computer, a smart car, a smart television, a smart sound box, and the like.

The automobile 102 may be any automobile having a lidar and a camera mounted thereon.

Those skilled in the art will appreciate that the terminal 101 and the car 102 are only examples, and other existing or future terminals or cars may be suitable for the application and are included within the scope of the present application and are hereby incorporated by reference.

The combined calibration method for the camera and the lidar provided by the embodiments of the present application will be explained in detail with reference to the accompanying drawings.

Fig. 2 is a flowchart of a method for jointly calibrating a camera and a lidar according to an embodiment of the present disclosure, where the method is applied to a terminal. Referring to fig. 2, the method includes the following steps.

Step 201: according to the laser point cloud of a target calibration target scanned by a laser radar, determining the space coordinates of N characteristic points of the target calibration target, wherein N is determined by the shape of the target calibration target.

Step 201: the plane coordinates of the N feature points are detected by a camera.

Step 201: and determining attitude information of the laser radar and the camera according to the space coordinates of the N characteristic points and the plane coordinates of the N characteristic points so as to realize the combined calibration of the camera and the laser radar.

In the method, the space coordinates of the characteristic points are determined through the laser point cloud of the target calibration target obtained from the laser radar, the plane coordinates of the characteristic points in the image are detected through the camera, the attitude information of the laser radar and the camera is determined through the space coordinates and the plane coordinates, the calibration of the camera and the laser radar can be completed only once, and the calibration efficiency and accuracy are improved.

In some embodiments, before determining the spatial coordinates of the N feature points of the target calibration target according to the laser point cloud of the target calibration target scanned by the laser radar, the method further includes:

acquiring laser point cloud of the laser radar;

and according to the laser point cloud, carrying out levelness calibration on the laser point cloud.

In some embodiments, leveling the laser point cloud from the laser point cloud comprises:

projecting the laser point cloud to obtain a top view of the laser point cloud;

in the top view of the laser point cloud, taking the position of the laser radar as a center, determining a cross reticle, wherein the cross reticle is a cross reticle which takes the position of the laser radar as the center and is symmetrical with each other;

in the top view of the laser point cloud, when any laser point cloud of the laser radar passes through the four vertexes of the cross line, the level of the laser radar and the road surface where the laser radar is located is determined, so that the level calibration of the laser radar is completed.

In some embodiments, determining the spatial coordinates of N feature points of a target calibration target according to a laser point cloud of the target calibration target scanned by a laser radar includes:

clustering the laser point cloud into a plurality of scanning line segments according to the distance between each laser radar point in the laser point cloud along the laser scanning direction and the change of the laser radar point direction;

clustering the scanning line segments into object segments to obtain a laser radar point set of a target plane of the target calibration target;

according to the point cloud intensity value and the intensity threshold value of the laser point cloud, eliminating interference points from the laser radar point set;

determining N boundary points of a preset area in the target calibration target according to the reflection intensity difference value of each laser radar point in the laser point cloud;

and determining the space coordinates of the intersection points of the boundary lines corresponding to the N boundary points as the space coordinates of the N characteristic points of the target calibration target.

In some embodiments, determining N boundary points of a preset region in the target calibration target according to the reflection intensity difference of each lidar point in the laser point cloud includes:

for a laser radar subset of an ith scanning laser in a plurality of scanning lasers, determining whether a reflection intensity difference value between any two adjacent laser radar points in the laser radar subset is greater than a difference threshold value, wherein i is a positive integer greater than or equal to 1;

and when the difference value of the reflection intensity between any two adjacent laser radar points is larger than the difference threshold value, determining the average coordinate value of any two adjacent laser radar points as the coordinate value of any boundary point.

All the above optional technical solutions can be combined arbitrarily to form an optional embodiment of the present application, and the present application embodiment is not described in detail again.

Fig. 3 is a flowchart of a method for calibrating a camera and a lidar according to an embodiment of the present disclosure, and referring to fig. 3, the method includes the following steps.

Step 301: and the terminal calibrates the levelness of the laser radar.

It should be noted that, the terminal is when carrying out the levelness calibration to laser radar, can install laser radar on the tripod and fix the tripod on the roof of car, selects comparatively level and the road surface of less shelter from the thing, and the operation laser radar obtains the laser point cloud to with the laser point cloud projection to the top view on, set up the size of slide bar in order to adjust the laser radar top view, mark the position that laser radar belonged to simultaneously on the top view.

As an example, the terminal may perform levelness calibration on the laser radar through a cross intersection method. That is, the terminal can acquire the laser point cloud of the laser radar; and according to the laser point cloud, carrying out levelness calibration on the laser point cloud.

As an example, the terminal performs levelness calibration on the laser point cloud according to the laser point cloud, and the operations of performing levelness calibration on the laser point cloud may be: projecting the laser point cloud to obtain a top view of the laser point cloud; in the top view of the laser point cloud, taking the position of the laser radar as the center, determining a cross reticle which is symmetrical to each other by taking the position of the laser radar as the center; in the top view of the laser point clouds, when any laser point cloud of the laser radar passes through four vertexes of the cross line, the level of the laser radar and the road surface where the laser radar is located is determined, so that the level calibration of the laser radar is completed.

In some embodiments, the terminal may calibrate the levelness of the laser radar in the above manner, and may also calibrate the levelness of the laser radar in other manners, for example, because the laser radar may generate a circle of annular point cloud on a top view of the laser point cloud, if the annular point cloud is close to a circle, it indicates that the laser radar is approximately level with a road surface where the laser radar is located. Therefore, the terminal can determine the similarity of the shape of the annular point cloud and the circle; and when the similarity between the shape of the annular point cloud and the circle is greater than the similarity threshold, determining the level of the laser radar and the road surface where the laser radar is located so as to finish the level calibration of the laser radar. And when the similarity between the shape of the annular point cloud and the circular shape is smaller than or equal to the similarity threshold, determining that the laser radar is not level with the road surface, adjusting the position of the laser radar, and returning the terminal to perform the operation of levelness calibration on the laser radar until the laser radar is level with the road surface.

It should be noted that the similarity threshold may be set in advance according to requirements, for example, the similarity threshold may be 90%, 95%, and so on.

Step 302: the terminal scans the laser point cloud of the target calibration target according to the laser radar, and determines the space coordinates of N characteristic points of the target calibration target, wherein N is determined by the shape of the target calibration target.

It should be noted that the target calibration target is a calibration target preset for implementing calibration, and the shape of the target calibration target may be a triangle, a hollowed triangle, a quadrangle, and the like, for example, the target calibration target may be a quadrangle as shown in fig. 5.

In some embodiments, the terminal may know the shape of the target calibration target in advance to determine the value of N, or perform image recognition on the target calibration target through a camera mounted on the vehicle to determine the value of N.

As an example, the operation of determining, by the terminal, the spatial coordinates of N feature points of the target calibration target according to the laser point cloud of the laser radar may be: clustering the laser point cloud into a plurality of scanning line segments according to the distance between each laser radar point in the laser point cloud along the laser scanning direction and the change of the laser radar point direction; clustering a plurality of scanning line segments into object segments to obtain a laser radar point set of a target plane of a target calibration target; according to the point cloud intensity value and the intensity threshold value of the laser point cloud, eliminating interference points from the laser radar point set; determining N boundary points of a preset area in the target calibration target according to the reflection intensity difference value of each laser radar point in the laser point cloud; and determining the space coordinates of the intersection points of the boundary lines corresponding to the N boundary points as the space coordinates of the N characteristic points of the target calibration target.

It should be noted that the terminal may process the point cloud by a scanning line-based segmentation method, that is, the terminal clusters the laser point cloud into a plurality of scanning line segments according to the distance between each laser radar point in the laser point cloud along the laser scanning direction and the change of the laser radar point direction, and then clusters the plurality of scanning line segments into an object segment to obtain a laser radar point set of the target plane of the target calibration target.

It should be noted that the distance between each lidar point may refer to a distance between a preset number of consecutive points, and the preset number may be set in advance, for example, the preset number may be 5, 6, and so on.

As an example, the terminal can decompose all three-dimensional points in a plurality of scanning line segments according to three main components (three coordinate directions) by a main component analysis method, and gather the three-dimensional points into object segments according to similarity to obtain a laser radar point set of a target plane of a target calibration target

Figure BDA0002293145210000091

Where p represents the lidar point and L represents the lidar.

In some embodiments, since there may be interference points in the set of lidar points, which may interfere with the accuracy of the calibration, the terminal may determine the intensity threshold to exclude the interference points. The intensity threshold may be an intensity threshold region.

Because the target plane only has two colors of black and white, the terminal can obtain a point cloud intensity map of the target plane by utilizing the corresponding relation between the reflection intensity and the pattern color, and then can determine the ordinate in the coordinate system to find the peak value abscissa (r) of the point cloud number on the left side and the right side by taking the point cloud intensity value as the abscissa and the point cloud number corresponding to the intensity value as the ordinate according to the point cloud intensity mapl,rh) The terminal can count the peak abscissa (r) according to the point cloudl,rh) An intensity threshold region is determined.

As an example, the terminal may determine the intensity threshold region by the following first formula.

Figure BDA0002293145210000101

In the first formula (1), r islLow intensity values, r, corresponding to the point cloud number peak to the left of the ordinatelHigh intensity value, tau, corresponding to the peak value of the point cloud number on the right of the ordinatelIs the lowest intensity threshold, τhThe highest intensity threshold.

As an example, the terminal may compare the coordinates of the point in the lidar point set to an intensity threshold region to determine if it is an interference point, i.e., when it is time to detect an interference pointThen determine

Figure BDA0002293145210000103

Belonging to a target plane point, otherwise, being an interference point. And when the interference point is determined, excluding the interference point.

The N characteristic points are the intersection points of all boundary lines of the target calibration target, so that the terminal needs to acquire the boundary lines, and can determine the preset inner and outer boundary points in the target calibration target.

It should be noted that, when the target calibration target is the calibration target shown in fig. 4, the preset area may be a black area shown in fig. 4.

As an example, the operation of determining, by the terminal, N boundary points of a preset region in the target calibration target according to the reflection intensity difference of each lidar point in the laser point cloud may be: for a laser radar subset of the ith scanning laser in the multiple scanning lasers, determining whether the reflection intensity difference value between any two adjacent laser radar points in the laser radar subset is greater than a difference threshold value, wherein i is a positive integer greater than or equal to 1; and when the reflection intensity difference value between any two adjacent laser radar points is greater than the difference threshold value, determining the average coordinate value of any two adjacent laser radar points as the coordinate value of any boundary point.

Because the scanning is carried out through a plurality of scanning lasers when the laser radar scans, a plurality of point clouds can be obtained for each scanning laser, so that the laser radar set comprises a plurality of laser radar subsets, and each laser radar subset corresponds to one scanning laser. For the lidar subset of the ith scanning laser, the lidar subset may be

Figure BDA0002293145210000104

The terminal may determine whether a difference in reflection intensity between any two adjacent lidar points in the lidar subset is greater than a difference threshold, that is, the terminal may determine whether a difference in reflection intensity between any two adjacent lidar points is greater than the difference threshold by using a second formula, and when the difference in reflection intensity between any two adjacent lidar points is greater than the difference threshold, it is determined that the boundary point is located between any two adjacent lidar points. Therefore, the average coordinate value of any two adjacent laser radar points can be determined as the coordinate value of any boundary point.

Figure BDA0002293145210000111

In the second formula (2), the first formula (a) is,

Figure BDA0002293145210000112

is the intensity value corresponding to the coordinate of the jth point,

Figure BDA0002293145210000113

and the intensity value corresponding to the coordinate of the j +1 th point is shown, and tau is a difference threshold value.

As an example, the terminal may determine the average coordinate value of any two adjacent laser radar points by the following third formula.

Figure BDA0002293145210000114

In the third formula (3), m isikIs the average coordinate value of the coordinate values,

Figure BDA0002293145210000115

is the coordinate of the j-th point,

Figure BDA0002293145210000116

is the coordinate of the j +1 th point.

In some embodiments, when the ith scanning laser intersects a preset area of the target calibration target, i.e., a black area, there may be two boundary points, where N is 2, or four boundary points, where N is 4. When two boundary points exist, the scanning laser is respectively arranged at the upper part and the lower part of the target calibration target (the upper part and the lower part of the hollow area), and when four boundary points exist, the scanning laser is arranged at the middle part of the target calibration target (the hollow area). The terminal may define the boundary points of the black region detected by the laser radar as a vector, for example, when there are two boundary points, the defined vector may be (m)i1,×,×,mi4) And x represents absent; when there are four boundary points, the defined vector may be (m)i1,mi2,mi3,mi4). That is, after writing the black region boundary points in the form of a vector, the vector may include (m)11,×,×,m14),(m21,m22,m23,m24),…,(mi1,×,×,mi4)。

Note that the vector (m)11,m21,…,mi1),(m14,m24,…,mi4) Respectively, left and right outer boundary line points, (×, m)22,…,m(i-1)2,×),(×,m23,…,m(i-1)3X (here i)>1) Representing the left and right inner boundary line points, respectively, the terminal can determine the inner and outer 8 boundary lines, thereby obtaining eight boundary line intersection points.

In some embodimentsIn the method, the terminal can record the intersection point of the outer boundary line of the black area in the laser radar coordinate system as p in a counterclockwise mannerL1,pL2,pL3,pL4The intersection of the inner boundary lines is denoted as p 'in the counterclockwise direction'L1,p′L2,p′L3,p′L4And obtaining eight three-dimensional characteristic points.

Step 303: and the terminal detects the plane coordinates of the N characteristic points through the camera.

It should be noted that the camera may obtain one frame of image including the target, pre-process the obtained image, and detect the plane coordinates of the N feature points from the pre-processed image.

As an example, the operation of the terminal to pre-process the image may be: carrying out graying processing on the image to obtain a grayscale image; and carrying out binarization on the gray level image to obtain a binary image.

As an example, the operation of the terminal detecting the plane coordinates of N feature points from the preprocessed image may be: when the gray value of any pixel point in the binary image is larger than the pixel threshold, determining the gray value of any pixel point to be set as the pixel threshold; otherwise, determining that the gray value of any pixel point is unchanged. Then, the terminal can perform pixel-level corner detection in a 5 × 5 window according to the Harri corner detection algorithm to obtain pixel-level edge points; and carrying out weighted centering operation in the optimal window through a Forstner operator to obtain sub-pixel angular points and determining plane coordinates of the N characteristic points.

In some embodiments, the terminal may set the number of feature points to 8, and may output accurate corner point coordinates after determining the plane coordinates of the N feature points. The terminal can record the characteristic point of the outer boundary line of the black area in the camera coordinate system as p in a counterclockwise mannerC1,pC2,pC3,pC4Feature points of the inner boundary line are denoted as p 'in the counterclockwise direction'C1,p′C2,p′C3,p′C4I.e. the plane coordinates of the eight feature points.

Step 304: and the terminal determines attitude information of the laser radar and the camera according to the space coordinates of the N characteristic points and the plane coordinates of the N characteristic points so as to realize the combined calibration of the camera and the laser radar.

As an example, the terminal may determine absolute attitude information of the laser radar and the camera by a uniform perspective N-point method according to the spatial coordinates of the N feature points and the plane coordinates of the N feature points, so as to obtain a rotation matrix from a laser radar coordinate system to a camera coordinate system

Figure BDA0002293145210000121

And translation vector

Figure BDA0002293145210000122

In some embodiments, the terminal may adjust the lidar and the camera based on the attitude information of the lidar and the camera.

In the embodiment of the application, the terminal can horizontally calibrate the laser radar, the space coordinates of the characteristic points are determined through the laser point cloud of the target calibration target obtained from the laser radar, the plane coordinates of the characteristic points in the image are detected through the camera, the attitude information of the laser radar and the camera is determined through the space coordinates and the plane coordinates, the calibration of the camera and the laser radar can be completed only once, and the calibration efficiency and accuracy are improved.

After explaining the combined calibration method for the camera and the lidar provided by the embodiment of the present application, a combined calibration device for the camera and the lidar provided by the embodiment of the present application is introduced next.

Fig. 5 is a schematic structural diagram of a combined calibration apparatus for a camera and a lidar according to an embodiment of the present disclosure, where the combined calibration apparatus for a camera and a lidar may be implemented by software, hardware, or a combination of the two to be part or all of a terminal, and the terminal may be the terminal shown in fig. 1. Referring to fig. 5, the apparatus includes: a first determination module 501, a detection module 502 and a second determination module 503.

The first determining module 501 is configured to determine spatial coordinates of N feature points of a target calibration target according to a laser point cloud of the target calibration target scanned by a laser radar, where N is determined by a shape of the target calibration target;

a detection module 502, configured to detect, by a camera, plane coordinates of the N feature points;

a second determining module 503, configured to determine, according to the spatial coordinates of the N feature points and the plane coordinates of the N feature points, attitude information of the laser radar and the camera, so as to implement joint calibration of the camera and the laser radar.

In some embodiments, referring to fig. 6, the apparatus further comprises:

an obtaining module 504, configured to obtain a laser point cloud of the laser radar;

and the calibration module 505 is configured to calibrate the levelness of the laser point cloud according to the laser point cloud.

In some embodiments, referring to fig. 7, the calibration module 505 comprises:

the projection submodule 5051 is used for projecting the laser point cloud to obtain a top view of the laser point cloud;

the first determining submodule 5052 is configured to determine a cross line in the top view of the laser point cloud, with the position of the laser radar as a center, where the cross line is symmetric to the laser radar;

and a second determining sub-module 5053, configured to determine, in the top view of the laser point cloud, that when any laser point cloud of the laser radar passes through four vertices of the cross line, the laser radar is horizontal to the road surface where the laser radar is located, so as to complete horizontal calibration of the laser radar.

In some embodiments, referring to fig. 8, the first determining module 501 comprises:

the first clustering submodule 5011 is used for clustering the laser point cloud into a plurality of scanning line segments according to the distance between each laser radar point in the laser point cloud along the laser scanning direction and the change of the laser radar point direction;

the second clustering submodule 5012 is configured to cluster the plurality of scan line segments into object segments, so as to obtain a laser radar point set of a target plane of the target calibration target;

the eliminating submodule 5013 is used for eliminating interference points from the laser radar point set according to the point cloud intensity value and the intensity threshold of the laser point cloud;

the third determining submodule 5014 is configured to determine N boundary points of a preset area in the target calibration target according to the reflection intensity difference of each laser radar point in the laser point cloud;

the fourth determining submodule 5015 is configured to determine the spatial coordinates of the intersection points of the boundary lines corresponding to the N boundary points as the spatial coordinates of the N feature points of the target calibration target.

In some embodiments, the third determination submodule 5014 is configured to:

for a laser radar subset of an ith scanning laser in a plurality of scanning lasers, determining whether a reflection intensity difference value between any two adjacent laser radar points in the laser radar subset is greater than a difference threshold value, wherein i is a positive integer greater than or equal to 1;

and when the reflection intensity difference value between any two adjacent laser radar points is greater than the difference threshold value, determining the average coordinate value of any two adjacent laser radar points as the coordinate value of any boundary point.

In the embodiment of the application, the terminal can horizontally calibrate the laser radar, the space coordinates of the characteristic points are determined through the laser point cloud of the target calibration target obtained from the laser radar, the plane coordinates of the characteristic points in the image are detected through the camera, the attitude information of the laser radar and the camera is determined through the space coordinates and the plane coordinates, the calibration of the camera and the laser radar can be completed only once, and the calibration efficiency and accuracy are improved.

It should be noted that: in the combined calibration device for the camera and the lidar, when the camera and the lidar are calibrated, only the division of the functional modules is used for illustration, and in practical application, the function distribution can be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the combined calibration device for the camera and the laser radar and the combined calibration method for the camera and the laser radar provided by the embodiments belong to the same concept, and specific implementation processes are detailed in the method embodiments and are not described herein again.

Fig. 9 is a block diagram of a terminal 900 according to an embodiment of the present disclosure. The terminal 900 may be a portable mobile terminal such as: a smartphone, a tablet laptop or a desktop computer. Terminal 900 may also be referred to by other names such as user equipment, portable terminals, laptop terminals, desktop terminals, and the like. Or may be a vehicle-mounted terminal.

In general, terminal 900 includes: a processor 901 and a memory 902.

Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 901 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 901 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 901 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 901 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.

Memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one instruction for execution by processor 901 to implement the camera and lidar joint calibration method provided by the method embodiments herein.

In some embodiments, terminal 900 can also optionally include: a peripheral interface 903 and at least one peripheral. The processor 901, memory 902, and peripheral interface 903 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 903 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 904, a touch display screen 905, a camera 906, an audio circuit 907, a positioning component 908, and a power supply 909.

The peripheral interface 903 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 901, the memory 902 and the peripheral interface 903 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.

The Radio Frequency circuit 904 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 904 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.

The display screen 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 905 is a touch display screen, the display screen 905 also has the ability to capture touch signals on or over the surface of the display screen 905. The touch signal may be input to the processor 901 as a control signal for processing. At this point, the display 905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 may be one, providing the front panel of the terminal 900; in other embodiments, the number of the display panels 905 may be at least two, and each of the display panels is disposed on a different surface of the terminal 900 or is in a foldable design; in still other embodiments, the display 905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the display screen 905 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display panel 905 can be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), and the like.

The camera assembly 906 is used to capture images or video. Optionally, camera assembly 906 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.

Audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for realizing voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of the terminal 900. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuit 907 may also include a headphone jack.

The positioning component 908 is used to locate the current geographic location of the terminal 900 to implement navigation or LBS (location based Service). The positioning component 908 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, or the galileo System in russia.

Power supply 909 is used to provide power to the various components in terminal 900. The power source 909 may be alternating current, direct current, disposable or rechargeable. When the power source 909 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.

In some embodiments, terminal 900 can also include one or more sensors 910.

Those skilled in the art will appreciate that the configuration shown in fig. 9 does not constitute a limitation of terminal 900, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.

In some embodiments, a computer-readable storage medium is provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the joint calibration method for a camera and a lidar in the above embodiments. For example, the computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.

It is noted that the computer-readable storage medium referred to herein may be a non-volatile storage medium, in other words, a non-transitory storage medium.

It should be understood that all or part of the steps for implementing the above embodiments may be implemented by software, hardware, firmware or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The computer instructions may be stored in the computer-readable storage medium described above.

That is, in some embodiments, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of the above-described method for joint calibration of a camera and a lidar.

The above-mentioned embodiments are provided not to limit the present application, and any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种海洋激光雷达系统响应优化处理方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类