Sensor calibration method and sensor calibration device

文档序号:780529 发布日期:2021-04-09 浏览:7次 中文

阅读说明:本技术 传感器标定方法和传感器标定装置 (Sensor calibration method and sensor calibration device ) 是由 陈亦伦 李涵 于 2019-10-08 设计创作,主要内容包括:本申请涉及人工智能领域,具体涉及激光雷达和摄像头的标定。在激光雷达的标定方法中,通过激光雷达标定区域内的激光雷达扫描点来拟合激光雷达标定区域的边缘直线,并根据拟合得到的直线求解直线间的交点,以及根据该交点在激光雷达坐标系的坐标进行激光雷达的标定。本申请提供的技术方案,有助于提高激光雷达的标定精度和标定速度。(The application relates to the field of artificial intelligence, in particular to calibration of a laser radar and a camera. In the laser radar calibration method, edge straight lines of a laser radar calibration area are fitted through laser radar scanning points in the laser radar calibration area, intersection points among the straight lines are solved according to the straight lines obtained through fitting, and the laser radar is calibrated according to coordinates of the intersection points in a laser radar coordinate system. The technical scheme provided by the application is beneficial to improving the calibration precision and the calibration speed of the laser radar.)

1. A sensor calibration method is characterized by comprising the following steps:

acquiring a first point coordinate set, wherein the first point coordinate set comprises three-dimensional coordinates of scanning points in a laser radar area on a calibration plate under a laser radar coordinate system, and the laser radar calibration area comprises two unparallel sides;

performing straight line fitting according to the first point coordinate set to obtain an expression of a plurality of straight lines, wherein the straight lines comprise straight lines where the two non-parallel sides are located;

estimating three-dimensional coordinates of the laser radar calibration points on the calibration plate under the laser radar coordinate system according to the expressions of the straight lines;

and performing external reference calibration on the laser radar according to the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system.

2. The method of claim 1, wherein the externally referencing the lidar based on three-dimensional coordinates of the lidar calibration point in the lidar coordinate system comprises:

acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of the camera head fixed point on the calibration board in an image coordinate system of the camera head;

determining the relative pose between the laser radar and the camera according to the second point coordinate set, the internal reference of the camera and the position relation between the camera calibration area and the laser radar calibration area;

and calibrating the external reference of the laser radar according to the external reference of the camera and the relative pose between the laser radar and the camera.

3. The method of claim 2, wherein determining the relative pose between the lidar and the camera based on the second set of point coordinates, the internal reference of the camera, and the positional relationship between the camera calibration region and the lidar calibration region comprises:

determining the three-dimensional coordinates of the laser radar calibration point under the camera coordinate system of the camera according to the second point coordinate set, the internal reference of the camera and the position relationship between the camera calibration area and the laser radar calibration area;

and determining the relative pose between the laser radar and the camera according to the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system and the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system.

4. The method of claim 1, further comprising:

acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of a camera head calibration point in an image coordinate system of a camera;

and calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration points in the world coordinate system.

5. The method according to claim 4, wherein said calibrating the camera based on the second set of point coordinates, the positional relationship between the camera calibration regions, and the three-dimensional coordinates of the camera calibration points in the world coordinate system comprises:

determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the internal reference of the camera head and the second point coordinate set;

determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points;

and performing external reference calibration on the camera according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

6. The method according to claim 5, wherein the second set of point coordinates comprises point coordinates of a plurality of images taken of the calibration board by the camera;

wherein the method further comprises:

determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions;

and performing internal reference calibration on the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

7. A sensor calibration method is characterized by comprising the following steps:

acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of a camera head calibration point in an image coordinate system of a camera;

and calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration points in the world coordinate system.

8. The method according to claim 7, wherein said calibrating the camera based on the second set of point coordinates, the positional relationship between the camera calibration regions, and the three-dimensional coordinates of the camera calibration points in the world coordinate system comprises:

determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the internal reference of the camera head and the second point coordinate set;

determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points;

and performing external reference calibration on the camera according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

9. The method according to claim 8, wherein the second set of point coordinates includes point coordinates of a plurality of images taken of the calibration board by the camera;

wherein the method further comprises:

determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions;

and performing internal reference calibration on the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

10. A sensor calibration device, comprising:

the laser radar calibration system comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring a first point coordinate set, the first point coordinate set comprises three-dimensional coordinates of scanning points in a laser radar area on a calibration plate under a laser radar coordinate system, and the laser radar calibration area comprises two unparallel sides;

the fitting module is used for performing straight line fitting according to the first point coordinate set to obtain an expression of a plurality of straight lines, wherein the straight lines comprise straight lines where the two non-parallel sides are located;

the estimation module is used for estimating the three-dimensional coordinates of the laser radar calibration points on the calibration plate under the laser radar coordinate system according to the expression of the straight lines;

and the calibration module is used for carrying out external reference calibration on the laser radar according to the three-dimensional coordinates of the laser radar calibration point under the laser radar coordinate system.

11. The apparatus of claim 10, wherein the obtaining module is further configured to: acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of the camera head fixed point on the calibration board in an image coordinate system of the camera head;

the calibration module is specifically configured to:

determining the relative pose between the laser radar and the camera according to the second point coordinate set, the internal reference of the camera and the position relation between the camera calibration area and the laser radar calibration area;

and calibrating the external reference of the laser radar according to the external reference of the camera and the relative pose between the laser radar and the camera.

12. The apparatus of claim 11, wherein the calibration module is specifically configured to:

determining the three-dimensional coordinates of the laser radar calibration point under the camera coordinate system of the camera according to the second point coordinate set, the internal reference of the camera and the position relationship between the camera calibration area and the laser radar calibration area;

and determining the relative pose between the laser radar and the camera according to the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system and the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system.

13. The apparatus of claim 10, wherein the obtaining module is further configured to: acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of a camera head calibration point in an image coordinate system of a camera;

the calibration module is further configured to: and calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration points in the world coordinate system.

14. The apparatus of claim 13, wherein the calibration module is specifically configured to:

determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the internal reference of the camera head and the second point coordinate set;

determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points;

and performing external reference calibration on the camera according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

15. The apparatus according to claim 14, wherein the second set of point coordinates includes point coordinates of a plurality of images captured by the camera on the calibration board;

wherein, the calibration module is further specifically configured to:

determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions;

and performing internal reference calibration on the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

16. A sensor calibration device, comprising:

the acquisition module is used for acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of the camera head fixed point in an image coordinate system of the camera;

and the calibration module is used for calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration point in a world coordinate system.

17. The apparatus of claim 16, wherein the calibration module is specifically configured to:

determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the camera head internal reference and the second point coordinate set;

determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points;

and calibrating the camera external parameters according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

18. The apparatus according to claim 17, wherein the second set of point coordinates includes point coordinates of a plurality of images captured by the camera on the calibration board;

wherein, the calibration module is further specifically configured to:

determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions;

and calibrating internal reference of the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

19. A calibration device, comprising: the laser radar calibration device comprises one or more laser radar calibration areas, wherein each laser radar calibration area comprises two non-parallel sides, and the intersection point of straight lines where the two non-parallel sides are located is used for calibrating external parameters of a laser radar.

20. The apparatus of claim 19, wherein a first side of a first lidar calibration area and a first side of a second lidar calibration area of the plurality of lidar calibration areas lie on a first straight line, wherein a second side of the first lidar calibration area and a second side of the second lidar calibration area lie on a second straight line, wherein the first side and the second side of the first lidar calibration area are not parallel, and wherein the first side and the second side of the second lidar calibration area are not parallel.

21. The apparatus of claim 20, wherein a first intersection of the first line and the second line is a first vertex of the first lidar calibration area and the first intersection is a first vertex of the second lidar calibration area.

22. The apparatus of claim 20 or 21, wherein a first side of a third lidar calibration area and a second side of a fourth lidar calibration area of the plurality of lidar calibration areas lie on a third straight line, a second side of the third lidar calibration area and a first side of the fourth lidar calibration area lie on a fourth straight line, the first side of the third lidar calibration area is not parallel to the second side, and the first side of the fourth lidar calibration area is not parallel to the second side;

wherein the first line is not parallel to the third line and the second line is not parallel to the fourth line.

23. The apparatus of claim 22, wherein a second intersection of the second line and the fourth line is a second vertex of the first lidar calibration area, the second intersection is a first vertex of the third lidar calibration area, a third intersection of the third line and the fourth line is a second vertex of the third lidar calibration area, the third intersection is a first vertex of the fourth lidar calibration area, a fourth intersection of the first line and the third line is a second vertex of the fourth lidar calibration area, and the fourth intersection is a second vertex of the second lidar calibration area.

24. The apparatus of any of claims 19 to 21, wherein the lidar calibration area is triangular in shape.

25. An apparatus according to any one of claims 19 to 24, wherein the apparatus further comprises a plurality of camera calibration regions, any one of the plurality of camera calibration regions having a different positional relationship to an adjacent camera calibration region and any other one of the plurality of camera calibration regions having a different positional relationship to an adjacent camera calibration region.

26. The apparatus of claim 25, wherein the camera calibration area is circular in shape.

27. A calibration device, comprising: the camera calibration device comprises a plurality of camera calibration areas, wherein the position relationship between any one of the camera calibration areas and an adjacent camera calibration area is different from that between any other one of the camera calibration areas and the adjacent camera calibration area.

28. The apparatus of claim 27, wherein the camera calibration area is circular in shape.

29. A computer-readable storage medium storing instructions for execution by a computing device to implement a sensor calibration method as claimed in any one of claims 1 to 9.

30. A sensor calibration system, characterized by comprising a sensor calibration device according to any one of claims 10 to 15 and/or a calibration device according to any one of claims 19 to 26.

31. A sensor calibration system, characterized by comprising a sensor calibration device according to any one of claims 16 to 18 and/or a calibration device according to any one of claims 27 to 28.

Technical Field

The present application relates to the field of artificial intelligence, and more particularly, to a sensor calibration method and a sensor calibration apparatus.

Background

The multi-line laser radar transmits a plurality of laser scanning lines simultaneously so as to meet the requirement of rapidly acquiring large-range environment information. In particular, in the field of automatic driving or intelligent driving, lidar has been increasingly used for vehicle surroundings sensing. For example, the multiline laser radar installed on the intelligent vehicle can provide richer, comprehensive and accurate vehicle surrounding environment information for the intelligent vehicle.

The raw data obtained by the scanning of the multi-line laser radar mainly comprises distance information and angle information. The distance information indicates the distance from the scanning point to the three-dimensional laser radar, and the angle information indicates the pitch angle of the scanning line where the scanning point is located.

For convenience of subsequent processing, the raw data obtained by scanning the multiline lidar needs to be converted into a coordinate system of the intelligent device to which the multiline lidar belongs.

Converting raw data obtained by scanning a multiline lidar to the coordinate system of the smart device to which the multiline lidar belongs, typically includes the following two steps: converting original data (namely distance information and angle information) obtained by scanning the multi-line laser radar into three-dimensional coordinates under a coordinate system of the multi-line laser radar; and converting the three-dimensional coordinates under the laser radar coordinate system obtained by conversion into the three-dimensional coordinates under the intelligent equipment coordinate system.

And converting the three-dimensional coordinates under the laser radar coordinate system obtained by conversion into the three-dimensional coordinates under the intelligent equipment coordinate system, wherein external parameters of the multi-line laser radar are needed, and the position of the multi-line laser radar under the vehicle body coordinate system is needed to be determined firstly. The position of the multi-line laser radar under the vehicle body coordinate system is determined, and external reference calibration of the multi-line laser radar or external reference calibration of the multi-line laser radar can be achieved.

Therefore, how to perform external reference calibration on the multi-line laser radar is a technical problem to be solved urgently.

Disclosure of Invention

The application provides a sensor calibration method, a sensor calibration device, a calibration device and a calibration system, which are beneficial to improving the calibration precision and the calibration speed of a laser radar.

In a first aspect, the present application provides a sensor calibration method. The method comprises the following steps: acquiring a first point coordinate set, wherein the first point coordinate set comprises three-dimensional coordinates of scanning points in a laser radar area on a calibration plate under a laser radar coordinate system, and the laser radar calibration area comprises two unparallel sides; performing straight line fitting according to the first point coordinate set to obtain an expression of a plurality of straight lines, wherein the straight lines comprise straight lines where the two non-parallel sides are located; estimating three-dimensional coordinates of the laser radar calibration points on the calibration plate under the laser radar coordinate system according to the expressions of the straight lines; and performing external reference calibration on the laser radar according to the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system.

According to the method, the coordinates of the laser radar calibration point in the laser radar coordinate system are estimated according to straight lines obtained through fitting, and the straight lines are fitted through the coordinates of the scanning points in the laser radar coordinate system, so that the accuracy of the coordinates of the laser radar calibration point in the laser radar coordinate system can be improved, and the accuracy of external reference of the laser radar can be improved.

In some possible implementation manners of the first aspect, the performing external reference calibration on the lidar according to a three-dimensional coordinate of the lidar calibration point in the lidar coordinate system includes: acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of the camera head fixed point on the calibration board in an image coordinate system of the camera head; determining the relative pose between the laser radar and the camera according to the second point coordinate set, the internal reference of the camera and the position relation between the camera calibration area and the laser radar calibration area; and calibrating the external reference of the laser radar according to the external reference of the camera and the relative pose between the laser radar and the camera.

In the implementation mode, external reference calibration is carried out on the laser radar according to internal reference and external reference of the camera.

Optionally, the determining a relative pose between the lidar and the camera according to the second point coordinate set, the internal reference of the camera, and the position relationship between the camera calibration region and the lidar calibration region includes: determining the three-dimensional coordinates of the laser radar calibration point under the camera coordinate system of the camera according to the second point coordinate set, the internal reference of the camera and the position relationship between the camera calibration area and the laser radar calibration area; and determining the relative pose between the laser radar and the camera according to the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system and the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system.

In a second possible implementation manner of the first aspect, the method further includes: acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of a camera head calibration point in an image coordinate system of a camera; and calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration points in the world coordinate system.

In the implementation mode, the laser radar and the camera on the same equipment are calibrated synchronously, so that the calibration efficiency can be improved.

In addition, no matter all patterns on the whole calibration board or part of patterns on the calibration board are shot by the camera, the characteristic points in the image shot by the camera can be determined to be the characteristic points on the calibration board according to the position relation among the characteristic points, so that the three-dimensional coordinates of the characteristic points in a world coordinate system can be determined from the pre-stored three-dimensional coordinates, and further internal and external parameters of the camera can be determined. When the calibration plate is used for calibrating the camera, the distance between the camera and the calibration plate is not limited, and the calibration plate does not need to be moved, so that the automatic calibration of the camera can be realized.

Optionally, the calibrating the camera according to the second point coordinate set, the position relationship between the camera calibration regions, and the three-dimensional coordinates of the camera calibration point in the world coordinate system includes: determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the internal reference of the camera head and the second point coordinate set; determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points; and performing external reference calibration on the camera according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

Optionally, the second point coordinate set includes point coordinates of a plurality of images obtained by shooting the calibration board by the camera. Wherein the method further comprises: determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions; and performing internal reference calibration on the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

In a second aspect, the present application provides a sensor calibration method, including: acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of a camera head calibration point in an image coordinate system of a camera; and calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration points in the world coordinate system.

The method can determine the characteristic points in the image shot by the camera as the characteristic points on the calibration board according to the position relation among the characteristic points no matter all the patterns shot by the camera on the whole calibration board or part of the patterns shot by the camera on the calibration board, thereby determining the three-dimensional coordinates of the characteristic points in a world coordinate system from the pre-stored three-dimensional coordinates and further determining the internal and external parameters of the camera. When the calibration plate is used for calibrating the camera, the distance between the camera and the calibration plate is not limited, and the calibration plate does not need to be moved, so that the automatic calibration of the camera can be realized.

In some possible implementations, the calibrating the camera according to the second point coordinate set, the positional relationship between the camera calibration regions, and the three-dimensional coordinates of the camera calibration point in the world coordinate system includes: determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the internal reference of the camera head and the second point coordinate set; determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points; and performing external reference calibration on the camera according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

Optionally, the second point coordinate set includes point coordinates of a plurality of images obtained by shooting the calibration board by the camera. Wherein the method further comprises: determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions; and performing internal reference calibration on the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

In a third aspect, the present application provides a sensor calibration apparatus, including: the laser radar calibration system comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring a first point coordinate set, the first point coordinate set comprises three-dimensional coordinates of scanning points in a laser radar area on a calibration plate under a laser radar coordinate system, and the laser radar calibration area comprises two unparallel sides; the fitting module is used for performing straight line fitting according to the first point coordinate set to obtain an expression of a plurality of straight lines, wherein the straight lines comprise straight lines where the two non-parallel sides are located; the estimation module is used for estimating the three-dimensional coordinates of the laser radar calibration points on the calibration plate under the laser radar coordinate system according to the expression of the straight lines; and the calibration module is used for carrying out external reference calibration on the laser radar according to the three-dimensional coordinates of the laser radar calibration point under the laser radar coordinate system.

In a first possible implementation manner of the third aspect, the obtaining module is further configured to: and acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of the camera head fixed point on the calibration board in an image coordinate system of the camera head. The calibration module is specifically configured to: determining the relative pose between the laser radar and the camera according to the second point coordinate set, the internal reference of the camera and the position relation between the camera calibration area and the laser radar calibration area; and calibrating the external reference of the laser radar according to the external reference of the camera and the relative pose between the laser radar and the camera.

Optionally, the calibration module is specifically configured to: determining the three-dimensional coordinates of the laser radar calibration point under the camera coordinate system of the camera according to the second point coordinate set, the internal reference of the camera and the position relationship between the camera calibration area and the laser radar calibration area; and determining the relative pose between the laser radar and the camera according to the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system and the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system.

In a second possible implementation manner of the third aspect, the obtaining module is further configured to: a second point coordinate set is obtained, which comprises coordinates of the camera head reference point in the image coordinate system of the camera. The calibration module is further configured to: and calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration points in the world coordinate system.

Optionally, the calibration module is specifically configured to: determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the internal reference of the camera head and the second point coordinate set; determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points; and performing external reference calibration on the camera according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

Optionally, the second point coordinate set includes point coordinates of a plurality of images obtained by shooting the calibration board by the camera. Wherein, the calibration module is further specifically configured to: determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions; and performing internal reference calibration on the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

In a fourth aspect, the present application provides a sensor calibration apparatus, including: the acquisition module is used for acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of the camera head fixed point in an image coordinate system of the camera; and the calibration module is used for calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration point in a world coordinate system.

In some possible implementations, the calibration module is specifically configured to: determining the three-dimensional coordinates of the camera head fixed point under the camera head coordinate system according to the camera head internal reference and the second point coordinate set; determining the three-dimensional coordinates of the camera head fixed points under a world coordinate system according to the position relation among the camera head fixed points; and calibrating the camera external parameters according to the three-dimensional coordinates of the camera head calibration point in the world coordinate system and the three-dimensional coordinates of the camera head calibration point in the camera head coordinate system.

Optionally, the second point coordinate set includes point coordinates of a plurality of images obtained by shooting the calibration board by the camera. Wherein, the calibration module is further specifically configured to: determining coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration regions; and calibrating internal reference of the camera according to the coordinates of the camera calibration point in each image of the plurality of images to obtain the internal reference of the camera.

In a fifth aspect, a sensor calibration apparatus is provided, the apparatus comprising: a memory for storing a program; a processor configured to execute the program stored in the memory, and when the program stored in the memory is executed, the processor is configured to perform the method in any one of the implementations of the first aspect.

Optionally, the apparatus may further comprise a communication interface.

In a sixth aspect, a sensor calibration apparatus is provided, the apparatus comprising: a memory for storing a program; a processor for executing the program stored in the memory, and when the program stored in the memory is executed, the processor is configured to perform the method in any one of the implementations of the second aspect.

Optionally, the apparatus may further comprise a communication interface.

In a seventh aspect, a computer-readable medium is provided, which stores instructions for execution by a device to perform the method of the first aspect.

In an eighth aspect, a computer-readable medium is provided that stores instructions for execution by a device to perform the method of the second aspect.

In a ninth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the first aspect described above.

A tenth aspect provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the second aspect described above.

In an eleventh aspect, a chip is provided, where the chip includes a processor and a communication interface, and the processor reads instructions stored in a memory through the communication interface to perform the method in the first aspect.

Optionally, the chip may further comprise a memory, the memory having instructions stored therein, and the processor being configured to execute the instructions stored on the memory, and when the instructions are executed, the processor being configured to perform the method of the first aspect.

In a twelfth aspect, a chip is provided, where the chip includes a processor and a communication interface, and the processor reads instructions stored in a memory through the communication interface to execute the method in the second aspect.

Optionally, the chip may further comprise a memory, wherein instructions are stored in the memory, and the processor is configured to execute the instructions stored in the memory, and when the instructions are executed, the processor is configured to execute the method of the second aspect.

In a thirteenth aspect, an electronic device is provided, which includes the sensor calibration apparatus in the third aspect.

In a fourteenth aspect, an electronic device is provided, which includes the sensor calibration apparatus in the fourth aspect.

In a fifteenth aspect, the present application provides a calibration apparatus, comprising: the laser radar calibration device comprises one or more laser radar calibration areas, wherein each laser radar calibration area comprises two non-parallel sides, and the intersection point of straight lines where the two non-parallel sides are located is used for calibrating external parameters of a laser radar.

The laser radar calibration area comprises at least two nonparallel edges, and the intersection point of straight lines where the nonparallel edges are located can be used as a laser radar calibration point to realize the calibration of the laser radar.

In some possible implementation manners, in the plurality of laser radar calibration areas, a first edge of a first laser radar calibration area and a first edge of a second laser radar calibration area are located on a first straight line, a second edge of the first laser radar calibration area and a second edge of the second laser radar calibration area are located on a second straight line, the first edge and the second edge of the first laser radar calibration area are not parallel, and the first edge and the second edge of the second laser radar calibration area are not parallel.

In the implementation mode, the edges of the laser radar areas are located on the same straight line, so that the straight line can be fitted according to scanning points near the edges, the accuracy of the fitted straight line can be improved, and the accuracy of the laser radar calibration point is improved.

Optionally, a first intersection point of the first straight line and the second straight line is a first vertex of the first lidar calibration area, and the first intersection point is a first vertex of the second lidar calibration area.

That is, different lidar areas use the same lidar calibration point as a vertex, which makes the layout of the lidar calibration area on the calibration plate more compact, which makes it possible to layout more calibration areas on the calibration plate of the same area or to layout the calibration area of the same area using a calibration plate of a smaller area.

In some possible implementation manners, in the plurality of laser radar calibration areas, a first edge of a third laser radar calibration area and a second edge of a fourth laser radar calibration area are located on a third straight line, a second edge of the third laser radar calibration area and a first edge of the fourth laser radar calibration area are located on a fourth straight line, the first edge and the second edge of the third laser radar calibration area are not parallel, and the first edge and the second edge of the fourth laser radar calibration area are not parallel. Wherein the first line is not parallel to the third line and the second line is not parallel to the fourth line.

In this implementation, because first straight line and third straight line nonparallel, second straight line and fourth straight line nonparallel for increase the lidar calibration region of same quantity, can increase more nodical, thereby increase more lidar calibration points, and then can improve lidar's calibration accuracy.

Optionally, a second intersection point of the second straight line and the fourth straight line is a second vertex of the first lidar calibration area, the second intersection point is a first vertex of the third lidar calibration area, a third intersection point of the third straight line and the fourth straight line is a second vertex of the third lidar calibration area, the third intersection point is a first vertex of the fourth lidar calibration area, a fourth intersection point of the first straight line and the third straight line is a second vertex of the fourth lidar calibration area, and the fourth intersection point is a second vertex of the second lidar calibration area.

That is, different lidar areas use the same lidar calibration point as a vertex, which makes the layout of the lidar calibration area on the calibration plate more compact, which makes it possible to layout more calibration areas on the calibration plate of the same area or to layout the calibration area of the same area using a calibration plate of a smaller area.

Optionally, the shape of the lidar calibration area is a triangle, a quadrangle, or the like.

In some possible implementations, the apparatus further includes a plurality of camera calibration regions, a positional relationship between any one of the plurality of camera calibration regions and an adjacent camera calibration region is different, and a positional relationship between any other one of the plurality of camera calibration regions and an adjacent camera calibration region is different.

The device in the implementation mode can synchronously realize the calibration of the laser radar and the camera. In addition, the device in this implementation mode enables all the patterns on the whole calibration board or part of the patterns on the calibration board to be shot by the camera, and the feature points in the image shot by the camera can be determined to be the feature points on the calibration board according to the position relationship between the feature points, so that the three-dimensional coordinates of the feature points in the world coordinate system can be determined from the pre-stored three-dimensional coordinates, and further the internal and external parameters of the camera can be determined. When the calibration plate is used for calibrating the camera, the distance between the camera and the calibration plate is not limited, and the calibration plate does not need to be moved, so that the automatic calibration of the camera can be realized.

Optionally, the shape of the camera calibration area is a circle, a triangle, a quadrangle, or the like.

In a sixteenth aspect, the present application provides a calibration apparatus, comprising: the camera calibration device comprises a plurality of camera calibration areas, wherein the position relationship between any one of the camera calibration areas and an adjacent camera calibration area is different from that between any other one of the camera calibration areas and the adjacent camera calibration area.

The real calibration device can synchronously realize the calibration of the laser radar and the camera. In addition, the device in this implementation mode enables all the patterns on the whole calibration board or part of the patterns on the calibration board to be shot by the camera, and the feature points in the image shot by the camera can be determined to be the feature points on the calibration board according to the position relationship between the feature points, so that the three-dimensional coordinates of the feature points in the world coordinate system can be determined from the pre-stored three-dimensional coordinates, and further the internal and external parameters of the camera can be determined. When the calibration plate is used for calibrating the camera, the distance between the camera and the calibration plate is not limited, and the calibration plate does not need to be moved, so that the automatic calibration of the camera can be realized.

Optionally, the shape of the camera calibration area is a circle, a triangle, a quadrangle, or the like.

In a seventeenth aspect, the present application provides a sensor calibration system comprising the sensor calibration device of the third or fifth aspect, and the calibration device of the fifteenth aspect.

In an eighteenth aspect, the present application provides a sensor calibration system comprising the sensor calibration device of the fourth or sixth aspect, and the calibration device of the sixteenth aspect.

In a nineteenth aspect, the present application provides a sensor calibration system comprising the sensor calibration device of the fourth aspect and the calibration device of the sixteenth aspect.

Drawings

Fig. 1 is a schematic diagram of an application scenario of the technical solution of the embodiment of the present application;

FIG. 2 is a schematic block diagram of a calibration apparatus according to an embodiment of the present application;

FIG. 3 is a schematic flow chart diagram of a sensor calibration method according to an embodiment of the present application;

fig. 4 is a schematic structural view of a calibration apparatus according to another embodiment of the present application;

fig. 5 is a schematic structural view of a calibration apparatus according to another embodiment of the present application;

FIG. 6 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

FIG. 7 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

fig. 8 is a schematic structural view of a calibration apparatus according to another embodiment of the present application;

FIG. 9 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

FIG. 10 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

FIG. 11 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

FIG. 12 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

FIG. 13 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

FIG. 14 is a schematic flow chart diagram of a sensor calibration method according to another embodiment of the present application;

FIG. 15 is a schematic block diagram of a sensor calibration apparatus according to an embodiment of the present application;

FIG. 16 is a schematic deployment view of a sensor calibration apparatus of another embodiment of the present application;

FIG. 17 is a schematic deployment view of a sensor calibration apparatus of another embodiment of the present application;

FIG. 18 is a schematic block diagram of a computing device according to another embodiment of the present application.

Detailed Description

It should be understood that the smart device of the embodiments of the present application refers to any device, apparatus or machine having computing processing capability. The intelligent device in the embodiment of the present application may be a robot (robot), an auto vehicle (autonomous vehicles), an intelligent assistant vehicle (assistant vehicle), an unmanned aerial vehicle (unmanned aerial vehicle), an intelligent assistant airplane, an intelligent home (smart home) device, and the like. The intelligent device is not limited in any way, and the intelligent device can be used in a range of the intelligent device as long as the intelligent device can be provided with the laser radar and/or the camera.

Fig. 1 is a schematic diagram of an application scenario of the technical solution of the embodiment of the present application. The scene can comprise an automobile, a front calibration plate and four groups of binocular cameras.

The four-wheel vision positioning system is formed by four pairs of binocular cameras, and each pair of binocular cameras is responsible for detecting the three-dimensional coordinate of the center of one wheel under a world coordinate system. According to the three-dimensional coordinates of the centers of the four wheels in the world coordinate system, an automobile coordinate system can be established, and the position and the posture of the automobile coordinate system relative to the world coordinate system can be further determined.

The front calibration plate and the binocular camera are respectively fixed at corresponding positions. The front of the vehicle is provided with a multi-line laser radar.

Fig. 2 is a schematic diagram of a calibration plate 200 according to an embodiment of the present application. The triangular area is made of infrared reflective materials in the working frequency range of the laser radar, and the background is made of black light absorption materials. For convenience of description, the four connected triangular regions are referred to as a set of calibration patterns. In the embodiment of the present application, there may be more or fewer sets of calibration patterns on the calibration board, and fig. 2 only shows three sets as an example.

In one set of calibration patterns, the upper left triangular region is referred to as a first region, the lower left triangular region is referred to as a second region, the upper right triangular region is referred to as a third region, and the lower right triangular region is referred to as a fourth region.

One vertex of the first region and one vertex of the second region are the same point, and this point is referred to as a point a for the convenience of the following description. For convenience of description, the edge of the first region is referred to as a first edge of the first region, the edge of the second region is referred to as a first edge of the second region, and the straight line is referred to as a first straight line. The other edge of the first region and the other edge of the second region are located on the same straight line, and for convenience of subsequent description, the one edge of the first region is referred to as a second edge of the first region, the one edge of the second region is referred to as a second edge of the second region, and the one straight line is referred to as a second straight line.

The other vertex of the first region is the same as the one vertex of the third region, and for the convenience of the subsequent description, the same point is referred to as a point B.

One vertex of the third region and one vertex of the fourth region are the same point, and this point is referred to as point C for the convenience of the following description. For convenience of description, the side of the third region is referred to as a first side of the third region, the side of the fourth region is referred to as a second side of the fourth region, and the straight line is referred to as a third straight line. The other edge of the third region and the other edge of the fourth region are located on the same straight line, and for convenience of subsequent description, the edge of the third region is referred to as a second edge of the third region, the edge of the fourth region is referred to as a first edge of the second region, and the straight line is referred to as a fourth straight line.

The other vertex of the second region is the same as the vertex of the fourth region, and for the convenience of the following description, the same point is referred to as a point D.

Based on the application scenario shown in fig. 1 and the calibration board shown in fig. 2, a schematic flow chart of a sensor calibration method according to an embodiment of the present application is described below with reference to fig. 3.

The method shown in fig. 3 may include S310 to S390. It should be understood that these steps or operations are only examples. More or fewer steps or operations may be included in the solution presented in the present application, or variations of the individual operations in fig. 3 may be performed.

And S310, pre-storing three-dimensional coordinates of the feature points on the calibration board in a world coordinate system.

For example, when the point a, the point B, the point C, and the point D in the calibration pattern on the calibration board shown in fig. 2 are taken as the feature points, three-dimensional coordinates of a plurality of sets of the point a, the point B, the point C, and the point D on the calibration board in world coordinates are prestored. The three-dimensional coordinates of the plurality of groups of points in the world coordinate system can be obtained through the total station measurement.

And S320, detecting whether the vehicle enters a designated area, if so, executing S330, otherwise, executing S340.

Wherein, images of four wheels can be captured by the four-wheel visual positioning system, and whether the vehicle enters a designated area is judged according to the images. And if the position of the vehicle meets the calibration requirement, determining that the vehicle enters the designated area.

For a more detailed implementation of this step, reference may be made to the prior art, which is not described herein again.

And S330, detecting the centers of the wheels through a four-wheel vision positioning system, and establishing an automobile coordinate system.

The four groups of binocular cameras are used for observing the wheels and the side-looking camera calibration plate simultaneously, the outer parameters of the binocular cameras are solved according to the characteristic points on the side-looking camera calibration plate, and the three-dimensional coordinates of the centers of the four wheels under a world coordinate system are constructed according to the images acquired by the binocular cameras. And establishing an automobile coordinate system according to the three-dimensional coordinates of the centers of the four wheels in the world coordinate system, and solving the pose of the automobile coordinate system relative to the world coordinate system.

For a more detailed implementation of this step, reference may be made to the prior art, which is not described herein again.

And S340, prompting to adjust the vehicle position.

And S350, determining the three-dimensional coordinates of the scanning point in the laser radar coordinate system.

The laser radar transmits laser beams, receives reflected signals and determines three-dimensional coordinates of the scanning points in a laser radar coordinate system according to the reflected signals. And the three-dimensional coordinates of the plurality of scanning points under the laser radar coordinates form a three-dimensional point coordinate set.

Specifically, the laser radar emits a plurality of laser scanning lines outwards, and the spatial coordinates, namely the three-dimensional coordinates, of scanning points formed by the intersection of each laser scanning line and the calibration plate under a laser radar coordinate system are measured and calculated.

Taking the example that four laser scanning lines of the lidar scan the calibration pattern in fig. 2, the effective scanning points of the lidar in the calibration pattern are shown as four dotted lines in fig. 4.

And S360, determining the three-dimensional coordinates of the feature points in the laser radar coordinate system according to the three-dimensional coordinates of the scanning points in the laser radar coordinate system.

Specifically, fitting expressions of a first straight line, a second straight line, a third straight line and a fourth straight line according to the three-dimensional point coordinate set; determining three-dimensional coordinates of the point A, the point B, the point C and the point D in a laser radar coordinate system according to fitting expressions of the first straight line, the second straight line, the third straight line and the fourth straight line; and carrying out external reference calibration on the laser radar according to the three-dimensional coordinates of the point A, the point B, the point C and the point D in the laser radar coordinate system and the three-dimensional coordinates of the point A, the point B, the point C and the point D in the world coordinate system.

For example, three-dimensional coordinates of a start scanning point and an end scanning point of each scanning line in each area are determined from a three-dimensional point coordinate set of the scanning points in a laser radar coordinate system. Taking fig. 4 as an example, three-dimensional coordinates of the points labeled with the four-pointed star, the five-pointed star, the hexagon, and the seven-pointed star are determined from the three-dimensional point coordinate set.

Fitting an expression of a first straight line according to three-dimensional coordinates of four points marked by a quadrangle star under a laser radar coordinate system, and specifically fitting to obtain parameters in the expression of the first straight line; fitting an expression of a second straight line according to three-dimensional coordinates of the four points marked by the five-pointed star under the laser radar coordinate system, and specifically fitting to obtain parameters in the expression of the second straight line; fitting an expression of a third straight line according to three-dimensional coordinates of the four points marked by the seven-pointed star under the laser radar coordinate system, and specifically fitting to obtain parameters in the expression of the third straight line; and fitting an expression of a fourth straight line according to the three-dimensional coordinates of the four points marked by the hexagons in the laser radar coordinate system, and specifically fitting to obtain parameters in the expression of the fourth straight line.

The expression of each straight line may be fitted through various fitting manners, for example, the expression of each straight line may be fitted through a least square method, or the expression of each straight line may be fitted through a data linear fitting manner of a spatial three-dimensional scatter point.

Then, solving a three-dimensional coordinate of a point A under a laser radar coordinate system according to the expression of the first straight line obtained through fitting and the expression of the second straight line obtained through fitting; solving the three-dimensional coordinate of the point B in the laser radar coordinate system according to the expression of the second straight line obtained through fitting and the expression of the fourth straight line obtained through fitting; solving a three-dimensional coordinate of a point C under a laser radar coordinate system according to the expression of the third straight line obtained through fitting and the expression of the fourth straight line obtained through fitting; and solving the three-dimensional coordinate of the point D under the laser radar coordinate system according to the expression of the third straight line obtained by fitting and the expression of the first straight line obtained by fitting.

And scanning a plurality of groups of calibration patterns according to a plurality of scanning lines of the laser radar to obtain a three-dimensional point coordinate set, and determining three-dimensional coordinates of a plurality of groups of points A, B, C and D in a laser radar coordinate system.

And S370, carrying out external reference calibration on the laser radar according to the three-dimensional coordinates of the characteristic points in the laser radar coordinate system and the three-dimensional coordinates of the characteristic points in the world coordinate system.

For example, determining the pose of the laser radar in a world coordinate system according to the three-dimensional coordinates of a plurality of groups of points A, B, C and D in the laser radar coordinate system and the three-dimensional coordinates of a plurality of groups of points A, B, C and D in the world coordinate system; and determining the pose of the laser radar in the automobile coordinate system according to the pose of the laser radar in the world coordinate system, namely realizing the external reference calibration of the laser radar. For a specific implementation of this operation, reference may be made to the prior art.

Because the first straight line, the second straight line, the third straight line and the fourth straight line are obtained through fitting of a plurality of points, the accuracy of the straight line expression obtained through fitting can be improved, the accuracy of the three-dimensional coordinates of the characteristic points solved according to the straight line expression obtained through fitting can be improved, and the accuracy of the external reference of the laser radar can be improved.

And S380, judging whether the external reference calibration error of the laser radar meets the requirement, if so, executing S390, and otherwise, executing S320 again.

The external reference calibration precision and stability of the laser radar are critical to the automatic driving algorithm, so that if the external reference calibration error of the laser radar does not meet the requirement, the operation returns to S320, and the operation from S320 to S380 is executed again until the external reference calibration error of the laser radar meets the requirement.

And S390, outputting and storing the external reference calibration result of the laser radar.

For example, the external reference calibration result of the laser radar is displayed and stored in the storage unit.

Optionally, the relative pose of the lidar and other sensors may also be determined from the external parameters of the lidar.

For example, the relative pose between the lidar and the front camera is solved according to the external parameters of the lidar and the external parameters of the front camera on the vehicle.

It will be appreciated that the calibration plate shown in fig. 2 may have other shapes, for example, a quadrilateral shape, for the reflective area for calibrating the lidar external parameter.

It is to be understood that the scenario shown in fig. 1 is merely an example, and that more or fewer devices may be included in a scenario to which the sensor calibration method shown in fig. 3 may be applied.

For example, the scene shown in fig. 1 may not have binocular cameras, and accordingly, the car coordinate system may be established according to other ways.

For example, in the scenario shown in fig. 1, a front camera may also be mounted on the vehicle. Under the condition that a front camera is installed on a vehicle, if the internal reference and the external reference of the front camera are calibrated, the external reference calibration of the laser radar can be carried out according to the internal reference and the external reference of the camera; if the internal reference and the external reference of the front camera are not calibrated, the internal reference and the external reference of the camera and the external reference of the laser radar can be calibrated synchronously.

The internal reference of the camera is a parameter related to the characteristics of the camera itself, and is determined by the installation positions of the optical lens and the photoelectric sensor in the camera. The internal reference of the camera may include: focal length, pixel size, optical distortion, white balance parameters, resolution, contrast, vignetting, and/or vignetting, etc.

The camera external reference is a parameter of the camera in the world coordinate system, and is determined by the position of the camera in the world coordinate system, and includes, for example, the position, the rotation direction, and the like of the camera in the world coordinate system.

A schematic diagram of a calibration pattern of the pre-calibration plate is shown in fig. 5 in a scene where the internal reference and the external reference of the pre-camera are calibrated and the external reference of the laser radar is calibrated according to the internal reference and the external reference of the camera.

The triangular regions in the calibration board 500 shown in fig. 5 have the same meaning as the triangular regions in the calibration board shown in fig. 2, and are not described in detail here. The circular area in the calibration plate 500 can be made of white reflective material and is used for shooting by a camera; the background is made of black light absorption materials.

The positional relationship of any one circular region and its adjacent circular region in the calibration plate 500 is different from the positional relationship of another arbitrary circular region and its adjacent circular region.

Based on the above application scenario and the calibration board shown in fig. 5, a schematic flow chart of a sensor calibration method according to an embodiment of the present application is described below with reference to fig. 6.

The method shown in fig. 6 may include S610 to S694. It should be understood that these steps or operations are only examples. More or fewer steps or operations may be included in the solution presented in the present application, or variations of the individual operations in fig. 6 may be performed.

S610, pre-storing the position relation between the characteristic points on the calibration board, the camera internal reference and the camera external reference. The camera external parameter is the pose of the camera in the automobile coordinate system.

For example, when the point a, the point B, the point C, the point D, and the center point of the circular area in the calibration pattern on the calibration board shown in fig. 5 are taken as the feature points, the positional relationship between the point a, the point B, the point C, the point D, and the center point of the circular area on the calibration board is prestored. Specifically, the relative positions between the point a, the point B, the point C, the point D, and the center point of the circular area may be prestored.

For the purpose of the following description, the point a, the point B, the point C, and the point D are referred to as intersection feature points, and the center point of the circular region is referred to as a circle center feature point.

S620, detecting whether the vehicle enters a designated area, if so, executing S630, otherwise, executing S640. This step may refer to S320.

And S630, detecting the wheel center through a four-wheel vision positioning system, and establishing an automobile coordinate system. This step may refer to S330.

And S640, prompting to adjust the vehicle position.

And S650, determining the three-dimensional coordinates of the scanning point in the laser radar coordinate system. This step can be referred to as S350, and is not described herein.

And S660, determining the three-dimensional coordinates of the intersection point feature point in the laser radar coordinate system according to the three-dimensional coordinates of the scanning point in the laser radar coordinate system.

This step can refer to S360, which is not described herein.

And S670, acquiring a two-dimensional coordinate of the circle center feature point in an image coordinate system through the image shot by the camera.

For example, a calibration pattern is captured by a camera to obtain an image containing a plurality of circular areas. And fitting each circular area in the image to obtain the two-dimensional coordinates of the circle center characteristic points in the image coordinate system.

And S680, determining the three-dimensional coordinates of the intersection point feature points in the camera coordinate system according to the two-dimensional coordinates of the circle center feature points in the image coordinate system and the position relationship between the feature points.

For example, two-dimensional coordinates of the intersection feature point in the image coordinate system are calculated according to two-dimensional coordinates of the circle center feature point in the image coordinate system and the position relationship between the feature points, and the two-dimensional coordinates of the intersection feature point in the image coordinate system are converted into three-dimensional coordinates in the camera coordinate system.

And S690, performing external reference calibration on the laser radar according to the three-dimensional coordinates of the intersection point characteristic point in the camera coordinate system and the three-dimensional coordinates of the intersection point characteristic point in the laser radar coordinate system.

For example, the relative pose of the laser radar and the camera is determined according to the three-dimensional coordinates of the intersection point characteristic point in the camera coordinate system and the three-dimensional coordinates of the intersection point characteristic point in the laser radar coordinate system; and determining the pose of the laser radar under the automobile coordinate system according to the external parameters of the camera and the relative poses of the laser radar and the camera, namely realizing the external parameter calibration of the laser radar.

And S692, judging whether the external reference calibration error of the laser radar meets the requirement, if so, executing S694, and otherwise, executing S620 again.

The external reference calibration precision and stability of the laser radar are critical to the automatic driving algorithm, so that if the external reference calibration error of the laser radar does not meet the requirement, the operation returns to S620, and S620 to S692 are executed again until the external reference calibration error of the laser radar meets the requirement.

And S694, outputting and storing an external reference calibration result of the laser radar.

For example, the external reference calibration result of the laser radar is displayed and stored in the storage unit.

It is understood that in the calibration plate shown in fig. 5, the light-reflecting region for calibrating the external parameter of the laser radar may have other shapes, for example, a quadrilateral shape; the reflective area for camera imaging may be other shapes such as triangle, quadrangle or checkerboard.

In the embodiment of the application, because the position relationship between any feature point and an adjacent feature point is different from the position relationship between another any feature point and an adjacent feature point, no matter all patterns on the whole calibration board or part of patterns on the calibration board are shot by the camera, the feature points in the image shot by the camera can be determined to be the feature points on the calibration board according to the position relationship between the feature points, and therefore the three-dimensional coordinates of the intersection feature points in the laser radar coordinate system can be determined according to the position relationship between the feature points. When the camera is used for shooting the calibration plate, the distance between the camera and the calibration plate is not limited, and the automatic calibration of the external parameters of the laser radar can be realized.

A schematic diagram of a calibration pattern of the front calibration plate in a scene where the internal reference and the external reference of the front camera are not calibrated, the internal reference and the external reference of the camera are synchronously calibrated, and the external reference of the laser radar is calibrated is shown in fig. 5.

Based on the above scenario and the calibration board shown in fig. 5, a schematic flow chart of a sensor calibration method according to an embodiment of the present application is described below with reference to fig. 7.

The method shown in fig. 7 may include S710 to S796. It should be understood that these steps or operations are only examples. More or fewer steps or operations may be included in the solution presented in the present application, or variations of the individual operations in fig. 7 may be performed.

S710, pre-storing three-dimensional coordinates of the feature points on the calibration board in a world coordinate system and the position relation among the feature points.

For example, when the point a, the point B, the point C, the point D, and the center point of the circular area in the calibration pattern on the calibration board shown in fig. 5 are taken as the feature points, three-dimensional coordinates of the point a, the point B, the point C, the point D, and the center point of the circular area on the calibration board in the world coordinate system are prestored, and a positional relationship between the center points of the circular area is prestored.

For the purpose of the following description, the point a, the point B, the point C, and the point D are referred to as intersection feature points, and the center point of the circular region is referred to as a circle center feature point.

S720, detecting whether the vehicle enters the designated area, if so, executing S730, otherwise, executing S740. This step may refer to S320.

And S730, detecting the centers of the wheels through a four-wheel vision positioning system, and establishing an automobile coordinate system. This step may refer to S330.

And S740, prompting to adjust the vehicle position.

And S750, determining the three-dimensional coordinates of the scanning point in the laser radar coordinate system. This step may refer to S350.

And S760, determining the three-dimensional coordinates of the intersection point characteristic points in the laser radar coordinate system according to the three-dimensional coordinates of the scanning points in the laser radar coordinate system. This step can refer to S360, which is not described herein.

And S770, performing external reference calibration on the laser radar according to the three-dimensional coordinates of the intersection point characteristic point in a laser radar coordinate system and the three-dimensional coordinates of the intersection point characteristic point in a world coordinate system. This step may refer to S370.

And S780, acquiring two-dimensional coordinates of the circle center characteristic point in the image coordinate system through the image shot by the camera. This step may refer to S670.

The image taken by the camera may include the entire calibration plate, or may be a portion of the calibration plate.

And S790, calibrating the internal reference and the external reference of the camera according to the two-dimensional coordinates of the circle center characteristic point in the image coordinate system and the three-dimensional coordinates of the circle center characteristic point in the world coordinate system.

The method comprises the steps of determining which feature points on a calibration plate are shot by a camera according to the position relation among feature points of the circle center, selecting three-dimensional coordinates of the feature points in a world coordinate system from prestored three-dimensional coordinates, and calibrating internal and external parameters of the camera according to the three-dimensional coordinates of the feature points in the camera coordinate system and the three-dimensional coordinates in the world coordinate system.

According to the three-dimensional coordinates of the characteristic points in the camera coordinate system and the three-dimensional coordinates in the world coordinate system, the prior art can be referred to for the implementation mode of calibrating the internal and external references of the camera, and details are not repeated here.

And S792, judging whether the internal reference calibration error, the external reference calibration error and the external reference calibration error of the laser radar of the camera meet the requirements, if so, executing S794, otherwise, executing S720 again.

The calibration precision and stability of the camera and the laser radar are critical to the automatic driving algorithm, so that if the calibration error of the camera and the laser radar does not meet the requirement, the operation returns to S720, and S720 to S792 are executed again until the calibration error of the camera and the laser radar meets the requirement.

And S794, outputting and storing the calibration results of the camera and the laser radar.

For example, the calibration result is displayed and stored in the storage unit.

Further, the relative pose between the camera and the laser radar can be determined according to the external parameters of the camera and the laser radar.

Of course, the relative poses between the camera and other sensors and between the laser radar and other sensors can also be determined according to the external parameters of the camera and the laser radar.

It is understood that in the calibration plate shown in fig. 5, the light-reflecting region for calibrating the external parameter of the laser radar may have other shapes, for example, a quadrilateral shape; the reflective area for camera imaging may be other shapes such as triangle, quadrangle or checkerboard.

In the embodiment of the application, because the position relationship between any feature point and an adjacent feature point is different from the position relationship between another any feature point and an adjacent feature point, no matter all patterns on the whole calibration board or part of patterns on the calibration board are shot by the camera, the feature points in the image shot by the camera can be determined to be the feature points on the calibration board according to the position relationship between the feature points, so that the three-dimensional coordinates of the feature points under a world coordinate system can be determined from the pre-stored three-dimensional coordinates, and further, the internal and external parameters of the camera can be determined. When the calibration plate is used for calibrating the camera, the distance between the camera and the calibration plate is not limited, and the calibration plate does not need to be moved, so that the automatic calibration of the camera can be realized.

In the scenario shown in fig. 1, a schematic diagram of the calibration pattern of the calibration plate is optionally shown in fig. 8. This scenario can be used to calibrate a front-facing camera on a vehicle.

The circular area in the calibration board shown in fig. 8 may be made of a white reflective material for camera shooting; the background is made of black light absorption materials. The position relationship between any circular area and the adjacent circular area is different from the position relationship between another any circular area and the adjacent circular area.

Based on the above scenario and the calibration board shown in fig. 8, a schematic flow chart of a sensor calibration method according to an embodiment of the present application is described below with reference to fig. 9.

The method shown in fig. 9 may include S910 to S980. It should be understood that these steps or operations are only examples. More or fewer steps or operations may be included in the solution presented in the present application, or variations of the individual operations in fig. 9 may be performed.

S910, pre-storing the three-dimensional coordinates of the feature points on the calibration board in a world coordinate system and the position relationship between the feature points.

For example, when the center point of the circular area on the calibration board shown in fig. 8 is used as the feature point, the three-dimensional coordinates of the center point of the circular area on the calibration board in the world coordinate system are pre-stored. For the purposes of the following description, the center point of a circular region is referred to as a circle center feature point.

And S920, detecting whether the vehicle enters a designated area, if so, executing S930, and otherwise, executing S940. This step may refer to S320.

And S930, detecting the centers of the wheels through a four-wheel vision positioning system, and establishing an automobile coordinate system. This step may refer to S330.

And S940, prompting to adjust the vehicle position.

And S950, acquiring two-dimensional coordinates of the circle center feature point in an image coordinate system through the image shot by the camera. This step may refer to S670.

S960, calibrating the internal reference and the external reference of the camera according to the two-dimensional coordinates of the circle center characteristic point in the image coordinate system and the three-dimensional coordinates of the circle center characteristic point in the world coordinate system. This step may be S790.

And S970, judging whether the internal reference calibration error and the external reference calibration error of the camera meet the requirements, if so, executing S980, and otherwise, executing S970 again.

The calibration accuracy and stability of the camera and the laser radar are critical to the automatic driving algorithm, so that if the calibration error of the camera and the laser radar does not meet the requirement, the operation returns to S920, and S920 to S970 are executed again until the calibration error of the camera and the laser radar meets the requirement.

And S980, outputting and storing the calibration results of the camera and the laser radar.

For example, the calibration result is displayed and stored in the storage unit.

Further, the relative pose between the camera and other sensors can be determined according to external parameters of the camera.

It is understood that in the calibration plate shown in fig. 5, the light-reflecting region for calibrating the external parameter of the laser radar may have other shapes, for example, a quadrilateral shape; the reflective area for camera imaging may be other shapes such as triangle, quadrangle or checkerboard.

In the embodiment of the application, because the position relationship between any feature point and an adjacent feature point is different from the position relationship between another any feature point and an adjacent feature point, no matter all patterns on the whole calibration board or part of patterns on the calibration board are shot by the camera, the feature points in the image shot by the camera can be determined to be the feature points on the calibration board according to the position relationship between the feature points, and therefore the three-dimensional coordinates of the intersection feature points in the laser radar coordinate system can be determined according to the position relationship between the feature points. When the calibration plate is used for calibrating the camera, the automatic calibration of the camera can be realized without limiting the distance between the camera and the calibration plate.

Fig. 10 is a schematic flow chart of a sensor calibration method provided in the present application. The method may include S1010 to S1040.

S1010, a first point coordinate set is obtained, the first point coordinate set comprises three-dimensional coordinates of scanning points in a laser radar area on a calibration plate under a laser radar coordinate system, and the laser radar calibration area comprises two unparallel sides.

The laser radar is a multi-line laser radar, and the multi-line scanning lines can refer to all scanning lines of the laser radar and can also refer to partial scanning lines of the laser radar.

One or more sets of calibration patterns may be included on the calibration plate, where each set of calibration patterns may include one or more lidar calibration regions.

The laser radar calibration area is a light reflection area used for calibrating the laser radar. For example, the laser radar calibration area is made of infrared materials of the laser radar working frequency band. The lidar area may be of any shape.

The lidar region includes a plurality of edges, wherein at least two of the edges are non-parallel, or wherein straight lines on which at least two of the edges lie intersect. For example, the laser radar area is a triangle or a quadrangle.

When there are multiple sets of calibration patterns on the calibration plate, the multiple sets of calibration patterns may be the same or different. When the calibration pattern includes a plurality of lidar calibration areas, the shapes of the plurality of lidar calibration areas may be the same or different.

An example of a calibration plate comprising a plurality of sets of calibration patterns, each set comprising a plurality of lidar calibration areas, is shown in fig. 2 or 5.

And scanning the laser radar area on the calibration plate by a plurality of scanning lines of the laser radar to generate a plurality of scanning points. For convenience of description, a set of three-dimensional coordinates of the plurality of scanning points in the lidar coordinate system is referred to as a first point coordinate set.

The origin of the lidar coordinate system is usually the center of the lidar, and the x-axis direction of the lidar coordinate system usually points to the opposite direction of an output cable of the lidar; if the laser radar is installed in a manner of pointing to the front of the automobile, the y-axis direction of the laser radar coordinate system usually points to the left side of the automobile; the z-axis of the lidar coordinate system is typically pointed skyward.

S1020, performing straight line fitting according to the first point coordinate set to obtain an expression of a plurality of straight lines, wherein the straight lines comprise straight lines where the two non-parallel sides are located.

In some possible designs, when performing straight line fitting according to the first point coordinate set, three-dimensional coordinates of a start scanning point and an end scanning point when each scanning line scans a laser radar calibration area between the two non-parallel sides may be selected. And scanning the laser radar calibration area by multiple scanning lines, so that three-dimensional coordinates of a plurality of starting scanning points and a plurality of ending scanning points are selected and obtained. Then, straight line fitting is performed according to the three-dimensional coordinates of the start scanning point and the end scanning point.

For example, a set of calibration patterns on the calibration board includes the first area as shown in fig. 4, and when the laser radar scans from the first edge to the second edge of the first area, the scanning point marked by the five-pointed star is the starting scanning point, and the scanning point marked by the four-pointed star is the ending scanning point. At the moment, fitting an expression of a first straight line according to the three-dimensional coordinates of the two points marked by the quadrangle star in the first area under the laser radar coordinate system, and fitting an expression of a second straight line according to the three-dimensional coordinates of the two points marked by the pentagon star in the first area under the laser radar coordinate system.

For another example, a set of calibration patterns on the calibration board includes a first area and a second area as shown in fig. 4, and when the scanning direction of the laser radar is from the first edge to the second edge of the first area, the scanning point marked by the five-pointed star in the first area is the starting scanning point, and the scanning point marked by the four-pointed star is the ending scanning point; and points marked by the pentagram in the second area are end scanning points, and points marked by the tetragram are starting scanning points. At the moment, fitting an expression of a first straight line according to three-dimensional coordinates of four points marked by the quadrangle star in the first area and the second area under a laser radar coordinate system; and fitting an expression of a second straight line according to three-dimensional coordinates of four points marked by the pentagram in the first area and the second area under the laser radar coordinate system.

In the implementation mode, because points fitting a straight line are increased, the accuracy of the expression obtained by fitting can be improved, the accuracy of the coordinate of the laser radar characteristic point in the laser radar coordinate system obtained by solving according to the expression can be improved, and the accuracy of the calibration result of the laser radar can be improved.

For another example, a set of calibration patterns on the calibration board includes a first region, a second region, a third region, and a fourth region as shown in fig. 4, and when the scanning direction of the laser radar is from the first edge to the second edge of the first region, the scanning point marked by the five-pointed star in the first region is the starting scanning point, and the scanning point marked by the four-pointed star is the ending scanning point; points marked by the pentagram in the second area are end scanning points, and points marked by the tetragram are starting scanning points; in the third area, a point marked by a heptagonal star is a starting scanning point, and a point marked by a hexagonal star is an ending scanning point; and a point marked by a hexagram in the fourth area is a starting scanning point, and a point marked by a heptagram is an ending scanning point.

At the moment, fitting an expression of a first straight line according to three-dimensional coordinates of four points marked by the quadrangle star in the first area and the second area under a laser radar coordinate system; fitting an expression of a second straight line according to three-dimensional coordinates of four points marked by the pentagram in the first area and the second area under a laser radar coordinate system; fitting an expression of a third straight line according to three-dimensional coordinates of four points marked by the seven-pointed star in the third area and the fourth area under a laser radar coordinate system; and fitting an expression of a fourth straight line according to three-dimensional coordinates of four points marked by hexagons in the third area and the fourth area under the laser radar coordinate system.

According to the implementation mode, under the condition that a certain number of scanning points are added, more intersection point characteristic points can be added, and therefore the accuracy of the calibration result of the laser radar can be improved.

It is to be understood that the implementation of the straight line fitting according to the first point coordinate set is not limited to the above manner. For example, when each scanning line scans the laser radar calibration area between the two non-parallel sides, the average coordinate of all scanning points in the z-axis direction may be used as the z-coordinate of the starting scanning point and the ending scanning point; alternatively, the coordinate distances of all the scanning points in the y-axis direction may be calculated, and the coordinates of the scanning point located in the middle of the scanning points in the y-axis direction may be calculated by adding or subtracting a certain number of coordinate distances, and the obtained coordinate values may be used as the y coordinates of the start scanning point or the end scanning point.

In this step, after the three-dimensional coordinates of the starting scanning point and the ending scanning point in the laser radar coordinate system are determined, when linear fitting is performed according to the three-dimensional coordinates of these points, the linear fitting may be performed in various ways, for example, by a least square three-dimensional linear fitting method or a linear fitting method of spatial three-dimensional scattered point data.

And S1030, estimating the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system according to the expressions of the straight lines.

The laser radar calibration point is a point on the calibration plate for calibrating the external reference of the laser radar. Specifically, the lidar calibration point includes an intersection point of straight lines where the two non-parallel sides of the lidar calibration area are located. Therefore, the lidar calibration points may also be referred to as intersection feature points.

For example, when the first region in fig. 2 is included on the calibration board, the lidar characteristic point includes point a. For another example, when the calibration board includes the first region, the second region, the third region, and the fourth region in fig. 2, the laser radar feature points include point a, point B, point C, and point D.

And according to the expression obtained by fitting, estimating the three-dimensional coordinates of the laser radar feature points in the laser radar coordinate system, wherein the estimating comprises the following steps: and (3) constructing an equation set consisting of expressions, solving the equation set, and taking the obtained coordinates as the three-dimensional coordinates of the laser radar characteristic points in the laser radar coordinate system.

For example, when the calibration board includes the first region in fig. 2, or includes the first region and the second region in fig. 2, the equation set formed by the expression of the first straight line and the expression of the second straight line is solved, and the obtained coordinates are the three-dimensional coordinates of the point a in the lidar coordinate system.

For another example, when the calibration board includes the first area, the second area, the third area, and the fourth area in fig. 2, an equation set formed by expressions of the first straight line, the second straight line, the third straight line, and the fourth straight line is solved, and the obtained coordinates are three-dimensional coordinates of the point a, the point B, the point C, and the point D in the laser radar coordinate system.

And S1040, performing external reference calibration on the laser radar according to the three-dimensional coordinate of the laser radar calibration point in the laser radar coordinate system.

In some designs, as shown in fig. 11, performing external reference calibration on the lidar based on three-dimensional coordinates of the lidar calibration point in the lidar coordinate system may include: and S1041, performing external reference calibration on the laser radar according to the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system and the three-dimensional coordinates of the laser radar calibration point in the world coordinate system. One example is the method shown in fig. 3.

The three-dimensional coordinates of the laser radar calibration point in the world coordinate system can be measured in advance through a total station. According to the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system and the three-dimensional coordinates of the laser radar calibration point in the world coordinate system, reference can be made to the prior art, and details are not repeated here.

In this design, optionally, as shown in fig. 12, it may further include: s1050 and S1060.

And S1050, acquiring a second point coordinate set, wherein the second point coordinate set comprises coordinates of the camera head fixed point in an image coordinate system of the camera.

The camera calibration points comprise points in a camera calibration area on the calibration plate. The camera and the laser radar are located on the same intelligent device. The image shot by the camera can comprise all camera calibration areas on the calibration board, and can also comprise part of camera calibration areas on the calibration board.

The camera calibration area is an area for calibrating the camera external reference or calibrating the camera internal reference and the camera external reference. The camera calibration area can be made of white reflective materials. The camera calibration area can be in any shape, for example, it can be in a circle, triangle or checkerboard shape.

The calibration board usually includes a plurality of camera calibration regions, wherein the positional relationship between any one camera calibration region and its adjacent region is different from the positional relationship between any another camera calibration region and its adjacent region.

The camera calibration points are located in the camera calibration area, and therefore, the positional relationship between the camera calibration areas can also be understood as the positional relationship between the camera calibration points.

For example, the position relationship between the center point of any one camera calibration region and the center point of its adjacent region is different from the position relationship between the center point of any other camera calibration region and the end point of its adjacent region.

S1060, calibrating the camera according to the second point coordinate set, the position relationship between the camera calibration regions, and the three-dimensional coordinates of the camera calibration point in the world coordinate system.

The calibration of the camera comprises calibrating internal parameters and/or external parameters of the camera.

For example, according to camera internal parameters, coordinates in the second point coordinate set are converted from an image coordinate system to a camera coordinate system, and three-dimensional coordinates of the camera mark point under the camera coordinate system are obtained; determining which calibration points on the calibration plate are the camera calibration points shot by the camera according to the position relation between the camera calibration points; according to the three-dimensional coordinates of the calibration points in the world coordinate system and the three-dimensional coordinates of the calibration points in the camera coordinate system, the camera external parameters are calibrated, and the specific implementation mode can refer to the prior art.

The three-dimensional coordinate of the camera head fixed point in the world coordinate may be obtained by measuring with a total station, and of course, may also be measured in other ways, which is not limited in this application.

The internal parameters of the camera can be calibrated in advance or calibrated according to the calibration plate.

For example, a camera shoots a calibration plate from multiple angles to obtain multiple images; according to the position relation among the camera head fixed points, the coordinates of each camera head fixed point can be respectively determined from a plurality of images; the camera is internally calibrated according to the coordinates of the camera calibration points in the multiple images, and the specific implementation mode can refer to the prior art.

In the step, the camera is calibrated according to the position relation between the camera calibration areas, so that the automatic calibration of the camera can be realized.

In some designs, performing external reference calibration on the lidar according to three-dimensional coordinates of the lidar calibration point in a lidar coordinate system may include: and carrying out external reference calibration on the laser radar according to the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system and the external reference of the camera.

For example, as shown in fig. 13, S1040 may include S1042 through S1046.

S1042, a second point coordinate set is obtained, where the second point coordinate set includes coordinates of the camera head fixed point on the calibration board in an image coordinate system of the camera head.

This step may refer to S1050.

And S1044, carrying out external reference calibration on the laser radar according to the second point coordinate set, the internal and external reference of the camera and the position relation between the camera calibration area and the laser radar calibration area.

In some designs, three-dimensional coordinates of the lidar calibration point in a camera coordinate system of the camera may be determined based on the second point coordinate set, camera internal parameters, and a positional relationship between the camera calibration region and the lidar calibration region; and carrying out external reference calibration on the laser radar according to the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system, the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system and the external reference of the camera.

For example, the coordinates of the central point of the camera calibration region in the image coordinate system can be determined according to the second point coordinate set; determining the coordinates of the laser radar calibration point in an image coordinate system according to the position relationship between the camera calibration area and the laser radar calibration area and the coordinates of the central point of the camera calibration area in the image coordinate system; and determining the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system according to the coordinates of the laser radar calibration point in the image coordinate system and the internal parameters of the camera.

Or after the coordinates of the central point of the calibration area of the camera in the image coordinate system are determined, converting the coordinates into three-dimensional coordinates in the camera coordinate system according to camera internal parameters; and determining the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system according to the three-dimensional coordinates of the central point of the camera calibration area in the camera coordinate system and the position relationship between the camera calibration area and the laser radar calibration point.

For example, the relative pose of the camera and the laser radar is determined according to the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system and the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system; and determining the pose (namely the external reference of the laser radar) of the laser radar in the vehicle coordinate system according to the relative pose and the pose (namely the external reference of the camera) of the camera in the vehicle coordinate system.

The application also provides a sensor calibration method. FIG. 14 is a schematic flow chart diagram of a sensor calibration method according to an embodiment of the present application. The sensor calibration method shown in fig. 14 may include S1410 and S1420.

S1410, acquiring a second point coordinate set, where the second point coordinate set includes coordinates of the camera head calibration point in an image coordinate system of the camera head. This step may refer to S1050.

And S1420, calibrating the camera according to the second point coordinate set, the position relation among the camera calibration areas and the three-dimensional coordinates of the camera calibration point in a world coordinate system. This step may refer to S1060.

In the embodiment of the application, no matter what the camera shoots are all the patterns on the whole calibration board or some patterns on the calibration board, the characteristic points in the image shot by the camera can be determined to be the characteristic points on the calibration board according to the position relation between the characteristic points, so that the three-dimensional coordinates of the characteristic points under a world coordinate system can be determined from the prestored three-dimensional coordinates, and further the internal and external references of the camera can be determined. When the calibration plate is used for calibrating the camera, the distance between the camera and the calibration plate is not limited, and the calibration plate does not need to be moved, so that the automatic calibration of the camera can be realized.

The application also provides a calibration device. The application provides a calibration device includes: one or more lidar calibration areas that include two non-parallel sides. The calibration device may also be referred to as a calibration plate.

One or more sets of calibration patterns may be included on the calibration plate, where each set of calibration patterns may include one or more lidar calibration regions.

The laser radar calibration area is a light reflection area used for calibrating the laser radar. For example, the laser radar calibration area is made of infrared materials of the laser radar working frequency band. The lidar area may be of any shape.

The lidar region includes a plurality of edges, wherein at least two of the edges are non-parallel, or wherein straight lines on which at least two of the edges lie intersect. For example, the laser radar area is a triangle or a quadrangle. For convenience of description, two sides of the laser radar area that are not parallel are referred to as a first side and a second side, respectively.

When there are multiple sets of calibration patterns on the calibration plate, the multiple sets of calibration patterns may be the same or different. When the calibration pattern includes a plurality of lidar calibration areas, the shapes of the plurality of lidar calibration areas may be the same or different.

An example of a calibration plate comprising a plurality of sets of calibration patterns, each set comprising a plurality of lidar calibration areas, is shown in fig. 2 or 5.

The laser radar calibration area comprises at least two nonparallel edges, and the intersection point of straight lines where the nonparallel edges are located can be used as a laser radar calibration point to realize the calibration of the laser radar.

In some designs, in a plurality of laser radar calibration areas of a set of calibration patterns, a first edge of one laser radar calibration area and a first edge of another laser radar calibration area are located on the same straight line, for the convenience of description, the former laser radar calibration area becomes the first laser radar calibration area, the latter laser radar calibration area is called the second laser radar calibration area, and the straight line is called the first straight line; the second side of the first lidar calibration area and the second side of the second lidar calibration area are located on the same straight line, which is referred to as the second straight line for descriptive purposes.

In the design, the edges of a plurality of laser radar areas are located on the same straight line, so that the straight line can be fitted according to scanning points near the edges, the accuracy of the fitted straight line can be improved, and the accuracy of laser radar calibration points is improved.

In this design, optionally, the intersection of the first line and the second line is a vertex of the first lidar calibration area. For convenience of description, an intersection of the first straight line and the second straight line is referred to as a first intersection. The first intersection point is simultaneously a vertex of the second laser radar calibration area.

That is, different lidar areas use the same lidar calibration point as a vertex, which makes the layout of the lidar calibration area on the calibration plate more compact, which makes it possible to layout more calibration areas on the calibration plate of the same area or to layout the calibration area of the same area using a calibration plate of a smaller area.

In some designs, in the plurality of lidar calibration areas, a first edge of a third lidar calibration area and a second edge of a fourth lidar calibration area are located on a third straight line, and a second edge of the third lidar calibration area and a first edge of the fourth lidar calibration area are located on a fourth straight line; wherein the first line is not parallel to the third line and the second line is not parallel to the fourth line.

In this design, because first straight line and third straight line nonparallel, second straight line and fourth straight line nonparallel for increase the lidar calibration region of same quantity, can increase more nodical, thereby increase more lidar calibration points, and then can improve lidar's demarcation precision.

Optionally, a second intersection point of the second straight line and the fourth straight line is another vertex of the first lidar calibration area, the second intersection point is one vertex of the third lidar calibration area, a third intersection point of the third straight line and the fourth straight line is another vertex of the third lidar calibration area, the third intersection point is one vertex of the fourth lidar calibration area, a fourth intersection point of the first straight line and the third straight line is another vertex of the fourth lidar calibration area, and the fourth intersection point is another vertex of the second lidar calibration area.

That is, different lidar areas use the same lidar calibration point as a vertex, which makes the layout of the lidar calibration area on the calibration plate more compact, which makes it possible to layout more calibration areas on the calibration plate of the same area or to layout the calibration area of the same area using a calibration plate of a smaller area.

An example of the design is shown in fig. 2, in which the first area, the second area, the third area, and the fourth area correspond to the first lidar calibration area, the second lidar calibration area, the third lidar calibration area, and the fourth lidar calibration area, respectively.

Another calibration apparatus provided by the present application includes: the camera calibration device comprises a plurality of camera calibration areas, wherein the position relationship between any one of the camera calibration areas and an adjacent camera calibration area is different from that between any other one of the camera calibration areas and the adjacent camera calibration area.

The camera calibration area is an area for calibrating the camera external reference or calibrating the camera internal reference and the camera external reference. The camera calibration area can be made of white reflective materials. The camera calibration area can be in any shape, for example, it can be in a circle, triangle or checkerboard shape.

The calibration board usually includes a plurality of camera calibration regions, wherein the positional relationship between any one camera calibration region and its adjacent region is different from the positional relationship between any another camera calibration region and its adjacent region.

A point in the camera calibration area may be used as a camera calibration point, for example, a camera calibration area center point is used as a camera calibration point. Therefore, the positional relationship between the camera calibration regions can also be understood as the positional relationship between the camera calibration points.

In the embodiment of the application, no matter what the camera shoots are all the patterns on the whole calibration board or some patterns on the calibration board, the characteristic points in the image shot by the camera can be determined to be the characteristic points on the calibration board according to the position relation between the characteristic points, so that the three-dimensional coordinates of the characteristic points under a world coordinate system can be determined from the prestored three-dimensional coordinates, and further the internal and external references of the camera can be determined. When the calibration plate is used for calibrating the camera, the distance between the camera and the calibration plate is not limited, and the calibration plate does not need to be moved, so that the automatic calibration of the camera can be realized.

An example of this design is shown in fig. 8.

In some designs, the calibration device may further include a lidar calibration area in one of the aforementioned calibration devices. For the related content of the lidar area, reference is made to the foregoing content, and the description is omitted here. An example of this design is shown in fig. 5.

The application also provides a sensor calibration device. The sensor calibration arrangement comprises means for performing the respective operations in the sensor calibration method described in fig. 3, 6, 7, 9, 10, 11, 12, 13 or 14. Alternatively, the sensor calibration device includes a plurality of modules, each module being for implementing a corresponding process in the method described in fig. 3, 6, 7, 9, 10, 11, 12, 13 or 14.

Fig. 15 is a schematic structural diagram of a sensor calibration apparatus 1500 according to an embodiment of the present application. The sensor calibration apparatus 1500 may include an acquisition module 1510, a fitting module 1520, an estimation module 1530, and a calibration module 1540.

In some designs, the obtaining module 1510 is configured to perform S1010, the fitting module is configured to perform S1020, the estimating module 1530 is configured to perform S1030, and the calibrating module 1540 is configured to perform 1040 to implement the sensor calibration method shown in fig. 10.

In some designs, the obtaining module 1510 is configured to perform S1010, the fitting module is configured to perform S1020, the estimating module 1530 is configured to perform S1030, and the calibrating module 1540 is configured to perform S1041 to implement the sensor calibrating method shown in fig. 11.

In some designs, the obtaining module 1510 is configured to perform S1010 and S1050, the fitting module is configured to perform S1020, the estimating module 1530 is configured to perform S1030, and the calibrating module 1540 is configured to perform S1041 and S1060 to implement the sensor calibrating method shown in fig. 12.

In some designs, the obtaining module 1510 is configured to perform S1010, the fitting module is configured to perform S1020, the estimating module 1530 is configured to perform S1030, and the calibrating module 1540 is configured to perform S1042 and S1044, so as to implement the sensor calibrating method shown in fig. 13.

It is to be understood that the specific processes of the modules for executing the corresponding steps are already described in detail in the above method embodiments, and are not described herein again.

It is to be understood that fig. 15 is only an exemplary division of the structural and functional modules of the sensor device of the present application, and the present application does not set any limit to the specific division thereof.

Fig. 16 is a deployment schematic diagram of a sensor calibration device according to an embodiment of the present application, where the sensor calibration device may be deployed in a cloud environment, and the cloud environment is an entity that provides cloud services to a user by using basic resources in a cloud computing mode. A cloud environment includes a cloud data center that includes a large number of infrastructure resources (including computing resources, storage resources, and network resources) owned by a cloud service provider, which may include a large number of computing devices (e.g., servers), and a cloud service platform.

The sensor calibration device can be a server used for calibrating the sensor in the cloud data center; the sensor calibration device can also be a virtual machine which is created in the cloud data center and used for calibrating the sensor; the sensor calibration device may also be a software device deployed on a server or a virtual machine in the cloud data center, the software device is used for calibrating the sensor, and the software device may be deployed on a plurality of servers in a distributed manner, or deployed on a plurality of virtual machines in a distributed manner, or deployed on a virtual machine and a server in a distributed manner.

As shown in fig. 16, the sensor calibration device is abstracted by a cloud service provider into a cloud service calibrated by a sensor on a cloud service platform and provided to a user, after the user purchases the cloud service on the cloud service platform, the cloud environment provides the sensor calibration cloud service to the user by using the sensor calibration device, the user may upload a point coordinate set acquired by a laser radar to be calibrated to the cloud environment through an Application Program Interface (API) or a web interface provided by the cloud service platform, receive the point coordinate set by the sensor calibration device, calibrate the laser radar to be calibrated, return a calibration result to a terminal where the user is located by the sensor calibration device, or store the calibration result in the cloud environment, for example: and the data is presented on a webpage interface of the cloud service platform for a user to view.

When the sensor calibration device is a software device, different modules of the sensor calibration device may be deployed in different environments or apparatuses. For example: as shown in fig. 17, one part of the sensor calibration apparatus is deployed in a terminal computing device (e.g., a vehicle, a smart phone, a laptop, a tablet, a personal desktop computer, a smart camera), and the other part is deployed in a data center (specifically deployed on a server or a virtual machine in the data center), where the data center may be a cloud data center, and the data center may also be an edge data center, and the edge data center is a collection of edge computing devices deployed in a close distance from the terminal smart device.

The sensor calibration method is characterized in that the sensor calibration device is deployed between various parts of the sensor calibration device in different environments or equipment to cooperatively realize the function of the sensor calibration method. For example, an acquisition module in a sensor calibration device is deployed on a vehicle, after the vehicle acquires a first point coordinate set, the first point coordinate set is sent to a data center through a network, a fitting module, an estimation module and a calibration module are deployed on the data center, the modules further process the first point coordinate set to finally obtain a calibration result, and the data center sends the calibration result to the vehicle.

It can be understood that, in the present application, it is not limited to divide what parts of the sensor calibration apparatus are deployed in the terminal computing device and what parts are deployed in the data center, and the parts can be deployed adaptively according to the computing capability of the terminal computing device or the specific application requirement in actual application. It is noted that, in an embodiment, the sensor calibration apparatus may also be deployed in three parts, where one part is deployed in the terminal computing device, one part is deployed in the edge data center, and another part is deployed in the cloud data center.

When the sensor calibration apparatus is a software apparatus, the sensor calibration apparatus may also be deployed on a computing device in any environment (for example, on a terminal computing device or on a computing device in a data center).

As shown in fig. 18, computing device 1800 includes a bus 1801, a processor 1802, a communication interface 1803, and memory 1804. The processor 1802, memory 1804, and communication interface 1803 communicate via a bus 1801.

The memory 1804 stores executable codes (codes for realizing functions of the modules) included in the sensor calibration apparatus, and the processor 1802 reads the executable codes in the memory 1804 to execute the sensor calibration method. The memory 1804 may also include other software modules required to run processes, such as an operating system. The operating system may be LINUXTM,UNIXTM,WINDOWSTMAnd the like.

The present application also provides a computing device 1800 as shown in fig. 18.

The present application also provides a chip that includes a bus 1801, a processor 1802, a communication interface 1803, and a memory 1804. The processor 1802, memory 1804, and communication interface 1803 communicate via a bus 1801. The memory 1804 stores executable codes (codes for realizing functions of the modules) included in the sensor calibration apparatus, and the processor 1802 reads the executable codes in the memory 1804 to execute the sensor calibration method. The memory 1804 may also include other software modules required to run processes, such as an operating system.

The chip may be a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a system on chip (SoC), a Central Processing Unit (CPU), a Network Processor (NP), a digital signal processing circuit (DSP), a Micro Controller Unit (MCU), a Programmable Logic Device (PLD) or other integrated chips.

It should be understood that each module in the present application may also be referred to as a corresponding unit, for example, an acquisition module may also be referred to as an acquisition unit, an estimation module may also be referred to as an estimation unit, and so on.

In implementation, the steps of the respective methods may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor. To avoid repetition, it is not described in detail here.

It should be noted that the processor in the embodiments of the present application may be an integrated circuit chip having signal processing capability. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor described above may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.

It will be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory, a Hard Disk Drive (HDD), or a Solid State Drive (SSD), among others. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory.

By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, Synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.

The present application further provides a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the method of any of the preceding method embodiments.

The present application also provides a computer-readable medium having stored program code which, when run on a computer, causes the computer to perform the method of any of the method embodiments described above.

The application also provides a system which comprises any one of the sensor calibration device and the computing equipment and the calibration device.

The descriptions of the flows corresponding to the above-mentioned figures have respective emphasis, and for parts not described in detail in a certain flow, reference may be made to the related descriptions of other flows.

In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product for video similarity detection comprises one or more computer program instructions for video similarity detection, which when loaded and executed on a computer, cause, in whole or in part, the processes or functions described in fig. 6 to be performed in accordance with embodiments of the invention.

The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optics, digital subscriber line, or wireless (e.g., infrared, wireless, microwave, etc.) means, the computer readable storage medium may store a readable storage medium of the computer program instructions for video similarity detection. (e.g., floppy disk, hard disk, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., SSD).

42页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种激光雷达的标定方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!