Combined calibration method for monocular camera and millimeter wave radar

文档序号:114808 发布日期:2021-10-19 浏览:17次 中文

阅读说明:本技术 一种单目相机和毫米波雷达的联合标定方法 (Combined calibration method for monocular camera and millimeter wave radar ) 是由 程德心 张伟 胡早阳 汤戈 蔡幼波 于 2021-03-25 设计创作,主要内容包括:本发明提供了一种单目相机和毫米波雷达的联合标定方法,首先将毫米波雷达和相机进行时间同步以及空间数据融合,然后基于空间数据融合结果,调整毫米波雷达和相机的安装角度以及安装位置,使得检测同一个检测点时,毫米波雷达检测到的检测点坐标与相机检测到的检测点坐标一致。本发明实将毫米波雷达和相机进行时间同步以及空间数据融合,确定无人驾驶汽车上安装的相机和毫米波雷达之间的坐标关系,进而进行相机和毫米波雷达的联合标定,提高了相机和毫米波雷达的标定精度。并且,本发明使用的标定支架由固定在支架上的棋盘格标定板和雷达角反射器构成,其制作简易,成本低。(The invention provides a monocular camera and millimeter wave radar combined calibration method, which comprises the steps of firstly carrying out time synchronization and spatial data fusion on a millimeter wave radar and a camera, and then adjusting the installation angle and the installation position of the millimeter wave radar and the camera based on the spatial data fusion result, so that when the same detection point is detected, the coordinates of the detection point detected by the millimeter wave radar are consistent with the coordinates of the detection point detected by the camera. According to the method, the millimeter wave radar and the camera are subjected to time synchronization and spatial data fusion, the coordinate relation between the camera installed on the unmanned vehicle and the millimeter wave radar is determined, then the camera and the millimeter wave radar are subjected to combined calibration, and the calibration precision of the camera and the millimeter wave radar is improved. In addition, the calibration support used by the invention is composed of a chessboard calibration plate fixed on the support and a radar corner reflector, and has the advantages of simple manufacture and low cost.)

1. A combined calibration method for a monocular camera and a millimeter wave radar is characterized by comprising the following steps.

S1, carrying out time synchronization and spatial data fusion on the millimeter wave radar and the camera;

and S2, adjusting the installation angles and the installation positions of the millimeter wave radar and the camera based on the spatial data fusion result, so that the coordinates of the detection points detected by the millimeter wave radar are consistent with the coordinates of the detection points detected by the camera when the same detection point is detected.

2. The method for the combined calibration of the monocular camera and the millimeter wave radar according to claim 1, wherein the time synchronization of the millimeter wave radar and the monocular camera comprises:

and GPS time service of the millimeter wave radar and the monocular camera is unified, so that time synchronization of the millimeter wave radar and the monocular camera is realized.

3. The method for jointly calibrating a monocular camera and a millimeter wave radar according to claim 1, wherein the spatial data fusion of the millimeter wave radar and the monocular camera comprises:

establishing a millimeter wave radar coordinate system and a camera coordinate system, and establishing a pixel level data fusion equation of the millimeter wave radar and the camera by utilizing the space constraint relation between the millimeter wave radar data point and the camera image;

and solving a pixel-level data fusion equation to obtain a spatial transformation relation between the millimeter wave radar coordinate system and the camera coordinate system, and completing spatial data fusion of the millimeter wave radar and the camera.

4. The method for united calibration of a monocular camera and a millimeter wave radar according to claim 3, wherein the pixel level data fusion equation for the millimeter wave radar and the camera comprises:

where u, v are the image points in the camera image pixels, Kc is the internal parameter matrix of the camera, (u0,v0) As the image pixel center coordinates, (f)x,fy) Is the equivalent focal length in the x, y directions;is a 3 x 3 rotation matrix and,is a coordinate translation matrix of 1 x 3,andrespectively representing the installation angle and the installation position of the camera relative to the vehicle, also called the external parameters of the camera;

installation angle of camera and millimeter wave radarAngle of directionThe pitch angle δ, and roll angle ξ are determined, as follows:

wherein, t1,t2,t3Indicating the mounting position of the camera.

5. The method for the combined calibration of the monocular camera and the millimeter wave radar according to claim 4, wherein step S2 specifically includes:

s21, mounting the camera below the windshield of the vehicle, and mounting the millimeter wave radar on the front bumper of the vehicle;

s22, according to the installation position of the camera and the millimeter wave radar, a calibration support is arranged in front of the vehicle, and the calibration support comprises a checkerboard calibration plate and a radar corner reflector which are fixed on the support;

s23, adjusting the installation angle and the installation position of the millimeter wave radar and the camera, and when the millimeter wave radar and the camera detect the same detection point on the calibration support, calculating through a pixel-level data fusion equation to obtain that the coordinates of the detection point detected by the radar are consistent with the coordinates of the detection point detected by the camera.

6. The method for united calibration of a monocular camera and a millimeter wave radar according to claim 5, further comprising, after step S2:

and S3, moving the calibration support, repeating the step S2, and calibrating the millimeter wave radar and the camera for multiple times.

7. The method for united calibration of a monocular camera and a millimeter wave radar according to claim 5, wherein the step of arranging a calibration bracket in front of the vehicle according to the installation position of the camera and the millimeter wave radar comprises:

and setting the height difference between the camera and the millimeter wave radar at the installation position as H, and setting the height difference between the chessboard pattern calibration plate on the calibration support and the radar corner reflector as H.

Correspondingly, the height difference between the detection point of the camera on the checkerboard calibration board and the detection point of the millimeter wave radar on the radar corner reflector is H.

Technical Field

The invention relates to the technical field of sensor calibration, in particular to a combined calibration method of a monocular camera and a millimeter wave radar.

Background

Advanced Driver Assistance Systems (ADAS) and advanced autopilot, with the aim of reducing road traffic accidents and improving driver comfort. The advanced driving assistance technology firstly solves the problem of perception, and perception necessarily utilizes different sensors, such as a camera, a millimeter wave radar, a laser radar and the like. A plurality of sensors are arranged on a vehicle, and the coordinate relation between the sensors is required to be determined, so that the calibration of the sensors is the basic requirement of automatic driving and is an important basis for determining whether a sensing system is correct or not.

Additionally, the relationship between sensor input and output can be determined by sensor calibration. Since a safe and fast driving path needs to be planned by identifying the surrounding environment when the advanced unmanned vehicle drives on the road, the sensor calibration is the basis of advanced auxiliary driving and high-level automatic driving.

Disclosure of Invention

In order to solve the above problems, embodiments of the present invention provide a method for jointly calibrating a monocular camera and a millimeter wave radar, which overcomes or at least partially solves the above problems.

The embodiment of the invention provides a monocular camera and millimeter wave radar combined calibration method, which comprises the steps of

S1, carrying out time synchronization and spatial data fusion on the millimeter wave radar and the camera;

and S2, adjusting the installation angles and the installation positions of the millimeter wave radar and the camera based on the spatial data fusion result, so that the coordinates of the detection points detected by the millimeter wave radar are consistent with the coordinates of the detection points detected by the camera when the same detection point is detected.

Preferably, the time synchronization of the millimeter wave radar and the monocular camera includes:

and GPS time service of the millimeter wave radar and the monocular camera is unified, so that time synchronization of the millimeter wave radar and the monocular camera is realized.

Preferably, the spatial data fusion of the millimeter wave radar and the monocular camera includes:

establishing a millimeter wave radar coordinate system and a camera coordinate system, and establishing a pixel level data fusion equation of the millimeter wave radar and the camera by utilizing the space constraint relation between the millimeter wave radar data point and the camera image;

and solving a pixel-level data fusion equation to obtain a spatial transformation relation between the millimeter wave radar coordinate system and the camera coordinate system, and completing spatial data fusion of the millimeter wave radar and the camera.

Preferably, the pixel-level data fusion equation for the millimeter wave radar and the camera includes:

where u, v are the image points in the camera image pixels, Kc is the internal parameter matrix of the camera, (u0,v0) As the image pixel center coordinates, (f)x,fy) Is the equivalent focal length in the x, y directions;is a 3 x 3 rotation matrix and,is a coordinate translation matrix of 1 x 3,andrespectively representing the installation angle and the installation position of the camera relative to the vehicle, also called the external parameters of the camera;

the installation angle of the camera and the millimeter wave radar is determined by the course angleThe pitch angle δ, and roll angle ξ are determined, as follows:

wherein, t1,t2,t3Indicating the mounting position of the camera.

Preferably, step S2 specifically includes:

s21, mounting the camera below the windshield of the vehicle, and mounting the millimeter wave radar on the front bumper of the vehicle;

s22, according to the installation position of the camera and the millimeter wave radar, a calibration support is arranged in front of the vehicle, and the calibration support comprises a checkerboard calibration plate and a radar corner reflector which are fixed on the support;

and S23, adjusting the installation angle and the installation position of the millimeter wave radar and the camera, and when the millimeter wave radar and the camera detect the same detection point on the calibration support, calculating through a pixel-level data fusion equation to obtain that the coordinates of the detection point detected by the radar are consistent with the coordinates of the detection point detected by the camera.

Preferably, after step S2, the method further comprises:

and S3, moving the calibration support, repeating the step S2, and calibrating the millimeter wave radar and the camera for multiple times.

Preferably, according to the installation position of camera and millimeter wave radar, set up calibration support in the plantago, include:

and setting the height difference between the camera and the millimeter wave radar at the installation position as H, and setting the height difference between the chessboard pattern calibration plate on the calibration support and the radar corner reflector as H.

Correspondingly, the height difference between the detection point of the camera on the checkerboard calibration board and the detection point of the millimeter wave radar on the radar corner reflector is H.

Compared with the prior art, the monocular camera and millimeter wave radar combined calibration method provided by the embodiment of the invention has the following beneficial effects:

1) the millimeter wave radar and the camera are subjected to time synchronization and spatial data fusion, the coordinate relation between the camera installed on the unmanned automobile and the millimeter wave radar is determined, then the multi-sensor combined calibration is carried out, and the calibration precision of the camera and the millimeter wave radar is improved.

2) Because the calibration precision of the camera and the millimeter wave radar is improved, the precision of the obstacle fusion detection of the camera and the millimeter wave radar can be improved

3) The calibration support used by the invention consists of the chessboard pattern calibration plate fixed on the support and the radar corner reflector, and has the advantages of simple manufacture and low cost.

Drawings

In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.

Fig. 1 is a schematic flow chart of a combined calibration method for a monocular camera and a millimeter wave radar according to an embodiment of the present invention;

fig. 2 is a schematic view illustrating an installation of a camera and a millimeter wave radar according to an embodiment of the present invention;

fig. 3 is a schematic structural diagram of a calibration bracket according to an embodiment of the present invention.

Detailed Description

In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.

Advanced driver assistance techniques first solve the problem of perception, which inevitably uses different sensors, such as cameras, millimeter-wave radars, laser radars, etc. A plurality of sensors are arranged on a vehicle, and the coordinate relation between the sensors is required to be determined, so that the calibration of the sensors is the basic requirement of automatic driving and is an important basis for determining whether a sensing system is correct or not. Additionally, the relationship between sensor input and output can be determined by sensor calibration. Since a safe and fast driving path needs to be planned by identifying the surrounding environment when the advanced unmanned vehicle drives on the road, the sensor calibration is the basis of advanced auxiliary driving and high-level automatic driving.

Therefore, the invention develops a combined calibration method of the monocular camera and the millimeter wave radar, the millimeter wave radar and the camera are subjected to time synchronization and spatial data fusion, the coordinate relation between the camera installed on the unmanned automobile and the millimeter wave radar is determined, then the combined calibration of the multiple sensors is carried out, and the calibration precision of the camera and the millimeter wave radar is improved. The following description and description of various embodiments are presented in conjunction with the following drawings.

Sensor calibration can be divided into two parts: the method comprises the following steps of internal parameter calibration and external parameter calibration, wherein the internal parameters are used for determining the mapping relation inside the sensor, such as the focal length, eccentricity and pixel aspect ratio (distortion coefficient) of a camera, and the external parameters are used for determining the conversion relation between the sensor and an external coordinate system, such as attitude parameters (rotation and translation).

The invention solves the external reference calibration between different sensors, particularly the calibration between a millimeter wave radar and a monocular camera, namely the problems of time data fusion and space data fusion of the millimeter wave radar and the monocular camera, thereby achieving the purpose of calibration. The "camera" in this application is a "monocular camera".

Fig. 1 is a schematic flow chart of a combined calibration method for a monocular camera and a millimeter wave radar according to an embodiment of the present invention, and as shown in fig. 1, the combined calibration method for a monocular camera and a millimeter wave radar according to the embodiment of the present invention includes, but is not limited to, the following steps:

s1, carrying out time synchronization and spatial data fusion on the millimeter wave radar and the camera;

and S2, adjusting the installation angles and the installation positions of the millimeter wave radar and the camera based on the spatial data fusion result, so that the coordinates of the detection points detected by the millimeter wave radar are consistent with the coordinates of the detection points detected by the camera when the same detection point is detected.

Specifically, the invention performs time data synchronization on the millimeter wave radar and the camera, and because the acquisition channels of different sensors such as the millimeter wave radar and the monocular camera are different, for example, the millimeter wave radar acquires through a Can bus, and the camera acquires through a Can interface or a USB interface. In addition, the acquisition periods of different sensors are also different, the acquisition period of the millimeter wave radar can be 60ms, and the acquisition period of the camera can be 50ms, so that the different sensors need to be synchronized in time. The invention adopts a GPS time service mode, and the millimeter wave radar and the camera are subjected to unified time service through the GPS. And the GPS is used as standard time to stamp the millimeter wave radar and the camera, so that the purpose of time synchronization is achieved. Furthermore, a memory with a certain size is designed to serve as a cache pool for caching the data of the current millimeter wave radar and the current camera, so that the problem of data lag is solved, and the data to be processed is guaranteed to be the latest data after current synchronization.

Further, in step S1, performing spatial data fusion on the millimeter wave radar and the monocular camera, including: establishing a millimeter wave radar coordinate system and a camera coordinate system, and establishing a pixel level data fusion equation of the millimeter wave radar and the camera by utilizing the space constraint relation between the millimeter wave radar data point and the camera image; and solving a pixel-level data fusion equation to obtain a spatial transformation relation between the millimeter wave radar coordinate system and the camera coordinate system, and completing spatial data fusion of the millimeter wave radar and the camera.

When the millimeter wave radar coordinate system and the camera are installed, the millimeter wave radar is in rigid connection with the vehicle, and the relative posture and displacement of the millimeter wave radar and the vehicle are fixed and unchanged without considering the vibration condition of the vehicle, so that data points returned by the millimeter wave radar have unique position coordinates corresponding to each other in a world coordinate system. Similarly, the monocular camera is rigidly connected with the automobile, the relative posture and displacement of the monocular camera and the automobile are fixed, and only one unique image pixel is corresponding to each point in the world coordinate, so that a unique corresponding point exists in the image space for each data point of the millimeter wave radar in the same space.

Therefore, by establishing a reasonable millimeter wave radar coordinate system and a camera coordinate system and utilizing the space constraint relation between the millimeter wave radar data point and the camera image, the space transformation relation between the millimeter wave radar coordinate system and the camera coordinate system can be solved, so that the space fusion of the millimeter wave radar and the camera is completed, and therefore, the space fusion problem of the millimeter wave radar and the camera is converted into the function fitting problem of corresponding points of the radar and the image.

After the external parameters of the camera are solved through the pixel-level data fusion equation, the relation between the millimeter wave radar and the relative environment coordinate system of the camera is completely determined, so that millimeter wave radar data points can be projected onto an image pixel coordinate system through a camera model, and the pixel-level data fusion equation comprises the following steps:

where u, v are the image points in the camera image pixels, Kc is the internal parameter matrix of the camera, (u0,v0) As the image pixel center coordinates, (f)x,fy) Is the equivalent focal length in the x, y directions;is a 3 x 3 rotation matrix and,is a coordinate translation matrix of 1 x 3,andrespectively representing the installation angle and the installation position of the camera relative to the vehicle, also called the external parameters of the camera;

the installation angle of the camera and the millimeter wave radar is determined by the course angleThe pitch angle δ, and roll angle ξ are determined, as follows:

wherein, t1,t2,t3Indicating the mounting position of the camera.

The calibration of the camera and the millimeter wave radar needs to calculate a detection point detected by the camera and the millimeter wave radar at the same time according to the above method, and project the detection point to a pixel plane. Fig. 2 is a schematic view illustrating an installation of a camera and a millimeter wave radar according to an embodiment of the present invention, and referring to fig. 2, in this embodiment, the millimeter wave radar is installed in a front bumper of a vehicle, and the camera is installed in a windshield and has a height difference. The calibration can be considered successful only by finding the coincidence of the observation points of the plurality of radars and the pixel points of the camera.

Based on the content of the foregoing embodiment, step S2 specifically includes:

s21, mounting the camera below the windshield of the vehicle, and mounting the millimeter wave radar on the front bumper of the vehicle; as shown with reference to fig. 2.

S22, according to the installation position of the camera and the millimeter wave radar, a calibration support is arranged in front of the vehicle, and the calibration support comprises a checkerboard calibration plate and a radar corner reflector which are fixed on the support; fig. 3 is a schematic structural diagram of a calibration bracket according to an embodiment of the present invention. And setting the height difference between the camera and the millimeter wave radar at the installation position as H, and setting the height difference between the chessboard pattern calibration plate on the calibration support and the radar corner reflector as H.

And S23, adjusting the installation angle and the installation position of the millimeter wave radar and the camera, and when the millimeter wave radar and the camera detect the same detection point on the calibration support, calculating through a pixel-level data fusion equation to obtain that the coordinates of the detection point detected by the radar are consistent with the coordinates of the detection point detected by the camera.

Referring to fig. 2 and 3, the calibration support comprises a checkerboard calibration plate and a radar corner reflector, the checkerboard calibration plate is used for calibrating a camera, according to a pixel-level data fusion equation, a projection from a camera detection point on the checkerboard calibration plate to a camera pixel plane is found, and a projection from a radar corner reflector reflection point to the camera pixel plane is found at the same time. In this embodiment, the height difference between the detection point of the camera on the checkerboard calibration plate and the detection point of the millimeter wave radar on the radar corner reflector is equal to the height difference between the actual camera and the actual radar mounted on the vehicle, so that the height difference between the detection point of the camera on the checkerboard calibration plate and the detection point of the millimeter wave radar on the radar corner reflector is H. The calibration precision can be improved and the calculation difficulty can be reduced by the arrangement. In this embodiment, adjustment camera and millimeter wave radar's installation angle and mounted position for same check point is at the pixel plane coincidence of camera, because radar corner reflector and chess board check calibration board have the difference in height of H on the support, need subtract the difference in height H during the projection, makes the radar pixel point after the conversion and the pixel point coincidence of camera.

Further, the calibration support is moved once at an interval of 0.5m from the front of the vehicle at 1m, the step S2 is repeated, and the millimeter wave radar and the camera are calibrated for multiple times according to the method so as to improve the calibration precision. Through multiple calibration tests, the calibration precision can be obviously improved after more than 4 times of calibration.

Compared with the prior art, the monocular camera and millimeter wave radar combined calibration method provided by the embodiment of the invention has the following beneficial effects:

1) the time synchronization and the spatial data fusion are carried out on the millimeter wave radar and the camera, the coordinate relation between the camera installed on the unmanned automobile and the millimeter wave radar is determined, the camera and the millimeter wave radar are jointly calibrated, and the calibration precision of the camera and the millimeter wave radar is improved.

2) Because the calibration precision of the camera and the millimeter wave radar is improved, the precision of the obstacle fusion detection of the camera and the millimeter wave radar can be improved

3) The calibration support used by the invention consists of the chessboard pattern calibration plate fixed on the support and the radar corner reflector, and has the advantages of simple manufacture and low cost.

The embodiments of the present invention can be arbitrarily combined to achieve different technical effects.

Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.

Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

11页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种基于FMCW安防雷达测角校正方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!