Laser radar calibration method and device, electronic equipment and storage medium

文档序号:359125 发布日期:2021-12-07 浏览:18次 中文

阅读说明:本技术 一种激光雷达标定方法、装置、电子设备及存储介质 (Laser radar calibration method and device, electronic equipment and storage medium ) 是由 林金表 董博 于 2020-10-10 设计创作,主要内容包括:本发明实施例公开了一种激光雷达标定方法、装置、电子设备及存储介质。该方法包括:分别获取同一时间采集到的包含有标定物的点云数据和图像数据;从各点云数据中筛选出属于标定物上的标定数据,并根据各标定数据分别确定每条边在雷达坐标系下的边坐标;根据各边坐标分别确定标定物中每个顶点在雷达坐标系下的顶点坐标,并根据各顶点坐标、以及标定物中的标定点在标定物中的相对位置,确定标定点在雷达坐标系下的雷达坐标;将标定点在相机坐标系下的相机坐标、及雷达坐标作为一组特征点对,基于多组特征点对进行相机坐标系和雷达坐标系的标定。本发明实施例的技术方案,解决了从点云数据中提取特征点时对标定物的尺寸和摆放位置限制较大的问题。(The embodiment of the invention discloses a laser radar calibration method and device, electronic equipment and a storage medium. The method comprises the following steps: respectively acquiring point cloud data and image data which are acquired at the same time and contain calibration objects; screening calibration data belonging to a calibration object from the cloud data of each point, and respectively determining the edge coordinates of each edge under a radar coordinate system according to the calibration data; determining the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining the radar coordinates of the calibration points under the radar coordinate system according to the vertex coordinates and the relative positions of the calibration points in the calibration object; and taking the camera coordinates and the radar coordinates of the calibration points in the camera coordinate system as a group of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of feature point pairs. The technical scheme of the embodiment of the invention solves the problem that the size and the placing position of a calibration object are greatly limited when the feature points are extracted from the point cloud data.)

1. A laser radar calibration method is characterized by comprising the following steps:

respectively acquiring point cloud data and image data which are acquired at the same time and contain a calibration object, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data;

screening calibration data belonging to the calibration object from the point cloud data, and respectively determining the edge coordinate of each edge under a radar coordinate system according to the calibration data;

determining a vertex coordinate of each vertex in the calibration object under the radar coordinate system according to each edge coordinate, and determining a radar coordinate of the calibration point under the radar coordinate system according to each vertex coordinate and a relative position of the calibration point in the calibration object;

and taking the camera coordinates of the calibration points in a camera coordinate system and the radar coordinates as a group of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of feature point pairs.

2. The method of claim 1, wherein said screening calibration data pertaining to said calibration object from each of said point cloud data comprises:

screening out plane data belonging to the same plane from the point cloud data, and screening out end point data from a plane set formed by the plane data;

and comparing the first length of the diagonal line formed by the end point data with the second length of the diagonal line corresponding to the calibration object, and screening the calibration data belonging to the calibration object from the point cloud data according to the comparison result.

3. The method of claim 2, wherein the step of screening the point cloud data from the point cloud data to obtain calibration data pertaining to the calibration object according to the comparison result comprises:

if the same plane is determined not to be the plane where the calibration object is located according to the comparison result, the plane data are removed from the point cloud data, and the point cloud data are updated according to the removal result;

repeatedly performing the step of screening out plane data belonging to the same plane from the point cloud data until the same plane is the plane where the calibration object is located;

and taking the plane data as calibration data belonging to the calibration object.

4. The method of claim 1, wherein said determining edge coordinates of each of said edges in a radar coordinate system based on each of said calibration data comprises:

and acquiring a plurality of calibration lines formed by the calibration data, determining edge data of each calibration line in the scanning direction, and respectively determining the edge coordinates of each edge under a radar coordinate system according to the edge data.

5. The method of claim 1, wherein said calibration object comprises a quadrilateral calibration object having a curved edge shape disposed therein that is tangent to at least two of said edges, said calibration point being a center point of said curved edge shape.

6. The method of claim 5, wherein the polygon comprises a square, the curved polygon comprises a circle, and the circle is tangent to two of the sides connecting the vertices;

correspondingly, the determining the radar coordinates of the calibration point in the radar coordinate system according to the vertex coordinates and the relative position of the calibration point in the calibration object includes:

each vertex comprises an upper vertex, a left vertex and a right vertex, and the radar coordinate P of the calibration point adjacent to the upper vertex in the radar coordinate system is determined through the following formulatop

Wherein, CtopIs the coordinates of the upper vertex of said upper vertex under said radar coordinate system, CleftIs the left vertex coordinate, C, of the left vertex under the radar coordinate systemrightIs a right vertex coordinate of the right vertex in the radar coordinate system, w is a length of a first side of each of the sides, h is a length of a second side of each of the sides perpendicular to the first side, and r is a radius of the circle.

7. The method of claim 1, wherein the calibrating the camera coordinate system and the radar coordinate system based on the plurality of sets of pairs of feature points comprises:

and determining the coordinate change relationship from the camera coordinate system to the radar coordinate system based on the plurality of groups of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system according to the coordinate change relationship.

8. A laser radar calibration device is characterized by comprising:

the data acquisition module is used for respectively acquiring point cloud data and image data which are acquired at the same time and comprise a calibration object, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data;

the edge coordinate determination module is used for screening calibration data belonging to the calibration object from the point cloud data and determining the edge coordinate of each edge under a radar coordinate system according to the calibration data;

the radar coordinate determination module is used for respectively determining the vertex coordinate of each vertex in the calibration object under the radar coordinate system according to each edge coordinate, and determining the radar coordinate of the calibration point under the radar coordinate system according to each vertex coordinate and the relative position of the calibration point in the calibration object;

and the laser radar calibration module is used for taking the camera coordinates of the calibration points in a camera coordinate system and the radar coordinates as a group of characteristic point pairs and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of characteristic point pairs.

9. An electronic device, comprising:

one or more processors;

a memory for storing one or more programs;

when executed by the one or more processors, cause the one or more processors to implement the lidar calibration method as defined in any of claims 1-7.

10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a lidar calibration method according to any of claims 1 to 7.

Technical Field

The embodiment of the invention relates to the field of laser radar calibration, in particular to a laser radar calibration method and device, electronic equipment and a storage medium.

Background

With the rapid development of artificial intelligence technology, automation equipment capable of replacing human work receives more and more attention, such as unmanned vehicles, unmanned aerial vehicles, manipulators and the like. The automatic equipment is often provided with a plurality of sensors to sense the surrounding environment information for operation, and on the basis, the sensing information of the laser radar and the camera can be supplemented with each other, so that the laser radar and the camera are common sensors on the automatic equipment.

For the automatic equipment which is simultaneously provided with the laser radar and the camera, the calibration of the laser radar and the camera is the premise for fusing the data of the two sensors. The calibration work from the laser radar to the camera is to determine a three-dimensional rotation matrix and a translation vector, which can represent the mutual conversion relation from a laser radar coordinate system to a camera coordinate system, and the point cloud data on the laser radar coordinate system can be projected onto the camera coordinate system according to the mutual conversion relation, so that the fusion of the point cloud data and the image data is realized.

The calibration work from the laser radar to the camera which is commonly used at present is mainly finished by a calibration object with obvious characteristics, characteristic points corresponding to the obvious characteristics are respectively extracted from point cloud data and image data to form characteristic point pairs, and then the calibration of a radar coordinate system and a camera coordinate system is carried out according to the characteristic point pairs.

In the process of implementing the invention, the inventor finds that the following technical problems exist in the prior art: the existing extraction scheme for extracting the feature points from the point cloud data has great limitation on the size and the placement position of a calibration object, which brings great interference to the convenience of calibration work.

Disclosure of Invention

The embodiment of the invention provides a laser radar calibration method, a laser radar calibration device, electronic equipment and a storage medium, and solves the problem that the size and the placement position of a calibration object are greatly limited when feature points are extracted from point cloud data.

In a first aspect, an embodiment of the present invention provides a laser radar calibration method, which may include:

respectively acquiring point cloud data and image data which are acquired at the same time and contain a calibration object, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data; screening calibration data belonging to a calibration object from the cloud data of each point, and respectively determining the edge coordinates of each edge under a radar coordinate system according to the calibration data; determining the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining the radar coordinates of the calibration points under the radar coordinate system according to the vertex coordinates and the relative positions of the calibration points in the calibration object; and taking the camera coordinates and the radar coordinates of the calibration points in the camera coordinate system as a group of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of feature point pairs.

Optionally, screening calibration data belonging to a calibration object from the cloud data of each point may include:

screening out plane data belonging to the same plane from the cloud data of each point, and screening out endpoint data from a plane set formed by the plane data; and comparing the first length of the diagonal line formed by the end point data with the second length of the diagonal line corresponding to the calibration object, and screening the calibration data belonging to the calibration object from the cloud data of each point according to the comparison result.

Optionally, screening calibration data belonging to a calibration object from the cloud data of each point according to the comparison result, which may include:

if the same plane is determined not to be the plane where the calibration object is located according to the comparison result, removing plane data from the cloud data of each point, and updating the point cloud data according to the removal result; repeatedly executing the step of screening out plane data belonging to the same plane from the cloud data of each point until the same plane is the plane where the calibration object is located; and taking the plane data as calibration data belonging to a calibration object.

Optionally, determining edge coordinates of each edge under the radar coordinate system according to each calibration data respectively includes:

and acquiring a plurality of calibration lines formed by the calibration data, determining edge data of each calibration line in the scanning direction, and determining edge coordinates of each edge in a radar coordinate system according to the edge data.

Optionally, the calibration object includes a quadrilateral calibration object, a curved edge shape tangent to at least two edges is arranged in the quadrilateral calibration object, and the calibration point is a central point of the curved edge shape.

Optionally, the polygon comprises a square, the curved edge comprises a circle, and the circle is tangent to two edges connected with the vertex;

correspondingly, determining the radar coordinates of the calibration point in the radar coordinate system according to the vertex coordinates and the relative position of the calibration point in the calibration object may include:

each vertex comprises an upper vertex, a left vertex and a right vertex, and a radar coordinate Ptop of a calibration point adjacent to the upper vertex in a radar coordinate system is determined through the following formula:

wherein Ctop is the upper vertex coordinate of the upper vertex in the radar coordinate system, Cleft is the left vertex coordinate of the left vertex in the radar coordinate system, Cright is the right vertex coordinate of the right vertex in the radar coordinate system, w is the length of the first side in each side, h is the length of the second side perpendicular to the first side in each side, and r is the radius of the circle.

Optionally, calibrating the camera coordinate system and the radar coordinate system based on the plurality of sets of feature point pairs may include:

and determining a coordinate change relation from the camera coordinate system to the radar coordinate system based on the plurality of groups of characteristic point pairs, and calibrating the camera coordinate system and the radar coordinate system according to the coordinate change relation.

In a second aspect, an embodiment of the present invention further provides a laser radar calibration apparatus, which may include:

the data acquisition module is used for respectively acquiring point cloud data and image data which are acquired at the same time and comprise a calibration object, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data;

the side coordinate determination module is used for screening calibration data belonging to a calibration object from the cloud data of each point and respectively determining the side coordinate of each side under a radar coordinate system according to the calibration data;

the radar coordinate determination module is used for respectively determining the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining the radar coordinates of the calibration points under the radar coordinate system according to the vertex coordinates and the relative positions of the calibration points in the calibration object;

and the laser radar calibration module is used for taking the camera coordinates and the radar coordinates of the calibration points in the camera coordinate system as a group of characteristic point pairs and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of characteristic point pairs.

In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device may include:

one or more processors;

a memory for storing one or more programs;

when the one or more programs are executed by the one or more processors, the one or more processors implement the lidar calibration method provided by any of the embodiments of the invention.

In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the laser radar calibration method provided in any embodiment of the present invention.

According to the technical scheme of the embodiment of the invention, the point cloud data and the image data of the calibration object containing the polygonal shape, which are acquired at the same time, are respectively acquired, the calibration data belonging to the calibration object are screened out from the point cloud data, and each edge in the polygon is intersected with at least two point cloud lines, so that the edge coordinates of each edge under a radar coordinate system can be respectively determined according to each calibration data; further, determining vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining radar coordinates of the calibration point under the radar coordinate system according to the vertex coordinates and the relative position of the calibration point in the calibration object; therefore, the camera coordinates and the radar coordinates of the calibration point in the camera coordinate system can be used as a group of feature point pairs, and calibration between the camera coordinate system and the radar coordinate system can be carried out based on the plurality of groups of feature point pairs. According to the technical scheme, the radar coordinates of the feature points can be determined through the edge coordinates of each edge in the polygon, which means that when each edge in the polygon is intersected with at least two point cloud lines, the feature points can be extracted from the cloud data of each point, the limitation on the size and the placing position of the calibration object is small, the convenience of the calibration object in the aspects of manufacturing and using is improved, and the convenience of the calibration work is further improved.

Drawings

FIG. 1 is a schematic illustration of a prior art calibration object;

FIG. 2 is a schematic diagram of a calibration object in a lidar calibration method in various embodiments of the present disclosure;

fig. 3 is a flowchart of a laser radar calibration method according to a first embodiment of the present invention;

fig. 4 is a schematic diagram illustrating placement of a calibration object in a laser radar calibration method according to a first embodiment of the present invention;

fig. 5 is a schematic diagram of point cloud data in a laser radar calibration method according to a first embodiment of the present invention;

fig. 6 is a flowchart of a laser radar calibration method in the second embodiment of the present invention;

fig. 7 is a flowchart of a laser radar calibration method in the third embodiment of the present invention;

fig. 8 is a block diagram of a lidar calibration apparatus according to a fourth embodiment of the present invention;

fig. 9 is a schematic structural diagram of an electronic device in a fifth embodiment of the present invention.

Detailed Description

The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.

Before the embodiment of the present invention is described, an application scenario of the embodiment of the present invention is exemplarily described: the calibration object involved in the calibration work from the laser radar to the camera commonly used at present can be a flat plate with a circular hole as shown in fig. 1, the calibration work is to fit 4 circles according to point cloud data with obviously changed depth in each point cloud data, and then obtain coordinates of 4 circle centers according to the fitting result, and the coordinates can be used as a feature point in a feature point pair. In other words, the calibration work needs to ensure that each circle intersects at least two point cloud lines, since at least three non-collinear point cloud data can fit a circle.

It should be noted that for a multiline lidar, such as a 16-line lidar, the scanning of the calibration object is implemented by emitting a plurality of scanning lines with different heights in a cone beam manner, which means that the sparseness among the scanning lines is larger when the distance between each scanning line and the lidar is farther. At this time, in order to ensure that each circle is intersected with at least two point cloud lines, one alternative is that the placing distance between the calibration object and the laser radar is close enough, at this time, the size of the calibration object is not specifically limited, but the placing position of the calibration object is limited; another alternative is that the size of the calibration object is large enough, and then the calibration object can be placed at any position, and considering that the focus of the camera is usually at a distance, the camera and the lidar are placed at the same position, which is favorable for the camera to shoot clear image data of the calibration object, which is an important precondition for extracting feature points from the image data subsequently. Therefore, the existing extraction scheme for extracting the feature points from the point cloud data has great limitation on the size and the placement position of the calibration object, and the calibration object needs to be specially customized, which brings great interference to the convenience of calibration work.

In order to solve the above problems, the lidar calibration method provided in each embodiment of the present invention only needs to make the calibration object be a polygon, and each edge of the polygon intersects with at least two point cloud lines, which means that there is no specific limitation on the placement position and size of the calibration object, for example, the calibration object may be placed at a coordinate where a camera can clearly focus, and the size of the calibration object does not need to be large enough, and the calibration object does not need to be dug, such calibration object does not need to be specially customized, and the convenience of the manufacturing process and the using process is high, and the calibration object can be better applied to a 16-line lidar. For example, the peripheral shape of the calibration object may be a triangle, a square (e.g., a square, a rectangle), a pentagon, a hexagon, etc.; on the basis, optionally, a curved polygon, such as a circle, an ellipse, etc., may be disposed in the polygon, which is advantageous in that a certain point (e.g., a center point, a corner point) in the curved polygon may be used as a calibration point in the calibration object, and on the basis, coordinates of the calibration point may also be used as a feature point in the calibration work. For example, taking the calibration object shown in fig. 2 as an example, the peripheral shape of the calibration object is a rectangle, 4 circles are arranged in the calibration object, each circle is tangent to two sides, the two sides are sides connected to the same vertex, and the center of each circle can be used as the calibration point of the calibration object. The manufacturing process of the calibration object is simple, for example, 4 circles can be printed firstly, and the 4 circles are directly adhered to four corners of a square wood board; 4 circles can be directly drawn on a certain square wood board; etc., and are not specifically limited herein.

Example one

Fig. 3 is a flowchart of a laser radar calibration method according to a first embodiment of the present invention. This embodiment can be applicable to the circumstances that laser radar markd, is particularly useful for carrying out the circumstances that laser radar markd under the unrestricted prerequisite to the locating place and the size of calibration object. The method can be executed by the laser radar calibration device provided by the embodiment of the invention, the device can be realized in a software and/or hardware mode, the device can be integrated on electronic equipment, and the electronic equipment can be various user terminals or servers.

Referring to fig. 3, the method of the embodiment of the present invention specifically includes the following steps:

s110, point cloud data and image data which are collected at the same time and contain a calibration object are respectively obtained, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data.

The method comprises the following steps of placing a calibration object in the scanning direction of a laser radar sensor, controlling the laser radar sensor to scan the calibration object to obtain point cloud data containing the calibration object, wherein the laser radar sensor can be a sensor capable of scanning the point cloud data, such as a multi-line laser radar; the scanning result of the laser radar sensor can be a plurality of point cloud lines, each point cloud line comprises a plurality of point cloud data, the plurality of point cloud data can comprise point cloud data belonging to the calibration object, and can also comprise point cloud data belonging to objects other than the calibration object, and because two points form a straight line, when each edge in a polygon is intersected with at least two point cloud lines, the coordinate of the edge under a radar coordinate system corresponding to the laser radar sensor can be determined according to the point cloud data on the edge, and the coordinate can be called as an edge coordinate.

Correspondingly, the image data acquisition process is similar, optionally, the placing position of the calibration object is also the shooting direction of the camera sensor, the camera sensor is controlled to shoot the calibration object to obtain the image data containing the calibration object, the camera sensor can be a sensor capable of shooting the image data, such as a camera, a digital camera and the like, and the coordinate system corresponding to the camera sensor can be a camera coordinate system. It should be noted that each frame of image data and each frame of point cloud data have a one-to-one correspondence in time, that is, each frame of image data corresponds to one frame of point cloud data acquired at the same time as the frame of image data, and vice versa.

In consideration of application scenarios possibly involved in the embodiments of the present invention, optionally, taking fig. 4 as an example, the above-mentioned process of acquiring point cloud data may be: the calibration object is inclined by a certain angle and fixed, for example, the calibration object is supported by a thin-leg bracket after being inclined by 45 degrees or is hung by a thin line, and the condition that the inclined calibration object can not be intersected because a certain edge of the inclined calibration object is parallel to the point cloud line can be avoided. The remaining objects, which may interfere with the extraction of the feature points from the point cloud data, are as far as possible not to appear around the calibration object. And moving the multi-line laser radar in a small amplitude, and collecting point cloud data in different directions. In the moving process, each edge in the calibration object can be ensured to be intersected with at least two scanning lines emitted by the multi-line laser radar as far as possible, which is an important precondition for realizing the intersection of each edge and at least two point cloud lines subsequently. On the basis, optionally, in order to ensure the subsequent determination accuracy of the edge coordinates, the size of the calibration object can be made larger, so that each edge can be intersected with more point cloud lines, and the situation that the determination accuracy of the edge coordinates is influenced because certain point cloud data is a noise point can be avoided. In addition, each vertex in the calibration object can be ensured to appear in each frame of point cloud data as far as possible, and the coordinates of the calibration object in the point cloud data of different frames have difference. For example, as shown in fig. 5, a frame of point cloud data scanned by the multi-line laser radar at a certain time includes a plurality of point cloud lines, each point cloud data on each point cloud line belonging to the same color is data scanned based on the same scanning line, and each point cloud data can simultaneously record its coordinate and which scanning line belongs to. The square area (i.e., the dotted area) formed by the cloud lines of the points in fig. 5 is the area where the calibration object is located, and the rest of the cloud lines may be lines reflected by the scanning lines after the scanning lines are applied to the wall outside the calibration object. It should be noted that, in practical applications, the point cloud line shown in fig. 5 may be a color line, which is converted into a gray scale line here. Of course, the image data is acquired similar to point cloud data, such as moving a camera sensor by a small amount, and the calibration object is photographed from different orientations during the movement.

S120, screening calibration data belonging to a calibration object from the cloud data of each point, and respectively determining the edge coordinates of each edge under a radar coordinate system according to the calibration data.

Some point cloud data are data reflected by scanning lines after being hit on the calibration object, and some point cloud data are data reflected by scanning lines hit on other objects outside the calibration object, and the other objects can be walls arranged on one side of the calibration object. Therefore, in order to improve the accuracy of determining the feature points according to the cloud data of each point, calibration data belonging to a calibration object, which is data reflected back by the calibration object, may be first screened from the cloud data of each point. Of course, in order to improve the determination accuracy of each calibration data, optionally, a target range of the calibration object in each frame of point cloud data is determined according to the relative position between the calibration object and the laser radar sensor, and then the calibration data is screened from each point cloud data belonging to the target range.

Further, since each calibration data is point cloud data on the calibration object, the edge coordinates of each edge under the radar coordinate system can be respectively determined according to each calibration data, for example, a plurality of calibration lines formed by each calibration data are obtained, the calibration lines are the point cloud lines described above, and each point cloud data on the calibration lines is calibration data; determining edge data of each calibration line in the scanning direction, the scanning direction being a direction in which the lidar sensor scans, such as a horizontal direction, a vertical direction, and the like, the edge data being calibration data of an extreme edge of the calibration line in the scanning direction, for example, the edge data may be leftmost calibration data and rightmost calibration data on the calibration line when the scanning direction is the horizontal direction, and the edge data may be uppermost calibration data and lowermost calibration data on the calibration line when the scanning direction is the vertical direction,

still further, the edge coordinates of each edge in the radar coordinate system are determined according to the edge data, for example, the edge coordinates of the edge belonging to one side in the radar coordinate system are determined according to the edge data belonging to the one side, and for example, taking the dashed area where the marker is located in fig. 5 as an example, each edge data includes the leftmost calibration data P on each calibration lineleftAnd rightmost calibration data PrightAt each PleftIn (1), a linear equation of the left long side (i.e., the side located at the upper left corner) is fitted based on a Random Sample Consensus algorithm (RANSAC), and is calculated from each PleftEach edge data other than the edge data on the left long side is fitted to the straight of the left short side (i.e., the side located at the lower left corner)Equation of the line, PrightThe processing procedures are similar, linear equations of the right long side and the right short side are obtained in sequence, and the linear equations can express the edge coordinates of each edge under the radar coordinate system.

S130, respectively determining the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining the radar coordinates of the calibration points under the radar coordinate system according to the vertex coordinates and the relative positions of the calibration points in the calibration object.

Since each scanning line is sparse, when a calibration object is scanned based on each scanning line, a situation that a vertex of the calibration object is not scanned may occur, and the vertex is a vertex of a polygon, that is, the above-mentioned edge data is not necessarily point cloud data obtained by scanning the vertex, which means that a vertex coordinate of the vertex in a radar coordinate system may not be directly obtained according to the point cloud data. At this time, each edge coordinate may be fitted first, and then each vertex coordinate in the calibration object is determined according to each edge coordinate, for example, a coordinate at an intersection of each edge determined according to the edge coordinate of each edge is taken as a corresponding vertex coordinate.

Further, the calibration object is provided with a calibration point, which is a preset feature point for calibration, such as a center point of the calibration object, a center point of a curved edge disposed in the calibration object, and the like, so that a relative position of the calibration point in the calibration object is information that can be obtained in advance, and at this time, according to coordinates of each vertex and a relative position of the calibration point in the calibration object, a radar coordinate of the calibration point in a radar coordinate system can be determined, which can be used as one feature point in a feature point pair involved in calibration work.

And S140, taking the camera coordinates and the radar coordinates of the calibration points in the camera coordinate system as a group of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of feature point pairs.

The camera coordinates are coordinates of the calibration point in the camera coordinate system, which may be another feature point of the pair of feature points involved in the calibration work. Determining camera coordinates of the calibration point under a camera coordinate system according to the image data, wherein various implementation schemes exist, such as distortion removal of the image data according to distortion parameters of a camera sensor; edge feature maps are extracted from the image data by using edge extraction operators such as canny and sobe; and finally, detecting lines in the edge feature map by using hough transformation, and determining the camera coordinates of the calibration point according to the detection result. In practical application, optionally, when the calibration object is manufactured, colors with obvious difference can be set on the circle and the polygon, so that the detection precision of hough transformation on the circle can be improved.

It should be noted that, in practical applications, optionally, the image data and the point cloud data acquired at the same time may be taken as a set of data, and the determining steps of the camera coordinates and the radar coordinates are performed in units of each set of data. Moreover, the determination processes of the two are not in strict sequence, and the two can be determined simultaneously or sequentially, which is not specifically limited herein. In addition, if the extraction of the radar coordinates and/or the camera coordinates in a certain set of data fails, the set of data can be skipped and the extraction can be continued from the next set of data.

Thus, the radar coordinates and the camera coordinates are taken as a set of pairs of feature points, i.e., three-dimensional feature points of each calibration point in the radar coordinate system (i.e., radar coordinates q)i) And two-dimensional feature points in the camera coordinate system (i.e., camera coordinates p)i) Is a set of pairs of characteristic points (p)i,qi) In practical applications, the number of calibration points on the calibration object may be at least three, that is, the number of pairs of feature points in each set of data may be at least three. After a plurality of groups of feature point pairs are acquired, the camera coordinate system and the radar coordinate system can be calibrated according to the feature point pairs.

According to the technical scheme of the embodiment of the invention, the point cloud data and the image data of the calibration object containing the polygonal shape, which are acquired at the same time, are respectively acquired, the calibration data belonging to the calibration object are screened out from the point cloud data, and each edge in the polygon is intersected with at least two point cloud lines, so that the edge coordinates of each edge under a radar coordinate system can be respectively determined according to each calibration data; further, the vertex coordinates of each vertex in the calibration object under the radar coordinate system can be determined according to the side coordinates, and the radar coordinates of the calibration point under the radar coordinate system are determined according to the vertex coordinates and the relative position of the calibration point in the calibration object; therefore, the camera coordinates and the radar coordinates of the calibration point in the camera coordinate system can be used as a group of feature point pairs, and calibration between the camera coordinate system and the radar coordinate system can be carried out based on the plurality of groups of feature point pairs. According to the technical scheme, the radar coordinates of the feature points can be determined through the edge coordinates of each edge in the polygon, which means that when each edge in the polygon is intersected with at least two point cloud lines, the feature points can be extracted from the cloud data of each point, the limitation on the size and the placing coordinates of the calibration object is small, the convenience of the calibration object in the aspects of manufacturing and using is improved, and the convenience of the calibration work is further improved.

In an alternative embodiment, the calibration object may include a quadrilateral calibration object, such as a square, rectangle, or parallelogram shaped calibration object; the quadrangle calibration object is internally provided with a curved edge shape tangent with at least two edges, for example, the vicinity of four vertexes of the quadrangle calibration object is respectively provided with a curved edge shape tangent with two edges connected with the vertexes, for example, the rectangle calibration object is internally provided with a curved edge shape tangent with three edges, for example, the square calibration object is internally provided with a curved edge shape tangent with four edges, and the curved edge shape can be a circle, an ellipse, and the like; the index point may be a center point of a curved edge, such as a center of a circle.

On this basis, optionally, if the polygon is a square, the curved edge is a circle, and two edges of the circle connecting the vertices are tangent to each other, as shown in fig. 2, in this case, determining the radar coordinates of the calibration point in the radar coordinate system according to the coordinates of each vertex and the relative position of the calibration point in the calibration object may specifically include: each vertex comprises an upper vertex, a left vertex and a right vertex, and the index point adjacent to the upper vertex is determined by the following formulaRadar coordinate P under radar coordinate systemtop

Wherein, CtopIs the coordinates of the upper vertex point in the radar coordinate system, CleftIs the left vertex coordinate, C, of the left vertex under the radar coordinate systemrightIs the right vertex coordinate of the right vertex in the radar coordinate system, w is the length of the first side in each side, h is the length of the second side perpendicular to the first side in each side, and r is the radius of the circle. To better understand the above formula, taking FIG. 4 as an example, cleft-ctopIndicates the direction in which the side located at the upper left corner is located, (c)left-ctop) The unit vector in this direction is represented by,/w, (c)right-ctop) The meaning of/h is similar; then, at ctopThen the radius is removed along the two unit vectors, and the coordinate P of the center of the circle can be obtainedtop. Of course, the determination process of the radar coordinates of the circle centers adjacent to the other vertexes in the radar coordinate system is similar, and is not described herein again.

It should be noted that the setting manner of the calibration object shown in fig. 2 is only an optional manner, not a unique manner, the extraction process of extracting the feature points from the point cloud data obtained by scanning the calibration object thus set or the image data obtained by shooting is simple, the extraction accuracy is good, and the calibration accuracy of the calibration work can be improved when the calibration is performed according to a plurality of feature points. In general, the number of circles provided in the quadrangle may be three or more because the calibration work can be performed based on at least three feature points.

Example two

Fig. 6 is a flowchart of a laser radar calibration method according to a second embodiment of the present invention. The present embodiment is optimized based on the above technical solutions. In this embodiment, optionally, the screening of calibration data belonging to a calibration object from cloud data of each point may specifically include: screening out plane data belonging to the same plane from the cloud data of each point, and screening out endpoint data from a plane set formed by the plane data; and comparing the first length of the diagonal line formed by the end point data with the second length of the diagonal line corresponding to the calibration object, and screening the calibration data belonging to the calibration object from the cloud data of each point according to the comparison result. The same or corresponding terms as those in the above embodiments are not explained in detail herein.

Referring to fig. 6, the method of this embodiment may specifically include the following steps:

s210, point cloud data and image data which are collected at the same time and contain a calibration object are obtained respectively, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data.

S220, screening out plane data belonging to the same plane from the cloud data of each point, and screening out end point data from a plane set formed by the plane data.

The cloud data of each point is not data belonging to the same plane, some cloud data may be data belonging to a plane where the calibration object is located, some cloud data may be data belonging to a plane where a wall body on one side of the calibration object is located, and the like. Therefore, plane data belonging to the same plane can be screened from the cloud data of each point, for example, plane data is extracted by using a RANSAC algorithm, the plane data may be point cloud data on a plane where a calibration object is located, or point cloud data on other planes, and a point cloud set formed by the plane data is called a plane set. In practical application, optionally, some point cloud data in the plane set may be outliers, which may be considered as noise points, and may be noise points around the plane where the plane set is located, and at this time, the outliers may be removed based on algorithms such as statistical filtering, so that the screening accuracy of subsequent endpoint data is improved.

Further, screening out end point data from a plane set formed by the plane data, wherein the end point data are positioned at extreme ends in the plane setPoint cloud data at a location, e.g. at the top ftopLowermost end fbottomThe leftmost end fleftAnd/or rightmost end frightThe point cloud data of (2). It should be noted that, because each scanning line is sparse, when a calibration object is scanned based on each scanning line, a situation that a vertex of the calibration object is not scanned may occur, that is, the endpoint data is not necessarily point cloud data obtained by scanning the vertex.

S230, comparing the first length of the diagonal line formed by the end point data with the second length of the diagonal line corresponding to the calibration object, screening out calibration data belonging to the calibration object from the cloud data of each point according to the comparison result, and respectively determining the edge coordinate of each edge under the radar coordinate system according to each calibration data.

Because the point cloud data comprises the coordinates of the point cloud data, the first length of the diagonal line formed by the point cloud data can be determined according to the endpoint data. On this basis, the length of the diagonal line of the calibration object is taken as the second length, and the comparison between the first length and the second length can determine whether the same plane corresponding to the plane set is the plane where the calibration object is located, for example, when the difference between the first length and the second length is smaller than a preset length threshold value, the same plane is the plane where the calibration object is located, and at this time, the plane data can be taken as the calibration data belonging to the calibration object. Illustratively, let d be a quadrangular object1=||fleft-fright||、d2=||ftop-fbottom||,d=sqrt(w2+h2) I.e. d is the second length, d1And d2Are all of a first length, if d-a<d1<d + b and d-a<d2<d + b, a and b are parameters, and the same plane is considered as the plane where the calibration object is located.

On this basis, optionally, if it is determined that the same plane is not the plane where the calibration object is located according to the comparison result, the plane data can be removed from the cloud data of each point, and the point cloud data is updated according to the removal result, that is, only the point cloud data of each point in the cloud data of each point except the plane data is retained; repeatedly executing the step of screening out plane data belonging to the same plane from the cloud data of each point until the same plane is the plane where the calibration object is located; and taking the plane data as calibration data belonging to a calibration object. On the basis, optionally, if the quantity of the retained point cloud data is too small, a prompt of calibration data extraction failure can be directly given.

S240, respectively determining the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining the radar coordinates of the calibration points under the radar coordinate system according to the vertex coordinates and the relative positions of the calibration points in the calibration object.

And S250, taking the camera coordinates and the radar coordinates of the calibration points in the camera coordinate system as a group of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of feature point pairs.

According to the technical scheme of the embodiment of the invention, the end point data are screened from the plane set formed by the plane data screened from the point cloud data and belonging to the same plane, and then the first length of the diagonal line formed by the end point data is compared with the second length of the diagonal line corresponding to the calibration object, so that the calibration data belonging to the calibration object can be screened from the point cloud data, and the comparison mode of the lengths of the diagonal lines is simple and efficient, so that the extraction efficiency and the extraction precision of the calibration data are improved.

EXAMPLE III

Fig. 7 is a flowchart of a laser radar calibration method provided in the third embodiment of the present invention. The present embodiment is optimized based on the above technical solutions. In this embodiment, optionally, the calibrating the camera coordinate system and the radar coordinate system based on the plurality of sets of feature point pairs may specifically include: and determining a coordinate change relation from the camera coordinate system to the radar coordinate system based on the plurality of groups of characteristic point pairs, and calibrating the camera coordinate system and the radar coordinate system according to the coordinate change relation. The same or corresponding terms as those in the above embodiments are not explained in detail herein.

Referring to fig. 7, the method of this embodiment may specifically include the following steps:

s310, point cloud data and image data which are collected at the same time and contain a calibration object are obtained respectively, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data.

S320, screening calibration data belonging to a calibration object from the cloud data of each point, and respectively determining the edge coordinates of each edge under a radar coordinate system according to the calibration data.

S330, respectively determining the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining the radar coordinates of the calibration points under the radar coordinate system according to the vertex coordinates and the relative positions of the calibration points in the calibration object.

S340, taking the camera coordinates and the radar coordinates of the calibration points in the camera coordinate system as a group of feature point pairs, determining the coordinate change relationship from the camera coordinate system to the radar coordinate system based on the plurality of groups of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system according to the coordinate change relationship.

Wherein each index point is a three-dimensional feature point in the radar coordinate system (i.e., radar coordinate q)i) And two-dimensional feature points in the camera coordinate system (i.e., camera coordinates p)i) Is a set of pairs of characteristic points (p)i,qi) In practical applications, the number of calibration points on the calibration object may be at least three, that is, the number of pairs of feature points in each set of data may be at least three. And summarizing the characteristic point pairs of each group of data, determining the coordinate change relationship between the radar coordinate system and the camera coordinate system according to the characteristic point pairs, and calibrating the camera coordinate system and the radar coordinate system according to the coordinate change relationship.

It should be noted that the determination process of the coordinate variation relationship may be a typical pnp (robust N points) problem, which may be operated in various ways, such as P3P, Direct Linear Transformation (DLT), epnp (efficient pnp), UPnP, Bundle Adjustment, and so on, and the coordinate variation relationship may be obtained through the operation result, and may be considered as a calibration result.

Here, DLT is taken as an example:

for the feature point pair (p)i,qi) Recording the corresponding homogeneous coordinates as pi=[u,v,1]TAnd q isi=[x,y,z,1]T. Let the internal reference matrix of the camera sensor be

The projection relationship from the radar coordinate system to the camera coordinate system is then

Wherein, R is a three-dimensional rotation matrix, and t is a translation vector, which describe the coordinate change relationship from a radar coordinate system to a camera coordinate system. Unfolding R and t, the result is:

elimination to obtain

That is, for a set of pairs of feature points (p)i,qi) We can get equation Bia is 0. Wherein B isiIs composed ofi,qi) And K, and a is a vector consisting of coefficients of the three-dimensional rotation matrix and the translation vector.

Summarizing all the pairs of feature points, the constraint equation Ba can be obtained as 0. This equation cannot solve for an exact solution, but a least-squares solution argmin Ba survival can be obtained under the constraint of | a | ═ 12

1. Obtaining a matrix B according to all the characteristic point pairs and camera internal parameters;

2. SVD decomposition is performed on B, where [ U Σ V ] ═ SVD (B).

3. Take the last column of the V matrix and record asGet9 components of (a) to form a three-dimensional rotation matrix

4. Performing SVD decomposition on R, having

5. The scaling factor beta is calculated. Beta satisfiesAnd is

6. The optimal rotation matrix R isThe translation vector t is

According to the technical scheme of the embodiment of the invention, the coordinate change relationship from the camera coordinate system to the radar coordinate system is determined based on the plurality of groups of characteristic point pairs, and the camera coordinate system and the radar coordinate system are calibrated according to the coordinate change relationship, so that the effect of accurate calibration between the camera coordinate system and the radar coordinate system is realized.

Example four

Fig. 8 is a block diagram of a lidar calibration apparatus according to a fourth embodiment of the present invention, where the apparatus is configured to execute the lidar calibration method according to any of the embodiments. The device and the laser radar calibration method of each embodiment belong to the same inventive concept, and details which are not described in detail in the embodiment of the laser radar calibration device can refer to the embodiment of the laser radar calibration method. Referring to fig. 8, the apparatus may specifically include: data acquisition module 410, edge coordinate determination module 420, radar coordinate determination module 430, and lidar calibration module 440.

The data acquiring module 410 is configured to acquire point cloud data and image data, which are acquired at the same time and include a calibration object, respectively, where the calibration object is a polygon, each edge of the polygon intersects with at least two point cloud lines, and each point cloud line includes a plurality of point cloud data;

the edge coordinate determination module 420 is configured to screen calibration data belonging to a calibration object from the cloud data of each point, and determine an edge coordinate of each edge in the radar coordinate system according to each calibration data;

a radar coordinate determination module 430, configured to determine, according to the side coordinates, vertex coordinates of each vertex in the calibration object in the radar coordinate system, and determine, according to the vertex coordinates and a relative position of a calibration point in the calibration object, radar coordinates of the calibration point in the radar coordinate system;

and the laser radar calibration module 440 is configured to use the camera coordinates and the radar coordinates of the calibration point in the camera coordinate system as a group of feature point pairs, and calibrate the camera coordinate system and the radar coordinate system based on the plurality of groups of feature point pairs.

Optionally, the edge coordinate determining module 420 may specifically include:

the end point data screening unit is used for screening out plane data belonging to the same plane from the cloud data of each point and screening out end point data from a plane set formed by the plane data; and the calibration data determining unit is used for comparing the first length of the diagonal line formed by the end point data with the second length of the diagonal line corresponding to the calibration object, and screening out the calibration data belonging to the calibration object from the cloud data of each point according to the comparison result.

Optionally, the calibration data determining unit may specifically include:

the point cloud data updating subunit is used for removing the plane data from the point cloud data if the same plane is determined not to be the plane where the calibration object is located according to the comparison result, and updating the point cloud data according to the removal result; the repeated execution subunit is used for repeatedly executing the step of screening out plane data belonging to the same plane from the cloud data of each point until the same plane is the plane where the calibration object is located; and the calibration data determining subunit is used for taking the plane data as the calibration data belonging to the calibration object.

Optionally, the edge coordinate determining module 420 may specifically include: and the edge coordinate determining unit is used for acquiring a plurality of calibration lines formed by the calibration data, determining edge data of each calibration line in the scanning direction, and respectively determining the edge coordinate of each edge in the radar coordinate system according to the edge data.

Optionally, the calibration object includes a quadrilateral calibration object, a curved edge shape tangent to at least two edges is arranged in the quadrilateral calibration object, and the calibration point is a central point of the curved edge shape.

Optionally, the polygon comprises a square, the curved edge comprises a circle, and the circle is tangent to two edges connected with the vertex; accordingly, the radar coordinate determination module 430 may be specifically configured to:

each vertex comprises an upper vertex, a left vertex and a right vertex, and a radar coordinate P of a calibration point adjacent to the upper vertex in a radar coordinate system is determined through the following formulatop

Wherein, CtopIs the coordinates of the upper vertex point in the radar coordinate system, CleftIs the left vertex coordinate, C, of the left vertex under the radar coordinate systemrightIs the right vertex coordinate of the right vertex in the radar coordinate system, w is the length of the first side in each side, h is the length of the second side perpendicular to the first side in each side, and r is the radius of the circle.

Optionally, the laser radar calibration module 440 may specifically include:

and the laser radar calibration unit is used for determining the coordinate change relationship from the camera coordinate system to the radar coordinate system based on the plurality of groups of characteristic point pairs and calibrating the camera coordinate system and the radar coordinate system according to the coordinate change relationship.

In the lidar calibration device provided by the fourth embodiment of the invention, the data acquisition module and the edge coordinate determination module are matched with each other to respectively acquire point cloud data and image data of a calibration object containing a polygonal shape, which are acquired at the same time, and the calibration data belonging to the calibration object is screened out from the point cloud data, and each edge in the polygon is intersected with at least two point cloud lines, which means that the edge coordinate of each edge in a radar coordinate system can be respectively determined according to each calibration data; furthermore, the radar coordinate determination module can determine the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determine the radar coordinates of the calibration point under the radar coordinate system according to the vertex coordinates and the relative position of the calibration point in the calibration object; therefore, the laser radar calibration module can take the camera coordinates and the radar coordinates of the calibration point in the camera coordinate system as a group of characteristic point pairs and calibrate the camera coordinate system and the radar coordinate system based on the plurality of groups of characteristic point pairs. According to the device, the radar coordinates of the feature points can be determined through the edge coordinates of each edge in the polygon, which means that when each edge in the polygon is intersected with at least two point cloud lines, the feature points can be extracted from the cloud data of each point, the limitation on the size and the placing position of the calibration object is small, the convenience of the calibration object in the aspects of manufacturing and using is improved, and the convenience of calibration work is further improved.

The laser radar calibration device provided by the embodiment of the invention can execute the laser radar calibration method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.

It should be noted that, in the embodiment of the lidar calibration apparatus, each included unit and module are only divided according to functional logic, but are not limited to the above division, as long as corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.

EXAMPLE five

Fig. 9 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention, as shown in fig. 9, the electronic device includes a memory 510, a processor 520, an input device 530, and an output device 540. The number of the processors 520 in the electronic device may be one or more, and one processor 520 is taken as an example in fig. 9; the memory 510, processor 520, input device 530, and output device 540 in the electronic device may be connected by a bus or other means, such as by bus 550 in fig. 9.

Memory 510 serves as a computer-readable storage medium that may be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to lidar calibration methods in embodiments of the invention (e.g., data acquisition module 410, edge coordinate determination module 420, radar coordinate determination module 430, and lidar calibration module 440 in a lidar calibration apparatus). Processor 520 implements the lidar calibration method described above by executing software programs, instructions, and modules stored in memory 510 to perform various functional applications of the electronic device and lidar calibration.

The memory 510 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 510 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, memory 510 may further include memory located remotely from processor 520, which may be connected to devices through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.

The input device 530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the device. The output device 540 may include a display device such as a display screen.

EXAMPLE six

An embodiment of the present invention provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a lidar calibration method, where the method includes:

respectively acquiring point cloud data and image data which are acquired at the same time and contain a calibration object, wherein the calibration object is a polygon, each edge of the polygon is intersected with at least two point cloud lines, and the point cloud lines comprise a plurality of point cloud data; screening calibration data belonging to a calibration object from the cloud data of each point, and respectively determining the edge coordinates of each edge under a radar coordinate system according to the calibration data; determining the vertex coordinates of each vertex in the calibration object under the radar coordinate system according to the side coordinates, and determining the radar coordinates of the calibration points under the radar coordinate system according to the vertex coordinates and the relative positions of the calibration points in the calibration object; and taking the camera coordinates and the radar coordinates of the calibration points in the camera coordinate system as a group of feature point pairs, and calibrating the camera coordinate system and the radar coordinate system based on the plurality of groups of feature point pairs.

Of course, the storage medium provided by the embodiments of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the operations of the method described above, and may also perform related operations in the laser radar calibration method provided by any embodiment of the present invention.

From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. With this understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.

It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:坐标关系的标定方法、装置、设备及介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!