Method for correcting laser radar point cloud data motion distortion based on integrated navigation system

文档序号:1427900 发布日期:2020-03-17 浏览:15次 中文

阅读说明:本技术 一种基于组合导航系统矫正激光雷达点云数据运动畸变的方法 (Method for correcting laser radar point cloud data motion distortion based on integrated navigation system ) 是由 *** 叶扬青 王在满 何杰 方龙羽 梁展豪 何思禹 许�鹏 刘顺财 于 2019-12-03 设计创作,主要内容包括:本发明公开一种基于组合导航系统矫正激光雷达点云运动畸变的方法,利用时间同步器,将组合导航系统的经纬度、高程、姿态角、速度、加速度、角速度、角加速度和时间戳与激光雷达点云数据、时间戳进行融合,利用旋转和平移,将一帧激光雷达点云数据矫正至相同时间点,以矫正激光雷达点云数据运动畸变,稳定、有效地减小由无人船运动导致的激光雷达点云数据误差。本发明方法能为后续的无人船障碍物检测、避障等提供更加精确的激光雷达点云数据。(The invention discloses a method for correcting laser radar point cloud motion distortion based on a combined navigation system, which fuses longitude and latitude, elevation, attitude angle, speed, acceleration, angular velocity, angular acceleration and time stamp of the combined navigation system with laser radar point cloud data and time stamp by using a time synchronizer, corrects a frame of laser radar point cloud data to the same time point by using rotation and translation so as to correct laser radar point cloud data motion distortion, and stably and effectively reduces laser radar point cloud data errors caused by unmanned ship motion. The method can provide more accurate laser radar point cloud data for subsequent unmanned ship obstacle detection, obstacle avoidance and the like.)

1. A method for correcting laser radar point cloud data motion distortion based on an integrated navigation system is characterized by comprising the following steps:

step S1, reading the combined navigation system data and the laser radar point cloud data at the same time by using a time synchronizer, and then entering step S2;

step S2, calculating the horizontal angle of a laser spot of a laser radar point cloud data frame head and the horizontal angle of a laser spot of a laser radar point cloud data frame tail, and then entering step S3;

step S3, calculating the time required from the current laser point of the laser radar point cloud data to the laser point at the end of the laser radar point cloud data frame, and then entering step S4;

step S4, judging whether the combined navigation system data and the laser radar point cloud data exist at the last moment, if so, entering S5; if not, go to step S6;

step S5, utilizing the calculated time from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, and combining the navigation system data at the previous moment and the current moment to endow the laser radar point cloud data with motion information for correction;

step S6, using the calculated time from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, and using the current time to combine the navigation system data, and endowing the laser radar point cloud data with the motion information for correction;

and the laser radar point cloud data comprises point cloud coordinates and the frame time stamp.

2. The method for correcting the motion distortion of the lidar point cloud data based on the integrated navigation system of claim 1, wherein the integrated navigation system data of the step S1 comprises one or more of latitude and longitude, elevation, attitude angle, velocity, acceleration, angular velocity, angular acceleration and time stamp.

3. The method for correcting the laser radar point cloud data motion distortion based on the integrated navigation system as claimed in claim 1, wherein the time synchronizer of step S1 utilizes the timestamp in the laser radar point cloud data and the timestamp in the integrated navigation system data to perform the comparison and pairing, so as to obtain two sets of data at the same time.

4. The method for correcting the motion distortion of the lidar point cloud data based on the integrated navigation system of claim 1, wherein the horizontal angle of the laser point of the lidar point cloud data frame at step S2 is an angle calculated from an x value and a y value of a first laser point in the lidar point cloud data;

the formula adopted by the calculation is as follows:

Figure FDA0002301130050000021

wherein: theta is the horizontal angle of a laser spot of a laser radar point cloud data frame header, and x and y are coordinate values of an x axis and coordinate values of a y axis of the laser spot of the laser radar point cloud data frame header;

the horizontal angle of a laser spot at the tail of the laser radar point cloud data frame is the horizontal angle level of the frame head plus the angular resolution, and the internal rotating motor of the laser radar is required to rotate clockwise.

5. The method for correcting the motion distortion of the point cloud data of the lidar based on the integrated navigation system of claim 1, wherein step S3 is performed by calculating the time required from the current laser point of the point cloud data of the lidar to the laser point of the point cloud data of the lidar based on the horizontal angle between the current laser point of the point cloud data of the lidar and the tail laser point of the point cloud data frame of the lidar and the rotation frequency of the rotating motor inside the lidar, and the calculation is performed according to the following steps:

step S3.1, calculating the time required by the internal rotating motor of the laser radar to rotate by 1 degree, and calculating the formula:

Figure FDA0002301130050000022

wherein: Δ t1The time required for rotating the internal rotating motor of the laser radar by 1 degree is added, and then S3.2 is carried out;

s3.2, calculating the horizontal angle of the current laser point of the laser radar point cloud data,

calculating the formula:

Figure FDA0002301130050000023

wherein: theta is the horizontal angle of the current laser point of the laser radar point cloud data, x and y are the coordinate values of the x axis and the y axis of the current laser point of the laser radar point cloud data, and then S3.3 is carried out;

s3.3, calculating an included angle between the current laser point horizontal angle of the laser radar point cloud data and the laser point horizontal angle at the tail of the laser radar point cloud data frame,

calculating the formula:

Δθ=360°+θ10

wherein: delta theta is an included angle between the current laser point horizontal angle of the laser radar point cloud data and the laser point horizontal angle at the tail of the laser radar point cloud data frame, and theta1For laser radar point cloud data, the current laser point horizontal angle theta0The horizontal angle of the laser spot at the tail of the laser radar point cloud data frame is obtained, and then S3.4 is carried out;

s3.4, calculating the time required from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data,

calculating the formula:

Δt=Δt1*Δθ

wherein: delta t is the time from the current laser point of the point cloud data of the laser radar to the laser point at the tail of the point cloud data frame of the laser radar, delta t1The time required for rotating a rotating motor in the laser radar by 1 degree is obtained, and delta theta is an included angle between the current laser point horizontal angle of the laser radar point cloud data and the laser point horizontal angle at the tail of a laser radar point cloud data frame.

6. The method for correcting the laser radar point cloud data motion distortion based on the integrated navigation system as claimed in claim 1, wherein the step S5 is to apply the motion information to the laser radar point cloud data for correction by using the integrated navigation system data at the previous time and the current time, and the specific steps are as follows:

step S5.1, the acceleration and the angular acceleration of the navigation system are combined according to the previous moment and the current moment, the jerk (jerk) and the angular jerk in the time period are calculated,

calculating the formula:

Figure FDA0002301130050000031

wherein: j is the acceleration or angular acceleration of the unmanned ship, a1Acceleration or angular acceleration of the unmanned ship at the present moment, a0Acceleration or angular acceleration of the unmanned ship at the previous moment, t1Time of the current time, t0The time of the previous moment; then entering S5.2;

s5.2, taking the coordinate origin of the laser radar point cloud data frame tail laser point as the coordinate origin of a global coordinate system, and then turning to S5.3;

and S5.3, calculating the translation amount and the rotation amount according to the time required from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, and converting the current laser point of the laser radar point cloud data into a global coordinate system by using the calculated acceleration and angular acceleration of the unmanned ship, the angular velocity, speed, angular acceleration, acceleration of the unmanned ship and the corresponding time.

7. The method for correcting the laser radar point cloud data motion distortion based on the integrated navigation system as claimed in claim 6, wherein in step S5.3, the computed acceleration and angular acceleration of the unmanned ship, the angular velocity, angular acceleration, acceleration and corresponding time of the unmanned ship are used to convert the laser radar point cloud data to the point cloud data with the origin of the laser spot at the end of the laser radar point cloud data frame as the origin under the global coordinate system, and the specific steps are as follows:

step S5.3.1, calculating the translation of x-axis, y-axis and z-axis by using the time required from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data and the speed, acceleration and jerk of the unmanned ship,

calculating the formula:

Figure FDA0002301130050000041

wherein:

Figure FDA0002301130050000042

step S5.3.2, calculating the unmanned ship attitude angle of the current laser point of the laser radar point cloud data by using the time required from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data and the angular velocity, the angular acceleration and the angular jerk of the unmanned ship,

calculating the formula:

Figure FDA0002301130050000043

wherein: theta is the attitude angle of the unmanned ship at the current laser point of the laser radar point cloud data, theta1The unmanned ship attitude angle is a laser radar point cloud data frame tail laser point, omega is the unmanned ship angular velocity, a is the unmanned ship angular acceleration, and j is the unmanned ship angular jerk; respectively utilizing angular velocities, angular accelerations and angular jerks of the unmanned ship on an x axis, a y axis and a z axis to calculate the unmanned ship attitude angle of the unmanned ship under the current laser point of the laser radar point cloud data, and then turning to S5.3.3;

step S5.3.3, calculating the rotation quantity of the unmanned ship in the x axis, y axis and z axis by using the attitude angle of the unmanned ship,

calculating the formula:

Figure FDA0002301130050000051

Figure FDA0002301130050000053

wherein: ryawFor the rotation amount of the unmanned ship in the z-axis direction, RpitchThe rotation amount of the unmanned ship in the y-axis direction,Rrollis the rotation amount of the unmanned ship in the direction of the z axis, thetax、θy、θzRotating the unmanned ship by the rotation angles of the x axis, the y axis and the z axis, and then turning to S5.3.4;

step S5.3.4, rotating the laser radar point cloud data current laser point to the horizontal attitude by using the rotation amount of the unmanned ship x axis, y axis and z axis,

calculating the formula:

Figure FDA0002301130050000054

wherein: ryawFor the rotation amount of the unmanned ship in the z-axis direction, RpitchFor the rotation amount of the unmanned ship in the y-axis direction, RpitchThe rotation amount of the unmanned ship in the direction of the z axis,

Figure FDA0002301130050000055

step S5.3.5, using the translation of x-axis, y-axis and z-axis to translate the laser point from horizontal position to global coordinate system,

calculating the formula:

Figure FDA0002301130050000057

wherein:

Figure FDA0002301130050000058

8. The method for correcting the laser radar point cloud data motion distortion based on the integrated navigation system as claimed in claim 1, wherein step S6 is implemented by using the integrated navigation system data at the current time to give the motion information to the laser radar point cloud data for correction, and the specific steps are as follows:

s6.1, taking the coordinate origin of the laser radar point cloud data frame tail laser point as the coordinate origin of a global coordinate system, and then turning to S6.2;

and S6.2, calculating the translation amount and the rotation amount according to the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame, and converting the current laser point of the laser radar point cloud data into a global coordinate system by utilizing the angular speed, the angular acceleration and the acceleration of the known unmanned ship and the corresponding time.

9. The method for correcting the motion distortion of the lidar point cloud data based on the integrated navigation system of claim 8, wherein the step S6.2 is to convert the lidar point cloud data to an origin point under a global coordinate system by using the angular velocity, angular acceleration, acceleration and corresponding time of the unmanned ship, and the specific steps are as follows:

step S6.2.1, calculating the translation amount of the x-axis, the y-axis and the z-axis by using the time required by the laser radar point cloud data from the current laser point to the laser point at the frame tail of the laser radar point cloud data, the speed and the acceleration of the unmanned ship, and calculating the formula:

Figure FDA0002301130050000062

wherein:

Figure FDA0002301130050000063

step S6.2.2, calculating the attitude angle of the unmanned ship at the current laser point of the laser radar point cloud data by using the time required from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, the angular velocity and the angular acceleration of the unmanned ship,

calculating the formula:

Figure FDA0002301130050000071

wherein: theta is the attitude angle of the unmanned ship at the current laser point of the laser radar point cloud data, theta1The unmanned ship attitude angle of the unmanned ship under the laser point of the laser radar point cloud data is calculated by respectively utilizing the angular velocity, the angular acceleration and the angular jerk of the unmanned ship on the x axis, the y axis and the z axis of the unmanned ship, and then the unmanned ship attitude angle is turned to S6.2.3.

Step S6.2.3, calculating the rotation amount of the unmanned ship in the x axis, y axis and z axis by using the attitude angle of the unmanned ship, and calculating the formula:

Figure FDA0002301130050000072

Figure FDA0002301130050000073

Figure FDA0002301130050000074

wherein: ryawFor the rotation amount of the unmanned ship in the z-axis direction, RpitchFor the rotation amount of the unmanned ship in the y-axis direction, RrollIs the rotation amount of the unmanned ship in the direction of the z axis, thetax、θy、θzRotating to S6.2.4 for the rotation angles of the unmanned ship on the x axis, the y axis and the z axis;

step S6.2.4, rotating the laser radar point cloud data current laser point to the horizontal attitude by using the rotation amount of the unmanned ship x axis, y axis and z axis,

calculating the formula:

wherein: ryawFor the rotation amount of the unmanned ship in the z-axis direction, RpitchFor the rotation amount of the unmanned ship in the y-axis direction, RpitchThe rotation amount of the unmanned ship in the direction of the z axis,

Figure FDA0002301130050000081

step S6.2.5, using the translation of x-axis, y-axis and z-axis to translate the laser point from horizontal position to global coordinate system,

calculating the formula:

Figure FDA0002301130050000083

wherein:

Figure FDA0002301130050000084

10. A system for correcting laser radar point cloud data motion distortion based on a combined navigation system, which comprises a flow module for implementing the method according to any one of claims 1 to 10.

Technical Field

The invention relates to the technical field of unmanned ship automatic driving, in particular to correction of laser radar point cloud data motion distortion, and more particularly relates to a method for correcting laser radar point cloud data motion distortion based on an integrated navigation system.

Background

There are many sensors for realizing environment perception in unmanned ships, such as laser radar, infrared sensor, ultrasonic sensor, sonar, monocular camera, binocular camera, and the like. The laser radar is good at short-distance obstacle detection, high in depth resolution capability and precision, and free of influences of external illumination and the like. The performance in various water environments is more stable and accurate. Therefore, laser radar has attracted attention as a main means for detecting obstacles in the field of unmanned ship autopilot. The laser radar transmits laser to the obstacle, and calculates the distance and the reflection intensity by using signals reflected by the obstacle to generate a point cloud map.

However, the laser radar drives the laser transmitter by using an internal rotating motor. There is a process of generating lidar point cloud data rather than laser points at the same time. For example, chinese patent application CN109975792A discloses a method for correcting motion distortion of a multi-line lidar point cloud based on fusion of multiple sensors, which transforms displacement distortion to an initial point coordinate system according to a rotation relationship between a global coordinate system and a certain frame initial point coordinate system of the lidar, and then transforms points under a local coordinate system of the lidar to the initial point coordinate system, thereby obtaining all points of the point cloud under the certain frame initial point coordinate system and corresponding displacement distortion. The point cloud data is corrected by compensating for displacement distortion on the three-dimensional point coordinates. And more accurate point cloud data of the multi-line laser radar are provided for subsequent algorithms such as target tracking, path planning, map construction, object identification and the like. Although this document relates to a method for correcting the point cloud motion distortion of a multiline lidar, it is mainly directed to an unmanned vehicle, and it is necessary to calculate the speeds of four wheels and the like. And the unmanned ship and the laser radar are in rigid connection, so that the laser radar can rotate and translate along with the unmanned ship, and the data received by the laser radar are not data at the same position and attitude and cannot be unified under a coordinate system. Currently, this problem is faced with the fact that the lidar point cloud data is not usually corrected or is only considered to be in uniform motion. This has a relatively small impact in the field of unmanned vehicle autopilot, but can produce large measurement errors in the field of unmanned ship autopilot. In the field of unmanned vehicle automatic driving, the roll angle and the pitch angle of the unmanned vehicle are very small, while in the field of unmanned ship automatic driving, the change range of the attitude angle of the unmanned ship is very large, and the measurement error of the laser radar is influenced.

Therefore, the research shows that the laser radar point cloud data error caused by the movement of the unmanned ship can be stably and effectively reduced. And more accurate laser radar point cloud data are provided for subsequent unmanned ship obstacle detection, obstacle avoidance and the like.

Disclosure of Invention

Aiming at the technical problems in the prior art, the invention aims to: the method for correcting the laser radar point cloud data motion distortion based on the integrated navigation system is provided, and laser radar point cloud data errors caused by unmanned ship motion of the laser radar are reduced.

In order to achieve the purpose, the invention adopts the following technical scheme:

a method for correcting laser radar point cloud data motion distortion based on an integrated navigation system is disclosed, wherein the laser radar point cloud data comprises a point cloud coordinate and a time stamp of the frame, and the method is characterized by comprising the following steps:

and step S1, reading the combined navigation system data and the laser radar point cloud data at the same time by using the time synchronizer. Then proceeds to step S2;

and step S2, calculating the horizontal angle of the laser spot of the head of the laser radar point cloud data frame and the horizontal angle of the laser spot of the tail of the laser radar point cloud data frame. Then proceeds to step S3;

and step S3, calculating the time required from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data. Then proceeds to step S4;

step S4, judging whether the combined navigation system data and the laser radar point cloud data exist at the last moment, if so, entering S5, and if not, entering step S6;

step S5, utilizing the calculated time from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, and combining the navigation system data at the previous moment and the current moment to endow the laser radar point cloud data with motion information for correction;

and step S6, utilizing the calculated time from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, and combining the navigation system data at the current moment to endow the laser radar point cloud data with motion information for correction.

Preferably, the combined navigation system data of step S1 includes latitude and longitude, elevation, attitude angle, velocity, acceleration, angular velocity, angular acceleration, and time stamp.

Preferably, the time synchronizer of step S1 is configured to obtain two sets of data at the same time by performing a comparison pairing using a timestamp in the lidar point cloud data and a timestamp in the combined navigation system data.

Preferably, the method is characterized in that the laser spot horizontal angle of the laser radar point cloud data frame header of the step S2 is an angle calculated by the x value and the y value of the first laser spot in the laser radar point cloud data

Calculating the formula:

wherein: and theta is the horizontal angle of the laser point of the laser radar point cloud data frame header. And x and y are coordinate values of an x axis and coordinate values of a y axis of a laser spot of the laser radar point cloud data frame head.

The horizontal angle of the laser spot at the tail of the laser radar point cloud data frame is the horizontal angle level of the frame head plus the angular resolution. The horizontal angular resolution is a laser radar fixed parameter, and the internal rotating motor of the laser radar rotates clockwise.

Preferably, the method is characterized in that the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame is calculated in step S3. Because the time required by the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame cannot be directly measured, the time required by the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame is calculated by utilizing the horizontal included angle between the current laser point of the laser radar point cloud data and the laser point at the tail of the laser radar point cloud data frame and the rotation frequency of a rotating motor in the laser radar. The calculation method of the time required from the current laser point to the frame tail laser point of the laser radar point cloud data comprises the following steps:

step S3.1, calculating the time required by the internal rotating motor of the laser radar to rotate by 1 degree

Calculating the formula:

Figure BDA0002301130060000032

wherein: Δ t1The time required for the internal rotating motor of the laser radar to rotate by 1 degree is saved. Then step S3.2 is carried out;

and S3.2, calculating the horizontal angle of the current laser point of the laser radar point cloud data.

Calculating the formula:

Figure BDA0002301130060000033

wherein: and theta is the horizontal angle of the current laser point of the laser radar point cloud data. And x and y are coordinate values of the x axis and the y axis of the current laser point of the laser radar point cloud data. Then entering S3.3;

step S3.3, calculating the included angle between the current laser point horizontal angle of the laser radar point cloud data and the laser point horizontal angle at the tail of the laser radar point cloud data frame

Calculating the formula:

Δθ=360°+θ10

wherein: delta theta is the laser radar point and the current laser point horizontal angle of the laser radar point cloud dataAnd the included angle between the horizontal angles of the laser points at the tail of the cloud data frame. Theta1And obtaining the current laser point horizontal angle of the laser radar point cloud data. Theta0And the horizontal angle of the laser spot at the tail of the laser radar point cloud data frame is obtained. Then S3.4 is entered;

s3.4, calculating the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame

Calculating the formula:

Δt=Δt1*Δθ

wherein: and delta t is the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame. Δ t1The time required for the internal rotating motor of the laser radar to rotate by 1 degree is saved. And delta theta is an included angle between the current laser point horizontal angle of the laser radar point cloud data and the tail laser point horizontal angle of the laser radar point cloud data frame.

Preferably, the method is characterized in that in step S5, the navigation system data is combined with the previous time and the current time to give the motion information to the lidar point cloud data for correction, and the method specifically comprises the following steps:

step S5.1, calculating the jerk (jerk) and angular jerk in the time period according to the acceleration and angular acceleration of the combined navigation system at the previous moment and the current moment

Calculating the formula:

wherein: j is the unmanned ship jerk or angular jerk. a is1The acceleration or angular acceleration of the unmanned ship at the current moment. a is0The unmanned ship acceleration or angular acceleration at the previous moment. t is t1Is the current time of day. t is t0The previous time. Then step S5.2 is carried out;

and S5.2, taking the coordinate origin of the laser radar point cloud data frame tail laser point as the coordinate origin of the global coordinate system. Then go to step S5.3;

and S5.3, calculating the translation amount and the rotation amount according to the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame. And converting the current laser point of the laser radar point cloud data into a global coordinate system by using the calculated acceleration and angular acceleration of the unmanned ship, the angular velocity, speed, angular acceleration, acceleration and corresponding time of the unmanned ship.

Preferably, the method is characterized in that step S5.3 converts the laser radar point cloud data into an origin point under a global coordinate system by using the calculated acceleration and angular acceleration of the unmanned ship, the angular velocity, angular acceleration, acceleration and corresponding time of the unmanned ship. The method comprises the following specific steps:

step S5.3.1, calculating the translation of x-axis, y-axis and z-axis according to the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame and the speed, acceleration and jerk of the unmanned ship

Calculating the formula:

Figure BDA0002301130060000051

wherein:

Figure BDA0002301130060000052

the translation amount required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame. v. ofx、vy、vzThe speed of the unmanned ship in the x-axis direction, the y-axis direction and the z-axis direction at the current moment. a isx、ay、azThe acceleration of the unmanned ship in the x-axis direction, the y-axis direction and the z-axis direction at the current moment is shown. j is a function ofx、jy、jzAnd adding acceleration in the directions of the x axis, the y axis and the z axis of the unmanned ship at the current moment. And delta t is the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame. Then go to step S5.3.2;

and S5.3.2, calculating the attitude angle of the unmanned ship at the current laser point of the laser radar point cloud data by using the time required from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, and the angular velocity, the angular acceleration and the angular jerk of the unmanned ship.

Calculating the formula:

Figure BDA0002301130060000053

wherein: and theta is the attitude angle of the unmanned ship at the current laser point of the laser radar point cloud data. Theta1And forming an unmanned ship attitude angle for a laser radar point cloud data frame tail laser point. Omega is the angular velocity of the unmanned ship. and a is the angular acceleration of the unmanned ship. j is the angular jerk of the unmanned ship. And respectively utilizing angular velocities, angular accelerations and angular jerks on an x axis, a y axis and a z axis of the unmanned ship to calculate the unmanned ship attitude angle of the unmanned ship under the current laser point of the laser radar point cloud data. Then go to step S5.3.3;

step S5.3.3, calculating the rotation amount of the unmanned ship in x-axis, y-axis and z-axis by using the attitude angle of the unmanned ship

Calculating the formula:

Figure BDA0002301130060000061

Figure BDA0002301130060000062

Figure BDA0002301130060000063

wherein: ryawThe rotation amount of the unmanned ship in the z-axis direction is obtained. RpitchThe rotation amount of the unmanned ship in the y-axis direction. RrollThe rotation amount of the unmanned ship in the z-axis direction is obtained. Thetax、θy、θzThe rotation angles of the unmanned ship are the rotation angles of the x axis, the y axis and the z axis. Then go to step S5.3.4;

step S5.3.4, rotating the laser radar point cloud data current laser point to the horizontal attitude by utilizing the rotation amount of the unmanned ship in the x axis, the y axis and the z axis

Calculating the formula:

Figure BDA0002301130060000064

wherein: ryawThe rotation amount of the unmanned ship in the z-axis direction is obtained. RpitchThe rotation amount of the unmanned ship in the y-axis direction. RrollThe rotation amount of the unmanned ship in the z-axis direction is obtained.

Figure BDA0002301130060000065

And the current laser point is the laser radar point cloud data.

Figure BDA0002301130060000066

Is a laser spot in a horizontal attitude. Go to step S5.3.5;

step S5.3.5, translating the laser point from the horizontal posture to the global coordinate system by the translation of the x-axis, the y-axis and the z-axis

Calculating the formula:

Figure BDA0002301130060000071

wherein:

Figure BDA0002301130060000072

and removing the laser point with motion distortion for the laser radar point cloud data.

Figure BDA0002301130060000073

Is a laser spot in a horizontal attitude.

Figure BDA0002301130060000074

The translation amounts of the current laser point of the laser radar point cloud data to the x axis, the y axis and the z axis of the laser point of the laser radar point cloud data frame tail are obtained.

Preferably, the method is characterized in that S6 combines navigation system data at the current time, and gives motion information to laser radar point cloud data for correction, and the method specifically comprises the following steps:

and S6.1, taking the coordinate origin of the laser radar point cloud data frame tail laser point as the coordinate origin of the global coordinate system. Then go to S6.2;

and S6.2, calculating the translation amount and the rotation amount according to the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame. And converting the current laser point of the laser radar point cloud data into a global coordinate system by using the known angular velocity, speed, angular acceleration, acceleration and corresponding time of the unmanned ship.

Preferably, the method is characterized in that step S6.2 uses the angular velocity, the angular acceleration, the acceleration and the corresponding time of the unmanned ship to convert the laser radar point cloud data into the origin point of the laser point at the tail of the laser radar point cloud data frame as the origin point under the global coordinate system. The method comprises the following specific steps:

step S6.2.1, calculating the translation of x-axis, y-axis and z-axis according to the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame and the speed and acceleration of the unmanned ship

Calculating the formula:

Figure BDA0002301130060000075

wherein:

Figure BDA0002301130060000076

the translation amount required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame. v. ofx、vy、vzThe speed of the unmanned ship in the x-axis direction, the y-axis direction and the z-axis direction at the current moment. a isx、ay、azThe acceleration of the unmanned ship in the x-axis direction, the y-axis direction and the z-axis direction at the current moment is shown. And delta t is the time required from the current laser point of the laser radar point cloud data to the laser point at the tail of the laser radar point cloud data frame. Then go to step S6.2.2;

s6.2.2, calculating the attitude angle of the unmanned ship at the current laser point of the laser radar point cloud data by using the time from the current laser point of the laser radar point cloud data to the laser point at the frame tail of the laser radar point cloud data, the angular velocity and the angular acceleration of the unmanned ship

Calculating the formula:

Figure BDA0002301130060000081

wherein: and theta is the attitude angle of the unmanned ship at the current laser point of the laser radar point cloud data. Theta1And forming an unmanned ship attitude angle for a laser radar point cloud data frame tail laser point. Omega is the angular velocity of the unmanned ship. and a is the angular acceleration of the unmanned ship. j is the angular jerk of the unmanned ship. And respectively utilizing angular velocities, angular accelerations and angular jerks on an x axis, a y axis and a z axis of the unmanned ship to calculate the unmanned ship attitude angle of the unmanned ship under the current laser point of the laser radar point cloud data. And then to S6.2.3.

Step S6.2.3, calculating the rotation amount of the unmanned ship in x-axis, y-axis and z-axis by using the attitude angle of the unmanned ship

Calculating the formula:

Figure BDA0002301130060000082

Figure BDA0002301130060000083

wherein: ryawThe rotation amount of the unmanned ship in the z-axis direction is obtained. RpitchThe rotation amount of the unmanned ship in the y-axis direction. RpiychThe rotation amount of the unmanned ship in the z-axis direction is obtained. Thetax、θy、θzThe rotation angles of the unmanned ship are the rotation angles of the x axis, the y axis and the z axis. Turning to S6.2.4;

step S6.2.4, rotating the laser radar point cloud data current laser point to the horizontal attitude by utilizing the rotation amount of the unmanned ship in the x axis, the y axis and the z axis

Calculating the formula:

Figure BDA0002301130060000091

wherein:Ryawthe rotation amount of the unmanned ship in the z-axis direction is obtained. RpitchThe rotation amount of the unmanned ship in the y-axis direction. RrollThe rotation amount of the unmanned ship in the z-axis direction is obtained.

Figure BDA0002301130060000092

And the current laser point is the laser radar point cloud data.

Figure BDA0002301130060000093

Is a laser spot in a horizontal attitude. Turning to S6.2.5;

step S6.2.5, translating the laser point from the horizontal posture to the global coordinate system by the translation of the x-axis, the y-axis and the z-axis

Calculating the formula:

Figure BDA0002301130060000094

wherein:

Figure BDA0002301130060000095

and removing the laser point with motion distortion for the laser radar point cloud data.Is a laser spot in a horizontal attitude.

Figure BDA0002301130060000097

The translation amounts of the current laser point of the laser radar point cloud data to the x axis, the y axis and the z axis of the laser point of the laser radar point cloud data frame tail are obtained.

In addition, the invention provides a system for correcting the laser radar point cloud data motion distortion based on the integrated navigation system, which comprises a flow module for realizing the method, and can be realized by compiling a computer program for example, so that the aim of automatic calculation is fulfilled.

Overall, the advantages of the invention are: the navigation system data and the laser radar point cloud data are combined for fusion, and the point cloud data recorded at different times are corrected and unified to a global coordinate system by using the speed acceleration information, so that more accurate and effective laser radar point cloud data are obtained.

Drawings

FIG. 1 is a flow chart of a method for correcting laser radar point cloud data motion distortion based on a combined navigation system.

Fig. 2 is a flow chart for calculating the time required from the current laser point to the end-of-frame laser point of the laser radar point cloud data.

FIG. 3 is a flow chart of a correction process for point cloud data assigned to a lidar based on motion information, wherein FIG. 3-1 shows the correction process when combined navigation system data and lidar point cloud data were present at the previous time; fig. 3-2 is a rectification flowchart when the navigation system data and the lidar point cloud data were not combined at the previous time.

FIG. 4 is a flowchart of a current laser point of the lidar point cloud data being converted to a global coordinate system, where FIG. 4-1 is a flowchart of a current laser point of the lidar point cloud data being converted to a global coordinate system corresponding to FIG. 3-1;

FIG. 4-2 is a flow chart of the conversion of the current laser point of the lidar point cloud data to the global coordinate system corresponding to FIG. 3-1.

Fig. 5 is a lake picture obtained by laser radar point cloud data.

Fig. 6 is raw lidar point cloud data.

FIG. 7 is point cloud data after laser radar point cloud motion distortion is corrected based on the combined navigation system.

Detailed Description

The present invention will be described in further detail with reference to specific embodiments.

26页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:目标体探测方法与装置及目标体温度探测方法与装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!