Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle

文档序号:1935217 发布日期:2021-12-07 浏览:14次 中文

阅读说明:本技术 基于多传感器融合无人车的相对定位方法、系统及车辆 (Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle ) 是由 闫耀威 王宽 林鑫余 彭祥军 于 2021-08-05 设计创作,主要内容包括:本发明公开了一种基于多传感器融合无人车的相对定位方法、系统及车辆,包括以下步骤:(1)数据接入;(2)定位初始化;(3)IMU积分推算;(4)地图数据库匹配;(5)融合定位建模;(6)融合定位求解。本发明降低了累计误差对定位结果的影响,以及有效地减少了匹配过程中出现的误匹配结果。(The invention discloses a relative positioning method, a relative positioning system and a relative positioning vehicle based on a multi-sensor fusion unmanned vehicle, wherein the relative positioning method comprises the following steps of: (1) data access; (2) positioning initialization; (3) IMU integral calculation; (4) map database matching; (5) fusing, positioning and modeling; (6) and (5) fusion positioning solving. The invention reduces the influence of accumulated errors on the positioning result and effectively reduces the mismatching result in the matching process.)

1. A relative positioning method based on a multi-sensor fusion unmanned vehicle is characterized by comprising the following steps:

step 1, data access;

receiving data output by a GPS, an IMU, a wheel speed meter and a laser radar in real time, attaching timestamp information of the current moment to the data of each frame, and respectively storing the timestamp information into corresponding cache regions;

step 2, positioning initialization;

popping up two frames of GPS data from a GPS cache region, judging whether the vehicle moves or not based on the two frames of GPS data, and if so, calculating the position P, the speed V and the course angle of the current vehicle;

step 3, IMU integral calculation;

obtaining IMU states (R, p, V, ba and bg) of each IMU moment by using linear velocity and angular velocity integration of the IMU according to an IMU dynamic formula, wherein R represents attitude rotation quantity, p represents translation quantity, ba represents deviation of a gyroscope, and bg represents deviation of an accelerometer;

distortion correction is carried out on the real-time laser radar point cloud according to the historical positioning attitude and a pose (R, p) obtained by IMU integration;

step 4, map database matching;

according to the translation amount p calculated by IMU integral, searching a map frame nearest to the translation amount p in a map database by using KNN, calculating corresponding poses (R, p) between a real-time radar frame and the searched map frame by using an NDT (normalized difference test) matching algorithm, and adding the poses (R, p) serving as unitary factors into a factor graph model calculation graph;

step 5, fusing, positioning and modeling;

adding the received GPS signal into a Factor graph model calculation graph as a unitary Factor according to the received GPS signal;

adding the wheel speed signal as a unitary Factor into a Factor graph model calculation graph according to the received wheel speed signal;

adding IMU states (R, p, v, ba, bg) as binary factors into the factorial graph model calculation graph;

step 6, fusion positioning solving;

and sending all the factors of the data to an iSAM2 module to obtain the final transverse and longitudinal coordinates and the final heading angle of the vehicle in the map.

2. The multi-sensor fusion unmanned-vehicle-based relative positioning method of claim 1, wherein: using two frames of GPS data larger than 0.1 meter, subtracting the displacement of the next frame from the previous frame, and dividing the displacement by time to obtain the speed, and solving the course angle through the speed vectors in the X direction and the Y direction; the calculated speed is used as the initialization speed of the IMU, and the GPS coordinates are sent to an NDT database and used for searching bin files of the NDT data which are closest to the frame; after the search, the current frame and the map are used for registration to obtain the position P, the speed V and the course angle of the current vehicle, meanwhile, the calculated position P and the calculated course angle are used as the initialization pose of the IMU, and meanwhile, the speed V is used as the initialization parameter of the IMU of the mainline.

3. The multi-sensor fusion unmanned-vehicle-based relative positioning method according to claim 2, characterized in that: solving the IMU state vector at each moment, specifically:

firstly, IMU data between two moments are searched, a 6 x 1 matrix is created, the matrix comprises acceleration and angular speed, the time difference of the two frames of data is sent to an IMU _ pre-integrated function in a factor graph model for integration, and states (R, p, V, ba and bg) of the IMU are predicted according to the IMU data of each incoming frame.

4. The multi-sensor fusion unmanned-vehicle-based relative positioning method according to claim 2, characterized in that: the step 4 specifically comprises the following steps:

and (3) a graph building process: inputting a track, i.e. a positioning result { R calculated by imufactori,pi}i=1....N(ii) a Wherein R isiRepresenting a rotation vector, piRepresenting a translation vector; and radar point cloud frames at corresponding moments: { Li}i=1....NWherein L isiRepresenting the pose information of each frame of the laser radar in a local coordinate system, and converting the point cloud pose of each frame into a world coordinate system, namelyWherein the content of the first and second substances,representing the pose information of each frame of radar converted from local coordinate system to world coordinate system, piRepresenting the translation amount during coordinate conversion to finally obtain a point cloud map of the laser radar;

map matching: NDT is used for matching, namely the space occupied by the reference point cloud is divided into grids or voxels with specified sizes by taking the current point as the center; counting the scanning points of the laser radar in the small lattices, and when the number of the laser points collected in each lattice is not less than 4, calculating a multi-dimensional normal distribution parameter of each lattice, and solving a mean value and a covariance of the corresponding lattice; the odometer data obtained by the IMU are then assigned to transformation parameters, and the point clouds to be registered are transformed into the grids of the reference point clouds by means of a transformation matrix T, i.e.Wherein the content of the first and second substances,indicating that the registered coordinates are converted to point cloud coordinates under the map,lidar point cloud coordinates representing unconverted coordinates, LnewRepresenting the point cloud coordinates of the laser radar under a local coordinate system; calculating normal distribution PDF parameters of point clouds in the grids: p (k) exp [ -0.5(k-q)t-1(k-q)]Wherein k represents all scanning points in a grid, and q represents the mean value of normal distribution; adding the probability densities calculated by each grid to obtain the NDT registration score, and optimizing the objective function score according to a Newton optimization algorithm, namely searching for a transformation parameter to enable the score value to be optimal, so as to obtain the final positioning NDT positioning result; after receiving the coordinate information of the first frame of the GPS, starting to search a map frame closest to the frame coordinate, storing the map frame in a container after finding, then registering the current laser radar data converted by removing the point cloud distortion coordinate with the map frame to obtain a relative pose after registering, and transmitting the obtained pose as a unitary factor to a factor graph model calculation graph.

5. The utility model provides a relative positioning system based on unmanned car of multisensor fuses, includes memory and controller, the controller is connected with the memory, its characterized in that: the memory stores a computer readable program, and the controller, when invoked, is capable of performing the steps of the relative positioning method based on the multi-sensor fusion unmanned vehicle as claimed in any one of claims 1 to 4.

6. A vehicle, characterized in that: a multi-sensor fusion unmanned vehicle based relative positioning system as claimed in claim 5 is employed.

Technical Field

The invention belongs to the technical field of positioning, and particularly relates to a relative positioning method and system based on a multi-sensor fusion unmanned vehicle and a vehicle.

Background

One of the core capabilities that an autonomous mobile unmanned vehicle requires is to use sensors to sense the surrounding environment. Real-time positioning in the environment has been a popular research issue when unmanned vehicles are engaged in autonomous navigation. The traditional positioning method is to calculate the current pose of the unmanned vehicle in real time through data detected by an encoder arranged on the wheels of the unmanned vehicle. This method is simple to use but has its inherent cumulative error. The robot inevitably slips when moving on the ground, and errors exist in the system design and manufacturing process, the errors are accumulated along with the movement of the robot, and the calculation result of the pose after running for a period of time cannot be corrected and applied, so that the method has corresponding limitation.

In the research process of robot positioning, people gradually adopt a laser radar scanning matching technology to correct the deviation generated in the process of calculating the position and the attitude of the odometer. The scanning matching is to match the scanning result of the laser radar at the current moment with the scanning result at the previous moment, and the pose change relationship of the two groups of laser point clouds can be obtained through matching, so that the pose of the robot body in the environment is determined. The method well makes up for the accumulated error result existing in the odometer positioning, and the positioning result is high in precision under the condition that the radar is high in precision. However, the laser scanning matching method has certain limitations, and the scanning environment must include more landmark features, and the landmark features cannot be highly similar to each other, otherwise, mismatching occurs.

Therefore, there is a need to develop a new relative positioning method, system and vehicle based on multi-sensor fusion unmanned vehicle.

Disclosure of Invention

The invention aims to provide a relative positioning method and system based on a multi-sensor fusion unmanned vehicle and a vehicle, which can reduce the influence of accumulated errors on positioning results and effectively reduce mismatching results in the matching process.

In a first aspect, the relative positioning method based on the multi-sensor fusion unmanned vehicle comprises the following steps:

step 1, data access;

receiving data output by a GPS, an IMU, a wheel speed meter and a laser radar in real time, attaching timestamp information of the current moment to the data of each frame, and respectively storing the timestamp information into corresponding cache regions;

step 2, positioning initialization;

popping up two frames of GPS data from a GPS cache region, judging whether the vehicle moves or not based on the two frames of GPS data, and if so, calculating the position P, the speed V and the course angle of the current vehicle;

step 3, IMU integral calculation;

obtaining IMU states (R, p, V, ba and bg) of each IMU moment by using linear velocity and angular velocity integration of the IMU according to an IMU dynamic formula, wherein R represents attitude rotation quantity, p represents translation quantity, ba represents deviation of a gyroscope, and bg represents deviation of an accelerometer;

distortion correction is carried out on the real-time laser radar point cloud according to the historical positioning attitude and a pose (R, p) obtained by IMU integration;

step 4, map database matching;

according to the translation amount p calculated by IMU integral, searching a map frame nearest to the translation amount p in a map database by using KNN, calculating corresponding poses (R, p) between a real-time radar frame and the searched map frame by using an NDT (normalized difference test) matching algorithm, and adding the poses (R, p) serving as unitary factors into a factor graph model calculation graph;

step 5, fusing, positioning and modeling;

adding the received GPS signal into a Factor graph model calculation graph as a unitary Factor according to the received GPS signal;

adding the wheel speed signal as a unitary Factor into a Factor graph model calculation graph according to the received wheel speed signal;

adding IMU states (R, p, v, ba, bg) as binary factors into the factorial graph model calculation graph;

step 6, fusion positioning solving;

and sending all the factors of the data to an iSAM2 module to obtain the final transverse and longitudinal coordinates and the final heading angle of the vehicle in the map.

Optionally, two frames of GPS data larger than 0.1 meter are used, the displacement of the next frame is subtracted from the previous frame and then divided by time to obtain the speed, and the heading angle is obtained through the speed vectors in the X direction and the Y direction; the calculated speed is used as the initialization speed of the IMU, and the GPS coordinates are sent to an NDT database and used for searching bin files of the NDT data which are closest to the frame; after the search, the current frame and the map are used for registration to obtain the position P, the speed V and the course angle of the current vehicle, meanwhile, the calculated position P and the calculated course angle are used as the initialization pose of the IMU, and meanwhile, the speed V is used as the initialization parameter of the IMU of the mainline.

Optionally, solving an IMU state vector at each time, specifically:

firstly, IMU data between two moments are searched, a 6 x 1 matrix is created, the matrix comprises acceleration and angular speed, the time difference of the two frames of data is sent to an IMU _ pre-integrated function in a factor graph model for integration, and states (R, p, V, ba and bg) of the IMU are predicted according to the IMU data of each incoming frame.

Optionally, the step 4 specifically includes:

and (3) a graph building process: inputting a track, i.e. a positioning result { R calculated by imufactori,pi}i=1....N(ii) a Wherein R isiRepresenting a rotation vector, piRepresenting a translation vector; and radar point cloud frames at corresponding moments: { Li}i=1....NWherein L isiRepresenting the pose information of each frame of the laser radar in a local coordinate system, and converting the point cloud pose of each frame into a world coordinate system, namelyWherein the content of the first and second substances,representing the pose information of each frame of radar converted from local coordinate system to world coordinate system, piRepresenting the translation amount during coordinate conversion to finally obtain a point cloud map of the laser radar;

map matching: NDT is used for matching, namely the space occupied by the reference point cloud is divided into grids or voxels with specified sizes by taking the current point as the center; counting the scanning points of the laser radar in the small lattices, and when the number of the laser points collected in each lattice is not less than 4, calculating a multi-dimensional normal distribution parameter of each lattice, and solving a mean value and a covariance of the corresponding lattice; the odometer data obtained by the IMU are then assigned to transformation parameters, and the point clouds to be registered are transformed into the grids of the reference point clouds by means of a transformation matrix T, i.e.Wherein the content of the first and second substances,indicating that the registered coordinates are converted to point cloud coordinates under the map,lidar point cloud coordinates representing unconverted coordinates, LnewRepresenting the point cloud coordinates of the laser radar under a local coordinate system; calculating normal distribution PDF parameters of point clouds in the grids: p (k) exp [ -0.5(k-q)t-1(k-q)]Wherein k represents all scanning points in a grid, and q represents the mean value of normal distribution; adding the probability densities calculated by each grid to obtain the NDT registration score, and optimizing the objective function score according to a Newton optimization algorithm, namely searching for a transformation parameter to enable the score value to be optimal, so as to obtain the final positioning NDT positioning result; after receiving the coordinate information of the first frame of the GPS, starting to search a map frame closest to the frame coordinate, storing the map frame in a container after the map frame is found, then registering the current laser radar data converted by removing the point cloud distortion coordinate with the map frame to obtain the relative pose after registration, and obtaining the relative pose after registrationThe pose is transmitted to the factor graph model calculation graph as a unitary factor.

In a second aspect, the relative positioning system based on the multi-sensor fusion unmanned vehicle comprises a memory and a controller, wherein the controller is connected with the memory, a computer readable program is stored in the memory, and when the controller calls the computer readable program, the steps of the relative positioning method based on the multi-sensor fusion unmanned vehicle can be executed.

In a third aspect, the invention provides a vehicle, which adopts the relative positioning system based on the multi-sensor fusion unmanned vehicle.

The invention has the following advantages: the method can improve the inherent constraint existing in the traditional odometer system positioning, and reduces the influence of the accumulated error on the positioning result; the laser point cloud scanning matching is combined with IMU, GPS and odometer positioning phases, and the mismatching result in the matching process can be effectively reduced aiming at the condition of high similarity of the road sign characteristics in a large-range environment or environment.

Drawings

FIG. 1 is a diagram illustrating the relationship between sensors and the processing flow in this embodiment;

FIG. 2 is a diagram illustrating how sensor data in the algorithm module is processed in the factor graph model according to this embodiment;

FIG. 3 is a point cloud map constructed using the NDT algorithm based on the GPS trajectory in this embodiment;

FIG. 4 is a comparison graph (already completely overlapped) between the positioning track and the high-precision GPS track in the present embodiment;

fig. 5 is a graph comparing the relative positioning result of the conventional single laser radar with the high-precision GPS track (with a large error).

Detailed Description

The invention will be further explained with reference to the drawings.

As shown in fig. 1, in this embodiment, a relative positioning method based on a multi-sensor fusion unmanned vehicle includes the following steps:

(1) data access;

and respectively accessing data output by the GPS, the IMU, the wheel speed meter and the laser radar, attaching timestamp information of the current moment to the data of each frame, and respectively storing the timestamp information into corresponding cache regions.

(2) Positioning initialization;

the IMU is initialized and the vehicle is first stationary for more than 50 seconds. And (3) initializing the GPS, wherein two frames of GPS data larger than 0.1 meter are used for initialization, the displacement of the former frame minus the latter frame is divided by time to calculate the speed, and the course angle yaw can be calculated through speed vectors in the X direction and the Y direction. If no wheel speed is introduced, GPS speed is used instead of wheel speed addition. The velocity calculated at the same time is used as the initial velocity of the IMU and the GPS coordinates are sent to an NDT (normal distribution transform mapping) database for searching bin files that are located one frame of NDT data closest. After the search, the current frame and the map are used for registration, so that the position P (namely the transverse and longitudinal positioning results x and y), the speed V and the course angle of the current vehicle are obtained, meanwhile, the calculated position P and the calculated course angle are used as the initialization pose of the IMU, and meanwhile, the speed V calculated by the front GPS is used as the initialization parameter of the IMU of the main line.

(3) IMU integral calculation;

after the IMU has obtained the initialization parameters, it begins to request the IMU state vector at each time. Because the accuracy of the IMU in the instant state is higher, but the integral error is very large, the IMU posture calculation method firstly searches IMU data between two moments, creates a 6 x 1 matrix comprising acceleration and angular velocity, sends the time difference of two frames of data into an IMU _ pre-integrated function in a factor graph model for integration, predicts the state (R, p, V, ba and bg) of the IMU according to the IMU data of each incoming frame, simultaneously needs other sensors as observed quantities for continuous calibration, and calculates the IMU accumulated value by using the imufactor function in the factor graph model to obtain odometer information. And (8) performing distortion correction on the real-time laser radar point cloud according to the historical positioning attitude and the pose (R, p) obtained by IMU integration.

(4) Map database matching;

and (3) a graph building process: inputting a trajectory, i.e. calculated by imufactorPositioning result { Ri,pi}i=1....N(ii) a Wherein R isiRepresenting a rotation vector, piRepresenting a translation vector; and radar point cloud frames at corresponding moments: { Li}i=1....NWherein L isiRepresenting the pose information of each frame of the laser radar in a local coordinate system, and converting the point cloud pose of each frame into the coordinate system of the track, namelyWherein the content of the first and second substances,representing the pose information of each frame of radar converted from local coordinate system to world coordinate system, piAnd (4) representing the translation amount during coordinate conversion, and finally obtaining the point cloud map of the laser radar.

Map matching: matching using NDT (normalized difference transform) which is a normal distribution transformation algorithm; the general idea of NDT is to divide the space occupied by reference point cloud into grids or voxels with specified size by taking the current point as the center; for example, (10cm) × (10cm) × (10cm) × (10cm) small grids, the scanning points of the laser radar are counted in the small grids, as long as the collected laser points of each grid are not less than 4, the multidimensional normal distribution parameter of each grid is calculated, and the mean value and the covariance of the corresponding grid are calculated. The transformation parameters are then assigned using the odometer data obtained above by the IMU, and for the point cloud to be registered, it is converted into the grid of the reference point cloud by means of the transformation matrix T, i.e.Wherein the content of the first and second substances,indicating that the registered coordinates are converted to point cloud coordinates under the map,lidar point cloud coordinates representing unconverted coordinates, LnewAnd the laser radar point cloud coordinates under the local coordinate system are represented. MeterCalculating the normal distribution PDF parameters of the point cloud in the grid: p (k) exp [ -0.5(k-q)t-1(k-q)]Where k represents all scan points in a grid and q represents the mean of a normal distribution. Each PDF can be considered an approximation of a local plane describing features such as plane position, orientation, shape, and smoothness. And adding the probability densities calculated by each grid to obtain a score (score) of NDT registration, and optimizing the objective function score according to a Newton optimization algorithm, namely searching a transformation parameter to ensure that the score value is optimal, so as to obtain a final positioning NDT positioning result. After receiving the coordinate information of the first frame of the GPS, starting to search a map frame closest to the frame coordinate, storing the map frame in a container after finding, then registering the current laser radar data converted by removing the point cloud distortion coordinate with the map frame, and solving the Gaussian distribution relation in each grid in the current frame and the NDT map. And obtaining the relative pose obtained after registration. And transmitting the obtained pose as a unitary factor to a factor graph model to serve as a unitary factor for adjusting IMU data.

(5) Fusing, positioning and modeling;

the data received by the GPS is also used as position observed quantity of the IMU and added into a factor graph model calculation graph, meanwhile, a unitary factor is self-established for data of a wheel speed meter, information of the wheel speed meter is mainly used as observed quantity to deduce position information of the vehicle, and the information is fused with GPS information and used as a parameter for continuously correcting the position information of the IMU.

(6) Fusion positioning solving;

all data factors are sent to the iSAM2 module (long-term sliding window), and the final horizontal and vertical coordinates and heading angle of the vehicle in the map are obtained.

As shown in fig. 3, a point cloud map is created by using the NDT algorithm based on the GPS track in this embodiment.

As shown in fig. 4, which is a comparison diagram of the positioning track and the high-precision GPS track in the present embodiment, it can be seen from fig. 4 that the positioning track and the high-precision GPS track are already completely overlapped, which illustrates that the positioning precision is high and stable by using the method.

As shown in fig. 5, it is a comparison graph of the relative positioning result of the conventional single lidar and the high-precision GPS track, and it can be seen from the comparison graph that the error of the relative positioning result of the single lidar is large.

In this embodiment, a relative positioning system based on a multi-sensor fusion unmanned vehicle includes a memory and a controller, where the controller is connected to the memory, and a computer-readable program is stored in the memory, and when the controller calls the computer-readable program, the steps of the relative positioning method based on a multi-sensor fusion unmanned vehicle as described in this embodiment may be executed.

In this embodiment, a vehicle adopts the relative positioning system based on the multi-sensor fusion unmanned vehicle as described in this embodiment.

11页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:地图检测方法和装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!