Method for estimating relative pose and inertia of space complete non-cooperative target

文档序号:1462902 发布日期:2020-02-21 浏览:10次 中文

阅读说明:本技术 一种空间完全非合作目标相对位姿和惯量估计方法 (Method for estimating relative pose and inertia of space complete non-cooperative target ) 是由 冯乾 侯晓磊 杨家男 潘泉 刘勇 于 2019-10-18 设计创作,主要内容包括:本发明公开了一种空间完全非合作目标相对位姿和惯量估计方法,步骤一、由两个左右间隔设置在追踪航天器上、且参数相同的工业相机实时获取非合作目标的图像信息,计算各特征点在相机坐标系下的3D位置和速度;步骤二、根据刚体运动模型,由至少三个特征点的3D位置和速度计算非合作目标的相对角速度;估计非合作目标在任意时刻的相对姿态;步骤三、将3D位置和速度以及刚体的相对姿态和相对角速度,估计非合作目标的质心位置、质心速度及特征点的相对位置;步骤四、估计非合作目标的转动惯量参数。在无需先验已知完全非合作目标的几何形状及特征点位置的前提下,可解算出完全非合作目标的质心位置、质心速度、相对姿态、相对角速度及惯量参数。(The invention discloses a method for estimating relative pose and inertia of a space complete non-cooperative target, which comprises the steps of firstly, acquiring image information of the non-cooperative target in real time by two industrial cameras which are arranged on a tracked spacecraft at intervals and have the same parameters, and calculating the 3D position and speed of each feature point under a camera coordinate system; step two, calculating the relative angular velocity of the non-cooperative target according to the rigid motion model and the 3D positions and the velocities of at least three characteristic points; estimating the relative attitude of the non-cooperative target at any moment; estimating the centroid position, the centroid speed and the relative position of the characteristic point of the non-cooperative target by using the 3D position, the 3D speed and the relative posture and the relative angular speed of the rigid body; and step four, estimating the rotational inertia parameters of the non-cooperative target. Under the premise of not knowing the geometric shape and the position of the characteristic point of the completely non-cooperative target a priori, the centroid position, the centroid speed, the relative attitude, the relative angular speed and the inertia parameter of the completely non-cooperative target can be calculated.)

1. A method for estimating relative pose and inertia of a space complete non-cooperative target is characterized by comprising the following steps:

step one, two industrial cameras which are arranged on a tracking spacecraft at intervals at left and right and have the same parameters are used for acquiring image information of a non-cooperative target in real time, image positions and image speeds of a plurality of feature points on the non-cooperative target on the left camera and the right camera are obtained through calculation, and then the 3D positions and speeds of the feature points under a camera coordinate system are calculated;

estimating the relative angular velocity of the non-cooperative target according to the rigid motion model and the 3D positions and the velocities of the at least three characteristic points in the step one; estimating the relative posture of the non-cooperative target at any time by combining the 3D positions of the characteristic points at the front and rear times;

thirdly, estimating the centroid position, the centroid speed and the relative position of the characteristic point of the non-cooperative target by using the 3D position and the speed in the first step and the relative posture and the relative angular speed of the rigid body in the second step and combining a centroid relative motion constraint equation of the non-cooperative target;

and step four, estimating the rotational inertia parameters of the non-cooperative target.

2. The method for estimating the relative pose and inertia of the completely non-cooperative target in the space according to claim 1 is characterized in that the method is applicable to the following conditions: the distance between the non-cooperative target and the tracking spacecraft is less than 100 meters, and the motion trail of the tracking spacecraft is a circular or near-circular orbit.

3. The method for estimating the relative pose and inertia of the completely non-cooperative target in space according to claim 2, wherein in the first step, the characteristic point P can be obtained by a pinhole modeliThe image positions on the left and right cameras are:

Figure FDA0002239476410000021

wherein:

ηi=[uiRviRuiLviL]Tis the image coordinates of the ith feature point on the left and right cameras,

ρi=[ρixρiyρiz]Tthe coordinates of the ith feature point in the camera coordinate system are 1,2, …, N;

f is the focal length of the camera;

b is the baseline width between the two cameras;

when considering the image noise in the actual measurement, the image position measurement model is:

Figure FDA0002239476410000022

wherein:

measuring image coordinates of an ith characteristic point containing noise on left and right cameras;

εiis modeled as a zero mean with a covariance of

Figure FDA0002239476410000024

and (3) obtaining the 3D position estimation of the feature point in the camera coordinate system according to the formula (1) and the noise model (A):

Figure FDA0002239476410000031

wherein:

Figure FDA0002239476410000032

the image velocity is derived from equation (1):

Figure FDA0002239476410000033

and (3) according to the formula (4), and considering the image noise, obtaining a speed estimation value of the ith characteristic point in the camera coordinate system as follows:

Figure FDA0002239476410000034

wherein:

Figure FDA0002239476410000035

Figure FDA0002239476410000036

4. The method for estimating the relative pose and inertia of the space complete non-cooperative target according to the claim 2 or 3, wherein the specific process of the second step is as follows:

at any time t, any characteristic point P on the non-cooperative targetiSatisfies the following geometrical positional relationship:

Figure FDA0002239476410000037

the speed satisfies the following relationship:

wherein:

ρ0(t) is the position of the non-cooperative target centroid relative to the camera coordinate system at time t;

Figure FDA0002239476410000041

Figure FDA0002239476410000042

the angular velocity of the non-cooperative target relative to the tracked spacecraft at the time t is coordinated under a camera coordinate system;

ricoordinates of the feature points on the non-cooperative target relative to the position of the center of mass under the non-cooperative target body;

taking any characteristic point on the non-cooperative target as a reference point, wherein the characteristic point is PNDefine δ ρi(t)=ρi(t)-ρN(t),δri=ri-rNCombining formula (6) and formula (7) gives:

Figure FDA0002239476410000044

Figure FDA0002239476410000045

by elimination of formulae (8) and (9)

Figure FDA0002239476410000046

Figure FDA0002239476410000047

wherein: [ δ ρ [ ]i(t)×]Representing the vector δ ρi(t) the corresponding cross-multiplication matrix;

in combination of formulas (2) and (5), the following can be obtained:

Figure FDA0002239476410000048

the relative angular velocity estimate for the non-cooperative target is given by the following equation (12):

Figure FDA0002239476410000049

wherein:

the minimum value of N is 3;

setting an initial time t0And an arbitrary time tkWherein, tk=t0+ k Δ t, k being a positive integer, Δ t being the time interval between two shots of the non-cooperative target image, according to equation (8):

defining pose variance

Figure FDA0002239476410000052

Figure FDA0002239476410000053

Figure FDA0002239476410000054

calculating the relative attitude estimation value of the non-cooperative target at any moment by the formula (15)

Figure FDA0002239476410000055

5. The method for estimating the relative pose and inertia of the space complete non-cooperative target according to the claim 2 or 3, wherein the specific process of the third step is as follows: the relative position of the non-cooperative target is described by a centroid relative motion constraint equation, which is a CW equation, then:

wherein the content of the first and second substances,

Figure FDA0002239476410000057

performing a second order taylor discretization on equation (19) and ignoring the higher order terms and the noise terms, we can obtain:

xp(tk)=F1xp(tk-Δt) (20);

wherein:

Δ t is the time interval between two times of shooting of non-cooperative target images;

xpa vector containing the position and the speed of the mass center of the non-cooperative target is obtained;

F1=I6+Δt·F+1/2Δt2·F2

is provided withA vector containing the relative position of the feature point with respect to the centroid of the non-cooperative target, and the position and velocity of the centroid with respect to the camera coordinate system; from equation (20) we can derive:

X1(tk)=G·X1(tk-Δt) (21);

wherein:

Figure FDA0002239476410000061

according to the formula (21), j is a positive integer for the interval j.DELTA.t, and satisfies the condition that j.DELTA.t is a positive integer for the specified time interval c.DELTA.t

X1(tk-j·Δt)=G-jX1(tk),k-c≤j<k (22);

Wherein: c is a positive integer less than k;

from equations (6) and (7), it can be found that:

C(tk)X1(tk)=Y(tk) (23);

wherein:

Figure FDA0002239476410000063

estimated values calculated according to equations (2), (5), (12) and (15)

Figure FDA0002239476410000064

Figure FDA0002239476410000066

wherein:

Figure FDA0002239476410000071

6. the method for estimating the relative pose and inertia of the space complete non-cooperative target according to the claim 2 or 3, wherein the process of the fourth step is as follows: angular momentum h of the non-cooperative target in an inertial coordinate systemIComprises the following steps:

Figure FDA0002239476410000072

wherein:

Figure FDA0002239476410000073

Figure FDA0002239476410000074

Figure FDA0002239476410000075

Figure FDA0002239476410000076

Figure FDA0002239476410000077

Figure FDA0002239476410000078

definition of

Figure FDA0002239476410000079

AxI=0 (26);

wherein:

Figure FDA00022394764100000711

Figure FDA00022394764100000712

Figure FDA0002239476410000081

|| ||2A modulus representing a vector;

definition B ═ ATA; the condition for minimizing the convex quadratic function f (x) according to equation (27) is:

BxI=0 (28);

for homogeneous equation (28), given xIIs 1, i.e. the first component of

Figure FDA0002239476410000082

Figure FDA0002239476410000083

wherein: b11Is a positive real number; then homogeneous equation (28) is written as:

Brxr=-b1(30);

the inertia tensor of the non-cooperative target satisfies its own physical constraints, namely:

Figure FDA0002239476410000084

substitution into

Figure FDA0002239476410000085

then equation (30) is a quadratic equation that satisfies constraint (32) by optimizing a convex quadratic function

Figure FDA0002239476410000092

[ technical field ] A method for producing a semiconductor device

The invention belongs to the technical field of navigation, and particularly relates to a method for estimating relative pose and inertia of a space complete non-cooperative target.

[ background of the invention ]

In recent years, with the increasing frequency of human space activities, the number of space debris is increased sharply, and by 1 month in 2013, the number of large space debris (the size is not less than 10cm) cataloged by a space monitoring network (SSN) under the U.S. strategic commander is nearly 15000, and the space debris (including small space debris with the size less than 1mm and dangerous debris between the large space debris and the small space debris) which is not cataloged and has the size less than 10cm is difficult to estimate, so that the space debris seriously threatens the normal operation of the in-orbit spacecraft, and the research on clearing the space debris is urgently needed. Compared with cooperative targets in a traditional rendezvous and docking task, non-cooperative targets such as a fault satellite, a failed spacecraft, space debris and the like have obvious 'three-free' characteristics, namely, no cooperative measurement beacon, no interactive communication and no model parameters, and the characteristics bring great challenges to relative navigation and near-field operation of the non-cooperative targets.

Regarding the estimation of the relative pose and the state of the non-cooperative target, the following methods are adopted at present, firstly, the relative pose parameters of the non-cooperative target are solved by an iterative algorithm based on monocular vision, the method assumes that the shape and the geometric dimension of the target are known, actually, the coordinates of a plurality of reference points on the target spacecraft are under a body coordinate system of the target spacecraft, and the relative position and the attitude of the current moment are solved by the iterative method. And the other method is to provide a non-cooperative target relative pose estimation method based on gyroscope, stereo vision and accelerometer measurement under the condition that the positions and inertia tensors of a plurality of characteristic points on the non-cooperative target are known in advance, so that accurate autonomous relative navigation is realized under the condition that the non-cooperative target spacecraft dynamic data is lacked. And the other method is to use a model matching (such as ICP) based method to solve the relative pose parameters of the non-cooperative target on the premise of a CAD structure of the known non-cooperative target in advance. And on the basis of binocular stereo vision, a SLAM (simultaneous Localization and mapping) method is adopted to solve the mass center position, the linear speed, the relative attitude, the relative angular speed and the inertia parameters of the space rotation non-cooperative target.

Compared with a cooperative target in a traditional rendezvous and docking task, a complete non-cooperative target such as a fault satellite, a failed spacecraft, a space debris and the like lacks cooperative information which is convenient for navigation, such as a measurement beacon, interactive communication, model parameters and the like, so that the traditional navigation algorithm suitable for the cooperative target fails. The existing navigation algorithm applied to the non-cooperative target depends on a model of the target, or the calculation amount is too large to realize online calculation. The above method relies either on a priori known shapes and geometries, or on a priori knowledge of the locations of feature points on the object and the moment of inertia of the object, or on a priori known CAD models of the object, which are not strictly speaking completely uncooperative objects. The SLAM method is used for calculating the centroid position, the linear speed, the relative attitude, the relative angular speed and the inertia parameters of the space rotation non-cooperative target, the calculation amount is large, the time consumption is long, and only offline calculation can be performed.

[ summary of the invention ]

The invention aims to provide a method for estimating the relative pose and inertia of a space complete non-cooperative target, which can solve the centroid position, the centroid speed, the relative attitude, the relative angular speed and the inertia parameter of the complete non-cooperative target on the premise of not knowing the geometric shape and the characteristic point position of the complete non-cooperative target a priori, and provide effective navigation information for the near-field operation of the non-cooperative target in the next stage.

The invention adopts the following technical scheme: a method for estimating relative pose and inertia of a space complete non-cooperative target comprises the following steps:

step one, two industrial cameras which are arranged on a tracking spacecraft at intervals at left and right and have the same parameters are used for acquiring image information of a non-cooperative target in real time, image positions and image speeds of a plurality of feature points on the non-cooperative target on the left camera and the right camera are obtained through calculation, and then the 3D positions and speeds of the feature points under a camera coordinate system are calculated;

estimating the relative angular velocity of the non-cooperative target according to the rigid motion model and the 3D positions and the velocities of the at least three characteristic points in the step one; estimating the relative posture of the non-cooperative target at any time by combining the 3D positions of the characteristic points at the front and rear times;

thirdly, estimating the centroid position, the centroid speed and the relative position of the characteristic point of the non-cooperative target by using the 3D position and the speed in the first step and the relative posture and the relative angular speed of the rigid body in the second step and combining a centroid relative motion constraint equation of the non-cooperative target;

and step four, estimating the rotational inertia parameters of the non-cooperative target.

Further, the method is applicable to the following conditions: the distance between the non-cooperative target and the tracking spacecraft is less than 100 meters, and the motion track of the tracking spacecraft is a circular or near-circular orbit.

Further, in step one, the characteristic point P can be obtained from the pinhole modeliThe image positions on the left and right cameras are:

Figure BDA0002239476420000031

wherein:

ηi=[uiRviRuiLviL]Tis the image coordinates of the ith feature point on the left and right cameras,

ρi=[ρixρiyρiz]Tthe coordinates of the ith feature point in the camera coordinate system are 1,2, …, N;

f is the focal length of the camera;

b is the baseline width between the two cameras;

when considering the image noise in the actual measurement, the image position measurement model is:

Figure BDA0002239476420000041

wherein:

Figure BDA0002239476420000042

measuring image coordinates of an ith characteristic point containing noise on left and right cameras;

εiis modeled as a zero mean with a covariance of

Figure BDA0002239476420000043

White Gaussian noise of (1)4Representing a 4 x 4 identity matrix.

And (3) obtaining the 3D position estimation of the feature point in the camera coordinate system according to the formula (1) and the noise model (A):

wherein:

Figure BDA0002239476420000045

the image velocity is derived from equation (1):

Figure BDA0002239476420000046

and (3) according to the formula (4), and considering the image noise, obtaining a speed estimation value of the ith characteristic point in the camera coordinate system as follows:

Figure BDA0002239476420000047

wherein:

Figure BDA0002239476420000051

Figure BDA0002239476420000052

representing an estimate of the corresponding quantity.

Further, the specific process of the second step is as follows:

at any time t, any characteristic point P on the non-cooperative targetiSatisfies the following geometrical position relationship:

Figure BDA00022394764200000511

The speed satisfies the following relationship:

Figure BDA0002239476420000053

wherein:

ρ0(t) is the position of the non-cooperative target centroid relative to the camera coordinate system at time t;

Figure BDA0002239476420000054

is the speed of the non-cooperative target centroid relative to the camera coordinate system at time t;

Figure BDA0002239476420000055

is a posture rotation matrix from a non-cooperative target system to a camera coordinate system at the time t;

the angular velocity of the non-cooperative target relative to the tracked spacecraft at the time t is coordinated under a camera coordinate system;

ricoordinates of the feature points on the non-cooperative target relative to the position of the center of mass under the non-cooperative target body;

taking any characteristic point on the non-cooperative target as a reference point, wherein the characteristic point is PNDefinition of

δρi(t)=ρi(t)-ρN(t),δri=ri-rNCombining formula (6) and formula (7) gives:

Figure BDA0002239476420000057

Figure BDA0002239476420000058

by elimination of formulae (8) and (9)

Figure BDA0002239476420000059

The following can be obtained:

Figure BDA00022394764200000510

wherein: [ δ ρ [ ]i(t)×]Representing the vector δ ρi(t) the corresponding cross-multiplication matrix;

in combination of formulas (2) and (5), the following can be obtained:

Figure BDA0002239476420000061

the estimate of the relative angular velocity of the non-cooperative target is given by equation (12) as follows:

Figure BDA0002239476420000062

wherein:

Figure BDA0002239476420000063

the minimum value of N is 3;

setting an initial time t0And an arbitrary time tkWherein, tk=t0+ k Δ t, k being a positive integer, Δ t being the time interval between two shots of the non-cooperative target image, according to equation (8):

Figure BDA0002239476420000064

defining pose variance

Figure BDA0002239476420000065

And eliminate r in formula (13)iObtaining:

Figure BDA0002239476420000066

Figure BDA0002239476420000067

calculating the relative attitude estimation value of the non-cooperative target at any moment by the formula (15)

Figure BDA0002239476420000068

Further, the specific process of the third step is as follows: the relative position of the non-cooperative target is described by a centroid relative motion constraint equation, wherein the centroid relative motion constraint equation is a CW equation, and then:

Figure BDA0002239476420000069

wherein the content of the first and second substances,

Figure BDA00022394764200000611

acceleration noise generated for including spatial disturbance forces; n is the average orbital angular velocity of the tracked spacecraft;

performing a second order taylor discretization on equation (19) and ignoring the higher order terms and the noise terms, we can obtain:

xp(tk)=F1xp(tk-Δt) (20);

wherein:

Δ t is the time interval between two times of shooting of non-cooperative target images;

xpa vector containing the position and the speed of the mass center of the non-cooperative target is obtained;

F1=I6+Δt·F+1/2Δt2·F2

is provided with

Figure BDA0002239476420000071

Including the relative position, and mass, of the feature point with respect to the centroid of the non-cooperative objectA vector including the position and velocity of the heart relative to the camera coordinate system; from equation (20) we can derive:

X1(tk)=G·X1(tk-Δt) (21);

wherein:

Figure BDA0002239476420000072

I3is a 3 × 3 identity matrix;

according to the formula (21), j is a positive integer for the interval j.DELTA.t, and satisfies the condition that j.DELTA.t is a positive integer for the specified time interval c.DELTA.t

X1(tk-j·Δt)=G-jX1(tk),k-c≤j<k (22);

Wherein: c is a positive integer less than k;

from equations (6) and (7), it can be found that:

C(tk)X1(tk)=Y(tk) (23);

wherein:

Figure BDA0002239476420000073

Figure BDA0002239476420000081

estimated values calculated according to equations (2), (5), (12) and (15)

Figure BDA0002239476420000082

And

Figure BDA0002239476420000083

x can be obtained by combining the formulas (22) and (23)1(tk) The least squares estimation of (d) is:

Figure BDA0002239476420000084

wherein:

Figure BDA0002239476420000085

further, the process of the fourth step is as follows: angular momentum h of non-cooperative target under inertial coordinate systemIComprises the following steps:

Figure BDA0002239476420000086

wherein:

Figure BDA0002239476420000087

then:

Figure BDA0002239476420000089

to track the angular velocity of the spacecraft;

Figure BDA00022394764200000810

an angular velocity of a non-cooperative target;

Figure BDA00022394764200000811

is an attitude rotation matrix from a non-cooperative target system to an inertial system;

Figure BDA00022394764200000812

is a pose rotation matrix from a tracking spacecraft camera coordinate system to an inertial system;

definition of

Figure BDA00022394764200000813

Wherein I*=[ItxxItxyItxzItyyItyzItzz]TAre components of the moment of inertia of non-cooperative targets,

Figure BDA0002239476420000091

each component of the non-cooperative target angular momentum under an inertial system; from equation (25) we can obtain:

AxI=0 (26);

wherein:

wherein the content of the first and second substances,

Figure BDA0002239476420000093

is an estimate of the angular velocity of a non-cooperative target

Figure BDA0002239476420000094

Each component of (a);

solving equation (26) is equivalent to minimizing equation (27):

Figure BDA0002239476420000095

|| ||2a modulus representing a vector;

definition B ═ ATA; the condition for minimizing the convex quadratic function f (x) according to equation (27) is:

BxI=0 (28);

for homogeneous equation (28), given xIIs 1, i.e. the first component of

Figure BDA0002239476420000096

The block matrix of matrix B is then represented as follows:

Figure BDA0002239476420000097

wherein: b11Is a positive real number; then homogeneous equation (28) is written as:

Brxr=-b1(30);

the inertia tensor of the non-cooperative target satisfies its own physical constraints, namely:

Figure BDA0002239476420000101

substitution intoThen:

Figure BDA0002239476420000103

then equation (30) is a quadratic equation that satisfies constraint (32) by optimizing a convex quadratic function

Figure BDA0002239476420000104

Solving for xr

The invention has the beneficial effects that: 1. for a completely non-cooperative target, the 3D position and speed of the feature points in the camera coordinate system are calculated according to the image positions and image speeds of a plurality of feature points on the target acquired by the industrial camera.

2. Solving and converting the inertia parameters of the completely non-cooperative target into a quadratic optimization problem with constraint.

3. Least square, q-method and quadratic optimization are mainly adopted, the calculated amount is small, and online estimation can be realized.

4. When the inertia parameters are estimated, the constraint among components of the inertia tensor is considered, and the estimation result is more reliable.

[ description of the drawings ]

FIG. 1 is a geometric relationship diagram of feature points;

FIG. 2 is a diagram of the relative error of the centroid positions of non-cooperative targets;

FIG. 3 is a graph of the non-cooperative target centroid velocity relative error;

FIG. 4 is a graph of non-cooperative target relative attitude error;

FIG. 5 is a graph of the relative angular velocity error of the non-cooperative target;

FIG. 6 is a graph of non-cooperative target x-axis rotational inertia relative error;

FIG. 7 is a graph of non-cooperative target y-axis rotational inertia relative error;

FIG. 8 is a graph of non-cooperative target z-axis rotational inertia relative error;

fig. 9 shows the relative error of a feature point on a non-cooperative target with respect to the centroid position of the non-cooperative target.

[ detailed description ] embodiments

The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.

The invention discloses a method for estimating relative pose and inertia of a space complete non-cooperative target, which comprises the following steps:

step one, two industrial cameras which are arranged on a tracking spacecraft at intervals at left and right and have the same parameters are used for acquiring image information of a non-cooperative target in real time, image positions and image speeds of a plurality of feature points on the non-cooperative target on the left camera and the right camera are obtained through calculation, and then the 3D positions and speeds of the feature points under a camera coordinate system are calculated;

estimating the relative angular velocity of the non-cooperative target according to the rigid motion model and the 3D positions and the velocities of the at least three characteristic points in the step one; estimating the relative posture of the non-cooperative target at any time by combining the 3D positions of the characteristic points at the front and rear times; as shown in fig. 1.

Thirdly, estimating the centroid position, the centroid speed and the relative position of the characteristic point of the non-cooperative target by using the 3D position and the speed in the first step and the relative posture and the relative angular speed of the rigid body in the second step and combining a centroid relative motion constraint equation of the non-cooperative target;

and step four, estimating the rotational inertia parameters of the non-cooperative target.

The method is applicable to the following conditions: the distance between the non-cooperative target and the tracking spacecraft is less than 100 meters, and the motion track of the tracking spacecraft is a circular or near-circular orbit.

In the first step, the characteristic point P can be obtained from the pinhole modeliThe image positions on the left and right cameras are:

Figure BDA0002239476420000121

wherein:

ηi=[uiRviRuiLviL]Tis the image coordinates of the ith feature point on the left and right cameras,

ρi=[ρixρiyρiz]Tthe coordinates of the ith feature point in the camera coordinate system are 1,2, …, N; n is the number of the characteristic points;

f is the focal length of the camera;

b is the baseline width between the two cameras;

when considering the image noise in the actual measurement, the image position measurement model is:

Figure BDA0002239476420000122

wherein:

Figure BDA0002239476420000131

measuring image coordinates of an ith characteristic point containing noise on left and right cameras;

εiis modeled as a zero mean with a covariance ofWhite Gaussian noise of (1)4Representing a 4 x 4 identity matrix.

And (3) obtaining the 3D position estimation of the feature point in the camera coordinate system according to the formula (1) and the noise model (A):

wherein:

Figure BDA0002239476420000134

the image velocity is derived from equation (1):

Figure BDA0002239476420000135

and (3) according to the formula (4), and considering the image noise, obtaining a speed estimation value of the ith characteristic point in the camera coordinate system as follows:

wherein:

representing an estimate of the corresponding quantity.

The specific process of the second step is as follows:

as shown in FIG. 1, at any time t, any feature point P on the non-cooperative targetiSatisfies the following geometrical positional relationship:

Figure BDA0002239476420000141

the speed satisfies the following relationship:

Figure BDA0002239476420000142

wherein:

ρ0(t) is the position of the non-cooperative target centroid relative to the camera coordinate system at time t;

Figure BDA0002239476420000143

is the speed of the non-cooperative target centroid relative to the camera coordinate system at time t;

Figure BDA0002239476420000144

is a posture rotation matrix from a non-cooperative target system to a camera coordinate system at the time t;

Figure BDA0002239476420000145

the angular velocity of the non-cooperative target relative to the tracked spacecraft at the time t is coordinated under a camera coordinate system;

ricoordinates of the feature points on the non-cooperative target relative to the position of the center of mass under the non-cooperative target body;

taking any characteristic point on the non-cooperative target as a reference point, wherein the characteristic point is PNDefinition of

δρi(t)=ρi(t)-ρN(t),δri=ri-rNCombining formula (6) and formula (7) gives:

Figure BDA0002239476420000146

Figure BDA0002239476420000147

by elimination of formulae (8) and (9)

Figure BDA0002239476420000148

The following can be obtained:

Figure BDA0002239476420000149

wherein: [ δ ρ [ ]i(t)×]Representing the vector δ ρi(t) the corresponding cross-multiplication matrix;

combining equations (2) and (5), only the estimated values of the feature point position and velocity can be obtained, and then the following equation (10) can be obtained:

Figure BDA00022394764200001410

the estimate of the relative angular velocity of the non-cooperative target is given by the following equation (12):

Figure BDA00022394764200001411

wherein:

Figure BDA0002239476420000151

due to the fact thatThe determinant is 0, namely the rank of the matrix is 2, and the minimum value of the number N of the characteristic points is required to be 3 in order to solve the three-dimensional relative angular velocity of the non-cooperative target.

Setting an initial time t0And an arbitrary time tkWherein, tk=t0+ k Δ t, k being a positive integer, Δ t being the time interval between two shots of the non-cooperative target image, according to equation (8):

Figure BDA0002239476420000153

defining pose variance

Figure BDA0002239476420000154

And eliminate r in formula (13)iObtaining:

Figure BDA0002239476420000155

since equation (2) can only obtain an estimate of the location of the feature points, it is obtained from equation (14):

Figure BDA0002239476420000156

calculating the relative attitude estimation value of the non-cooperative target at any moment by the formula (15)

Figure BDA0002239476420000157

The above formula (15) is transformed into the classical Wahba problem and solved using q-method. Selection weight { a }i1,2, N-1, and defines the following matrix:

Figure BDA0002239476420000158

Figure BDA0002239476420000159

wherein the content of the first and second substances,

Figure BDA00022394764200001510

the unit feature vector corresponding to the largest feature root of L (B) is the attitude variation

Figure BDA00022394764200001511

Corresponding quaternion

Figure BDA00022394764200001512

Here, the quaternion q ═ q1q2q3q4]TThe corresponding attitude matrix is:

Figure BDA0002239476420000161

given an initial relative pose of a non-cooperative target

Figure BDA0002239476420000162

Can be arbitrarily given, and then combines the relative attitude variation

Figure BDA0002239476420000163

Calculating the relative attitude of the non-cooperative target at any time from equation (14)

Figure BDA0002239476420000164

The concrete process of the third step is as follows: the relative position of the non-cooperative objects is described by a centroid relative motion constraint equation, the centroid relative motion constraint equation is a CW equation, CW is a reference of Clohessy-Wiltshire, then:

Figure BDA0002239476420000165

wherein:

Figure BDA00022394764200001611

for acceleration noise including spatial disturbance force

Figure BDA0002239476420000167

Then formula (19) is obtained:

wherein the content of the first and second substances,

Figure BDA0002239476420000169

Figure BDA00022394764200001610

acceleration noise generated for including spatial disturbance forces; n is the average orbital angular velocity of the tracked spacecraft;

performing a second order taylor discretization on equation (19) and ignoring the higher order terms and the noise terms, we can obtain:

xp(tk)=F1xp(tk-Δt) (20);

wherein:

Δ t is the time interval between two times of shooting of non-cooperative target images;

xpa vector containing the position and the speed of the mass center of the non-cooperative target is obtained;

F1=I6+Δt·F+1/2Δt2·F2

is provided with

Figure BDA0002239476420000171

A vector containing the relative position of the feature point with respect to the centroid of the non-cooperative target, and the position and velocity of the centroid with respect to the camera coordinate system; from equation (20) we can derive:

X1(tk)=G·X1(tk-Δt) (21);

wherein:

Figure BDA0002239476420000172

I3is a 3 × 3 identity matrix;

according to the formula (21), j is a positive integer for the interval j.DELTA.t, and satisfies the condition that j.DELTA.t is a positive integer for the specified time interval c.DELTA.t

X1(tk-j·Δt)=G-jX1(tk),k-c≤j<k (22);

Wherein: c is a positive integer less than k;

from equations (6) and (7), and taking into account the feature point positions and velocities, the relative angular velocities and attitudes of the non-cooperative targets are both estimated values, which can be derived:

C(tk)X1(tk)=Y(tk) (23);

wherein:

Figure BDA0002239476420000173

Figure BDA0002239476420000174

estimated values calculated according to equations (2), (5), (12) and (15)

Figure BDA0002239476420000175

And

Figure BDA0002239476420000176

x can be obtained by combining the formulas (22) and (23)1(tk) The least squares estimation of (d) is:

Figure BDA0002239476420000181

wherein:

Figure BDA0002239476420000182

the process of the fourth step is as follows: for totally non-cooperative targets such as fault satellites, failure spacecrafts, space debris and the like, the active moment effect is not generated in outer space, so that the angular momentum of the targets is conserved in an inertial system, and the angular momentum h of the non-cooperative targets is under the inertial coordinate systemIComprises the following steps:

Figure BDA0002239476420000183

wherein:

Figure BDA0002239476420000184

the angular velocity and attitude of the above-mentioned tracked spacecraft can be obtained by the measurement equipment of the spacecraft itself, in known quantities, i.e.

Figure BDA0002239476420000185

And

Figure BDA0002239476420000186

in the known manner, it is known that,

Figure BDA0002239476420000187

and

Figure BDA0002239476420000188

estimated from equations (12) and (15), then:

Figure BDA0002239476420000189

Figure BDA00022394764200001810

to track the angular velocity of the spacecraft;

Figure BDA00022394764200001811

an angular velocity of a non-cooperative target;

Figure BDA00022394764200001812

is an attitude rotation matrix from a non-cooperative target system to an inertial system;

is a pose rotation matrix from a tracking spacecraft camera coordinate system to an inertial system;

(12) and (15) estimating the result of the estimation,

definition of

Figure BDA00022394764200001814

Wherein I*=[ItxxItxyItxzItyyItyzItzz]TAre components of the moment of inertia of non-cooperative targets,

Figure BDA00022394764200001815

each component of the non-cooperative target angular momentum under an inertial system; from equation (25) we can obtain:

AxI=0 (26);

wherein:

Figure BDA0002239476420000191

Figure BDA0002239476420000192

is an estimate of the angular velocity of a non-cooperative target

Figure BDA0002239476420000193

Each component of (a); solving equation (26) is equivalent to minimizing equation (27):

Figure BDA0002239476420000194

|| ||2a modulus representing a vector;

definition B ═ ATA; according to equation (27), the condition for obtaining the minimum of the quadratic function f (x) is:

BxI=0 (28);

for homogeneous equation (28), given xIIs 1, i.e. the first component of

Figure BDA0002239476420000195

The block matrix of matrix B is then represented as follows:

Figure BDA0002239476420000196

wherein: b11Is a positive real number; then homogeneous equation (28) is written as:

Brxr=-b1(30);

according to B being a positive definite matrix, an

Figure BDA0002239476420000197

In the knowledge that BrIs also positive; the inertia tensor of the non-cooperative target satisfies its own physical constraints, namely:

Figure BDA0002239476420000201

substitution into

Figure BDA0002239476420000202

Then:

Figure BDA0002239476420000203

then equation (30) is a quadratic equation that satisfies constraint (32) by optimizing a convex quadratic function

Figure BDA0002239476420000204

Solving for xr

And (3) experimental verification:

in order to verify the performance of the algorithm in the invention, a non-cooperative target with a certain size of 3m × 3m × 3m in space is selected as an experimental object. The simulation parameters in the experiment are designed as follows:

the number of the feature points: 4;

relative position of feature points with respect to the non-cooperative target centroid: taking a random number in the interval [ -1.51.5] m;

initial angular velocity of non-cooperative target:

Figure BDA0002239476420000211

initial value of non-cooperative target centroid position: ρ (t)0)=[10 25 30]Tm;

Initial value of non-cooperative target centroid velocity:

initial relative attitude of non-cooperative target: q. q.sct(t0)=[0 0 0 1]T

Non-cooperative target acceleration noise of

Simulation duration: 2500 s;

time interval of two times of shooting non-cooperative target images: Δ t ═ 1 s;

the time interval c is 50.

In the simulation experiment, assuming that the image extraction and matching work is finished, the image position and speed with measurement noise can be directly obtained, wherein the noise modeling is that the mean value is 0, and the standard deviation amplitude is 2 multiplied by 10-5White gaussian noise of rad.

To measure the estimated performance of the designed method, the following relative estimation errors are now defined:

Figure BDA0002239476420000214

Figure BDA0002239476420000215

Figure BDA0002239476420000216

representing the corresponding quantity estimator, | | | | luminance2Represents the norm of the vector, | | represents the absolute value, D represents the non-cooperative target size, here 3, since here the feature point is in the interval [ -1.51.5 []m is a random number, only the error of one characteristic point is taken as a representative, and the inertia parameter error is only the main moment of inertia, namely Ixx,IyyAnd IzzThe relative error is taken as a representative.

The non-cooperative target relative attitude error is defined as:

eθ=2cos-1(qe4)

wherein q ise4Is a quaternion q of the attitude erroreScalar part of qeByAnd (4) obtaining.

From the above simulation results, as shown in fig. 2, 3, 4, 5, 6, 7, 8 and 9, in the range where the distance from the non-cooperative target to the tracked spacecraft is less than 100m, the centroid position estimation error of the completely non-cooperative target is less than 0.1%, the centroid speed estimation error is less than 2%, the attitude estimation error of the non-cooperative target is less than 0.035rad, i.e. 2 °, the relative angular speed estimation relative error is less than 3%, the main moment of inertia relative error of the non-cooperative target is less than 0.15%, and the position estimation relative error of the non-cooperative target feature point relative to the centroid thereof is less than 1.5%, which are within the allowable range. By verification, the method can effectively estimate the relative state of the non-cooperative target and provide required navigation information for the space near-field operation of the next stage.

27页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种无人机相对导航信息融合方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!