Binocular vision-based near space vehicle relative pose measurement method and device

文档序号:1240931 发布日期:2020-08-18 浏览:10次 中文

阅读说明:本技术 基于双目视觉的临近空间飞行器相对位姿测量方法及装置 (Binocular vision-based near space vehicle relative pose measurement method and device ) 是由 黄磊 于 2020-06-04 设计创作,主要内容包括:本发明基于双目视觉的临近空间飞行器相对位姿测量方法及装置,建立气囊坐标系、吊舱坐标系和测量系统坐标系,建立各坐标系间的关系,通过双目相机采集图像并传送至嵌入式处理器处理,提取合作标识的图像中心位置,计算各合作标识在双目立体视觉测量坐标系下的坐标,并转换到吊舱坐标系下,得到气囊坐标系相对吊舱坐标系的相对位姿。本发明提供的基于双目视觉的临近空间飞行器相对位姿测量方法及装置,基于照明反光的特征合作标识、双目相机及机载嵌入式处理器有效实现临近空间飞行器在飞行过程中气囊和吊舱的相对位姿测量,满足轻量化、低功耗需求,可在复杂的成像条件下从气囊表面提取合作标识,对保障飞行安全、控制飞行运动具有重要意义。(The invention relates to a binocular vision-based near space aircraft relative pose measuring method and device, which are used for establishing an air bag coordinate system, a pod coordinate system and a measuring system coordinate system, establishing a relation among the coordinate systems, acquiring images through a binocular camera, transmitting the images to an embedded processor for processing, extracting the image center position of a cooperation mark, calculating the coordinate of each cooperation mark under the binocular stereoscopic vision measuring coordinate system, converting the coordinates into the pod coordinate system, and obtaining the relative pose of the air bag coordinate system relative to the pod coordinate system. According to the binocular vision-based near space vehicle relative pose measurement method and device, the relative pose measurement of the air bag and the pod of the near space vehicle in the flight process is effectively realized based on the illumination and reflection characteristic cooperation mark, the binocular camera and the airborne embedded processor, the requirements of light weight and low power consumption are met, the cooperation mark can be extracted from the surface of the air bag under the complex imaging condition, and the method and device have important significance for guaranteeing flight safety and controlling flight motion.)

1. The binocular vision-based near space aircraft relative pose measuring method is characterized by comprising the following steps of:

a. before starting flight measurement, a plurality of cooperative markers are reasonably distributed and adhered to the bottom surface of the air bag, the binocular camera, the illumination light source and the embedded processor are fixed in the hanging cabin, and the binocular camera and the illumination light source are connected with the embedded processor;

b. establishing an air bag coordinate system and a pod coordinate system, establishing a binocular stereoscopic vision measurement system coordinate system by combining with a calibration target, unifying the coordinate systems, and importing various parameters obtained by calibration into an embedded processor for measurement;

c. after the measurement is started, the embedded processor controls the illumination light source to be normally on, the binocular camera continuously collects images and transmits the images to the embedded processor, the embedded processor processes the obtained images, and the image center position of the cooperation mark is extracted;

d. calculating the coordinates of each cooperative identifier under a coordinate system of a binocular stereoscopic vision measurement system, and converting the coordinates into a coordinate system of a pod to obtain the position posture of an air bag coordinate system relative to the coordinate system of the pod;

e. and the embedded processor transmits the obtained result to the back-end controller in real time.

2. The binocular vision-based near space vehicle relative pose measurement method according to claim 1, wherein in the step a, black light absorption cloth and bright silver chemical fiber reflective cloth are cut to be manufactured into X-shaped corner point features serving as cooperation marks, and nine cooperation marks are selected as features and are adhered to the bottom surface of the air bag along the air bag framework.

3. The binocular vision based near space vehicle relative pose measurement method according to claim 1, wherein in the step a, three windows are designed on the upper surface of a pod shell, the windows cover coated glass to isolate the inside and the outside of the pod, a binocular camera and an illumination light source are respectively installed at the three corresponding windows in the pod and are upward in direction, the visual angles of the binocular camera and the illumination light source are adjusted to ensure that the camera and the illumination visual field can cover a cooperative identification movement interval, the position of the binocular camera is fixed, and the position of the camera is not changed in the flight movement process.

4. The binocular vision based adjacent space vehicle relative pose measurement method according to claim 1, wherein in the step b, the steps of establishing and calibrating an air bag coordinate system, a pod coordinate system and a binocular stereo vision measurement system coordinate system are as follows:

(1) before installing the binocular cameras, internal parameters of the cameras need to be well defined under a laboratory environment;

(2) measuring the characteristics of a coordinate system for defining the airbag when the airbag is designed by using a total station, and establishing an airbag coordinate system;

(3) setting an air bag coordinate system as a built-in measuring coordinate system of the total station, measuring the coordinates of the pasted cooperative identifications to obtain the coordinates of all the cooperative identifications under the air bag coordinate system, and numbering the coordinates according to rules;

(4) measuring the characteristics of a coordinate system of the nacelle by using a total station when the nacelle is designed, and establishing a nacelle coordinate system;

(5) calibrating binocular structure parameters of the fixed binocular cameras by using the connecting rod targets, wherein in the calibration process, the connecting rod targets continuously change the position postures, and meanwhile, the two cameras respectively shoot photoetching targets at the two ends of the connecting rod, so that the rotation and translation relations between the binocular cameras are calibrated;

(6) setting a pod coordinate system as a built-in measuring coordinate system of the total station while calibrating binocular structure parameters by using the connecting rod target, and synchronously measuring characteristic points on the connecting rod target to obtain coordinates of the connecting rod target under the pod coordinate system; under the condition that the relationship between the photoetching target and the characteristic points on the connecting rod target is known, the rotation and translation relationship between the nacelle coordinate system and the coordinate system of the binocular stereo vision measurement system can be calibrated;

(7) and importing the calibration result into the embedded processor.

5. The binocular vision based adjacent space vehicle relative pose measurement method according to claim 1, wherein in the step c, the embedded processor performs image processing as follows:

(1) in the measuring process, the embedded processor directly controls the illumination light source to be normally on, and simultaneously gives out a synchronous trigger signal to the binocular camera at the frequency of 100Hz, and the binocular camera continuously acquires images and transmits the images to the embedded processor through the USB;

(3) under the irradiation of an illumination light source, the light-reflecting material in the cooperative mark displays high brightness in an image, and the surrounding light-absorbing material is darker in the image, so that the position of the cooperative mark is determined;

(4) and adopting a Hessian matrix to carry out sub-pixel extraction on the preliminarily screened image on the X-shaped angular point characteristics, wherein the surface of the air bag and the aerial background lack the X-shaped angular point characteristics except the cooperative identification, and the sub-pixel centers of the angular points of all the required cooperative identifications in the image can be robustly obtained.

6. The binocular vision-based near space vehicle relative pose measurement method according to claim 5, wherein the step of performing sub-pixel extraction on the preliminarily screened image by using the Hessian matrix to the X-type corner point features comprises the following steps:

corner pixel (x)0,y0) The Hessian matrix of (a) is noted as:

the second order gradient value of the normal direction of the point and the normal direction (n)x,ny) The maximum absolute eigenvalue and the corresponding eigenvector of the Hessian matrix of the point are obtained;

let the angular point sub-pixel coordinate be (x)0+s,y0+ t) of (s, t) ∈ [ -0.5, 0.5 [ ]]*[-0.5,0.5]When, i.e. the first-order zero crossing of the edge is within the current pixel, through the diagonal point (x)0,y0) The gray value of the inner sub-pixel angular point is expanded by a second-order Taylor formula, which comprises the following steps:

the calculation can obtain:

wherein r isxx、rxy、ryyRespectively representing the image at point (x)0,y0) A second order gradient in the x direction, a first order gradient in the y direction of the first order gradient in the x direction, and a second order gradient in the y direction; r isx、ryRespectively representing the image at point (x)0,y0) First order gradients in the x-direction as well as in the y-direction.

7. The binocular vision-based adjacent space vehicle relative pose measurement method according to claim 1, wherein in the step d, the specific steps are as follows:

(1) matching the cooperative identifications extracted from the images synchronously acquired by the binocular camera through polar line constraint and image sequence consistency constraint to obtain the serial numbers of the cooperative identifications;

(2) performing binocular three-dimensional reconstruction on each matched characteristic point to obtain space coordinates of all cooperative identifications under a coordinate system of a binocular stereoscopic vision measuring system;

(3) transforming the coordinate of the cooperative identification under the coordinate system of the binocular stereoscopic vision measurement system to the coordinate system of the pod;

(4) after the cooperation mark coordinate under the pod coordinate system and the cooperation mark coordinate under the airbag coordinate system are known, the position posture transformation relation of the airbag coordinate system relative to the pod coordinate system is obtained through common point transfer station calculation.

8. The binocular vision-based near space vehicle relative pose measurement device comprises an air bag and a nacelle, and is characterized in that a plurality of cooperation marks are arranged on the bottom surface of the air bag, a binocular camera, an illumination light source and an embedded processor are fixed in the nacelle, the binocular camera and the illumination light source are connected with the embedded processor, and the embedded processor is connected with a rear-end controller for data transmission.

Technical Field

The invention belongs to the technical field of near space vehicles, and particularly relates to a binocular vision-based method and device for measuring the relative pose of a near space vehicle.

Background

The adjacent space is an airspace with the height of 20-100 km, compared with the traditional aviation space, the adjacent space can provide more information and is used as a new airspace, the upper airspace can be used for manufacturing the sky, and the lower airspace can be used for manufacturing the sea, the land and the air, and the adjacent space becomes a hotspot of future research and application. The near space aircraft mainly comprises an air bag filled with nitrogen and a pod of loading equipment, and the air bag and the pod are connected through a cable and used for space and ground detection, so that the real-time measurement of the relative pose relationship between the near space aircraft and the pod in flight has important significance for guaranteeing flight safety and controlling flight movement.

At present, no relevant report is published on a measurement means of the relative pose of an air bag and a pod of a near space aircraft, and the existing pose measurement method only can independently measure the position and the pose of a single carrier, namely the air bag and the pod, and mainly uses an inertial navigation method and a Global Positioning System (GPS) method. The above method cannot acquire the relative attitude between the airbag and the nacelle. The visual measurement has irreplaceable status in the measurement field due to the advantages of large measurement range, non-contact measurement process and the like. Therefore, the vision-based pose measurement technology is an important key technology for solving the problems.

The measurement of the relative pose of the near space aircraft aims at solving the problem of high-precision real-time measurement of the position and the attitude of an air bag relative to a nacelle in the flight process. The near space aircraft is large in size, the camera measurement field of view is large, the measurement field of view faces upwards, and calibration of the relation among the measurement systems is an important problem. Moreover, the measurement system faces complex imaging conditions, and is easily interfered by the sun due to the sky observation; secondly, the imaging conditions are changed violently from the ground to the highest altitude position, the ground reflected light and the air refract near the ground, the space brightness is uniform, the air is thin from the ground to the highest altitude, the difference between the backlight imaging and the backlight imaging is huge, the measured surface of the air bag lacks obvious characteristics, and the characteristics are difficult to extract from the surface of the air bag for calculation under the complex imaging conditions. In addition, the measurement system needs to be installed in the nacelle as airborne equipment, which puts high demands on light weight and low power consumption of the measurement system.

Disclosure of Invention

In order to solve the technical problems in the prior art, the invention aims to provide a method and a device for measuring the relative pose of a near space aircraft based on binocular vision.

In order to achieve the purpose and achieve the technical effect, the invention adopts the technical scheme that:

the binocular vision-based method for measuring the relative pose of the adjacent space aircraft comprises the following steps:

a. before starting flight measurement, a plurality of cooperative markers are reasonably distributed and adhered to the bottom surface of the air bag, the binocular camera, the illumination light source and the embedded processor are fixed in the hanging cabin, and the binocular camera and the illumination light source are connected with the embedded processor;

b. establishing an air bag coordinate system and a pod coordinate system, establishing a binocular stereoscopic vision measurement system coordinate system by combining with a calibration target, unifying the coordinate systems, and importing various parameters obtained by calibration into an embedded processor for measurement;

c. after the measurement is started, the embedded processor controls the illumination light source to be normally on, the binocular camera continuously collects images and transmits the images to the embedded processor, the embedded processor processes the obtained images, and the image center position of the cooperation mark is extracted;

d. calculating the coordinates of each cooperative identifier under a coordinate system of a binocular stereoscopic vision measurement system, and converting the coordinates into a coordinate system of a pod to obtain the position posture of an air bag coordinate system relative to the coordinate system of the pod;

e. and the embedded processor transmits the obtained result to the back-end controller in real time.

Further, in the step a, the black light absorption cloth and the bright silver chemical fiber reflective cloth are cut to be made into X-shaped angular point characteristics as cooperation marks, and nine cooperation marks are selected as characteristics to be adhered to the bottom surface of the air bag along the air bag framework.

Further, in the step a, three windows are designed on the upper surface of the nacelle shell, the windows cover the coated glass to isolate the inside and the outside of the nacelle, the binocular camera and the illumination light source are respectively installed at the corresponding three windows in the nacelle and are upward in direction, the visual angles of the binocular camera and the illumination light source are adjusted to ensure that the camera and the illumination visual field can cover the characteristic or cooperation mark movement interval, the position of the binocular camera is fixed, and the position of the binocular camera is not changed in the flying movement process.

Further, in the step b, the steps of establishing and calibrating the coordinate system of the air bag, the coordinate system of the pod and the coordinate system of the binocular stereo vision measuring system are as follows:

(1) before installing the binocular cameras, internal parameters of the cameras need to be well defined under a laboratory environment;

(2) measuring the characteristics of a coordinate system for defining the airbag when the airbag is designed by using a total station, and establishing an airbag coordinate system;

(3) setting an air bag coordinate system as a built-in measuring coordinate system of the total station, measuring the coordinates of the pasted cooperative identifications to obtain the coordinates of all the cooperative identifications under the air bag coordinate system, and numbering the coordinates according to rules;

(4) measuring the characteristics of a coordinate system of the nacelle by using a total station when the nacelle is designed, and establishing a nacelle coordinate system;

(5) calibrating binocular structure parameters of the fixed binocular cameras by using the connecting rod targets, wherein in the calibration process, the connecting rod targets continuously change the position postures, and meanwhile, the two cameras respectively shoot photoetching targets at the two ends of the connecting rod, so that the rotation and translation relation between the two cameras is calibrated;

(6) setting a pod coordinate system as a built-in measuring coordinate system of the total station while calibrating binocular structure parameters by using the connecting rod target, and synchronously measuring characteristic points on the connecting rod target to obtain coordinates of the connecting rod target under the pod coordinate system; under the condition that the relationship between the photoetching target and the characteristic points on the connecting rod target is known, the rotation and translation relationship between the nacelle coordinate system and the coordinate system of the binocular stereo vision measurement system can be calibrated;

(7) and importing the calibration result into the embedded processor.

Further, in step c, the embedded processor performs image processing as follows:

(1) in the measuring process, the embedded processor directly controls the illumination light source to be normally on, and simultaneously gives out a synchronous trigger signal to the binocular camera at the frequency of 100Hz, and the binocular camera continuously acquires images and transmits the images to the embedded processor through the USB;

(3) under the irradiation of an illumination light source, the light-reflecting material in the cooperative mark displays high brightness in an image, and the surrounding light-absorbing material is darker in the image, so that the position of the cooperative mark is determined;

(4) adopting a Hessian matrix to carry out sub-pixel extraction on the preliminarily screened image on the X-shaped angular point characteristics, wherein the surface of the air bag and the aerial background lack the X-shaped angular point characteristics except the cooperative identification, and the sub-pixel centers of the angular points of all the cooperative identifications in the image can be robustly obtained;

corner pixel (x)0,y0) The Hessian matrix of (a) is noted as:

the second order gradient value of the normal direction of the point and the normal direction (n)x,ny) The maximum absolute eigenvalue and the corresponding eigenvector of the Hessian matrix of the point are obtained;

let the angular point sub-pixel coordinate be (x)0+s,y0+ t) of (s, t) ∈ [ -0.5, 0.5 [ ]]*[-0.5,0.5]When, i.e. the first-order zero crossing of the edge is within the current pixel, through the diagonal point (x)0,y0) The gray value of the inner sub-pixel angular point is expanded by a second-order Taylor formula, which comprises the following steps:

the calculation can obtain:

wherein r isxx、rxy、ryyRespectively representing the image at point (x)0,y0) A second order gradient in the x direction, a first order gradient in the y direction of the first order gradient in the x direction, and a second order gradient in the y direction; r isx、ryRespectively representing the image at point (x)0,y0) First order gradients in the x-direction as well as in the y-direction.

Further, in step d, the specific steps are as follows:

(1) matching the cooperative identifications extracted from the images synchronously acquired by the binocular camera through polar line constraint and image sequence consistency constraint to obtain the serial numbers of the cooperative identifications;

(2) performing binocular three-dimensional reconstruction on each matched characteristic point to obtain space coordinates of all cooperative identifications under a coordinate system of a binocular stereoscopic vision measuring system;

(3) transforming the coordinate of the cooperative identification under the coordinate system of the binocular stereoscopic vision measurement system to the coordinate system of the pod;

(4) after the cooperation mark coordinate under the pod coordinate system and the cooperation mark coordinate under the airbag coordinate system are known, the position posture transformation relation of the airbag coordinate system relative to the pod coordinate system is obtained through common point transfer station calculation.

The air bag comprises an air bag body and a nacelle, wherein a plurality of cooperation marks are arranged on the bottom surface of the air bag body, a binocular camera, a lighting source and an embedded processor are fixed in the nacelle, the binocular camera and the lighting source are connected with the embedded processor, and the embedded processor is connected with a rear-end controller for data transmission.

Compared with the prior art, the invention has the beneficial effects that:

the invention discloses a binocular vision-based method and a device for measuring relative pose of a near space aircraft, which comprises the steps of firstly establishing an air bag coordinate system, a pod coordinate system and a binocular stereoscopic vision measurement system coordinate system, establishing a relation among the coordinate systems, continuously acquiring images through a binocular camera and transmitting the images to an embedded processor, processing the acquired images by the embedded processor, extracting the image center position of a cooperative identifier, calculating the coordinate of each cooperative identifier under the binocular stereoscopic vision measurement system coordinate system, converting the coordinate into the pod coordinate system, and finally obtaining the position pose of the air bag coordinate system relative to the pod coordinate system through common point transfer; and the result obtained by the calculation of the embedded processor is transmitted to the rear-end controller in real time to be used as motion feedback. The binocular vision-based near space aircraft relative pose measurement method and device provided by the invention are based on the illumination and reflection characteristic cooperation mark, the binocular camera and the airborne embedded processor, so that the relative pose measurement of the air bag and the nacelle of the near space aircraft in the flight process is effectively realized, the requirements of light weight and low power consumption are met, the cooperation mark can be extracted from the surface of the air bag under the complex imaging condition for calculation, and the method and device have important significance for ensuring the flight safety and controlling the flight motion of the near space aircraft.

Drawings

FIG. 1 is a block diagram of the steps of the present invention;

FIG. 2 is a layout view of a cooperative indicia of the present invention on an airbag;

FIG. 3 is a block diagram of the circuit connections of the present invention;

FIG. 4 is a measurement schematic of the present invention;

fig. 5 is a position relationship diagram of the binocular camera and the view plane according to the present invention.

Detailed Description

The embodiments of the present invention will be described in detail with reference to the accompanying drawings so that the advantages and features of the invention can be more easily understood by those skilled in the art, and the scope of the invention will be clearly and clearly defined.

As shown in fig. 1-5, the binocular vision-based near space vehicle relative pose measuring device comprises an air bag 2 and a nacelle, wherein a plurality of cooperative markers 1 are arranged on the bottom surface of the air bag 2, a binocular camera, an illumination light source and an embedded processor are fixed in the nacelle, the binocular camera and the illumination light source are connected with the embedded processor, and the embedded processor is connected with a rear-end controller for data transmission.

As shown in fig. 1-5, the binocular vision-based method for measuring the relative pose of the adjacent space vehicle comprises the following steps:

a. before the flight measurement starts, paste cooperation sign 1 rational distribution in 2 bottom surfaces of gasbag surface, binocular camera, light source, embedded processing system installation are fixed in the nacelle, and wherein binocular camera and light source install in nacelle windowing department and direction up, and the rational adjustment visual angle specifically includes following steps:

(1) cutting black light absorption cloth and bright silver chemical fiber reflecting cloth to manufacture X-shaped angular point characteristics (the black light absorption cloth and the bright silver chemical fiber reflecting cloth are distributed at an angle of 90 degrees alternately) as cooperation marks, and selecting 9 cooperation marks 1 as characteristics to be adhered to the lower surface of the air bag 2 along the framework of the air bag 2 as shown in figure 2;

(2) three open windows are designed on the upper surface of a pod shell, the windows are covered with coated glass to isolate the inside and the outside of the pod, a binocular camera (a left camera and a right camera) and an illumination light source are respectively arranged under the three corresponding windows in the pod, the visual angles of the camera and the illumination light source are adjusted to ensure that the camera and the illumination visual field can cover a characteristic motion interval, the positions of the binocular camera are fixed, and the position of the camera is ensured not to change in the flying motion process;

(3) an embedded processor is fixed in the nacelle, as shown in fig. 3, the embedded processor is connected with the left camera and the right camera through a USB and a synchronous trigger line, a control line is connected between the embedded processor and the illumination light source, and result data is transmitted between the embedded processor and the rear-end controller through a 485 bus.

b. After all modules of the measuring system are installed and fixed, an air bag coordinate system and a pod coordinate system are established by using a total station, a binocular stereoscopic vision measuring system coordinate system is established by combining a calibration target, all coordinate systems are unified, all parameters obtained by calibration are led into an embedded processor, and then measurement can be prepared to be started, and the method specifically comprises the following steps:

(1) as shown in FIG. 4, the whole system needs to establish an air bag coordinate system (O) for realizing measurementBXBYBZB) Pod coordinate system (O)SXSYSZS) Measuring camera coordinate system (O)CXCYCZC). Wherein the measuring camera coordinate system is established on the left camera of the binocular camera measuring system. The coordinate (P) of all the cooperative identifications under the air bag coordinate system is obtained by calibration for realizing measurementt1,Pt2,…,Pt9) The roto-translational relationship between the camera coordinate system and the pod coordinate system, i.e. the rotation matrix RCSAnd translation vectorTCSAnd structural parameters between the binocular cameras. Before a camera is installed, camera internal parameters need to be well defined under a laboratory environment;

(2) measuring the characteristics of a coordinate system for defining the airbag when the airbag is designed by using a total station, and establishing an airbag coordinate system;

(3) setting an air bag coordinate system as a built-in measuring coordinate system of the total station, measuring the coordinates of the pasted cooperative identifications to obtain the coordinates of all the cooperative identifications under the air bag coordinate system, and numbering the coordinates according to rules;

(4) measuring the characteristics of a coordinate system of the nacelle by using a total station when the nacelle is designed, and establishing a nacelle coordinate system;

(5) calibrating binocular structure parameters of the fixed binocular cameras by using a connecting rod target (comprising a connecting rod and photoetching targets fixed at two ends of the connecting rod), wherein in the calibration process, the connecting rod target continuously changes the position and posture, and meanwhile, the two cameras respectively shoot the photoetching targets at two ends, so that the rotation and translation relation between the two cameras is calibrated;

(6) when the connecting rod target is used for calibrating binocular structure parameters, the pod coordinate system is set as a built-in measuring coordinate system of the total station, the characteristic points on the connecting rod target are synchronously measured, and the coordinates of the connecting rod target under the pod coordinate system are obtained. Under the condition that the relationship between the photoetching target and the characteristic points on the connecting rod target is known, a rotation matrix R between a pod coordinate system and a binocular camera coordinate system, namely a coordinate system of a binocular stereo vision measuring system, can be calibratedCSAnd translation vector TCS

(7) And importing the calibration result into the embedded processor.

c. After the measurement is started, the embedded processor controls the illumination light source to be normally on, synchronous trigger information is given out at a frame rate of 100Hz by controlling a trigger signal, two cameras are controlled to continuously acquire images and transmit the images to the embedded processor, the embedded processor processes the acquired images, and the image center position of the cooperation mark is extracted. The method comprises the following specific steps:

(1) the embedded processor is a control and calculation unit taking ARM + GPU as a core, can realize the miniaturization and low power consumption of the processor, is connected with two cameras through two paths of USB, is connected to a hard trigger signal end of the camera through two paths of control signals, and is connected to an illumination light source control switch through one path of control signal;

(2) in the measuring process, the embedded processor directly controls the illumination light source to be normally on, and simultaneously gives a synchronous trigger signal to the camera at the frequency of 100Hz, and the camera starts to continuously acquire and transmits images to the embedded processor through a USB;

(3) under the irradiation of an illumination light source, the light-reflecting material in the cooperation mark 1 is highlighted in an image, the surrounding light-absorbing material is darker in the image, and the general position of the cooperation mark 1 can be determined through self-adaptive binarization processing and morphological operation;

(4) and performing sub-pixel extraction on the preliminarily screened image by adopting the Hessian matrix to the X-type angular point characteristics, wherein the sub-pixel centers of the angular points of all the needed cooperative marks in the image can be robustly acquired because the surfaces of the air bags and the aerial background lack the X-type angular point characteristics except the cooperative marks. The image processing process is accelerated through the GPU, so that the effect of quickly processing the image is achieved.

d. Based on the image center position of each cooperation mark 1 obtained by image processing, the coordinates of each cooperation mark in the coordinate system of the binocular stereo vision measuring system are calculated through the principle of binocular stereo vision, the coordinates are further converted into the coordinate system of the nacelle, and finally the position posture of the air bag coordinate system relative to the coordinate system of the nacelle is obtained through common point transfer. The method comprises the following specific steps:

(1) matching the cooperation identifications 1 extracted from the images synchronously acquired by the two cameras through polar line constraint and image sequence consistency constraint to obtain the serial numbers of the cooperation identifications 1;

(2) performing binocular three-dimensional reconstruction on each matched characteristic point to obtain space coordinates of all cooperation marks under a coordinate system of a measuring camera;

(3) transforming the cooperation identification coordinates under the coordinate system of the measuring camera to the coordinate system of the nacelle through space rotation translation;

(4) after the cooperation mark coordinate under the pod coordinate system and the cooperation mark coordinate under the airbag coordinate system are known and obtained through calibration, the real-time rotation matrix R and the translation vector T of the airbag coordinate system relative to the pod coordinate system can be obtained through calculation through the common point transfer station, and the real-time rotation matrix R and the translation vector T are target parameters obtained by a measuring system;

(5) and the result obtained by the calculation of the embedded processor is transmitted to the rear-end controller in real time through a 485 bus to be used as motion feedback.

17页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:基于单目视觉的临近空间飞行器相对位姿测量方法及装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!