Target tracking system and method based on two-dimensional label

文档序号:1954756 发布日期:2021-12-10 浏览:18次 中文

阅读说明:本技术 一种基于二维标签的目标追踪系统及方法 (Target tracking system and method based on two-dimensional label ) 是由 姜晓栋 张晋桥 赵新 于 2021-09-13 设计创作,主要内容包括:本发明提供一种基于二维标签的目标追踪系统及方法,用于对设有二维标签的运动目标物体进行追踪,包括:飞行器,飞行器上设有摄像头,用于在飞行过程中实时拍摄连续多帧运动目标物体的目标图像;目标追踪模块,用于依次针对每帧目标图像进行识别,并根据包含有二维标签的目标图像处理得到飞行器与运动目标物体之间的相对位置信息;根据相对位置信息处理得到一追踪信号,以控制飞行器追踪运动目标物体飞行,使得拍摄得到的目标图像中,二维标签处于目标图像的中心位置。有益效果是能够实现飞行器对特定运动目标物体的识别和自主追踪,降低追踪过程中人力的投入,提高追踪的效率和稳定性,降低系统风险。(The invention provides a target tracking system and a target tracking method based on a two-dimensional tag, which are used for tracking a moving target object provided with the two-dimensional tag, and comprise the following steps: the aircraft is provided with a camera and is used for shooting target images of continuous multi-frame moving target objects in real time in the flight process; the target tracking module is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain relative position information between the aircraft and the moving target object; and processing the relative position information to obtain a tracking signal so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional tag is positioned in the center of the target image in the shot target image. The method has the advantages that the method can realize the identification and autonomous tracking of the aircraft on the specific moving target object, reduce the manpower input in the tracking process, improve the tracking efficiency and stability and reduce the system risk.)

1. A target tracking system based on two-dimensional labels is characterized by being used for tracking a moving target object, wherein a two-dimensional label is arranged on the moving target object;

the target tracking system includes:

the aircraft is provided with a camera and is used for shooting continuous multiframe target images of the moving target object in real time in the flying process of the aircraft and outputting the target images;

the target tracking module is connected respectively the camera with the aircraft, the target tracking module includes:

the image processing submodule is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain relative position information between the aircraft and the moving target object;

and the tracking control sub-module is connected with the image processing sub-module and used for processing according to the relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is positioned in the center of the target image in the target image obtained by shooting.

2. The object tracking system of claim 1, wherein the two-dimensional tag is an aprilat tag.

3. The target tracking system of claim 2, wherein the image processing sub-module comprises:

the first processing unit is used for processing each frame of target image in sequence to obtain the gradient direction and amplitude of each pixel in the target image, and performing cluster analysis on each gradient direction and amplitude to obtain a plurality of line segments contained in the target image;

the second processing unit is connected with the first processing unit and used for traversing each line segment to identify a quadrangle and outputting an identification result indicating that the target image of the current frame contains the two-dimensional label when the quadrangle is identified for the first time;

and the third processing unit is connected with the second processing unit and used for starting a tracking mode according to the identification result, then sequentially processing the target image of the current frame and each frame of target image after the current frame respectively and continuously outputting the relative position information between the aircraft and the moving target object obtained through processing.

4. The object tracking system of claim 3, wherein the second processing unit performs quadrilateral recognition by traversing each of the line segments using a recursive depth-first search algorithm with a depth of 4.

5. The target tracking system of claim 3, wherein the third processing unit comprises:

the first processing subunit is configured to obtain a homography matrix representing a position mapping relationship of the two-dimensional tag between a tag coordinate system and an image coordinate system according to a pre-acquired focal length of the camera, a size of the two-dimensional tag, and the target image, where the tag coordinate system uses a center of the two-dimensional tag as an origin and a plane where the two-dimensional tag is located is an XOY plane;

and the second processing subunit is connected with the first processing subunit and is used for processing the internal reference matrix of the camera obtained by calibration in advance and the homography matrix to obtain the position information of the two-dimensional label in the image coordinate system as the relative position information between the aircraft and the moving target object.

6. The object tracking system of claim 5, wherein the first processing subunit processes the homography matrix using a direct linear transformation algorithm.

7. The object tracking system of claim 5, wherein the second processing subunit processes the position information using the following formula:

wherein H is used to represent the homography matrix; s is used to represent a scale factor; p is used for representing the internal reference matrix; rij(i-0, 1, 2; j-0, 1) for representing a rotation component of the two-dimensional label in the image coordinate system; t isx,Ty,TzFor representing a distance component of the two-dimensional tag in the image coordinate system;

the location information includes T in the distance componentxAnd Ty,TxRepresents a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate systemyRepresenting the aircraft and the motionA second relative distance between the target objects in the y-axis direction in the tag coordinate system, the relative position information including the first relative distance and the second relative distance.

8. The object tracking system of claim 7, wherein the image processing sub-module further comprises a position correction unit coupled to the third processing unit, the position correction unit comprising:

the first correction subunit is used for acquiring an Euler angle and a flight altitude of the aircraft in real time, and respectively processing the Euler angle and the flight altitude to obtain a first position deviation between the aircraft and the moving target object in the x-axis direction in a label coordinate system and a second position deviation between the aircraft and the moving target object in the y-axis direction in the label coordinate system;

the second correcting subunit is connected with the first correcting subunit and is used for correcting the first relative distance and the second relative distance respectively according to the first position deviation and the second position deviation to obtain corrected relative position information;

and the tracking control sub-module processes the corrected relative position information to obtain the tracking signal so as to control the aircraft to track the moving target object to fly.

9. The object tracking system of claim 8, wherein the first correction subunit processes the first and second position deviations using the following equations:

L=h*tanθ

wherein h represents the flying height; l represents the first positional deviation when θ represents a roll angle of the aircraft; and when theta represents the yaw angle of the aircraft, L represents the second position deviation.

10. A target tracking method based on two-dimensional tags, which is applied to the target tracking system according to any one of claims 1 to 9, and comprises the following steps:

step S1, the target tracking system controls a camera arranged on an aircraft to shoot a continuous multiframe target image of a moving target object provided with a two-dimensional label in real time in the flying process of the aircraft;

step S2, the target tracking system receives the target images, sequentially identifies each frame of the target images, and processes the target images according to the target images containing the two-dimensional labels to obtain the relative position information between the aircraft and the moving target object;

step S3, the target tracking system processes the relative position information to obtain a tracking signal to control the aircraft to track the moving target object for flight, so that the two-dimensional tag is located in the center of the target image in the target image obtained by shooting.

Technical Field

The invention relates to the technical field of image processing, in particular to a target tracking system and method based on a two-dimensional label.

Background

The method has important significance for the identification and tracking technology of a specific target in a moving state, the fields of industrial scene monitoring, intelligent traffic system management, power maintenance, even military application and the like. The rapid development of machine vision technology has further pushed the development of automated tracking technology. The existing similar system has the technical defects of huge platform, high cost, much manpower input and the like.

Aprilat is a visual positioning method developed in recent years for positioning based on two-dimensional code signposts, which can calculate the precise three-dimensional position, direction and tag ID of a two-dimensional code tag relative to a camera. AprilTag has played an important role in multi-agent cooperation and indoor positioning at present. How to combine unmanned aerial vehicle and machine vision technique, realize unmanned aerial vehicle to specific target's discernment and independently track become the technological problem that awaits a urgent need to solve.

Disclosure of Invention

Aiming at the problems in the prior art, the invention provides a target tracking system based on a two-dimensional label, which is used for tracking a moving target object, wherein the moving target object is provided with the two-dimensional label;

the target tracking system includes:

the aircraft is provided with a camera and is used for shooting continuous multiframe target images of the moving target object in real time in the flying process of the aircraft and outputting the target images;

the target tracking module is connected respectively the camera with the aircraft, the target tracking module includes:

the image processing submodule is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain relative position information between the aircraft and the moving target object;

and the tracking control sub-module is connected with the image processing sub-module and used for processing according to the relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is positioned in the center of the target image in the target image obtained by shooting.

Preferably, the two-dimensional tag is an AprilTag tag.

Preferably, the image processing sub-module includes:

the first processing unit is used for processing each frame of target image in sequence to obtain the gradient direction and amplitude of each pixel in the target image, and performing cluster analysis on each gradient direction and amplitude to obtain a plurality of line segments contained in the target image;

the second processing unit is connected with the first processing unit and used for traversing each line segment to identify a quadrangle and outputting an identification result indicating that the target image of the current frame contains the two-dimensional label when the quadrangle is identified for the first time;

and the third processing unit is connected with the second processing unit and used for starting a tracking mode according to the identification result, then sequentially processing the target image of the current frame and each frame of target image after the current frame respectively and continuously outputting the relative position information between the aircraft and the moving target object obtained through processing.

Preferably, the second processing unit performs quadrilateral recognition by traversing each line segment by using a recursive depth-first search algorithm with a depth of 4.

Preferably, the third processing unit includes:

the first processing subunit is configured to obtain a homography matrix representing a position mapping relationship of the two-dimensional tag between a tag coordinate system and an image coordinate system according to a pre-acquired focal length of the camera, a size of the two-dimensional tag, and the target image, where the tag coordinate system uses a center of the two-dimensional tag as an origin and a plane where the two-dimensional tag is located is an XOY plane;

and the second processing subunit is connected with the first processing subunit and is used for processing the internal reference matrix of the camera obtained by calibration in advance and the homography matrix to obtain the position information of the two-dimensional label in the image coordinate system as the relative position information between the aircraft and the moving target object.

Preferably, the first processing subunit processes the homography matrix by using a direct linear transformation algorithm.

Preferably, the second processing subunit obtains the position information by processing according to the following formula:

wherein H is used to represent the homography matrix; s is used to represent a scale factor; p is used for representing the internal reference matrix; rij(i-0, 1, 2; j-0, 1) for representing a rotation component of the two-dimensional label in the image coordinate system; t isx,Ty,TzFor representing a distance component of the two-dimensional tag in the image coordinate system;

the location information includes T in the distance componentxAnd Ty,TxRepresents a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate systemyA second relative distance between the aircraft and the moving target object in the y-axis direction in the tag coordinate system, wherein the relative position information includes the first relative distance and the second relative distance.

Preferably, the image processing sub-module further comprises a position correction unit connected to the third processing unit, the position correction unit comprising:

the first correction subunit is used for acquiring an Euler angle and a flight altitude of the aircraft in real time, and respectively processing the Euler angle and the flight altitude to obtain a first position deviation between the aircraft and the moving target object in the x-axis direction in a label coordinate system and a second position deviation between the aircraft and the moving target object in the y-axis direction in the label coordinate system;

the second correcting subunit is connected with the first correcting subunit and is used for correcting the first relative distance and the second relative distance respectively according to the first position deviation and the second position deviation to obtain corrected relative position information;

and the tracking control sub-module processes the corrected relative position information to obtain the tracking signal so as to control the aircraft to track the moving target object to fly.

Preferably, the first correcting subunit obtains the first position deviation and the second position deviation by processing according to the following formulas:

L=h*tanθ

wherein h represents the flying height; l represents the first positional deviation when θ represents a roll angle of the aircraft; and when theta represents the yaw angle of the aircraft, L represents the second position deviation.

The invention also provides a target tracking method based on the two-dimensional label, which is applied to the target tracking system, and the target tracking method comprises the following steps:

step S1, the target tracking system controls a camera arranged on an aircraft to shoot a continuous multiframe target image of a moving target object provided with a two-dimensional label in real time in the flying process of the aircraft;

step S2, the target tracking system receives the target images, sequentially identifies each frame of the target images, and processes the target images according to the target images containing the two-dimensional labels to obtain the relative position information between the aircraft and the moving target object;

step S3, the target tracking system processes the relative position information to obtain a tracking signal to control the aircraft to track the moving target object for flight, so that the two-dimensional tag is located in the center of the target image in the target image obtained by shooting.

The technical scheme has the following advantages or beneficial effects: the aircraft can recognize and autonomously track a specific moving target object, the manpower input in the tracking process is reduced, the tracking efficiency and stability are improved, and the system risk is reduced.

Drawings

Fig. 1 is a schematic structural diagram of a target tracking system based on two-dimensional code tags according to a preferred embodiment of the present invention;

FIG. 2 is a schematic diagram of a two-dimensional label in accordance with a preferred embodiment of the present invention;

FIG. 3 is a schematic diagram illustrating a position correction principle according to a preferred embodiment of the present invention;

fig. 4 is a flowchart illustrating a target tracking method based on two-dimensional tags according to a preferred embodiment of the present invention.

Detailed Description

The invention is described in detail below with reference to the figures and specific embodiments. The present invention is not limited to the embodiment, and other embodiments may be included in the scope of the present invention as long as the gist of the present invention is satisfied.

In a preferred embodiment of the present invention, based on the above problems in the prior art, a target tracking system based on two-dimensional tags is provided, which is used for tracking a moving target object 1, wherein a two-dimensional tag is arranged on the moving target object 1;

as shown in fig. 1, the target tracking system includes:

the device comprises an aircraft 2, wherein the aircraft 2 is provided with a camera 21 and is used for shooting and outputting target images of continuous multi-frame moving target objects 1 in real time in the flying process of the aircraft 2;

target tracking module 3 connects camera 21 and aircraft 2 respectively, and target tracking module 3 includes:

the image processing submodule 31 is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain the relative position information between the aircraft and the moving target object;

and the tracking control sub-module 32 is connected with the image processing sub-module 31 and is used for processing the relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is located at the center of the target image in the shot target image.

Specifically, in the present embodiment, the two-dimensional TAG is an aprilat TAG, which includes, but is not limited to, TAG36H11-0, as shown in fig. 2. Before the target tracking is performed, the generated aprilat label may be printed and then attached to the surface of the moving target object 1, preferably to the upper surface of the moving target object 1, so that the camera 21 flying on the aircraft 2 above the moving target object 1 can accurately capture a target image including the two-dimensional label in the target tracking process.

In the actual tracking process, when the target tracking is performed, because the moving target object 1 and the aircraft 2 both move rapidly, in order to keep the stability of the initial tracking state and the success rate of the tracking, the system starts the tracking mode and allows the moving target object 1 to be at the center position of the visual field of the aircraft 2 as much as possible. In the operation process of the system, the moving target object 1 is always kept at the central position of the visual field of the camera by adjusting the attitude of the aircraft, which is also the realization scheme of tracking. More preferably, the camera may be arranged vertically downward such that the positive Z-axis direction of the camera coordinate system is opposite to the positive Z-axis direction of the body coordinate system.

In the target tracking process, the aircraft 2 flies above the moving target object 1, the camera 21 is controlled to shoot a target image in real time, and then the target image is sent to the target tracking module 3 to be processed, preferably, the target tracking module 3 can be a local upper computer, a remote server or a control chip integrated in the aircraft 2. When the target tracking module 3 is integrated in the control chip of the aircraft 2, the processed position information is preferably sent to the controller of the aircraft 2 through a serial port, a 64-byte circular queue space is preferably pre-configured in the controller, and the position information is moved to the circular queue space after being received by the serial port. Meanwhile, in order to ensure the correctness of the position information, a check bit is set for each frame of position information data, and the check bit is positioned at the end of each frame of position information data and is the sum of all data bits. After the position information data are received, each position information data are summed again, and the acquired position information data are used as correct data only after the check bit is verified, so that the safety is guaranteed.

After receiving the target image, firstly, image recognition needs to be performed on the target image, when the two-dimensional tag is recognized, the corresponding moving target object is represented as a tracking target, then, image processing can be performed on the target image to acquire relative position information between the aircraft and the moving target object, and then the aircraft is controlled to track the moving target object to fly according to the relative position information, so that the two-dimensional tag is located at the center position of the target image in the shot target image, in other words, the aircraft 2 flies right above the moving target object, and autonomous tracking of the moving target object is achieved.

In a preferred embodiment of the present invention, the image processing sub-module 31 includes:

the first processing unit 311 is configured to sequentially obtain, for each frame of target image, a gradient direction and an amplitude of each pixel in the target image, and perform cluster analysis on each gradient direction and amplitude to obtain a plurality of line segments included in the target image;

the second processing unit 312, connected to the first processing unit 311, is configured to traverse each line segment to perform quadrilateral recognition, and output a recognition result indicating that the current frame target image includes a two-dimensional tag when a quadrilateral is recognized for the first time;

the third processing unit 313 is connected to the second processing unit 312, and is configured to start the tracking mode according to the recognition result, sequentially process the current frame target image and each frame target image after the current frame target image, and continuously output the processed relative position information between the aircraft and the moving target object.

Specifically, in this embodiment, before the tracking mode is started, the two-dimensional tag of the moving target object may not be in the visual field of the camera, a search process needs to be performed, that is, multiple frames of target images are continuously captured and identified in the flight process, and one or more target images that are initially captured may not include the two-dimensional tag.

When the image recognition is carried out on the target image, firstly, the line segment in the target image is recognized, and the method comprises the following steps: and acquiring the gradient direction and amplitude of each pixel in the target image, and gathering pixels with similar gradient direction and amplitude components into a line segment. The algorithm of cluster analysis used by the first processing unit 311 is similar to the graph-based method of Felzenszwalb, and specifically includes: each point in the target image captured by the camera 21 represents a pixel, and edges are added between adjacent pixels, with the weight of the edges being equal to the difference in the gradient direction between adjacent pixels. Then, the pixels are sorted according to the edge weights, and whether the pixels are classified into one class (line segment) is judged. Further specifically, the pixel component is denoted by n, which is a vector value; the gradient direction of the pixel is represented by a function D (n), and the gradient direction is a scalar value and represents the direction in which the pixel changes the fastest; the amplitude of a pixel is represented by a function m (m) representing the difference between the maximum and minimum values of the change of a certain pixel. Based on this, when there are two pixel points that satisfy the following two conditions, they are connected together to form a line segment:

in the above formula: min (D (n), D (m)) and min (M (n), M (m)) respectively representing the gradient direction and the smaller pixel amplitude, KDAnd KMFor indicating regulating parameters, preferably KD=100、KM=1200。

In the process of actually sequencing the pixels, a linear time counting sequencing method is preferably used, and the information of the upper limit and the lower limit of the gradient direction and the amplitude is stored while sequencing. This gradient-based clustering method is sensitive to noise in the image, resulting in local gradient direction changes even in the presence of less noise, and we solve this problem by low-pass filtering the image. The aprilat label used at the same time has a large-scale feature of edge nature, so that the effective information is not blurred when low-pass filtering is used, which is different from other problem domains, and in a specific design, a filter with sigma of 0.8 is selected. After the clustering operation is finished, the traditional least square method can be used for fitting and connecting the line segments, and meanwhile, the line segments are classified according to the brightness and darkness of images on the two sides of the line segments, so that the quadrilateral extraction work in the next processing stage is facilitated. The work is also the slowest stage in the detection scheme, the resolution of the target image is preferably reduced to half of the original resolution in practical development, and experiments show that the recognition speed is improved by 4 times.

After the identification of the line segments in the target image is completed, and then the quadrangle in the target image is identified, some directed line segments are obtained in the previous work, which provides convenience for the task of finding the line segment sequence with the shape of the quadrangle, namely the rectangle. The system uses a recursive depth-first search algorithm with the depth of 4 as a rectangular identification scheme, and each depth layer of the recursive depth-first search algorithm acquires an edge for a quadrangle. All the segments will be retrieved at the first depth level and each segment will be the starting segment of the rectangle. And searching the line segment adjacent to the line segment by taking the line segment of the first layer as a starting point until a closed quadrangle is obtained, wherein the whole searching process obeys a counterclockwise winding sequence. Meanwhile, whether the line segments belong to the same quadrangle or not can be judged by selecting a proper threshold value, so that the recognition accuracy and the success rate under the shielding condition are increased. Searching all line segments in the detection process is a huge workload, the resource consumption of the MCU is large, and a two-dimensional lookup table is preferably adopted in the design to accelerate the query. By the optimization mode and the anticlockwise searching mode, the detection times of each straight line are limited, and the running time occupied by quadrilateral detection is greatly reduced.

In a preferred embodiment of the present invention, the third processing unit 313 includes:

a first processing subunit 3131, configured to obtain, according to a focal length of the camera, a size of the two-dimensional tag, and a target image, a homography matrix representing a position mapping relationship of the two-dimensional tag between a tag coordinate system and an image coordinate system, where the tag coordinate system uses a center of the two-dimensional tag as an origin and a plane where the two-dimensional tag is located is an XOY plane;

and the second processing subunit 3132, connected to the first processing subunit 3131, is configured to process, according to the internal reference matrix and the homography matrix of the camera obtained through calibration in advance, to obtain position information of the two-dimensional tag in the image coordinate system as relative position information between the aircraft and the moving target object.

Specifically, in this embodiment, the tag coordinate system may process to obtain a three-dimensional coordinate of the two-dimensional tag in the camera coordinate system after acquiring a focal length of the camera and a size of the two-dimensional tag according to an imaging principle of the camera, and further process to obtain a homography matrix by using a direct linear transformation algorithm in combination with the target image.

In a preferred embodiment of the present invention, the second processing subunit 3132 obtains the position information by using the following formula:

wherein H is used for representing a homography matrix; s is used to represent a scale factor; p is used to represent an internal reference matrix; rij(i-0, 1, 2; j-0, 1) is used for representing the rotation component of the two-dimensional label in the image coordinate system; t isx,Ty,TzFor representing a distance component of the two-dimensional label in an image coordinate system;

the location information includes T in the distance componentxAnd Ty,TxRepresenting a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate systemyAnd the relative position information comprises the first relative distance and the second relative distance.

Specifically, in this embodiment, H is a matrix of 3 × 3, and P is a matrix of 3 × 4, which is specifically represented as:wherein f isxAnd fyIs the focal length of the camera. After the formula is substituted into the formula, the formula is converted into a set of equivalent equations as follows:

solving the equation system can obtain T in the position information including the distance componentxAnd TyI.e. the position information of the two-dimensional tag in the image coordinate system.

The position information is referred to by an image coordinate system, the central position of the selected target image is the original point of the image coordinate system, the positive direction of an X axis from the original point to the right and the positive direction of a Y axis from the original point to the top are selected in the image plane, and the output position information is the deviation from the central point of the image. After the design, the position information can be directly used as the relative position information between the aircraft and the moving target object, and then the aircraft can fly along with the moving target object by adjusting the attitude of the aircraft according to the relative position information, so that the tracking effect is realized by the fact that the moving target object is always in the central position of the target image.

Further, since the flight process of the aircraft is dynamic and the camera and the aircraft are relatively static, the visual angle of the camera may change with the body posture. In the change process, the visual angle of the camera cannot be kept vertical to the ground at any moment, and in addition, the image captured by the camera in the flight process has distortion. Therefore, the position information needs to be corrected, and the position information is completely obtained by calculating image pixels, so that the attitude interference in the position information is eliminated to ensure the correctness of the position data. Based on this, the image processing sub-module 31 further includes a position correction unit 314 connected to the third processing unit 313, and the position correction unit 314 includes:

the first correction subunit 3141 is configured to obtain an euler angle and a flight altitude of the aircraft in real time, and respectively process the euler angle and the flight altitude to obtain a first position deviation between the aircraft and the moving target object in the x-axis direction in the tag coordinate system and a second position deviation between the aircraft and the moving target object in the y-axis direction in the tag coordinate system;

a second correcting subunit 3142, connected to the first correcting subunit 3141, and configured to correct the first relative distance and the second relative distance according to the first position deviation and the second position deviation, respectively, to obtain corrected relative position information;

and the tracking control sub-module 32 processes the corrected relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly.

Specifically, in the present embodiment, the principle and method of position correction are described by taking the Roll angle Roll of the aircraft as an example, as shown in fig. 3, where a straight line L1 represents the ground plane on which the moving target object 1 is located; the point O is an aircraft, the straight line L2 is an aircraft body plane, the straight line L3 is a sight line of a camera carried by the aircraft, and the straight line L4 is a straight line parallel to the horizon line through the point O; the dotted line is perpendicular to the horizon through the moving target object 1 and is perpendicular to the straight line L4 through the point N; the roll angle of the machine body is theta. At this time, the aircraft is not directly above the moving target object 1 due to the inclination of the body, but the tracking of the aircraft is in an unbiased state when the moving target object 1 is at the center in the image captured by the camera, i.e., from the viewpoint of the target position data. At this time, the position data needs to be corrected, the broken line in the triangle formed by the moving target object 1, the airframe and the point N is the height of the aircraft acquired in advance, and according to the pythagorean theorem:

tan(π-θ)=h/L

the actual deviation is L ═ h × tan θ.

Wherein h represents the flying height; l represents a first positional deviation when θ represents a roll angle of the aircraft; and theta represents the yaw angle of the aircraft, and L represents the second position deviation. The calculation of the second position deviation is analogized, and the description is omitted here.

The present invention further provides a target tracking method based on a two-dimensional tag, which is applied to the target tracking system described above, as shown in fig. 4, the target tracking method includes:

step S1, the target tracking system controls a camera arranged on an aircraft to shoot a continuous multiframe target image of a moving target object provided with a two-dimensional label in real time in the flying process of the aircraft;

step S2, the target tracking system receives the target images, identifies each frame of target image in turn, and processes the target images according to the target images containing the two-dimensional labels to obtain the relative position information between the aircraft and the moving target object;

and step S3, the target tracking system processes the relative position information to obtain a tracking signal to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is located at the center of the target image in the shot target image.

While the invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.

13页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种基于原始CAD模型的三维目标跟踪算法研究

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!