Target tracking method and device, storage medium and electronic device

文档序号:1155269 发布日期:2020-09-15 浏览:7次 中文

阅读说明:本技术 目标跟踪方法和装置、存储介质及电子装置 (Target tracking method and device, storage medium and electronic device ) 是由 王林源 马子昂 卢维 于 2020-06-02 设计创作,主要内容包括:本发明公开了一种目标跟踪方法和装置、存储介质及电子装置。其中,该方法包括:获取目标对象的目标高度,其中,目标对象为机器人当前跟踪的跟踪对象,目标高度用于表示目标对象的实际高度;根据目标高度确定机器人与目标对象之间的当前距离,其中,当前距离用于表示机器人与目标对象之间当前的实际距离;根据当前距离确定机器人跟踪目标对象的目标跟踪距离,以及根据目标高度确定机器人与目标对象之间的目标夹角;根据目标跟踪距离和目标夹角跟踪目标对象,其中,目标跟踪距离为预先设定的机器人与目标对象的之间的跟踪距离。采用上述技术方案,解决了相关技术中,存在无法灵活的实现对任意给定的目标进行任意指定距离的跟随的问题。(The invention discloses a target tracking method and device, a storage medium and an electronic device. Wherein, the method comprises the following steps: acquiring a target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object; determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing a current actual distance between the robot and the target object; determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height; and tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is the preset tracking distance between the robot and the target object. By adopting the technical scheme, the problem that the following of any given target at any specified distance cannot be flexibly realized in the related technology is solved.)

1. A target tracking method is applied to a robot and comprises the following steps:

acquiring a target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;

determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing a current actual distance between the robot and the target object;

determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height;

and tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object.

2. The method of claim 1, wherein obtaining the target height of the target object comprises:

under the condition that the target height of the target object is known, acquiring the target height stored in advance; or

Acquiring a previous frame image and a current frame image containing the target object and acquiring a first moving distance of the robot within a first time interval for generating the previous frame image and the current frame image under the condition that the target height of the target object is unknown;

and determining the target height of the target object according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image and the focal length of the image acquisition equipment of the robot.

3. The method of claim 2, wherein determining the target height of the target object according to the first movement distance, a first pixel height of the target object in the previous frame image, a second pixel height of the target object in the current frame image, and a focal length of an image capturing device of the robot comprises:

determining a product of the first movement distance, the first pixel height and the second pixel height as a first value, and a product of the focal length and a difference between the second pixel height and the first pixel height as a second value, and determining a ratio between the first value and the second value as the target height.

4. The method of claim 2, wherein said determining a current distance between the robot and the target object as a function of the target height comprises:

determining a ratio between the product of the target height and the focal length and the second pixel height as the current distance.

5. The method of claim 2, further comprising:

determining a third value as a product of the first movement distance and the first pixel height, and determining a ratio of the third value to a difference between the second pixel height and the first pixel height as the current distance.

6. The method of claim 2, wherein said determining a target angle between the robot and the target object based on the target height comprises:

acquiring a third pixel height of the target object in a next frame image under the conditions that the target tracking distance between the robot and the target object changes and the target object moves transversely in a second time interval between the next frame image and the current frame image;

determining a ratio of a product of the target height and the focal length to the third pixel height as a first distance between the robot and the target object, and determining a ratio of the product of the target height and the focal length to the second pixel height as a second distance between the robot and the target object, wherein the first distance is a distance between the robot and the target object in a first shooting direction for acquiring the next frame image, and the second distance is a distance between the robot and the target object in a second shooting direction for acquiring the current frame image;

and determining the target included angle according to the first distance, the second pixel height and a target proportion coefficient, wherein the target proportion coefficient is a proportion coefficient between a pixel coordinate of the target object in the image and a real distance of the target object.

7. The method of claim 1, wherein determining a target tracking distance at which the robot tracks the target object based on the current distance comprises:

when the difference value between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is smaller than the target tracking distance, reducing the moving speed of the robot so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value; or

And under the condition that the difference value between the current distance and the target tracking distance is greater than the preset threshold value and the current distance is greater than the target tracking distance, improving the moving speed of the robot so as to enable the difference value between the current distance and the target tracking distance to be smaller than or equal to the preset threshold value.

8. The method of claim 1, wherein tracking the target object according to the target tracking distance and the target angle comprises:

in the process that the robot tracks the target object, keeping the distance between the robot and the target object as the target tracking distance and keeping the target included angle smaller than or equal to a first preset angle.

9. The method of any one of claims 2 to 6, further comprising:

acquiring an i-1 frame image of the target object, the first pixel height and the first pixel width of a sampling frame for sampling the target object, and initial position coordinates of the target object in the i-1 frame image; carrying out image feature extraction on the i-1 frame image to obtain a first feature vector of the i-1 frame image; training classifier parameters according to the first feature vector, the first pixel height, the first pixel width and the initial position coordinate, wherein the initial position coordinate is a central pixel coordinate of the target object in the i-1 frame image, the previous frame image comprises the i-1 frame image, and i is a natural number;

sampling the target object at the initial coordinate position by using the sampling frame corresponding to the first pixel height and the first pixel width to obtain an ith frame image, performing polar-logarithmic conversion on the ith-1 frame image and the ith frame image, and determining a scale difference and a rotation angle of the target object in the ith-1 frame image and the ith frame image, wherein the scale difference is used for representing the change of the size of the target object in the ith-1 frame image and the ith frame image, the rotation angle is used for representing the angle change of the target object in the ith-1 frame image and the ith frame image in the same reference direction, and the current frame image comprises the ith frame image;

acquiring the current coordinate position of the target object in the ith frame of image; performing image feature extraction on the ith frame image to obtain a second feature vector of the ith frame image, and determining the second pixel height and the second pixel width of the target object in the ith frame image according to the first pixel height, the first pixel width, the scale difference and the rotation angle, wherein the current position coordinate is the central pixel coordinate of the target object in the ith frame image;

training and updating the classifier parameters according to the second feature vector, the current position coordinate, the second pixel height and the second pixel width, so that the image acquisition equipment samples the (i +1) th frame of image according to the current position coordinate, the second pixel height and the second pixel width, wherein the (i +1) th frame of image comprises the next frame of image.

10. The method of claim 9, wherein after the determining the scale difference and the rotation angle of the scale change of the target object in the i-1 th frame image and the i-th frame image, the method further comprises:

and stopping the tracking of the robot on the target object when the rotation angle is greater than or equal to a second preset angle in the process that the robot tracks the target object.

11. An object tracking apparatus, characterized in that,

the robot tracking system comprises a first acquisition unit, a second acquisition unit and a control unit, wherein the first acquisition unit is used for acquiring a target height of a target object, the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;

a first determining unit, configured to determine a current distance between the robot and the target object according to the target height, where the current distance is used to represent a current actual distance between the robot and the target object;

the second determining unit is used for determining a target tracking distance of the robot for tracking the target object according to the current distance and determining a target included angle between the robot and the target object according to the target height;

and the tracking unit is used for tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object.

12. A computer-readable storage medium comprising a stored program, wherein the program when executed performs the method of any of claims 1 to 10.

13. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method of any of claims 1 to 10 by means of the computer program.

Technical Field

The invention relates to the field of computers, in particular to a target tracking method and device, a storage medium and an electronic device.

Background

At present, with the increasingly wide industrial application of robots in storage logistics, factory safety inspection and the like, the intelligent demand of robot platforms is increasingly remarkable. When the robot platform team advances, the robot tracks the moving target and other business scenes, and the robot is required to have the capability of following the target.

Because the following target is given at will and the movement of the target is random and limited by platform computing power, it is necessary to ensure that the execution efficiency of the following algorithm deployed on the robot platform is high enough, and there is a problem that the following of any given target at any specified distance cannot be flexibly realized.

Therefore, an effective technical scheme has not been proposed yet for the problem that the following of any given target by any specified distance cannot be flexibly realized in the related art.

Disclosure of Invention

The embodiment of the invention provides a target tracking method and device, a storage medium and an electronic device, which at least solve the technical problem that the following of any given target by any specified distance cannot be flexibly realized in the related technology.

According to an aspect of an embodiment of the present invention, there is provided a target tracking method, including: acquiring a target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object; determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing a current actual distance between the robot and the target object; determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height; and tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is the preset tracking distance between the robot and the target object.

According to another aspect of the embodiments of the present invention, there is also provided a target tracking apparatus, including: the robot tracking system comprises a first acquisition unit, a second acquisition unit and a control unit, wherein the first acquisition unit is used for acquiring a target height of a target object, the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object; the robot comprises a first determining unit, a second determining unit and a control unit, wherein the first determining unit is used for determining the current distance between the robot and the target object according to the target height, and the current distance is used for representing the current actual distance between the robot and the target object; the second determining unit is used for determining a target tracking distance of the robot for tracking the target object according to the current distance and determining a target included angle between the robot and the target object according to the target height; and the tracking unit is used for tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is the preset tracking distance between the robot and the target object.

According to yet another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to execute the above-mentioned object tracking method when running.

According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the above-mentioned target tracking method through the computer program.

In the embodiment of the invention, firstly, a target object currently tracked by the robot is obtained, a target height representing the actual height of the target object is obtained, the current distance between the robot and the target object is determined according to the target height, then, the target tracking distance of the robot tracking the target object is determined according to the current distance, and a target included angle between the robot and the target object is determined according to the target height, so that the robot tracks the target object according to the target tracking distance and the target included angle. The technical effect that the robot can flexibly track the target object according to the target tracking distance and the target included angle is achieved, and the technical problem that the robot can not flexibly track any given target at any specified distance in the related technology is solved.

Drawings

The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:

FIG. 1 is a schematic diagram of an application environment of a target tracking method according to an embodiment of the invention;

FIG. 2 is a schematic flow diagram of an alternative target tracking method according to an embodiment of the invention;

FIG. 3 is a schematic illustration of an alternative target height according to an embodiment of the present invention;

FIG. 4 is a schematic illustration of an alternative target angle according to an embodiment of the present invention;

FIG. 5 is a schematic flow diagram of an alternative target tracking method according to an embodiment of the invention;

FIG. 6 is a schematic flow diagram of an alternative target tracking algorithm according to an embodiment of the present invention;

FIG. 7 is a schematic diagram of an alternative target tracking device according to an embodiment of the present invention;

fig. 8 is a schematic structural diagram of an alternative electronic device according to an embodiment of the invention.

Detailed Description

In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.

According to an aspect of an embodiment of the present invention, there is provided a target tracking method. Alternatively, the above target tracking method may be applied, but not limited to, in an application environment as shown in fig. 1. As shown in fig. 1, the terminal device 102 obtains a target height of a target object, where the target object is a tracked object currently tracked by the robot, and the target height is used to represent an actual height of the target object, and sends the target height to the server 104 through a network. After receiving the target height, the server 104 determines a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing an actual distance between the robot and the target object; and determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height. The server 106 sends the target tracking distance and the target height to the terminal device 102 through the network 104, and after receiving the target tracking distance and the target height, the terminal device 102 tracks the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object. The above is merely an example, and the embodiments of the present application are not limited herein.

Or, acquiring a target height of a target object at the terminal device 102, where the target object is a tracked object currently tracked by the robot, and the target height is used to represent an actual height of the target object; determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing an actual distance between the robot and the target object; determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height; and tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is the preset tracking distance between the robot and the target object. The above is merely an example, and the present embodiment is not limited to this.

Optionally, in this embodiment, the terminal device may include, but is not limited to, at least one of the following: mobile phones (such as Android phones, iOS phones, etc.), notebook computers, tablet computers, palm computers, MID (Mobile internet devices), PAD, desktop computers, etc. Such networks may include, but are not limited to: a wired network, a wireless network, wherein the wired network comprises: a local area network, a metropolitan area network, and a wide area network, the wireless network comprising: bluetooth, WIFI, and other networks that enable wireless communication. The server may be a single server or a server cluster composed of a plurality of servers. The above is only an example, and the present embodiment is not limited to this.

Optionally, in this embodiment, as an optional implementation manner, the method may be executed by a server, or may be executed by a terminal device, or may be executed by both the server and the terminal device, and in this embodiment, the description is given by taking an example that the terminal device (for example, the terminal device 102) executes. As shown in fig. 2, the flow of the target tracking method may include the steps of:

step S202, acquiring a target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;

step S204, determining the current distance between the robot and the target object according to the target height, wherein the current distance is used for representing the actual distance between the robot and the target object;

step S206, determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height;

and S208, tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is the preset tracking distance between the robot and the target object.

Optionally, the target tracking method may be, but is not limited to, a robot in a scenario of industrial application such as warehouse logistics, factory safety inspection, and the like.

According to the embodiment, firstly, a target object currently tracked by the robot is obtained, a target height representing the actual height of the target object is obtained, the current distance between the robot and the target object is determined according to the target height, then, the target tracking distance for tracking the target object by the robot is determined according to the current distance, and a target included angle between the robot and the target object is determined according to the target height, so that the robot tracks the target object according to the target tracking distance and the target included angle. The technical effect that the robot can flexibly track the target object according to the target tracking distance and the target included angle is achieved, and the technical problem that the robot can not flexibly track any given target at any specified distance in the related technology is solved.

In an alternative embodiment, obtaining the target height of the target object includes: under the condition that the target height of the target object is known, acquiring a pre-stored target height; or under the condition that the target height of the target object is unknown, acquiring a previous frame image and a current frame image containing the target object, and acquiring a first moving distance of the robot in a first time interval for generating the previous frame image and the current frame image; and determining the target height of the target object according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image and the focal length of the image acquisition equipment of the robot.

Alternatively, if the target height of the target object is known and is pre-stored in the robot system, the target height may be directly obtained. Or

If the target height of the target object of the robot is unknown, a previous frame image and a current frame image containing the target object can be acquired, a first moving distance of the robot in a first time interval of the moment when the two frames of images are generated is acquired, and then the target height of the target object is determined according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image and the focal length of an image acquisition device (such as a monocular camera) installed on the robot.

In an alternative embodiment, determining the target height of the target object according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image, and the focal length of the image capturing device of the robot includes: determining a product of the first movement distance, the first pixel height and the second pixel height as a first value, and determining a product of the focal length and a difference between the second pixel height and the first pixel height as a second value, and determining a ratio between the first value and the second value as a target height.

Optionally, assume that the first moving distance between two images of the robot is l, and the first pixel height of the target object in the previous image is v0The height of a second pixel of the target object in the current frame image is vtIf the focal length f of the camera mounted on the robot is known, the starting distance x between the target object and the robot platform camera is known0Distance x between target and robot platform in following processtAnd the following target height h is calculated as follows:

as shown in fig. 3, the following formula can be obtained from the similar triangular proportional relationship between the image and the real environment and the difference of the robot motion distance between two frames of images:further, the target height can be obtainedWherein, lv0vtIs the above first value, (v)t-v0) f is the second value.

In an alternative embodiment, determining the current distance between the robot and the target object according to the target height comprises: and determining the ratio of the product of the target height and the focal length to the second pixel height as the current distance.

Optionally, in the case that the target height h is known in advance, the current distance calculation method between the robot and the target may further be:

Figure BDA0002521298730000071

in an optional embodiment, the method further comprises: and determining a third value by multiplying the first moving distance by the first pixel height, and determining the current distance by the ratio of the third value to the difference between the second pixel height and the first pixel height.

Optionally, the distance between the current frame target object and the robot platform (i.e. the current distance)Distance between starting target position and robot platformWherein, lvtIs the third value described above.

In an alternative embodiment, determining a target angle between the robot and the target object according to the target height includes: acquiring a third pixel height of the target object in the next frame image under the conditions that the target tracking distance between the robot and the target object changes and the target object moves transversely in a second time interval between the next frame image and the current frame image; determining the ratio of the product of the target height and the focal length to the third pixel height as a first distance between the robot and the target object, and determining the ratio of the product of the target height and the focal length to the second pixel height as a second distance between the robot and the target object, wherein the first distance is the distance between the robot and the target object in a first shooting direction for acquiring a next frame image, and the second distance is the distance between the robot and the target object in a second shooting direction for acquiring a current frame image; and determining a target included angle according to the first distance, the second pixel height and a target proportionality coefficient, wherein the target proportionality coefficient is a proportionality coefficient between a pixel coordinate of the target object in the image and a real distance of the target object.

Alternatively, the robot platform should follow the rotation through the respective angle when the target object moves laterally in the image plane. Assuming that the distance between the robot and the target object changes and the target object moves laterally, the relationship between the angle between the robot platform and the target is shown in fig. 4 according to the pinhole imaging principle of the pinhole camera.

The target height h can be calculated by the steps, and the pixel height v of the target height in the image is usedtAnd vt+1(corresponding to the third pixel height above), the target object and robot platform distance l may be estimatedt(the firsttAnd the above xtHave the same meaning) andt+1the relationship between:

Figure BDA0002521298730000081

mapping the target object of the t +1 frame to the t frame image, wherein the imaging characteristics of the pinhole camera in the figure 4 should satisfy the proportional relation:

wherein lt+1Is the first distance, ltAt the second distance, utAnd u _ (t +1) is the lateral coordinate of the target object, k is the proportional system of pixel coordinates to true distance, and k (corresponding to the target proportional relationship described above) satisfies

Figure BDA0002521298730000083

Thus, the change of the included angle between the robot and the target object is as follows:

wherein α is the target angle.

In an alternative embodiment, determining a target tracking distance at which the robot tracks the target object according to the current distance includes: when the difference value between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is smaller than the target tracking distance, reducing the moving speed of the robot so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value; or under the condition that the difference value between the current distance and the target tracking distance is larger than the preset threshold value and the current distance is larger than the target tracking distance, the moving speed of the robot is increased, so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value.

In an alternative embodiment, tracking the target object according to the target tracking distance and the target included angle includes: in the process that the robot tracks the target object, the distance between the robot and the target object is kept as a target tracking distance, and the target included angle is kept smaller than or equal to a first preset angle.

Optionally, the real pose relationship of the robot to the following target object can be obtained through the steps. During the tracking process, the robot can keep the angle between the target object and the robot platform (corresponding to the target angle) to be zero, and keep the distance between the robot and the target object to be a fixed distance (such as the target tracking distance). Wherein, the target tracking distance may be a pre-specified tracking distance.

In an optional embodiment, the method further includes: acquiring an i-1 frame image of a target object, a first pixel height and a first pixel width of a sampling frame for sampling the target object, and an initial position coordinate of the target object in the i-1 frame image; carrying out image feature extraction on the i-1 frame image to obtain a first feature vector of the i-1 frame image; training classifier parameters according to the first feature vector, the first pixel height, the first pixel width and initial position coordinates, wherein the initial position coordinates are central pixel coordinates of a target object in the i-1 th frame of image, the previous frame of image comprises the i-1 th frame of image, and i is a natural number; sampling a target object at an initial coordinate position by using a sampling frame corresponding to a first pixel height and a first pixel width to obtain an ith frame image, performing pole-pair conversion on the ith-1 frame image and the ith frame image, and determining a scale difference and a rotation angle of the target object in scale change in the ith-1 frame image and the ith frame image, wherein the scale difference is used for representing the change of the size of the target object in the ith-1 frame image and the ith frame image, the rotation angle is used for representing the angle change of the target object in the ith-1 frame image and the ith frame image in the same reference direction, and the current frame image comprises the ith frame image; acquiring the current coordinate position of a target object in the ith frame of image; performing image feature extraction on the ith frame image to obtain a second feature vector of the ith frame image, and determining a second pixel height and a second pixel width of the target object in the ith frame image according to the first pixel height, the first pixel width, the scale difference and the rotation angle, wherein the current position coordinate is a central pixel coordinate of the target object in the ith frame image; training and updating classifier parameters according to the second feature vector, the current position coordinate, the second pixel height and the second pixel width, so that the image acquisition equipment samples the (i +1) th frame of image according to the current position coordinate, the second pixel height and the second pixel width, wherein the (i +1) th frame of image comprises the next frame of image.

Optionally, when the target object moves, the robot platform may acquire image information by using a camera, perform pose estimation on the image of the target object by using a target tracking algorithm, including pixel position estimation and target inclination angle (corresponding to the rotation angle) estimation, and after obtaining position information of the target object on the image, obtain a current distance between the target and the robot platform and a target included angle by using the above steps.

It should be noted that the target object can be tracked by training the classifier in advance; the classifier can be trained according to the acquired ith frame of image and parameters of the classifier are updated, when the (i +1) th frame of image is acquired, the classifier is continuously trained by continuously using the (i +1) th frame of image, and the target object is tracked in a mode of continuously updating the classifier on line.

By means of the mode of continuously updating the classifier, the position of the target object and the rotation angle of the target object can be updated in real time, and the accuracy of tracking the target object is improved.

In an optional embodiment, after determining the scale difference and the rotation angle of the scale change of the target object in the i-1 th frame image and the i-th frame image, the method further comprises: and stopping the robot from tracking the target object when the rotation angle is greater than or equal to a second preset angle in the process that the robot tracks the target object.

Alternatively, the determination of the safety state of the following target object is made based on the target inclination angle (corresponding to the above-described rotation angle). By specifying the safe angle range, when the inclination angle of the tracking target in the image plane exceeds the limit, the robot platform stops following. In addition, the robot platform is also stopped when the target object tracking fails, the target moves completely out of the image, or the robot platform encounters an obstacle.

In the following, a flow of the face detection method is described with reference to an alternative example, as shown in fig. 5, the method may include the following steps:

step S501, calibrating a camera installed on the robot, determining a camera internal reference focal length f and an optical center pixel position, and ensuring that the optical center of the camera is adjusted to an image center in the subsequent image processing process. And correcting the odometer or the accelerometer of the robot platform, and estimating the displacement distance of the robot platform in an initialization stage.

Step S502, determining a following target (corresponding to the target object) by a detection algorithm or a human-designated method, and initializing the following target, including determining a tracking target frame and calculating a target height. And (3) designating an expected distance between the robot platform and the following target, and determining the corresponding relation between the target and target pixels in the image by combining the height of the following target.

Alternatively, it is assumed that the first movement distance between two images of the robot (the previous image and the current image) is l, and the target pixel height of the following target in the previous image (corresponding to the first pixel height) is v0The current frame follows the target second pixel height vtAnd the focal length f of the camera is known, the target and the machineStarting distance x between people platform cameras0Distance x between target and robot platform in following processtAnd the following target height h is calculated as follows:

as shown in fig. 3, the following formula can be obtained from the similar triangular proportional relationship between the image and the real environment and the difference of the robot motion distance between two frames of images:further, the target height can be obtainedDistance between current target object and robot platformDistance between starting target position and robot platform

In the case where the target height h is known in advance, the current distance calculation method between the robot and the target may further be:

Figure BDA0002521298730000115

after the corresponding transformation relation between the camera pixel and the actual environment is determined, the angle that the robot platform should follow when the target object moves transversely in the image plane can be presumed. It is generally assumed that the image acquisition frequency is high, the distance between the robot and the object does not change, and only the lateral movement is possible. If the target moves transversely with the change of the distance, the angle relationship between the robot platform and the target is shown in figure 4 according to the pinhole imaging principle of the pinhole camera.

The target height h can be calculated by the steps, and the pixel height v of the target height in the image is usedtAnd vt-1(corresponding to the third pixel height above), the target object and robot platform distance l may be estimatedtAnd lt-1The relationship between:

Figure BDA0002521298730000121

mapping the target object of the t +1 frame to the t frame image, wherein the imaging characteristics of the pinhole camera in the figure 4 should satisfy the proportional relation:wherein lt+1Is the first distance, ltIs the second distance.

Thus, the change of the included angle between the robot and the target object is as follows:wherein α is the above target angle.

The real pose relation from the robot to the following target object can be obtained through the steps. During the tracking process, the robot can keep the angle between the target object and the robot platform (corresponding to the target angle) to be zero, and keep the distance between the robot and the target object to be a fixed distance (such as the target tracking distance). Wherein, the target tracking distance may be a pre-specified tracking distance.

Step S503, when the target object moves, the robot platform may acquire image information by using the camera, perform pose estimation of the target object on the image by using a target tracking algorithm (such as tracking algorithm KSCFrot), including pixel position estimation and target tilt angle (corresponding to the above rotation angle) estimation, and after obtaining the position information of the target object on the image, obtain the current distance and target included angle between the target and the robot platform by the above steps.

In step S504, the safety state of the following target object is determined based on the target tilt angle (corresponding to the above-described rotation angle) obtained in step S503. By specifying the safe angle range, when the inclination angle of the tracking target in the image plane exceeds the limit, the robot platform stops following. In addition, the robot platform is also stopped when the target object tracking fails, the target moves completely out of the image, or the robot platform encounters an obstacle. Otherwise, the process proceeds to step S505.

And step S505, judging whether the robot platform reaches the designated position, and if not, adjusting the robot platform to enable the distance and the included angle between the robot and the target object to be consistent with the expected conditions. After reaching the new position, step S503 is executed again.

Optionally, the principle and flow of the target tracking algorithm (e.g. the tracking algorithm KSCFrot) in step S503 are as follows: firstly, a tracking classifier is used for estimating the position of a tracking target pixel coordinate, then scale and angle estimation is carried out, and finally the tracking classifier is updated according to a new position and a new size, as shown in fig. 6, the specific process is as follows:

step 1, KSCFrot location estimation process:

given a picture containing a tracked target object, the target object's width, height, and initial position coordinates. Taking the center of the target object as an origin, sampling by 2.5 times of the width and the height of the target object to obtain a base sample, and forming a group of initial training samples xi by circularly shifting base sample image blocks.

Training of tracking target objects and a background two-classifier is performed by using training samples, and the training samples need to be processed. Generally, the feature extraction is carried out on a sample image block, the pixel information can be mapped to a nonlinear space, and the classification of a target object by a support vector machine can be facilitated. Defining the mapping of an image into a non-linear space operation asThe mapping method can be extracting the Hog feature, the statistical color feature or extracting the feature through a neural network. It is understood that the above is only an example, and the present embodiment is not limited thereto.

The support vector machine classifier training sample labeling process comprises the steps of firstly establishing two-dimensional Gaussian response pi with the amplitude value of 1 according to the size of a sampling image block, further dividing labels into three classes, and establishing two labeling threshold values, wherein the Gaussian response is larger than an upper boundary threshold value thetauIs marked as 1 and is smaller than the lower boundary threshold value thetalThe definition of (1) is-1, and the other labels are defined as 0. The classifier is trained to minimize output results and labeling errors.

Wherein y is the output result.

Classifier w is defined as dual parameters α and feature vectorsThe feature vectors in the classifier and the image feature vectors are combined to obtain a kernel matrix,

Figure BDA0002521298730000133

the kernel matrix has cyclicity, so that the calculation can be simplified, the solution classifier is converted into a dual space through the kernel matrix, and dual parameters are solved. Meanwhile, the convolution operation of the time domain can be transferred to the frequency domain for dot multiplication operation, and the fast Fourier transform of the frequency domain is utilizedThe processing speed is faster, and the following results can be obtained:

the optimization objective function of the classifier is as follows:

Figure BDA0002521298730000142

s.t.e≥0 (3)

wherein C is an adjustment coefficient.

The classifier parameters satisfy:

Figure BDA0002521298730000143

after each sample xi generated by the circulant matrix and the classifier { w, b } or { alpha, b } are subjected to correlation calculation, a corresponding response result pi can be obtained. And after the result is obtained, the result is converted back to a time domain through inverse Fourier transform, so that the result can be displayed more visually. And arranging the response results in a cyclic sequence of the generated samples, searching the maximum response position and comparing the maximum response position with the standard Gaussian response maximum position so as to judge the movement distance of the tracking target. The width and height of the object on the image remains unchanged.

max(pi)→u,v (6)

Step 2, scale and angle estimation process:

after the preliminary position estimation is obtained, the target scale and angle changes are estimated. The process is to sample the same width and height of the base sample of one frame or initial frame above the initial target position. And circularly generating a group of scale and angle training samples for the base sample, and converting the Cartesian coordinates of the samples into polar logarithmic coordinates. By transforming logpolar (-) by pole pair, the angle and scale can be changed linearly. Assume that the template is I0 and the current frame target image is I1. Two image blocks x center pixel (u)0,v0) As an origin, the Cartesian coordinates u/v are converted to polar angular coordinates s/θ.

Wherein the content of the first and second substances,r is the reference direction in cartesian coordinates.

Figure BDA0002521298730000154

Carrying out feature extraction phi (-) on the training sample, and carrying out feature vector and tracking template h on the training sampleρAnd performing phase correlation estimation in the nonlinear feature space, arranging the obtained results according to the cyclic sequence of the generated samples, and taking the coordinate information of the maximum response result. Because the coordinates of the polar coordinate system represent length and angle, the coordinates of the maximum response can be used to determine the difference in scale and angle between the tracked target and the template. And (3) carrying out size scaling on the image of the tracking target object obtained in the step (1) at the position of the initial Cartesian position coordinate. And meanwhile, accumulating the obtained angle difference so as to determine the angle change between the tracking target and the initial state. And finally obtaining the Cartesian coordinate position and the inclination angle estimation of the tracking target.

max(hρΦ)→s′,θ′ (10)

And 4, updating the template:

at the new position, the training sample collection is repeated with the new size using the method in step 1, and the classifier parameters are calculated. And sampling the next image according to the position and the size obtained by the current frame, and performing a new round of target tracking calculation.

According to the embodiment, firstly, a camera (such as a monocular industrial camera) is used for estimating the pose of a target by using a target tracking algorithm, wherein the estimation comprises the estimation of the position of the target image and the in-plane rotation angle; secondly, calculating the following distance and orientation of the robot platform by using camera internal parameters and the target size and combining the conversion relation from the plane image to the real environment; and finally, adjusting the pose of the robot to enable the following target to be located at a fixed distance in front of the camera, and simultaneously ensuring the running safety of the robot platform through the estimated rotation angle. A new tracking algorithm KSCFrot is designed by taking a support vector machine tracking algorithm KSCF as a basis and introducing a polar logarithm coordinate to carry out tracking target dimension and angle estimation, so that the rotation condition of a tracking target can be estimated. The robot platform can follow any given target according to the specified distance by means of the monocular vision sensor, and can safely judge the abnormal state of the following target in a tilting or side-turning manner, so that the flexibility of the robot in tracking the target object is improved, and the safety of the robot is improved by estimating the rotation angle.

It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.

According to still another aspect of the embodiments of the present invention, there is also provided an object tracking apparatus, as shown in fig. 7, including:

a first obtaining unit 702, configured to obtain a target height of a target object, where the target object is a tracked object currently tracked by the robot, and the target height is used to represent an actual height of the target object;

a first determining unit 704, configured to determine a current distance between the robot and the target object according to the target height, where the current distance is used to represent a current actual distance between the robot and the target object;

a second determining unit 706, configured to determine a target tracking distance at which the robot tracks the target object according to the current distance, and determine a target included angle between the robot and the target object according to the target height;

the tracking unit 708 is configured to track the target object according to a target tracking distance and a target included angle, where the target tracking distance is a preset tracking distance between the robot and the target object.

Alternatively, the first acquiring unit 702 may be configured to execute step S202, the first determining unit 704 may be configured to execute step S204, the second determining unit 706 may be configured to execute step S206, and the tracking unit 708 may be configured to execute step S208.

According to the embodiment, firstly, a target object currently tracked by the robot is obtained, a target height representing the actual height of the target object is obtained, the current distance between the robot and the target object is determined according to the target height, then, the target tracking distance for tracking the target object by the robot is determined according to the current distance, and a target included angle between the robot and the target object is determined according to the target height, so that the robot tracks the target object according to the target tracking distance and the target included angle. The technical effect that the robot can flexibly track the target object according to the target tracking distance and the target included angle is achieved, and the technical problem that the robot can not flexibly track any given target at any specified distance in the related technology is solved.

As an optional technical solution, the first obtaining unit includes: the first acquisition module is used for acquiring a pre-stored target height under the condition that the target height of the target object is known; or the second acquisition module is used for acquiring a previous frame image and a current frame image containing the target object under the condition that the target height of the target object is unknown, and acquiring a first moving distance of the robot in a first time interval for generating the previous frame image and the current frame image; and the first determining module is used for determining the target height of the target object according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image and the focal length of the image acquisition equipment of the robot.

As an optional technical solution, the first determining module is further configured to determine a product of the first moving distance and the first pixel height and the second pixel height as a first numerical value, determine a product of the focal length and a difference between the second pixel height and the first pixel height as a second numerical value, and determine a ratio between the first numerical value and the second numerical value as the target height.

As an optional technical solution, the first determining unit is further configured to determine a ratio between a product of the target height and the focal length and the second pixel height as the current distance.

As an optional technical solution, the apparatus further includes: and the third determining unit is used for determining a third numerical value by multiplying the first moving distance by the first pixel height, and determining the ratio of the third numerical value to the difference value between the second pixel height and the first pixel height as the current distance.

As an optional technical solution, the second determining unit, the third obtaining module, is configured to obtain a third pixel height of the target object in the next frame image when a target tracking distance between the robot and the target object changes and the target object moves laterally in a second time interval between the next frame image and the current frame image; the second determining module is used for determining the ratio of the product of the target height and the focal length to the third pixel height as a first distance between the robot and the target object, and determining the ratio of the product of the target height and the focal length to the second pixel height as a second distance between the robot and the target object, wherein the first distance is the distance between the robot and the target object in a first shooting direction for acquiring a next frame image, and the second distance is the distance between the robot and the target object in a second shooting direction for acquiring a current frame image; and the third determining module is used for determining a target included angle according to the first distance, the second pixel height and a target proportion coefficient, wherein the target proportion coefficient is a proportion coefficient between the pixel coordinate of the target object in the image and the real distance of the target object.

As an optional technical solution, the second determining unit includes: the first processing module is used for reducing the moving speed of the robot under the condition that the difference value between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is smaller than the target tracking distance, so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value; or the second processing module is used for increasing the moving speed of the robot under the condition that the difference value between the current distance and the target tracking distance is larger than the preset threshold value and the current distance is larger than the target tracking distance, so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value.

As an optional technical solution, the tracking unit is further configured to, in a process that the robot tracks the target object, keep a distance between the robot and the target object as a target tracking distance, and keep a target included angle smaller than or equal to a first preset angle.

As an optional technical solution, the apparatus further includes: the first processing unit is used for acquiring the i-1 th frame image of the target object, the first pixel height and the first pixel width of a sampling frame for sampling the target object and the initial position coordinate of the target object in the i-1 th frame image; carrying out image feature extraction on the i-1 frame image to obtain a first feature vector of the i-1 frame image; training classifier parameters according to the first feature vector, the first pixel height, the first pixel width and initial position coordinates, wherein the initial position coordinates are central pixel coordinates of the target object in the i-1 th frame of image, the previous frame of image comprises the i-1 th frame of image, and i is a natural number; the second processing unit is used for sampling the target object at an initial coordinate position by using a sampling frame corresponding to the first pixel height and the first pixel width to obtain an ith frame image, performing pole-to-pair conversion on the ith-1 frame image and the ith frame image, and determining the scale difference and the rotation angle of the target object in the scale change of the ith-1 frame image and the ith frame image, wherein the scale difference is used for representing the change of the size of the target object in the ith-1 frame image and the ith frame image, the rotation angle is used for representing the angle change of the target object in the ith-1 frame image and the ith frame image in the same reference direction, and the current frame image comprises the ith frame image; the third processing unit is used for acquiring the current coordinate position of the target object in the ith frame of image; performing image feature extraction on the ith frame image to obtain a second feature vector of the ith frame image, and determining a second pixel height and a second pixel width of the target object in the ith frame image according to the first pixel height, the first pixel width, the scale difference and the rotation angle, wherein the current position coordinate is a central pixel coordinate of the target object in the ith frame image; and the fourth processing unit is used for training and updating the classifier parameters according to the second feature vector, the current position coordinate, the second pixel height and the second pixel width, so that the image acquisition equipment samples the (i +1) th frame of image according to the current position coordinate, the second pixel height and the second pixel width, wherein the (i +1) th frame of image comprises the next frame of image.

As an optional technical solution, the apparatus further includes: and the fifth processing unit is used for stopping the tracking of the target object by the robot under the condition that the rotating angle is greater than or equal to a second preset angle in the process that the robot tracks the target object after determining the scale difference and the rotating angle of the scale change of the target object in the i-1 th frame image and the i-th frame image.

According to a further aspect of embodiments of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above-mentioned method embodiments when executed.

Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:

s1, acquiring the target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;

s2, determining the current distance between the robot and the target object according to the target height, wherein the current distance is used for representing the current actual distance between the robot and the target object;

s3, determining the target tracking distance of the robot for tracking the target object according to the current distance, and determining the target included angle between the robot and the target object according to the target height;

and S4, tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is the preset tracking distance between the robot and the target object.

Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:

alternatively, in this embodiment, a person skilled in the art may understand that all or part of the steps in the methods of the foregoing embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, ROM (Read-Only Memory), RAM (Random Access Memory), magnetic or optical disks, and the like.

According to yet another aspect of the embodiments of the present invention, there is also provided an electronic device for implementing the above object tracking method, as shown in fig. 8, the electronic device includes a memory 802 and a processor 804, the memory 802 stores a computer program therein, and the processor 804 is configured to execute the steps in any one of the above method embodiments through the computer program.

Optionally, in this embodiment, the electronic apparatus may be located in at least one network device of a plurality of network devices of a computer network.

Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:

s1, acquiring the target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;

s2, determining the current distance between the robot and the target object according to the target height, wherein the current distance is used for representing the current actual distance between the robot and the target object;

s3, determining the target tracking distance of the robot for tracking the target object according to the current distance, and determining the target included angle between the robot and the target object according to the target height;

and S4, tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is the preset tracking distance between the robot and the target object.

Alternatively, it can be understood by those skilled in the art that the structure shown in fig. 8 is only an illustration, and the electronic device may also be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 8 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 8, or have a different configuration than shown in FIG. 8.

The memory 802 may be used to store software programs and modules, such as program instructions/modules corresponding to the target tracking method and apparatus in the embodiments of the present invention, and the processor 804 executes various functional applications and data processing by running the software programs and modules stored in the memory 802, so as to implement the target tracking method. The memory 802 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 802 can further include memory located remotely from the processor 804, which can be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The memory 802 may be, but not limited to, specifically configured to store information such as a target height of a target object. As an example, as shown in fig. 8, the memory 802 may include, but is not limited to, a first obtaining unit 702, a first determining unit 704, a second determining unit 706, and a tracking unit 708 of the target tracking device. In addition, other module units in the target tracking device may also be included, but are not limited to, and are not described in detail in this example.

Optionally, the transmitting device 806 is configured to receive or transmit data via a network. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 806 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices to communicate with the internet or a local area Network. In one example, the transmission device 806 is a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.

In addition, the electronic device further includes: a display 808 and a connection bus 810 for connecting the various modular components of the electronic device described above.

In other embodiments, the terminal or the server may be a node in a distributed system, wherein the distributed system may be a blockchain system, and the blockchain system may be a distributed system formed by connecting a plurality of nodes through a network communication form. Nodes can form a Peer-To-Peer (P2P, Peer To Peer) network, and any type of computing device, such as a server, a terminal, and other electronic devices, can become a node in the blockchain system by joining the Peer-To-Peer network.

Alternatively, in this embodiment, a person skilled in the art may understand that all or part of the steps in the methods of the foregoing embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.

The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.

The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be substantially or partially implemented in the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and including instructions for causing one or more computer devices (which may be personal computers, servers, or network devices) to execute all or part of the steps of the method according to the embodiments of the present invention.

In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.

In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit is merely a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.

Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.

The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

24页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种稳定性高的雷达

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!