Data processing method and device, unmanned aerial vehicle and flight control system

文档序号:74746 发布日期:2021-10-01 浏览:6次 中文

阅读说明:本技术 数据处理方法、装置、无人飞行器与飞行控制系统 (Data processing method and device, unmanned aerial vehicle and flight control system ) 是由 杨小虎 赵丛 于 2019-12-26 设计创作,主要内容包括:一种数据处理方法、装置、无人飞行器与飞行控制系统,该方法应用于可移动平台,可移动平台搭载拍摄装置,方法包括:通过拍摄装置采集目标物所在环境的影像(S11);将影像发送至终端设备,以使终端设备确定目标物在影像中的第一区域信息,并将第一区域信息返回至可移动平台(S12);根据第一区域信息,确定目标物相对于可移动平台的方位信息,方位信息用于控制可移动平台跟随目标物(S13)。该方法可以节约可移动平台的算力,提供精确而迅速的跟踪方案。(A data processing method, a data processing device, an unmanned aerial vehicle and a flight control system are provided, the method is applied to a movable platform, the movable platform carries a shooting device, and the method comprises the following steps: acquiring an image of an environment where a target object is located by a shooting device (S11); transmitting the image to the terminal device, so that the terminal device determines first area information of the target object in the image, and returning the first area information to the movable platform (S12); based on the first region information, position information of the object with respect to the movable platform is determined, the position information being used to control the movable platform to follow the object (S13). The method can save the calculation force of the movable platform and provide an accurate and rapid tracking scheme.)

A data processing method is applied to a movable platform carrying a shooting device, and comprises the following steps:

acquiring an image of an environment where a target object is located through the shooting device;

sending the image to a terminal device so that the terminal device determines first area information of the target object in the image and returns the first area information to the movable platform;

and determining the position information of the target object relative to the movable platform according to the first area information, wherein the position information is used for controlling the movable platform to follow the target object.

The data processing method of claim 1, wherein said determining positional information of the object relative to the movable platform based on the first zone information comprises:

determining second area information of the target object in the image;

generating target area information according to the first area information and the second area information;

and determining the position information of the target object relative to the movable platform according to the target area information.

The data processing method according to claim 2, wherein the first area information and the second area information respectively include a position and/or a size of an area corresponding to the object in the image.

The data processing method of claim 2, wherein the first region information includes a frame number, and the generating target region information from the first region information and the second region information includes:

and generating the target area information by the first area information and the second area information corresponding to the same frame according to the frame sequence number in the first area information.

The data processing method according to claim 4, wherein the generating the target area information from the first area information and the second area information corresponding to the same frame includes:

and taking the area information with higher confidence coefficient in the first area information and the second area information corresponding to the same frame as the target area information.

The data processing method according to claim 4, wherein the frame number currently received and returned by the terminal device is N, the frame number of the image currently acquired by the photographing device is M, and M is greater than N; generating target area information according to the first area information and the second area information includes:

determining reference area information according to the first area information and the second area information corresponding to the Nth frame;

and determining the target area information according to the reference area information and the second area information in the image after the Nth frame.

The data processing method of claim 6, wherein the determining the target area information based on the reference area information and the second area information determined in the picture after the nth frame comprises:

and determining the target area information according to the reference area information and the average value of the second area information in the image after the Nth frame.

The data processing method of claim 6, wherein the method further comprises:

and determining the second area information in the images after the Mth frame according to the target area information.

The data processing method of claim 1, wherein the movable platform is an unmanned aerial vehicle, and the terminal device is a remote controller or a central control platform of the unmanned aerial vehicle.

A data processing method is applied to terminal equipment, wherein the terminal equipment is communicated with a movable platform provided with a shooting device, and the method comprises the following steps:

receiving an image of an environment where a target object is located, wherein the image is sent by the movable platform and acquired by the movable platform;

determining first area information of the target object in the image, wherein the first area information comprises an area position and an area size of the target object in the image;

and sending the first area information to the movable platform so that the movable platform determines the position information of the target object relative to the movable platform according to the first area information, wherein the position information is used for controlling the movable platform to follow the target object.

The data processing method of claim 10, further comprising:

and sending the frame number of the image corresponding to the first area information and the first area information to the movable platform.

The data processing method according to claim 10 or 11, wherein the movable platform is an unmanned aerial vehicle, and the terminal device is a remote controller or a central control platform of the unmanned aerial vehicle.

An unmanned aerial vehicle, comprising:

a flight structure;

a photographing device;

a storage device;

a processor for executing code stored by the memory device to control the flight of the flight structure, the code configured to:

acquiring an image of an environment where a target object is located through the shooting device;

sending the image to a terminal device, so that the terminal device determines first area information of the target object in the image and returns the first area information to the unmanned aerial vehicle, wherein the first area information comprises an area position and an area size of the target object in the image;

and determining the azimuth information of the target object relative to the unmanned aerial vehicle according to the first area information, wherein the azimuth information is used for controlling the flight structure to follow the target object.

The UAV of claim 13 wherein the determining the positional information of the object relative to the UAV based on the first region information comprises:

determining second area information of the target object in the image;

generating target area information according to the first area information and the second area information;

and determining the azimuth information of the target object relative to the unmanned aerial vehicle according to the target area information.

The unmanned aerial vehicle of claim 14, wherein the first region information and the second region information respectively include a position and/or a size of a region corresponding to the object in a picture of the image.

The unmanned aerial vehicle of claim 14, wherein the first region information comprises a frame number, and wherein generating target region information from the first region information and the second region information comprises:

and generating the target area information by the first area information and the second area information corresponding to the same frame according to the frame sequence number in the first area information.

The UAV of claim 16, wherein the first region information and the second region information corresponding to a same frame generate the target region information comprising:

and taking the area information with higher confidence coefficient in the first area information and the second area information corresponding to the same frame as the target area information.

The unmanned aerial vehicle of claim 16, wherein a frame number currently received back by the terminal device is N, a frame number of an image that has been currently acquired by the camera is M, and M is greater than N; generating target area information according to the first area information and the second area information includes:

determining reference area information according to the first area information and the second area information corresponding to the Nth frame;

and determining the target area information according to the reference area information and the second area information in the image after the Nth frame.

The UAV of claim 18 wherein the determining the target region information from the reference region information and the second region information determined in the imagery subsequent to the Nth frame comprises:

and determining the target area information according to the reference area information and the average value of the second area information in the image after the Nth frame.

The UAV of claim 18, wherein the code is further configured to:

and determining the second area information in the images after the Mth frame according to the target area information.

The UAV of claim 13, wherein the terminal device is a remote control or a central control platform of the UAV.

A terminal device, comprising:

a storage device for storing a code;

a processor for executing code stored by the memory device to communicate with a movable platform, the code configured to:

receiving an image of an environment where a target object is located, wherein the image is sent by the movable platform and acquired by the movable platform;

determining first area information of the target object in the image;

and sending the first area information to the movable platform so that the movable platform determines the position information of the target object relative to the movable platform according to the first area information, wherein the position information is used for controlling the movable platform to follow the target object.

The terminal device of claim 22, wherein the code is further configured to: and sending the frame number of the image corresponding to the first area information and the first area information to the movable platform.

The terminal device according to claim 22 or 23, wherein the movable platform is an unmanned aerial vehicle, and the terminal device is a remote controller or a central control platform of the unmanned aerial vehicle.

A flight control system, comprising:

the unmanned aerial vehicle is provided with a shooting device;

a flight remote controller in wireless communication with the UAV for executing the following code:

acquiring an image of an environment where a target object is located through the unmanned aerial vehicle;

determining first area information of the target object in the image, wherein the first area information comprises an area position and an area size of the target object in the image;

and synchronizing the first region information to the unmanned aerial vehicle so that the unmanned aerial vehicle determines the azimuth information of the target object relative to the unmanned aerial vehicle according to the first region information, wherein the azimuth information is used for controlling the unmanned aerial vehicle to follow the target object.

The flight control system of claim 25, wherein the code is further configured to: and sending the frame number of the image corresponding to the first area information and the first area information to the unmanned aerial vehicle.

24页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种可再生能源采集方法、装置及控制器

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类