Control method, device and equipment of movable platform and storage medium

文档序号:1189232 发布日期:2020-09-22 浏览:2次 中文

阅读说明:本技术 可移动平台的控制方法、装置、设备及存储介质 (Control method, device and equipment of movable platform and storage medium ) 是由 周游 刘洁 陆正茂 于 2019-07-01 设计创作,主要内容包括:本发明实施例提供一种可移动平台的控制方法、装置、设备及存储介质。本发明实施例通过获取当前时刻拍摄装置在目标区域中拍摄的第一图像,并从多个历史图像中确定与第一图像匹配的第二图像,根据第一图像和第二图像,确定当前时刻可移动平台所处的第一位置点相对于拍摄装置在拍摄第二图像时可移动平台所处的第二位置点的位置信息,根据该第一位置点相对于该第二位置点的位置信息,可确定出该第一位置点相对于该可移动平台在历史时间在该目标区域中移动的历史轨迹的位置信息,根据该历史轨迹可实现对该无人机的定位,从而可提高对该可移动平台的定位精度。(The embodiment of the invention provides a method, a device, equipment and a storage medium for controlling a movable platform. According to the embodiment of the invention, the first image shot by the shooting device at the current moment in the target area is obtained, the second image matched with the first image is determined from the plurality of historical images, the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image is determined according to the first image and the second image, the position information of the first position point relative to the historical track of the movable platform moving in the target area at the historical time can be determined according to the position information of the first position point relative to the second position point, and the positioning of the unmanned aerial vehicle can be realized according to the historical track, so that the positioning accuracy of the movable platform can be improved.)

1. A method of controlling a movable platform, the movable platform including a camera, the method comprising:

acquiring a first image shot by the shooting device in a target area at the current moment;

determining a second image matching the first image among a plurality of historical images captured by the camera while the movable platform is moving in the target area over a historical time;

according to the first image and the second image, determining position information of a first position point where the movable platform is located at the current moment relative to a second position point where the movable platform is located when the shooting device shoots the second image;

and controlling the movable platform to move in the target area according to the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image.

2. The method of claim 1, wherein prior to the obtaining the first image captured by the camera in the target area at the current time, the method further comprises:

and determining the historical track of the movable platform moving in the target area in the historical time according to the position information and/or the posture information of the movable platform in the moving process of the movable platform in the target area in the historical time.

3. The method of claim 2, wherein after determining the historical trajectory of movement of the movable platform in the target region over the historical time, the method further comprises:

a plurality of points of interest in the historical track is determined.

4. The method of claim 3, wherein the movable platform rotates one revolution in place at the point of interest; or

The movable platform vibrates at the point of interest; or

When the movable platform is located at the point of interest, a preset button on the movable platform is triggered.

5. The method of claim 3 or 4, wherein the plurality of historical images comprises images captured by the camera while the movable platform is rotating in place for one revolution at the point of interest.

6. The method according to any one of claims 3-5, wherein the obtaining a first image taken by the camera in the target area at the current time comprises:

controlling the movable platform to take off from one of the plurality of points of interest at a current moment;

and acquiring a first image shot by the shooting device when the movable platform rotates for one circle in situ at the point of interest.

7. The method according to any one of claims 2-6, wherein controlling the movable platform to move in the target area according to the position information of the first position point where the movable platform is located at the current time relative to the second position point where the movable platform is located when the photographing device photographs the second image comprises:

determining the position information of a first position point in a preset coordinate system according to the position information of the first position point of the movable platform at the current moment relative to a second position point of the movable platform when the shooting device shoots the second image, wherein the preset coordinate system is the coordinate system of the historical track;

and controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system.

8. The method of claim 7, wherein the determining the position information of the first position point in the preset coordinate system according to the position information of the first position point where the movable platform is located at the current moment relative to the second position point where the movable platform is located when the shooting device shoots the second image comprises:

and determining the position information of the first position point in the preset coordinate system according to the position information of the first position point of the movable platform relative to the second position point of the movable platform when the shooting device shoots the second image at the current moment and the position information of the second position point in the preset coordinate system.

9. The method according to claim 7 or 8, wherein the controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system comprises:

and controlling the movable platform to move from the first position point and pass through at least one target interest point selected from a plurality of interest points by a user according to the position information of the first position point in the preset coordinate system and the at least one target interest point.

10. The method according to claim 7 or 8, wherein the controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system comprises:

determining a track point which is closest to the first position point in the historical track according to the position information of the first position point in the preset coordinate system;

controlling the movable platform to move from the first position point to the track point;

when the movable platform is located at the track point, the movable platform is controlled to move according to at least one target interest point selected from a plurality of interest points by a user, so that the movable platform passes through the at least one target interest point.

11. The method according to claim 7 or 8, wherein the controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system comprises:

controlling the movable platform to move from the first position point to the second position point according to the position information of the first position point in the preset coordinate system;

when the movable platform is located at the second position point, controlling the movable platform to move according to at least one target interest point selected from the plurality of interest points by a user so that the movable platform passes through the at least one target interest point.

12. The method of claim 10 or 11, wherein said controlling the movement of the movable platform to cause the movable platform to pass the at least one target point of interest comprises:

and controlling the movable platform to move according to at least a partial track of the historical track, wherein the partial track comprises the at least one target interest point, so that the movable platform passes through the at least one target interest point.

13. The method of any of claims 1-12, wherein determining a second image of the plurality of historical images that matches the first image comprises:

and determining a second image matched with the first image in the plurality of historical images according to the feature point of each historical image in the plurality of historical images and the feature point of the first image, wherein the feature point of the first image is matched with the feature point of the second image.

14. The method of any one of claims 1-13, wherein the movable platform comprises at least one of:

mobile robot, unmanned aerial vehicle.

15. A method of controlling a movable platform, the method comprising:

the method comprises the steps that selection operation of a user on at least one control displayed by a control terminal is obtained, and each control in the at least one control is used for controlling a movable platform to complete a task;

generating a control instruction stream according to the selection sequence of the user for at least one control;

and sending the control instruction stream to the movable platform so that the movable platform executes the task corresponding to the at least one control according to the control instruction stream and a plurality of historical images, wherein the plurality of historical images are shot by a shooting device carried on the movable platform when the movable platform moves in the target area in historical time.

16. The method according to claim 15, wherein before the obtaining of the user selection operation on the at least one control displayed by the control terminal, the method further comprises:

controlling the movable platform to move in the target area within the historical time so that the movable platform acquires the plurality of historical images captured by the capturing device mounted on the movable platform.

17. The method of claim 16, wherein a historical trajectory of movement of the movable platform in the target region over the historical time includes a plurality of points of interest;

the at least one control is to control the movable platform to move to at least one of the plurality of points of interest.

18. A control device for a movable platform, comprising: a memory and a processor;

the memory is used for storing program codes;

the processor, invoking the program code, when executed, is configured to:

acquiring a first image shot in a target area by a shooting device carried on the movable platform at the current moment;

determining a second image matching the first image among a plurality of historical images captured by the camera while the movable platform is moving in the target area over a historical time;

according to the first image and the second image, determining position information of a first position point where the movable platform is located at the current moment relative to a second position point where the movable platform is located when the shooting device shoots the second image;

and controlling the movable platform to move in the target area according to the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image.

19. The control device of claim 18, wherein the processor, prior to acquiring the first image captured by the capturing device in the target area at the current time, is further configured to:

and determining the historical track of the movable platform moving in the target area in the historical time according to the position information and/or the posture information of the movable platform in the moving process of the movable platform in the target area in the historical time.

20. The control apparatus of claim 19, wherein the processor, after determining the historical trajectory of movement of the movable platform in the target area over the historical time, is further configured to:

a plurality of points of interest in the historical track is determined.

21. The control device of claim 20, wherein the movable platform rotates in situ one revolution at the point of interest; or

The movable platform vibrates at the point of interest; or

When the movable platform is located at the point of interest, a preset button on the movable platform is triggered.

22. The control device of claim 20 or 21, wherein the plurality of historical images comprises images captured by the camera while the movable platform is rotating in place for one revolution at the point of interest.

23. The control device according to any one of claims 20 to 22, wherein the processor, when acquiring the first image captured by the capturing device in the target area at the current time, is specifically configured to:

controlling the movable platform to take off from one of the plurality of points of interest at a current moment;

and acquiring a first image shot by the shooting device when the movable platform rotates for one circle in situ at the point of interest.

24. The control device according to any one of claims 19 to 23, wherein the processor is configured to control the movable platform to move in the target area based on the position information of the first position point at which the movable platform is located at the current time relative to the second position point at which the movable platform is located at the time when the capturing device captures the second image, and is specifically configured to:

determining the position information of a first position point in a preset coordinate system according to the position information of the first position point of the movable platform at the current moment relative to a second position point of the movable platform when the shooting device shoots the second image, wherein the preset coordinate system is the coordinate system of the historical track;

and controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system.

25. The control device according to claim 24, wherein the processor is configured to, when determining the position information of the first position point in the preset coordinate system according to the position information of the first position point where the movable platform is located at the current time relative to the second position point where the movable platform is located when the shooting device is shooting the second image, specifically:

and determining the position information of the first position point in the preset coordinate system according to the position information of the first position point of the movable platform relative to the second position point of the movable platform when the shooting device shoots the second image at the current moment and the position information of the second position point in the preset coordinate system.

26. The control device according to claim 24 or 25, wherein the processor is configured to control the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system, and is specifically configured to:

and controlling the movable platform to move from the first position point and pass through at least one target interest point selected from a plurality of interest points by a user according to the position information of the first position point in the preset coordinate system and the at least one target interest point.

27. The control device according to claim 24 or 25, wherein the processor is configured to control the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system, and is specifically configured to:

determining a track point which is closest to the first position point in the historical track according to the position information of the first position point in the preset coordinate system;

controlling the movable platform to move from the first position point to the track point;

when the movable platform is located at the track point, the movable platform is controlled to move according to at least one target interest point selected from a plurality of interest points by a user, so that the movable platform passes through the at least one target interest point.

28. The control device according to claim 24 or 25, wherein the processor is configured to control the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system, and is specifically configured to:

controlling the movable platform to move from the first position point to the second position point according to the position information of the first position point in the preset coordinate system;

when the movable platform is located at the second position point, controlling the movable platform to move according to at least one target interest point selected from the plurality of interest points by a user so that the movable platform passes through the at least one target interest point.

29. The control device according to claim 27 or 28, wherein the processor controls the movement of the movable platform such that the movable platform passes the at least one target point of interest, in particular:

and controlling the movable platform to move according to at least a partial track of the historical track, wherein the partial track comprises the at least one target interest point, so that the movable platform passes through the at least one target interest point.

30. The control device according to any one of claims 18 to 29, wherein the processor, when determining a second image of the plurality of history images that matches the first image, is configured to:

and determining a second image matched with the first image in the plurality of historical images according to the feature point of each historical image in the plurality of historical images and the feature point of the first image, wherein the feature point of the first image is matched with the feature point of the second image.

31. A movable platform, comprising:

a body;

the power system is arranged on the machine body and used for providing power;

the shooting device is arranged on the body and used for shooting images; and

a control device as claimed in any one of claims 18 to 30.

32. The movable platform of claim 31, wherein the movable platform comprises at least one of:

mobile robot, unmanned aerial vehicle.

33. A control terminal, comprising: the display device comprises a display component, a memory, a processor and a communication interface;

wherein the display component is used for displaying a control;

the memory is used for storing program codes;

the processor, invoking the program code, when executed, is configured to:

acquiring selection operation of a user on at least one control displayed by the display assembly, wherein each control in the at least one control is used for controlling the movable platform to complete a task;

generating a control instruction stream according to the selection sequence of the user for at least one control;

and sending the control instruction stream to the movable platform through the communication interface so that the movable platform moves in a target area according to the control instruction stream and a plurality of historical images, wherein the plurality of historical images are shot by a shooting device carried on the movable platform when the movable platform moves in the target area in historical time.

34. The control terminal of claim 33, wherein the processor, prior to obtaining user selection of the at least one control displayed by the display component, is further configured to:

controlling the movable platform to move in the target area within the historical time so that the movable platform acquires the plurality of historical images captured by the capturing device mounted on the movable platform.

35. The control terminal of claim 34, wherein the historical trajectory of the movement of the movable platform in the target area over the historical time includes a plurality of points of interest; the at least one control is to control the movable platform to move to at least one of the plurality of points of interest.

36. A computer-readable storage medium, having stored thereon a computer program for execution by a processor to perform the method of any one of claims 1-17.

Technical Field

The embodiment of the invention relates to the technical field of control, in particular to a method, a device, equipment and a storage medium for controlling a movable platform.

Background

Disclosure of Invention

The embodiment of the invention provides a control method, a control device, control equipment and a storage medium of a movable platform, so as to realize navigation independent of a GPS.

A first aspect of an embodiment of the present invention provides a method for controlling a movable platform, including:

acquiring a first image shot by the shooting device in a target area at the current moment;

determining a second image matching the first image among a plurality of historical images captured by the camera while the movable platform is moving in the target area over a historical time;

according to the first image and the second image, determining position information of a first position point where the movable platform is located at the current moment relative to a second position point where the movable platform is located when the shooting device shoots the second image;

and controlling the movable platform to move in the target area according to the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image.

A second aspect of an embodiment of the present invention is to provide a method for controlling a movable platform, including:

the method comprises the steps that selection operation of a user on at least one control is obtained, and each control in the at least one control is used for controlling a movable platform to complete one task;

generating a control instruction stream according to the selection sequence of the user for at least one control displayed by the control terminal;

and sending the control instruction stream to the movable platform so that the movable platform executes the task corresponding to the at least one control according to the control instruction stream and a plurality of historical images, wherein the plurality of historical images are shot by a shooting device carried on the movable platform when the movable platform moves in the target area in historical time.

A third aspect of embodiments of the present invention provides a control apparatus for a movable platform, including: a memory and a processor;

the memory is used for storing program codes;

the processor, invoking the program code, when executed, is configured to:

acquiring a first image shot by the shooting device in a target area at the current moment;

determining a second image matching the first image among a plurality of historical images captured by the camera while the movable platform is moving in the target area over a historical time;

according to the first image and the second image, determining position information of a first position point where the movable platform is located at the current moment relative to a second position point where the movable platform is located when the shooting device shoots the second image;

and controlling the movable platform to move in the target area according to the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image.

A fourth aspect of an embodiment of the present invention provides a movable platform, including:

a body;

the power system is arranged on the machine body and used for providing power;

the shooting device is arranged on the body and used for shooting images; and

the control device according to the third aspect.

A fifth aspect of an embodiment of the present invention provides a control terminal, including: the display device comprises a display component, a memory, a processor and a communication interface;

wherein the display component is used for displaying a control;

the memory is used for storing program codes;

the processor, invoking the program code, when executed, is configured to:

acquiring selection operation of a user on at least one control displayed by the display assembly, wherein each control in the at least one control is used for controlling the movable platform to complete a task;

generating a control instruction stream according to the selection sequence of the user for at least one control;

and sending the control instruction stream to the movable platform through the communication interface so that the movable platform executes a task corresponding to the at least one control according to the control instruction stream and a plurality of historical images, wherein the plurality of historical images are shot by a shooting device carried on the movable platform when the movable platform moves in the target area in historical time.

A sixth aspect of embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program for execution by a processor to implement the method according to the first or second aspect.

The method, the device, the equipment and the storage medium for controlling the movable platform provided by the embodiment acquire a first image shot by a shooting device at the current moment in a target area, determine a second image matched with the first image from a plurality of historical images, determining position information of a first position point where the movable platform is located at the present moment with respect to a second position point where the movable platform is located when the photographing device photographs the second image, based on the first image and the second image, according to the position information of the first position point relative to the second position point, the position information of the first position point relative to the historical track of the movable platform moving in the target area at the historical time can be determined, the unmanned aerial vehicle can be positioned according to the historical track, so that the positioning precision of the movable platform can be improved, and the navigation independent of a GPS is realized.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.

Fig. 1 is a schematic diagram of an unmanned aerial vehicle according to an embodiment of the present invention;

FIG. 2 is a flowchart of a method for controlling a movable platform according to an embodiment of the present invention;

FIG. 3 is a diagram illustrating a history track according to an embodiment of the present invention;

FIG. 4 is a schematic diagram of another historical track provided by an embodiment of the invention;

FIG. 5 is a schematic diagram of three-dimensional points and feature points provided by an embodiment of the invention;

FIG. 6 is a flowchart of a method for controlling a movable platform according to another embodiment of the present invention;

fig. 7 is a schematic view of a flight trajectory of an unmanned aerial vehicle according to an embodiment of the present invention;

fig. 8 is a schematic view of another flight trajectory of the unmanned aerial vehicle according to the embodiment of the present invention;

FIG. 9 is a flowchart of a method for controlling a movable platform according to another embodiment of the present invention;

fig. 10 is a schematic diagram of a programming implementation of the drone according to an embodiment of the present invention;

FIG. 11 is a block diagram of a control apparatus for a movable platform according to an embodiment of the present invention;

fig. 12 is a structural diagram of a control terminal according to another embodiment of the present invention.

Reference numerals:

11: a main camera; 12: a forward looking binocular system; 50: a target object;

51: a first image; 52: a second image; 110: a control device;

111: a memory; 112: a processor; 120: a control terminal;

121: a display component; 122: a memory; 123: a processor;

124: and a communication interface.

Detailed Description

The technical solutions in the embodiments of the present invention will be described clearly below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When a component is referred to as being "connected" to another component, it can be directly connected to the other component or intervening components may also be present.

Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.

Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.

The embodiment of the invention provides a control method of a movable platform. Optionally, the movable platform includes at least one of: mobile robot, unmanned aerial vehicle. The present embodiment uses an unmanned aerial vehicle as an example for schematic illustration. As shown in fig. 1, the drone includes a primary camera 11 and a forward looking binocular system 12. The camera on the drone may be in particular a main camera 11 as shown in fig. 1. It is to be understood that this is for illustrative purposes only and does not limit the number of cameras carried by the drone and the other types of sensing systems that may be carried by the drone.

Fig. 2 is a flowchart of a method for controlling a movable platform according to an embodiment of the present invention. The main body of execution of the method of the present embodiment may be a control device in the movable platform, and the control device may be a control device that controls the movement of the movable platform. In this embodiment, the movable platform includes a camera, which may be a camera, a video camera, or the like. In addition, in other embodiments, the execution main body of the method of this embodiment may also be a control terminal corresponding to the movable platform, such as a remote controller, a tablet computer, a smart phone, and the like.

As shown in fig. 2, the method in this embodiment may include:

step S201, acquiring a first image shot by the shooting device in the target area at the current moment.

In this embodiment, the drone may fly in a target area, which may be a home or a park, and this is schematically illustrated here by taking the home as an example.

Optionally, before the acquiring the first image captured in the target area by the capturing device at the current time, the method may include: and determining the historical track of the movable platform moving in the target area in the historical time according to the position information and/or the posture information of the movable platform in the moving process of the movable platform in the target area in the historical time.

As shown in fig. 3, the target area includes a living room, a kitchen, a sub-bed, a main bed, a balcony, a study, a toilet, and the like. In historical time, the user can hand unmanned aerial vehicle and walk away in proper order living room, kitchen, time crouch, main crouch, balcony, study and lavatory, perhaps, the user can control this unmanned aerial vehicle through unmanned aerial vehicle's remote controller and fly in this target area to pass through living room, kitchen, time crouch, main crouch, balcony, study and lavatory in proper order. In the process that the unmanned aerial vehicle passes through a living room, a kitchen, a secondary bed, a main bed, a balcony, a study room and a toilet in sequence, the position information of the unmanned aerial vehicle can be recorded in real time through a Global Positioning System (GPS), a Visual Odometer (VO) or a Visual-Inertial odometer (VIO) on the unmanned aerial vehicle. It can be understood that when there is no GPS signal in the target area, the position information of the drone can be recorded in real time through the VO or VIO.

In addition, this unmanned aerial vehicle can also be provided with Inertial Measurement Unit (IMU), and IMU can be used to detect unmanned aerial vehicle's gesture. According to the real-time position information and/or posture of the unmanned aerial vehicle, the flight path of the unmanned aerial vehicle passing through a living room, a kitchen, a secondary bed, a primary bed, a balcony, a study room and a toilet in sequence can be determined, and here, the flight path can be recorded as a historical path.

Optionally, after determining the historical trajectory of the movement of the movable platform in the target area within the historical time, the method further includes: a plurality of points of interest in the historical track is determined. Optionally, the movable platform rotates in place for one circle at the point of interest; or the movable platform is vibrating at the point of interest; or when the movable platform is located at the point of interest, a preset button on the movable platform is triggered.

For example, during the process that the unmanned aerial vehicle passes through the living room, the kitchen, the secondary bed, the primary bed, the balcony, the study room and the toilet in sequence, the user may further mark the interest points in the history track, as shown in fig. 3, where the point a, the point B, the point C, the point D, the point E, the point F and the point G are the interest points in the history track respectively.

As a possible approach, the drone may rotate in place for one revolution, i.e., 360 degrees, at the point of interest, that is, when the drone rotates in place for one revolution at a certain point, the drone may record the point of interest. Optionally, when the drone rotates around the point of interest in situ, the main camera on the drone may also photograph the surrounding environment. Optionally, the plurality of historical images include images captured by the camera while the movable platform is rotating in place for one revolution at the point of interest. That is, the plurality of history images taken by the main camera include images taken by the main camera when the drone is at each point of interest, and in addition, the main camera is not limited to taking the surrounding environment only at the point of interest, and the main camera may also take the surrounding environment when the drone is not at the point of interest. Therefore, the plurality of history images taken by the main camera may include not only the images taken by the main camera at the points of interest but also the images taken by the main camera at other track points on the history track except the points of interest.

As another possible way, if the user holds the unmanned aerial vehicle to pass through the living room, the kitchen, the secondary bed, the main bed, the balcony, the study room and the toilet in sequence, when the unmanned aerial vehicle passes through the point of interest, the user can slightly shake the unmanned aerial vehicle, so that the unmanned aerial vehicle generates vibration at the point of interest. That is, when the drone produces a shock at a certain point, the drone may record that point as a point of interest.

As another possible way, when the drone passes through the point of interest, the user may also click a certain preset button on the drone, or click a certain preset button on the remote control of the drone. That is, when the preset button is triggered, for example, clicked, the location where the drone is located is the point of interest.

In addition, in-process that unmanned aerial vehicle passes through sitting room, kitchen, time crouches, main crouching, balcony, study and lavatory in proper order, the main camera on the unmanned aerial vehicle can regularly shoot the image to record the environment around the unmanned aerial vehicle and below. Here, in the process that the unmanned aerial vehicle passes through a living room, a kitchen, a secondary bed, a primary bed, a balcony, a study room and a toilet in sequence, an image shot by the primary camera is recorded as a history image. A forward looking binocular system on the drone may be used to calculate depth information for each historical image as it is captured by the primary camera. Meanwhile, the GPS, VO or VIO on the unmanned aerial vehicle can record the position information of the unmanned aerial vehicle when the main camera shoots each historical image. That is to say, every historical image that this main camera was shot corresponds with depth information respectively, and this unmanned aerial vehicle's positional information when this main camera was shot this historical image.

Due to long-time accumulation, the position information of the unmanned aerial vehicle output by the VO or the VIO may drift, that is, deviate from the true value more and more, and therefore, the historical track of the unmanned aerial vehicle needs to be corrected. Since the unmanned aerial vehicle actually passes through the same place, namely a room door, when entering or exiting the room, the history track can be corrected by a Loop detection (Loop) algorithm, and the corrected history track is as shown in fig. 4. In addition, in this embodiment, a three-dimensional coordinate system is established with a certain point of the target area as the origin of coordinates. For example, as shown in fig. 4, the lower left corner of the target area is taken as a coordinate origin, a direction pointing to the north from the coordinate origin is an X-axis direction of the three-dimensional coordinate system, a direction pointing to the east from the coordinate origin is a Y-axis direction of the three-dimensional coordinate system, and a direction perpendicular to the X-axis and the Y-axis through the coordinate origin is a Z-axis direction (not shown) of the three-dimensional coordinate system, where the three-dimensional coordinate system is taken as a preset coordinate system. It can be understood that each point on the historical track corresponds to a three-dimensional coordinate in the three-dimensional coordinate system, and since fig. 4 shows a top view, the height information is not shown, but the embodiment does not limit the height of each point on the historical track. For example, the point of interest D in the main bed is 5 meters north (X-axis) and 8 meters east (Y-axis).

Optionally, the acquiring a first image shot by the shooting device in the target area at the current time includes: controlling the movable platform to take off from one of the plurality of points of interest at a current moment; and acquiring a first image shot by the shooting device when the movable platform rotates for one circle in situ at the point of interest.

After the history track and the interest points of the unmanned aerial vehicle are recorded, when the unmanned aerial vehicle is powered on again for use next time, the user can place the unmanned aerial vehicle at any place in the target area, and specifically, the user can place the unmanned aerial vehicle at a certain interest point, such as point a, in the target area. This unmanned aerial vehicle takes off from point A at the present moment, takes off the back, and this unmanned aerial vehicle accessible main camera shoots the surrounding environment, and here, the image of taking this main camera at the present moment is marked as first image.

Step S202, determining a second image matched with the first image in a plurality of historical images, wherein the plurality of historical images are shot by the shooting device when the movable platform moves in the target area in historical time.

Because the gesture of unmanned aerial vehicle is different at different moments, perhaps, the gesture of the main camera on the unmanned aerial vehicle is different at different moments, leads to the image that unmanned aerial vehicle main camera shot at different moments probably to be different. For example, in the process that the unmanned aerial vehicle passes through a living room, a kitchen, a secondary bed, a primary bed, a balcony, a study room and a toilet in sequence at a historical time, a historical image taken by the primary camera at the point of interest a may be different from a first image taken by the primary camera at the point of interest a at the current moment, and therefore, a second image most matched with the first image can be determined from a plurality of historical images taken by the primary camera. Additionally, in other embodiments, the takeoff point of the drone at the current time may not be the exact point of interest a, e.g., the takeoff point may be near or around point of interest a.

Optionally, the determining a second image in the plurality of history images that matches the first image includes: and determining a second image matched with the first image in the plurality of historical images according to the feature point of each historical image in the plurality of historical images and the feature point of the first image, wherein the feature point of the first image is matched with the feature point of the second image.

For example, feature points of each history image and feature points of the first image are detected using a specific extraction algorithm, such as a Scale-invariant feature transform (SIFT) algorithm, a Speeded Up Robust Features (SURF) algorithm, an orb (organized FAST and organized brief) algorithm, and the like. Further, the feature points of the first image and the feature points of each historical image are matched, and a second image which is matched with the first image to the maximum degree is determined from the plurality of historical images. It is understood that the matching degree of the feature points of the first image and the feature points of the second image is the maximum.

Step S203, determining, according to the first image and the second image, position information of a first position point where the movable platform is located at the current time relative to a second position point where the movable platform is located when the photographing device photographs the second image.

As shown in fig. 5, 50 denotes a target object in the target area, 51 denotes a first image captured by the main camera of the drone at the present time, and 52 denotes a second image matching the first image among a plurality of history images captured by the main camera of the drone. The points H, I, and J are three-dimensional points on the target object 50, respectively, and the three-dimensional points on the target object 50 may be mapped into the first image 51 and the second image 52. For example, the point H1, the point J1, and the point I1 represent feature points in the first image 51, the point H1 corresponds to the point H, the point J1 corresponds to the point J, and the point I1 corresponds to the point I. The points H2, J2, and I2 represent feature points in the second image 52, with point H2 corresponding to point H, point J2 corresponding to point J, and point I2 corresponding to point I. It is understood that the mapping points of the same three-dimensional point on the target object 51 in different images may have different positions in the corresponding images, for example, the mapping point of the point H in the first image 51, i.e., the point H1, has a different position in the first image 51, and the mapping point of the point H in the second image 52, i.e., the point H2, has a different position in the second image 52.

According to the conversion relationship between the world coordinate system and the pixel plane coordinate system, the three-dimensional point on the target object 50 in the world coordinate system can be obtainedThree-dimensional coordinates (x) ofw,yw,zw) A relationship with positional information of the mapping point of the three-dimensional point in the second image 52, such as pixel coordinates (μ, ν), in the second image 52, which is specifically shown in the following formula (1):

Figure BDA0002620815840000091

wherein z iscRepresenting the coordinates of the three-dimensional point on the Z-axis of the camera coordinate system, where the camera coordinate system is the camera coordinate system of the host camera when the host camera was taking the second image 52, i.e. ZcRepresenting depth information of the second image 52. K represents the internal reference of the main camera, R represents the rotation matrix of the camera coordinate system relative to the world coordinate system, and T represents the translation matrix of the camera coordinate system relative to the world coordinate system. In this embodiment, the internal reference K of the main camera is a known quantity. Optionally, the world coordinate system may be the preset coordinate system. The camera coordinate system of the main camera when capturing the second image 52 can be determined according to the pose of the main camera when capturing the second image 52 and the three-dimensional coordinates of the main camera in the predetermined coordinate system. Further, according to the camera coordinate system and the preset coordinate system, a rotation matrix R of the camera coordinate system relative to the preset coordinate system and a translation matrix T of the camera coordinate system relative to the preset coordinate system can be calculated. Further, according to K, (mu, v), zcR and T may calculate the three-dimensional coordinates (x) of the three-dimensional point on the target object 50 in the world coordinate systemw,yw,zw)。

It is understood that the three-dimensional coordinates of the host camera in the preset coordinate system when the host camera takes the first image 51 and the three-dimensional coordinates of the host camera in the preset coordinate system when the host camera takes the second image 52 may be different, or the pose of the host camera when the host camera takes the first image 51 and the pose of the host camera when the host camera takes the second image 52 may be different, and thus, the camera coordinate system of the host camera when the host camera takes the first image 51 and the camera coordinate system of the host camera when the host camera takes the second image 52 may be different.

In this embodiment, it may be assumed that the target object 50 is fixed in the preset coordinate system, and further based on the depth information of the first image 51, the position information of the mapping point of the three-dimensional point on the target object 50 in the first image 51, such as the pixel coordinate, the internal reference K of the main camera, and the three-dimensional coordinate (x) of the three-dimensional point on the target object 50 in the world coordinate systemw,yw,zw) Using the same principle as in equation (1), the rotation matrix and the translation matrix of the camera coordinate system of the main camera relative to the preset coordinate system when the main camera takes the first image 51 can be calculated. Further, a translation matrix of the camera coordinate system of the main camera when the main camera captures the first image 51 relative to the camera coordinate system of the main camera when the main camera captures the second image 52 can be determined based on a translation matrix of the camera coordinate system of the main camera when the main camera captures the first image 51 relative to the preset coordinate system and a translation matrix of the camera coordinate system of the main camera when the main camera captures the second image 52 relative to the preset coordinate system.

Assuming that the position information of the main camera in the preset coordinate system is consistent with the position information of the unmanned aerial vehicle in the preset coordinate system at the same time, the position information of a first position point where the unmanned aerial vehicle is located at the current time, that is, when the main camera shoots the first image 51, relative to a second position point where the unmanned aerial vehicle is located when the main camera shoots the second image 52, can be determined according to a translation matrix of the camera coordinate system of the main camera relative to the camera coordinate system of the main camera when the main camera shoots the second image 52 when the main camera shoots the first image 51.

Or when the installation position of the main camera on the unmanned aerial vehicle is fixed, the position information of the unmanned aerial vehicle in the preset coordinate system can be determined according to the position information of the main camera in the preset coordinate system at the same moment. According to the translation matrix of the camera coordinate system of the main camera when the main camera takes the first image 51 relative to the camera coordinate system of the main camera when the main camera takes the second image 52, the displacement of the position where the main camera is located at the current moment relative to the position where the main camera is located when the main camera takes the second image 52 can be determined, and further, according to the displacement, the position information of the first position point where the unmanned aerial vehicle is located at the current moment, that is, when the main camera takes the first image 51, relative to the second position point where the unmanned aerial vehicle is located when the main camera takes the second image 52 can be determined.

Step S204, controlling the movable platform to move in the target area according to the position information of the first position point of the movable platform relative to the second position point of the movable platform when the shooting device shoots the second image at the current moment.

In the present embodiment, since the second position point where the drone is located when the main camera takes the second image 52 is a track point on the historical track, the three-dimensional coordinates of the drone in the preset coordinate system when the main camera takes the second image 52 are known. Therefore, according to the current time, that is, the position information of the first position point where the unmanned aerial vehicle is located when the main camera takes the first image 51, relative to the second position point where the unmanned aerial vehicle is located when the main camera takes the second image 52, and the three-dimensional coordinate of the unmanned aerial vehicle in the preset coordinate system when the main camera takes the second image 52, the three-dimensional coordinate of the unmanned aerial vehicle in the preset coordinate system when the main camera takes the first image 51 can be determined, that is, the position information of the flying point in the preset coordinate system when the unmanned aerial vehicle takes off at the current time is determined, that is, the positioning of the unmanned aerial vehicle is realized. Further, when the unmanned aerial vehicle takes off at the current moment, the position information of the flying point in the preset coordinate system can be used for controlling the unmanned aerial vehicle to fly in the target area. For example, as shown in fig. 4, the position information of each interest point in the preset coordinate system is known, and therefore, according to the position information of the takeoff point in the preset coordinate system when the unmanned aerial vehicle takes off at the current moment and the position information of each interest point in the preset coordinate system, the unmanned aerial vehicle can be controlled to fly in the target area, so that the unmanned aerial vehicle passes through at least one interest point of the plurality of interest points.

According to the embodiment, the first image shot by the shooting device at the current moment in the target area is acquired, the second image matched with the first image is determined from the plurality of historical images, the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image is determined according to the first image and the second image, the position information of the first position point relative to the historical track of the movable platform moving in the target area at the historical time can be determined according to the position information of the first position point relative to the second position point, and the unmanned aerial vehicle can be positioned according to the historical track, so that the positioning accuracy of the movable platform can be improved.

The embodiment of the invention provides a control method of a movable platform. Fig. 6 is a flowchart of a method for controlling a movable platform according to another embodiment of the present invention. As shown in fig. 6, on the basis of the above embodiment, the controlling the movable platform to move in the target area according to the position information of the first position point where the movable platform is located at the current time relative to the second position point where the movable platform is located when the shooting device shoots the second image may include:

step S601, determining position information of the first position point in a preset coordinate system according to position information of the first position point where the movable platform is located at the current time relative to a second position point where the movable platform is located when the shooting device shoots the second image, where the preset coordinate system is a coordinate system where the historical track is located.

Optionally, the determining, according to the position information of the first position point where the movable platform is located at the current time relative to the second position point where the movable platform is located when the shooting device shoots the second image, the position information of the first position point in a preset coordinate system includes: and determining the position information of the first position point in the preset coordinate system according to the position information of the first position point of the movable platform relative to the second position point of the movable platform when the shooting device shoots the second image at the current moment and the position information of the second position point in the preset coordinate system.

Since the second location point where the drone is located when the main camera takes the second image 52 is a track point on the historical track, the three-dimensional coordinates of the drone in the preset coordinate system when the main camera takes the second image 52 are known. Therefore, according to the current time, that is, the position information of the first position point where the unmanned aerial vehicle is located when the main camera shoots the first image 51, relative to the second position point where the unmanned aerial vehicle is located when the main camera shoots the second image 52, and the three-dimensional coordinates of the unmanned aerial vehicle in the preset coordinate system when the main camera shoots the second image 52, the three-dimensional coordinates of the first position point where the unmanned aerial vehicle is located when the main camera shoots the first image 51 in the preset coordinate system can be determined.

Step S602, controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system.

In a possible manner, the controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system includes: and controlling the movable platform to move from the first position point and pass through at least one target interest point selected from the plurality of interest points by the user according to the position information of the first position point in the preset coordinate system and the at least one target interest point.

As shown in fig. 4, after the drone passes through a living room, a kitchen, a secondary bed, a primary bed, a balcony, a study room, and a toilet in sequence, the drone may label each point of interest, for example, label each point of interest as ABCDEFG in sequence, and in addition, the user may also name each point of interest, for example, name the point of interest a as the living room, name the point of interest B as the kitchen, and so on. When the unmanned aerial vehicle is powered on again for use next time, the user can select part of interest points from the plurality of interest points, and the part of interest points selected by the user from the plurality of interest points is marked as target interest points. For example, the user may select 4 points of interest, such as ADGE, from the 7 points of interest ABCDEFG. When the unmanned aerial vehicle takes off from the interest point A, the unmanned aerial vehicle can start autonomous flight from the interest point A and pass through the interest point DGE, and in the autonomous flight process, the unmanned aerial vehicle can avoid obstacles through the detection equipment carried by the unmanned aerial vehicle.

In another possible manner, the controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system includes: determining a track point which is closest to the first position point in the historical track according to the position information of the first position point in the preset coordinate system; controlling the movable platform to move from the first position point to the track point; when the movable platform is located at the track point, the movable platform is controlled to move according to at least one target interest point selected from the interest points by a user, so that the movable platform passes through the at least one target interest point.

For example, the drone takes off from the vicinity of the point of interest a, and the takeoff point at which the drone takes off is referred to herein as the first location point. However, the first position point may not be on the historical track, that is, there is a certain deviation between the takeoff point of the unmanned aerial vehicle when taking off and the historical track, and at this time, according to the position information of the first position point in the preset coordinate system and the position information of each track point in the historical track in the preset coordinate system, the track point closest to the first position point in the historical track may be determined, and the unmanned aerial vehicle may be controlled to fly from the first position point to the closest track point. When the unmanned aerial vehicle is located at the nearest track point, the unmanned aerial vehicle can be controlled to fly according to the historical track, so that the unmanned aerial vehicle passes through the partial interest points selected by the user.

Optionally, the controlling the movable platform to move so that the movable platform passes through the at least one target point of interest includes: and controlling the movable platform to move according to at least a partial track of the historical track, wherein the partial track comprises the at least one target interest point, so that the movable platform passes through the at least one target interest point.

As shown in fig. 7, the user may select 4 points of interest from the 7 points of interest ABCDEFG, for example, ADGE, and one way to control the drone to pass through ADGE after takeoff is: unmanned aerial vehicle flies to the nearest track point of this departure point in the historical orbit from the departure point to begin along this historical orbit flight from this track point, make this unmanned aerial vehicle can pass through point of interest ABCDEFG in proper order, thereby make this unmanned aerial vehicle can pass through point of interest ADGE, control unmanned aerial vehicle promptly and pass through point of interest ADGE in proper order through controlling this unmanned aerial vehicle and realize through point of interest ABCDEFG.

In other embodiments, another way to control the drone to pass through the ADGE after takeoff is: as shown in fig. 8, the drone flies from the departure point to the track point closest to the departure point in the historical track, flies from the track point to the doorway of the kitchen along the historical track through the interest point a, flies from the doorway of the kitchen to the doorway of the next crouch along the historical track, and sequentially passes through the interest point D, the interest point G, and the interest point E from the doorway of the next crouch along the historical track. Optionally, the drone may fly from the point of interest E to the doorway of the study, further to the doorway of the toilet, and finally return to a certain point in the living room from the doorway of the toilet, for example, a track point closest to the starting point.

In yet another possible manner, the controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system includes: controlling the movable platform to move from the first position point to the second position point according to the position information of the first position point in the preset coordinate system; when the movable platform is located at the second position point, controlling the movable platform to move according to at least one target interest point selected from the plurality of interest points by a user so that the movable platform passes through the at least one target interest point.

For example, the drone takes off from the vicinity of the point of interest a, and the takeoff point at which the drone takes off is referred to herein as the first location point. However, the first position point may not be on the historical track, that is, there is a certain deviation between the takeoff point of the drone at takeoff and the historical track. At this moment, the unmanned aerial vehicle can be controlled to fly to a second position point from the first position point according to the position information of the first position point in the preset coordinate system, the second position point is the position point where the unmanned aerial vehicle is located when the main camera shoots the second image, and the second position point is a track point on the historical track. When the unmanned aerial vehicle is located at the second position point, the unmanned aerial vehicle is controlled to fly according to the part of interest points selected by the user, so that the unmanned aerial vehicle passes through the part of interest points. Here, the method for controlling the drone to pass through the part of the interest point is similar to the method shown in fig. 7 and 8, and is not described here again.

Optionally, the control method further includes: receiving a control instruction stream sent by a control terminal, wherein the control instruction stream is generated by the control terminal through obtaining the selection operation of a user on at least one control displayed by the control terminal, each control in the at least one control is used for controlling a movable platform to complete one task, and executing the task corresponding to the at least one control according to the control instruction stream and the plurality of historical images. In this way, the movable platform performs navigation based on the historical images to perform the user-directed task. For a detailed explanation, please refer to a detailed description portion of the control method provided in fig. 9, which is not repeated herein.

This embodiment is through the deviation between unmanned aerial vehicle's the departure point and the historical orbit, and control unmanned aerial vehicle flies the track point on this historical orbit from the departure point to begin along historical orbit flight from this track point, make this unmanned aerial vehicle can pass through some points of interest, improved the control flexibility to unmanned aerial vehicle.

The embodiment of the invention provides a control method of a movable platform. Fig. 9 is a flowchart of a method for controlling a movable platform according to another embodiment of the present invention. As shown in fig. 9, the method in this embodiment may include:

step S901, obtaining a selection operation of a user on at least one control displayed by the control terminal, where each control in the at least one control is used to control the movable platform to complete a task.

In this embodiment, a control terminal corresponding to the unmanned aerial vehicle, for example, a remote controller, a tablet computer, a smart phone, and other devices may display at least one control, and each control is used to control the unmanned aerial vehicle to complete one task, for example, take off, record a video, go to lie mainly, take a picture, go to a balcony, hover, panorama shooting, and the like are different tasks respectively.

The user may perform a selection operation on some or all of the at least one control, and the specific selection operation is not limited herein, and may be, for example, dragging, clicking, frame selecting, and the like. Taking a remote controller as an example, the remote controller may include a display component, where the display component is configured to display a plurality of different controls, and in addition, the display component may specifically be a touch screen, and the touch screen may sense an operation of a user on the touch screen, so that the remote controller may determine at least one control selected by the user according to a selection operation of the user, and determine a sequence in which the user selects the at least one control.

And S902, generating a control instruction stream according to the selection sequence of the user on the at least one control.

As shown in fig. 10, the remote controller may generate a control instruction stream according to the selection sequence of at least one control by the user.

Step S903, sending the control instruction stream to the movable platform, so that the movable platform executes a task corresponding to the at least one control according to the control instruction stream and a plurality of historical images, wherein the plurality of historical images are shot by a shooting device carried on the movable platform when the movable platform moves in the target area within historical time.

Specifically, the remote controller may send the control instruction stream to the unmanned aerial vehicle, so that the unmanned aerial vehicle executes a task corresponding to the at least one control according to the control instruction stream and the plurality of historical images.

The target area is specifically taken as an example of the home as described above, and in the historical time, the remote controller can control the unmanned aerial vehicle to sequentially pass through places such as a living room, a kitchen, a secondary bed, a main bed, a balcony, a study, a toilet and the like at home. Or, the user can hand this unmanned aerial vehicle and remove at home for this unmanned aerial vehicle crouches, main places such as balcony, study and lavatory through sitting room, kitchen, time in proper order. In this historical time, when this unmanned aerial vehicle passed through places such as sitting room, kitchen, time crouching, main crouching, balcony, study and lavatory in proper order, a plurality of historical images can be shot to the shooting device that this unmanned aerial vehicle carried on.

Because each control is used for controlling the unmanned aerial vehicle to complete one task, for example, taking off, recording video, going to master sleeping, taking pictures, going to balcony, hovering, panoramic shooting and the like are respectively different tasks. Therefore, one control may correspond to one control instruction, and thus, the control instruction stream may be a set of at least one control instruction corresponding to at least one control selected by the user. It will be appreciated that the at least one control includes a control for controlling movement of the movable platform to a target position in the target area. The target location may be a kitchen, balcony or sub-bedroom, etc., as previously described. The at least one control includes a control for controlling the movable platform to move to perform the photographing.

When the unmanned aerial vehicle receives the control instruction stream, executing the task corresponding to the at least one control according to the control instruction stream and the plurality of historical images. Specifically, when the unmanned aerial vehicle executes a task corresponding to the at least one control, the unmanned aerial vehicle can determine a historical image matched with the current image from the plurality of historical images according to the current image shot by the shooting device at the current moment, and determine the current position of the unmanned aerial vehicle according to the historical position of the unmanned aerial vehicle when the shooting device shoots the historical image. Further, the unmanned aerial vehicle executes the task corresponding to the at least one control according to the current position.

Optionally, before the obtaining of the selection operation of the user on the at least one control displayed by the control terminal, the method further includes: controlling the movable platform to move in the target area within the historical time so that the movable platform acquires the plurality of historical images captured by the capturing device mounted on the movable platform. For example, before the user selects the control displayed by the remote controller, the remote controller may control the unmanned aerial vehicle to sequentially fly through the living room, the kitchen, the secondary bed, the primary bed, the balcony, the study room, and the toilet at the historical time, and in the process that the unmanned aerial vehicle sequentially flies through the living room, the kitchen, the secondary bed, the primary bed, the balcony, the study room, and the toilet, the unmanned aerial vehicle may acquire a plurality of historical images captured by the camera of the unmanned aerial vehicle.

Optionally, the historical trajectory of the movement of the movable platform in the target region over the historical time includes a plurality of points of interest, and the at least one control is configured to control the movement of the movable platform to at least one of the plurality of points of interest.

For example, in the historical time, the unmanned aerial vehicle sequentially passes through the living room, the kitchen, the secondary bed, the primary bed, the balcony, the study room and the toilet to form a historical track, and the user may mark the interest points in the historical track, as shown in fig. 3, the point a, the point B, the point C, the point D, the point E, the point F and the point G are the interest points in the historical track, and the specific marking process is consistent with the marking process described in the above embodiment, and is not described here again.

The control can be selected by a user at the current moment to determine a part of interest points which need to be passed by the unmanned aerial vehicle after the unmanned aerial vehicle takes off from the takeoff point at the current moment, for example, the user at the current moment can only select the controls corresponding to the interest points A, C and E, further, the remote controller generates a control instruction stream according to the selection operation of the user on the controls corresponding to the interest points A, C and E respectively, and sends the control instruction stream to the unmanned aerial vehicle, so that the unmanned aerial vehicle sequentially flies through the interest points A, C and E according to the control instruction stream and the plurality of historical images.

The embodiment generates the control instruction stream by obtaining the selection operation of the user on at least one control and according to the selection sequence of the user on at least one control, and the control instruction stream is sent to a movable platform, so that the movable platform executes the task corresponding to the at least one control according to the control instruction stream and a plurality of historical images, the programming of the movable platform is realized, and in the process of programming control, the movable platform can determine a history image matched with the current image from a plurality of history images according to the current image shot at the current moment, and the current position of the movable platform is determined according to the historical position of the movable platform when the photographing device on the movable platform photographs the historical image, so that the accurate positioning of the movable platform is realized, and the movable platform can automatically move in a more complex environment.

The embodiment of the invention provides a control device of a movable platform. Fig. 11 is a structural diagram of a control device of a movable platform according to an embodiment of the present invention, and as shown in fig. 11, the control device 110 includes: a memory 111 and a processor 112; the memory 111 is used for storing program codes; the processor 112, invoking the program code, is configured to perform the following when the program code is executed: acquiring a first image shot by the shooting device in a target area at the current moment; determining a second image matching the first image among a plurality of historical images captured by the camera while the movable platform is moving in the target area over a historical time; according to the first image and the second image, determining position information of a first position point where the movable platform is located at the current moment relative to a second position point where the movable platform is located when the shooting device shoots the second image; and controlling the movable platform to move in the target area according to the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image.

Optionally, before acquiring the first image captured in the target area by the capturing device at the current time, the processor 112 is further configured to: and determining the historical track of the movable platform moving in the target area in the historical time according to the position information and/or the posture information of the movable platform in the moving process of the movable platform in the target area in the historical time.

Optionally, after the processor 112 determines the historical trajectory of the movement of the movable platform in the target area within the historical time, the processor is further configured to: a plurality of points of interest in the historical track is determined.

Optionally, the movable platform rotates in place for one circle at the point of interest; or the movable platform is vibrating at the point of interest; or when the movable platform is located at the point of interest, a preset button on the movable platform is triggered.

Optionally, the plurality of historical images include images captured by the camera while the movable platform is rotating in place for one revolution at the point of interest.

Optionally, when the processor 112 acquires the first image shot by the shooting device in the target area at the current moment, the method is specifically configured to: controlling the movable platform to take off from one of the plurality of points of interest at a current moment; and acquiring a first image shot by the shooting device when the movable platform rotates for one circle in situ at the point of interest.

Optionally, the processor 112 controls the movable platform to move in the target area according to the position information of the first position point where the movable platform is located at the current time relative to the second position point where the movable platform is located when the shooting device shoots the second image, and is specifically configured to: determining the position information of a first position point in a preset coordinate system according to the position information of the first position point of the movable platform at the current moment relative to a second position point of the movable platform when the shooting device shoots the second image, wherein the preset coordinate system is the coordinate system of the historical track; and controlling the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system.

Optionally, when determining the position information of the first position point in the preset coordinate system according to the position information of the first position point where the movable platform is located at the current moment relative to the second position point where the movable platform is located when the shooting device shoots the second image, the processor 112 is specifically configured to: and determining the position information of the first position point in the preset coordinate system according to the position information of the first position point of the movable platform relative to the second position point of the movable platform when the shooting device shoots the second image at the current moment and the position information of the second position point in the preset coordinate system.

Optionally, when the processor 112 controls the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system, specifically, the processor is configured to: and controlling the movable platform to move from the first position point and pass through at least one target interest point selected from the plurality of interest points by the user according to the position information of the first position point in the preset coordinate system and the at least one target interest point.

Optionally, when the processor 112 controls the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system, specifically, the processor is configured to: determining a track point which is closest to the first position point in the historical track according to the position information of the first position point in the preset coordinate system; controlling the movable platform to move from the first position point to the track point; when the movable platform is located at the track point, the movable platform is controlled to move according to at least one target interest point selected from the interest points by a user, so that the movable platform passes through the at least one target interest point.

Optionally, when the processor 112 controls the movable platform to move in the target area according to the position information of the first position point in the preset coordinate system, specifically, the processor is configured to: controlling the movable platform to move from the first position point to the second position point according to the position information of the first position point in the preset coordinate system; when the movable platform is located at the second position point, controlling the movable platform to move according to at least one target interest point selected from the plurality of interest points by a user so that the movable platform passes through the at least one target interest point.

Optionally, the processor 112 controls the movable platform to move, so that when the movable platform passes through the at least one target interest point, the processor is specifically configured to: and controlling the movable platform to move according to at least a partial track of the historical track, wherein the partial track comprises the at least one target interest point, so that the movable platform passes through the at least one target interest point.

Optionally, when the processor 112 determines a second image matching the first image in the plurality of history images, the processor is specifically configured to: and determining a second image matched with the first image in the plurality of historical images according to the feature point of each historical image in the plurality of historical images and the feature point of the first image, wherein the feature point of the first image is matched with the feature point of the second image.

The specific principle and implementation of the control device provided by the embodiment of the present invention are similar to those of the above embodiments, and are not described herein again.

According to the embodiment, the first image shot by the shooting device at the current moment in the target area is acquired, the second image matched with the first image is determined from the plurality of historical images, the position information of the first position point of the movable platform at the current moment relative to the second position point of the movable platform when the shooting device shoots the second image is determined according to the first image and the second image, the position information of the first position point relative to the historical track of the movable platform moving in the target area at the historical time can be determined according to the position information of the first position point relative to the second position point, and the unmanned aerial vehicle can be positioned according to the historical track, so that the positioning accuracy of the movable platform can be improved.

The embodiment of the invention provides a movable platform. The movable platform comprises: the camera comprises a body, a power system, a shooting device and a control device according to the embodiment. Wherein, the power system is arranged on the machine body and used for providing power; the shooting device is arranged on the body and used for shooting images; the principle and implementation of the control device are the same as those described in the above embodiments, and are not described herein again. Optionally, the movable platform comprises at least one of: mobile robot, unmanned aerial vehicle.

The embodiment of the invention also provides a control terminal, which can be a remote controller, a tablet personal computer or a smart phone corresponding to the movable platform, and can be used for controlling the movable platform. Fig. 12 is a block diagram of a control terminal according to another embodiment of the present invention; as shown in fig. 12, the control terminal 120 includes: a display component 121, a memory 122, a processor 123, and a communication interface 124; wherein the display component 121 is configured to display a control; the memory 122 is used for storing program codes; the processor 123, invoking the program code, is configured to perform the following when the program code is executed: acquiring selection operation of a user on at least one control displayed by the display component 121, wherein each control in the at least one control is used for controlling the movable platform to complete a task; generating a control instruction stream according to the selection sequence of the user for at least one control; and sending the control instruction stream to the movable platform through a communication interface 124 so that the movable platform moves in the target area according to the control instruction stream and a plurality of historical images, wherein the plurality of historical images are shot by a shooting device carried on the movable platform when the movable platform moves in the target area in historical time.

Optionally, before acquiring the selection operation of the user on the at least one control displayed by the display component, the processor 123 is further configured to: controlling the movable platform to move in the target area within the historical time so that the movable platform acquires the plurality of historical images captured by the capturing device mounted on the movable platform.

Optionally, the historical track of the movement of the movable platform in the target region within the historical time includes a plurality of interest points; the at least one control is to control the movable platform to move to at least one of the plurality of points of interest.

The specific principle and implementation manner of the control terminal provided by the embodiment of the present invention are similar to those of the above embodiments, and are not described herein again.

The embodiment generates the control instruction stream by obtaining the selection operation of the user on at least one control and according to the selection sequence of the user on at least one control, and the control instruction stream is sent to a movable platform, so that the movable platform executes the task corresponding to the at least one control according to the control instruction stream and a plurality of historical images, the programming of the movable platform is realized, and in the process of programming control, the movable platform can determine a history image matched with the current image from a plurality of history images according to the current image shot at the current moment, and the current position of the movable platform is determined according to the historical position of the movable platform when the photographing device on the movable platform photographs the historical image, so that the accurate positioning of the movable platform is realized, and the movable platform can automatically move in a more complex environment.

In addition, the present embodiment also provides a computer-readable storage medium on which a computer program is stored, the computer program being executed by a processor to implement the control method of the movable platform described in the above embodiment.

In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.

The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.

The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.

It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.

Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

29页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种飞行规划方法及相关设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类