Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle

文档序号:1539421 发布日期:2020-02-14 浏览:14次 中文

阅读说明:本技术 基于无人机的测距方法、装置及无人机 (Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle ) 是由 翁超 熊川樘 于 2018-09-28 设计创作,主要内容包括:一种基于无人机的测距方法、装置及无人机,包括:获取拍摄装置(120)在第一位置针对目标所拍摄的第一图像;根据第一位置的位置信息,控制拍摄装置从第一位置移动至第二位置;获取拍摄装置(120)在第二位置针对目标所拍摄的第二图像;根据第一位置和第二位置之间的距离、及第一图像、第二图像,确定目标的深度信息;第一位置和第二位置的高度相等,并且第一位置和第二位置的连线平行于拍摄装置在第一位置时针对目标拍摄的拍摄平面。通过单个拍摄装置(120)完成现有技术中基于两个拍摄装置在两个拍摄位置所拍摄的图像,实现了与双目测距相同的效果,节省了测距的成本,能够应用在小型号无人机上。(A distance measurement method and device based on an unmanned aerial vehicle and the unmanned aerial vehicle comprise: acquiring a first image captured by a capturing device (120) at a first position with respect to a target; controlling the shooting device to move from the first position to the second position according to the position information of the first position; acquiring a second image captured by the capturing device (120) at a second position with respect to the target; determining the depth information of the target according to the distance between the first position and the second position, the first image and the second image; the heights of the first position and the second position are equal, and a connecting line of the first position and the second position is parallel to a shooting plane shot for the object when the shooting device is at the first position. Accomplish the image that shoots based on two shooting device in two shooting positions among the prior art through single shooting device (120), realized the same effect with binocular range finding, saved the cost of range finding, can use on small-size unmanned aerial vehicle.)

1. The utility model provides a range finding method based on unmanned aerial vehicle, its characterized in that, unmanned aerial vehicle carries on the shooting device, the method includes:

acquiring a first image shot by the shooting device at a first position aiming at a target;

controlling the shooting device to move from the first position to a second position according to the position information of the first position;

acquiring a second image shot by the shooting device at the second position aiming at the target;

determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;

wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.

2. The method of claim 1, wherein the first location is a user-preset location; alternatively, the first and second electrodes may be,

the first position is a current position of the drone.

3. The method of claim 1, wherein the location information of the first location comprises:

geographic location information based on GPS detection; and

based on altitude information detected by a vision module or barometer.

4. The method of claim 3, wherein the camera is mounted on an unmanned aerial vehicle via a pan-tilt head, and the GPS is located on the unmanned aerial vehicle, the pan-tilt head, or the camera; and/or the presence of a catalyst in the reaction mixture,

the vision module or the barometer is arranged on the unmanned aerial vehicle, the holder or the shooting device.

5. The method of claim 1, wherein a distance between the first location and the second location is less than or equal to a preset distance threshold.

6. The method of claim 5, wherein the second location is a user-preset location; alternatively, the first and second electrodes may be,

the second position is determined according to the position information of the first position and the preset distance threshold.

7. The method according to claim 6, wherein after controlling the camera to move from the first position to the second position according to the position information of the first position, the method further comprises:

acquiring actual position information of the shooting device at the second position;

determining a horizontal distance between the first position and the second position according to the position information of the first position and the actual position information of the second position;

when the horizontal distance is larger than the preset distance threshold, controlling the shooting device to translate towards the first position, so that the horizontal distance from the shooting device to the first position is smaller than or equal to the preset distance threshold.

8. The method according to claim 6, wherein after controlling the camera to move from the first position to the second position according to the position information of the first position, the method further comprises:

acquiring actual position information of the shooting device at the second position;

determining a height difference between the first position and the second position according to the position information of the first position and the actual position information of the second position;

when the height difference is not equal to zero, adjusting the height of the shooting device so that the height of the shooting device is equal to that of the first position.

9. The method of claim 1 or 2, wherein the second position is located to the left or right of the first position.

10. The method of claim 1, wherein the controlling the camera to move from the first position to a second position according to the position information of the first position comprises:

according to the position information of the first position, the unmanned aerial vehicle is controlled to move, so that the shooting device moves from the first position to the second position.

11. The method of claim 10, wherein controlling the drone to move according to the location information of the first location comprises:

acquiring the current movement speed of the unmanned aerial vehicle;

determining a flight direction and a flight duration according to the current movement speed and the position information of the first position and the second position;

controlling the unmanned aerial vehicle to translate according to the current movement speed according to the flight direction and the flight duration;

alternatively, the first and second electrodes may be,

acquiring the current movement speed of the unmanned aerial vehicle;

determining flight time according to the current movement speed and a preset distance threshold;

controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the flight time length and the current movement speed;

alternatively, the first and second electrodes may be,

determining a flight direction and a flight duration according to a preset speed and the position information of the first position and the second position;

controlling the unmanned aerial vehicle to translate according to the preset speed according to the flight direction and the flight duration;

alternatively, the first and second electrodes may be,

determining flight time according to a preset speed and a preset distance threshold;

controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the preset speed according to the flight time;

alternatively, the first and second electrodes may be,

and controlling the unmanned aerial vehicle to translate relative to the first position according to a preset speed and a preset duration.

12. The method of claim 1, wherein the camera is mounted on the drone through a pan-tilt head;

the controlling the photographing device to move from the first position to the second position according to the position information of the first position comprises:

when the unmanned aerial vehicle is in a static state, the cradle head is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position.

13. The method of claim 12, wherein said pan/tilt head is carried on said drone by a power device movable in yaw;

the controlling the pan-tilt movement includes:

and controlling the power device to move so as to control the holder to translate.

14. The method of claim 13, wherein controlling the pan/tilt head movement according to the position information of the first position comprises:

determining the movement direction and the movement duration of the power device according to the preset speed and the position information of the first position and the second position;

controlling the power device to translate according to the preset speed according to the movement direction and the movement duration;

alternatively, the first and second electrodes may be,

determining the movement time length of the power device according to a preset speed and a preset distance threshold;

controlling the power device to translate towards the left or the right relative to the first position according to the preset speed according to the movement duration;

alternatively, the first and second electrodes may be,

and controlling the power device to translate relative to the first position according to a preset speed and a preset time length.

15. The method according to claim 10 or 12, wherein a sensing unit is provided on the drone; the method further comprises the following steps:

acquiring distance information from the target to the unmanned aerial vehicle based on the sensing unit;

and adjusting the depth information of the target according to the distance information.

16. The method of claim 15, wherein before adjusting the depth information of the target according to the distance information, further comprising:

and determining that the difference between the distance information and the depth information of the target is smaller than or equal to a preset difference threshold.

17. The method of claim 15, wherein after the obtaining the distance information of the target to the drone based on the sensing unit, further comprising:

and when the difference between the distance information and the depth information of the target is determined to be larger than a preset difference threshold value, determining the depth information of the target to be invalid information.

18. The method of claim 15, wherein the adjusting the depth information of the target according to the distance information comprises:

and carrying out fusion processing on the distance information and the depth information to determine final depth information of the target.

19. The method of claim 15, wherein the sensing unit comprises a laser ranging sensor.

20. The method of claim 1, wherein determining the depth information of the target according to the distance between the first location and the second location and the first image and the second image comprises:

acquiring the focal length of the shooting device;

determining a disparity between the first image and the second image from the first image and the second image;

and determining the depth information of the target according to the distance between the first position and the second position, the focal length of the shooting device and the parallax.

21. The method of claim 20, wherein the obtaining the focal length of the camera comprises:

and calibrating the shooting device, and determining the focal length of the shooting device.

22. The method of claim 20, wherein determining the disparity between the first image and the second image from the first image and the second image comprises:

and carrying out binocular matching on the first image and the second image to obtain the parallax between the first image and the second image.

23. The method of claim 22, wherein the binocular matching the first image and the second image, prior to obtaining the disparity between the first image and the second image, comprises:

calibrating the shooting device to obtain internal reference data of the shooting device;

and performing binocular correction processing on the first image and the second image according to the internal reference data.

24. The utility model provides a range unit based on unmanned aerial vehicle, its characterized in that, includes shoots device and treater, it carries on to shoot the device unmanned aerial vehicle is last, the treater with shoot device communication connection, the treater is used for:

acquiring a first image shot by the shooting device at a first position aiming at a target;

controlling the shooting device to move from the first position to a second position according to the position information of the first position;

acquiring a second image shot by the shooting device at the second position aiming at the target;

determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;

wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.

25. The apparatus of claim 24, wherein the first position is a user preset position; alternatively, the first and second electrodes may be,

the first position is a current position of the drone.

26. The apparatus of claim 24, wherein the location information of the first location comprises:

geographic location information based on GPS detection; and

based on altitude information detected by a vision module or barometer.

27. The device of claim 26, wherein the camera is mounted on an unmanned aerial vehicle via a cradle head, and the GPS is provided on the unmanned aerial vehicle, the cradle head, or the camera; and/or the presence of a catalyst in the reaction mixture,

the vision module or the barometer is arranged on the unmanned aerial vehicle, the holder or the shooting device.

28. The apparatus of claim 24, wherein a distance between the first position and the second position is less than or equal to a preset distance threshold.

29. The apparatus of claim 28, wherein the second position is a user preset position; alternatively, the first and second electrodes may be,

the second position is determined according to the position information of the first position and the preset distance threshold.

30. The apparatus of claim 29, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position according to the position information of the first position:

acquiring actual position information of the shooting device at the second position;

determining a horizontal distance between the first position and the second position according to the position information of the first position and the actual position information of the second position;

when the horizontal distance is larger than the preset distance threshold, controlling the shooting device to translate towards the first position, so that the horizontal distance from the shooting device to the first position is smaller than or equal to the preset distance threshold.

31. The apparatus of claim 29, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position according to the position information of the first position:

acquiring actual position information of the shooting device at the second position;

determining a height difference between the first position and the second position according to the position information of the first position and the actual position information of the second position;

when the height difference is not equal to zero, adjusting the height of the shooting device so that the height of the shooting device is equal to that of the first position.

32. The device of claim 24 or 25, wherein the second position is located to the left or right of the first position.

33. The apparatus of claim 24, wherein the processor is specifically configured to:

according to the position information of the first position, the unmanned aerial vehicle is controlled to move, so that the shooting device moves from the first position to the second position.

34. The apparatus of claim 33, wherein the processor is specifically configured to:

acquiring the current movement speed of the unmanned aerial vehicle;

determining a flight direction and a flight duration according to the current movement speed and the position information of the first position and the second position;

controlling the unmanned aerial vehicle to translate according to the current movement speed according to the flight direction and the flight duration;

alternatively, the first and second electrodes may be,

acquiring the current movement speed of the unmanned aerial vehicle;

determining flight time according to the current movement speed and a preset distance threshold;

controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the flight time length and the current movement speed;

alternatively, the first and second electrodes may be,

determining a flight direction and a flight duration according to a preset speed and the position information of the first position and the second position;

controlling the unmanned aerial vehicle to translate according to the preset speed according to the flight direction and the flight duration;

alternatively, the first and second electrodes may be,

determining flight time according to a preset speed and a preset distance threshold;

controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the preset speed according to the flight time;

alternatively, the first and second electrodes may be,

and controlling the unmanned aerial vehicle to translate relative to the first position according to a preset speed and a preset duration.

35. The apparatus of claim 24, wherein the camera is mounted on the drone through a cradle head;

the processor is specifically configured to:

when the unmanned aerial vehicle is in a static state, the cradle head is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position.

36. The apparatus of claim 35, wherein said pan/tilt head is carried on said drone by a power device movable in yaw;

the processor is specifically configured to:

and controlling the power device to move so as to control the holder to translate.

37. The apparatus of claim 36, wherein the processor is specifically configured to:

determining the movement direction and the movement duration of the power device according to the preset speed and the position information of the first position and the second position;

controlling the power device to translate according to the preset speed according to the movement direction and the movement duration;

alternatively, the first and second electrodes may be,

determining the movement time length of the power device according to a preset speed and a preset distance threshold;

controlling the power device to translate towards the left or the right relative to the first position according to the preset speed according to the movement duration;

alternatively, the first and second electrodes may be,

and controlling the power device to translate relative to the first position according to a preset speed and a preset time length.

38. The device of claim 33 or 35, wherein the drone is provided with a sensing unit; the processor is further configured to:

acquiring distance information from the target to the unmanned aerial vehicle based on the sensing unit;

and adjusting the depth information of the target according to the distance information.

39. The apparatus of claim 38, wherein the processor, prior to adjusting the depth information of the target based on the distance information, is further configured to:

and determining that the difference between the distance information and the depth information of the target is smaller than or equal to a preset difference threshold.

40. The apparatus of claim 38, wherein the processor, after obtaining the distance information from the target to the drone based on the sensing unit, is further configured to:

and when the difference between the distance information and the depth information of the target is determined to be larger than a preset difference threshold value, determining the depth information of the target to be invalid information.

41. The apparatus of claim 38, wherein the processor is specifically configured to:

and carrying out fusion processing on the distance information and the depth information to determine final depth information of the target.

42. The apparatus of claim 38, wherein the sensing unit comprises a laser ranging sensor.

43. The apparatus of claim 24, wherein the processor is specifically configured to:

acquiring the focal length of the shooting device;

determining a disparity between the first image and the second image from the first image and the second image;

and determining the depth information of the target according to the distance between the first position and the second position, the focal length of the shooting device and the parallax.

44. The apparatus of claim 43, wherein the processor is specifically configured to:

and calibrating the shooting device, and determining the focal length of the shooting device.

45. The apparatus of claim 43, wherein the processor is specifically configured to:

and carrying out binocular matching on the first image and the second image to obtain the parallax between the first image and the second image.

46. The apparatus of claim 45, wherein the processor is configured to perform binocular matching on the first image and the second image, and prior to obtaining the disparity between the first image and the second image, to:

calibrating the shooting device to obtain internal reference data of the shooting device;

and performing binocular correction processing on the first image and the second image according to the internal reference data.

47. An unmanned aerial vehicle, comprising:

a body;

a camera mounted on the body; and

a processor communicatively coupled to the camera, the processor configured to:

acquiring a first image shot by the shooting device at a first position aiming at a target;

controlling the shooting device to move from the first position to a second position according to the position information of the first position;

acquiring a second image shot by the shooting device at the second position aiming at the target;

determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;

wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.

48. A drone as claimed in claim 47, wherein the first position is a user preset position; alternatively, the first and second electrodes may be,

the first position is a current position of the drone.

49. A drone as claimed in claim 47, wherein the location information for the first location includes:

geographic location information based on GPS detection; and

based on altitude information detected by a vision module or barometer.

50. The unmanned aerial vehicle of claim 49, wherein the camera is mounted on the unmanned aerial vehicle via a cradle head, and the GPS is provided on the unmanned aerial vehicle, the cradle head, or the camera; and/or the presence of a catalyst in the reaction mixture,

the vision module or the barometer is arranged on the unmanned aerial vehicle, the holder or the shooting device.

51. A drone according to claim 47, wherein the distance between the first and second positions is less than or equal to a preset distance threshold.

52. A drone as claimed in claim 51, wherein the second position is a user preset position; alternatively, the first and second electrodes may be,

the second position is determined according to the position information of the first position and the preset distance threshold.

53. A drone as claimed in claim 52, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position based on the position information of the first position:

acquiring actual position information of the shooting device at the second position;

determining a horizontal distance between the first position and the second position according to the position information of the first position and the actual position information of the second position;

when the horizontal distance is larger than the preset distance threshold, controlling the shooting device to translate towards the first position, so that the horizontal distance from the shooting device to the first position is smaller than or equal to the preset distance threshold.

54. A drone as claimed in claim 52, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position based on the position information of the first position:

acquiring actual position information of the shooting device at the second position;

determining a height difference between the first position and the second position according to the position information of the first position and the actual position information of the second position;

when the height difference is not equal to zero, adjusting the height of the shooting device so that the height of the shooting device is equal to that of the first position.

55. A drone as claimed in claim 47 or 48, wherein the second location is to the left or right of the first location.

56. A drone as claimed in claim 47, wherein the processor is specifically configured to:

according to the position information of the first position, the unmanned aerial vehicle is controlled to move, so that the shooting device moves from the first position to the second position.

57. A drone as claimed in claim 56, wherein the processor is specifically configured to:

acquiring the current movement speed of the unmanned aerial vehicle;

determining a flight direction and a flight duration according to the current movement speed and the position information of the first position and the second position;

controlling the unmanned aerial vehicle to translate according to the current movement speed according to the flight direction and the flight duration;

alternatively, the first and second electrodes may be,

acquiring the current movement speed of the unmanned aerial vehicle;

determining flight time according to the current movement speed and a preset distance threshold;

controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the flight time length and the current movement speed;

alternatively, the first and second electrodes may be,

determining a flight direction and a flight duration according to a preset speed and the position information of the first position and the second position;

controlling the unmanned aerial vehicle to translate according to the preset speed according to the flight direction and the flight duration;

alternatively, the first and second electrodes may be,

determining flight time according to a preset speed and a preset distance threshold;

controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the preset speed according to the flight time;

alternatively, the first and second electrodes may be,

and controlling the unmanned aerial vehicle to translate relative to the first position according to a preset speed and a preset duration.

58. The drone of claim 47, wherein the camera is mounted on the drone by a cradle head;

the processor is specifically configured to:

when the unmanned aerial vehicle is in a static state, the cradle head is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position.

59. A drone according to claim 58, wherein the head is carried on the drone by a power device movable in the yaw direction;

the processor is specifically configured to:

and controlling the power device to move so as to control the holder to translate.

60. A drone as claimed in claim 59, wherein the processor is specifically configured to:

determining the movement direction and the movement duration of the power device according to the preset speed and the position information of the first position and the second position;

controlling the power device to translate according to the preset speed according to the movement direction and the movement duration;

alternatively, the first and second electrodes may be,

determining the movement time length of the power device according to a preset speed and a preset distance threshold;

controlling the power device to translate towards the left or the right relative to the first position according to the preset speed according to the movement duration;

alternatively, the first and second electrodes may be,

and controlling the power device to translate relative to the first position according to a preset speed and a preset time length.

61. A drone according to claim 56 or 58, characterised in that the drone is provided with a sensing unit; the processor is further configured to:

acquiring distance information from the target to the unmanned aerial vehicle based on the sensing unit;

and adjusting the depth information of the target according to the distance information.

62. A drone of claim 61, wherein the processor, prior to adjusting the depth information of the target based on the distance information, is further configured to:

and determining that the difference between the distance information and the depth information of the target is smaller than or equal to a preset difference threshold.

63. A drone according to claim 61, wherein the processor, after obtaining the distance information of the target to the drone based on the sensing unit, is further configured to:

and when the difference between the distance information and the depth information of the target is determined to be larger than a preset difference threshold value, determining the depth information of the target to be invalid information.

64. A drone as claimed in claim 61, wherein the processor is specifically configured to:

and carrying out fusion processing on the distance information and the depth information to determine final depth information of the target.

65. A drone according to claim 61, wherein the sensing unit includes a laser ranging sensor.

66. A drone as claimed in claim 47, wherein the processor is specifically configured to:

acquiring the focal length of the shooting device;

determining a disparity between the first image and the second image from the first image and the second image;

and determining the depth information of the target according to the distance between the first position and the second position, the focal length of the shooting device and the parallax.

67. A drone as claimed in claim 66, wherein the processor is specifically configured to:

and calibrating the shooting device, and determining the focal length of the shooting device.

68. A drone as claimed in claim 66, wherein the processor is specifically configured to:

and carrying out binocular matching on the first image and the second image to obtain the parallax between the first image and the second image.

69. A drone as claimed in claim 68, wherein the processor, prior to binocular matching the first and second images, is configured to:

calibrating the shooting device to obtain internal reference data of the shooting device;

and performing binocular correction processing on the first image and the second image according to the internal reference data.

Technical Field

The invention relates to the field of distance measurement, in particular to a distance measurement method and device based on an unmanned aerial vehicle and the unmanned aerial vehicle.

Background

Disclosure of Invention

The invention provides a distance measuring method and device based on an unmanned aerial vehicle and the unmanned aerial vehicle.

According to a first aspect of the present invention, there is provided a ranging method based on an unmanned aerial vehicle having a camera mounted thereon, the method comprising:

acquiring a first image shot by the shooting device at a first position aiming at a target;

controlling the shooting device to move from the first position to a second position according to the position information of the first position;

acquiring a second image shot by the shooting device at the second position aiming at the target;

determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;

wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.

According to a second aspect of the present invention, there is provided a ranging apparatus based on a drone, including a camera mounted on the drone and a processor communicatively connected to the camera, the processor being configured to:

acquiring a first image shot by the shooting device at a first position aiming at a target;

controlling the shooting device to move from the first position to a second position according to the position information of the first position;

acquiring a second image shot by the shooting device at the second position aiming at the target;

determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;

wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.

According to a third aspect of the invention, there is provided a drone comprising:

a body;

a camera mounted on the body; and

a processor communicatively coupled to the camera, the processor configured to:

acquiring a first image shot by the shooting device at a first position aiming at a target;

controlling the shooting device to move from the first position to a second position according to the position information of the first position;

acquiring a second image shot by the shooting device at the second position aiming at the target;

determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;

wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.

According to the technical scheme provided by the embodiment of the invention, the single shooting device is controlled to move to the two shooting positions respectively, the images are obtained respectively, the depth information of the target is calculated, the images shot by the two shooting devices at the two shooting positions in the prior art are completed through the single shooting device, the same effect as binocular distance measurement is achieved, the distance measurement cost is saved, the distance measurement method provided by the embodiment of the invention can be applied to small-sized unmanned aerial vehicles, and the requirements of most application scenes are met.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.

Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention;

fig. 2 is a flowchart of a method of a ranging method based on an unmanned aerial vehicle according to an embodiment of the present invention;

fig. 3 is a flowchart of a specific method of a ranging method based on an unmanned aerial vehicle according to an embodiment of the present invention;

fig. 4 is a schematic diagram of a position relationship between a target and a drone provided by an embodiment of the present invention;

fig. 5 is a flowchart of another specific method of the unmanned aerial vehicle-based ranging method according to an embodiment of the present invention;

FIG. 6 is a schematic diagram of a position relationship between a target and a pan/tilt head according to an embodiment of the present invention;

fig. 7 is a block diagram of a ranging apparatus based on an unmanned aerial vehicle according to an embodiment of the present invention;

fig. 8 is a block diagram of an unmanned aerial vehicle according to an embodiment of the present invention.

Detailed Description

The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

The following describes in detail a ranging method and apparatus based on an unmanned aerial vehicle, and an unmanned aerial vehicle according to the present invention, with reference to the accompanying drawings. The features of the following examples and embodiments may be combined with each other without conflict.

Fig. 1 is a schematic view of an unmanned aerial vehicle according to an embodiment of the present invention. The drone 100 may include a carrier 110 and a load 120. In some embodiments, the load 120 may be located directly on the drone 100 without the carrier 110. In this embodiment, the supporting body 110 is a pan/tilt head, such as a two-axis pan/tilt head or a three-axis pan/tilt head. The load 120 may be an image capturing device or a camera device (e.g., a camera, a camcorder, an infrared camera device, an ultraviolet camera device, or the like), an infrared camera device, or the like, and the load 120 may provide static sensing data (e.g., pictures) or dynamic sensing data (e.g., videos). The load 120 is mounted on the carrier 110, so that the rotation of the load 120 is controlled by the carrier 110.

Further, the drone 100 may include a power mechanism 130, a sensing system 140, and a communication system 150. The power mechanism 130 may include one or more rotating bodies, propellers, blades, motors, electronic speed regulators, and the like. For example, the rotator of the power mechanism may be a self-fastening rotator, a rotator assembly, or other rotator power unit. The drone 100 may have one or more powered mechanisms. All power mechanisms may be of the same type. Alternatively, one or more of the power mechanisms may be of a different type. The power mechanism 130 may be mounted on the drone by suitable means, such as by a support element (e.g., a drive shaft). The power mechanism 130 may be mounted at any suitable location on the drone 100, such as the top, bottom, front, back, sides, or any combination thereof. By controlling one or more power mechanisms 130, to control the flight of the drone 100.

The sensing system 140 may include one or more sensors to sense spatial orientation, velocity, and/or acceleration (e.g., rotation and translation with respect to up to three degrees of freedom) of the drone 100. The one or more sensors may include a GPS sensor, a motion sensor, an inertial sensor, a proximity sensor, or an image sensor. The sensed data provided by the sensing system 140 may be used to track the spatial orientation, velocity and/or acceleration of the target (using a suitable processing unit and/or control unit, as described below). Optionally, the sensing system 140 may be used to collect environmental data of the drone, such as climate conditions, potential obstacles to approach, location of geographic features, location of man-made structures, and the like.

The communication system 150 is capable of communicating with a terminal 160 having a communication module via wireless signals. The communication system 150, communication module, may include any number of transmitters, receivers, and/or transceivers for wireless communication. The communication may be a one-way communication such that data may be transmitted from one direction. For example, one-way communication may include only the drone 100 transmitting data to the terminal 160, or vice versa. One or more transmitters of the communication system 150 may transmit data to one or more receivers of the communication module and vice versa. Alternatively, the communication may be two-way communication, such that data may be transmitted in both directions between the drone 100 and the terminal 160. Two-way communication includes one or more transmitters of the communication system 150 that can transmit data to one or more receivers of the communication module, and vice versa.

In some embodiments, the terminal 160 may provide control data to one or more of the drone 100, the carrier 110, and the load 120, and receive information from one or more of the drone 100, the carrier 110, and the load 120 (e.g., position and/or motion information of the drone, the carrier, or the load, load-sensed data, such as image data captured by a camera).

In some embodiments, the drone 100 may communicate with other remote devices than the terminal 160, and the terminal 160 may also communicate with other remote devices than the drone 100. For example, the drone and/or the terminal 160 may communicate with another drone or a bearer or load of another drone. The additional remote device may be a second terminal or other computing device (such as a computer, desktop, tablet, smartphone, or other mobile device) when desired. The remote device may transmit data to the drone 100, receive data from the drone 100, transmit data to the terminal 160, and/or receive data from the terminal 160. Alternatively, the remote device may be connected to the internet or other telecommunications network to enable data received from the drone 100 and/or the terminal 160 to be uploaded to a website or server.

In some embodiments, the movement of the drone 100, the movement of the carrier 110, and the movement of the load 120 relative to a fixed reference (e.g., an external environment), and/or each other, may be controlled by the terminal 160. The terminal 160 may be a remote control terminal located remotely from the drone, carrier and/or load. The terminal 160 may be located on or affixed to a support platform. Alternatively, the terminal 160 may be hand-held or wearable. For example, the terminal 160 may include a smartphone, a tablet, a desktop, a computer, glasses, gloves, a helmet, a microphone, or any combination thereof. The terminal 160 may include a user interface such as a keyboard, mouse, joystick, touch screen, or display. Any suitable user input may interact with terminal 160 such as manual input commands, voice control, gesture control, or position control (e.g., through movement, position, or tilt of terminal 160).

In the following embodiments, the unmanned aerial vehicle-based ranging method and apparatus and the unmanned aerial vehicle are described separately by taking the load 120 including the shooting apparatus as an example.

23页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:光检测和测距系统的物体测量

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!