Flight assistance method, device, chip, system and medium for unmanned aerial vehicle

文档序号:1966919 发布日期:2021-12-14 浏览:18次 中文

阅读说明:本技术 无人飞行器的飞行辅助方法、设备、芯片、系统及介质 (Flight assistance method, device, chip, system and medium for unmanned aerial vehicle ) 是由 吕熙敏 李翔 段武阳 于 2020-08-17 设计创作,主要内容包括:一种无人飞行器(110,600,901)的飞行辅助方法、设备、芯片(800)、系统(100,900)及介质,飞行辅助方法包括:获取无人飞行器(110,600,901)的飞行状态信息,无人飞行器(110,600,901)的飞行状态信息包括无人飞行器(110,600,901)的飞行速度矢量(S201);获取无人飞行器(110,600,901)所在飞行场景的障碍物信息(S202);将飞行状态信息和障碍物信息发送给显示设备(130,700,902)进行显示,飞行状态信息和障碍物信息的显示用于辅助用户进行无人飞行器(110,600,901)的飞行(S203)。(A flight assistance method, apparatus, chip (800), system (100, 900) and medium for an unmanned aerial vehicle (110, 600, 901), the flight assistance method comprising: acquiring flight state information of the unmanned aerial vehicle (110, 600, 901), wherein the flight state information of the unmanned aerial vehicle (110, 600, 901) comprises a flight speed vector of the unmanned aerial vehicle (110, 600, 901) (S201); acquiring obstacle information of a flight scene where the unmanned aerial vehicle (110, 600, 901) is located (S202); the flight state information and the obstacle information are transmitted to a display device (130, 700, 902) to be displayed, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle (110, 600, 901) (S203).)

1. A flight assistance method of an unmanned aerial vehicle is applied to the unmanned aerial vehicle, and is characterized by comprising the following steps:

acquiring flight state information of an unmanned aerial vehicle, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located;

and sending the flight state information and the obstacle information to a display device, wherein the display device can display the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

2. The method of claim 1, wherein the obstacle information comprises a relative position between an obstacle in the flight scenario and the UAV.

3. The method of claim 2, wherein the relative position between the obstacle in the flight scenario and the UAV comprises a distance between three-axis directions of the UAV and the obstacle.

4. The method according to claim 2 or 3, wherein the obstacle information further comprises at least one of a type, a shape, a size of an obstacle in the flight scenario.

5. The method according to any one of claims 1-4, further comprising:

acquiring a scene image of the flight scene;

the transmitting the flight status information and the obstacle information to a display device includes:

the flight state information, the obstacle information and the scene image are sent to the display device, the display device can display the scene image and display the flight state information and the obstacle information on the displayed scene image, and the display of the flight state information and the obstacle information on the scene image is used for displaying the flight state of the unmanned aerial vehicle and the relative position between the obstacle in the flight scene and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

6. The method according to any one of claims 1-4, wherein after obtaining the obstacle information of the flight scene in which the UAV is located, the method further comprises:

establishing a virtual model of the barrier in the flight scene according to the barrier information;

the transmitting the flight status information and the obstacle information to a display device includes:

the flight state information, the obstacle information and the virtual model of the obstacle are sent to the display device, the display device can display the flight state information, the obstacle information and the virtual model of the obstacle, and the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

7. The method of claim 6, further comprising:

acquiring a scene image of the flight scene;

the sending the flight status information, the obstacle information, and the virtual model of the obstacle to the display device includes:

Transmitting the flight state information, the obstacle information, the virtual model of the obstacle, and the scene image to the display device, the display device is capable of displaying the scene image in a first preset display area and displaying the flight status information and the obstacle information on the displayed scene image, displaying the scene image, the flight state information and the obstacle information in the first preset display area, for presenting the flight scenario, and the flight status of the UAV, the obstacle, and the relative position between the obstacle and the UAV in the flight scenario to the user, the display device is capable of displaying the virtual model of the obstacle in a second preset display area, and displaying the virtual model in the second preset display area to display the obstacles in the flight scene to the user.

8. The method according to any one of claims 1-7, further comprising:

generating a flight prediction track according to the flight speed vector of the unmanned aerial vehicle;

And sending the flight prediction track to the display equipment, wherein the display equipment can display the flight prediction track, and the display of the flight prediction track is used for assisting the user in flying the unmanned aerial vehicle.

9. The method of claim 8, further comprising:

and determining whether the unmanned aerial vehicle has the risk of collision with the obstacle in the flight scene or not according to the flight prediction track, and if so, sending a collision reminding message to the display equipment and/or a control terminal of the unmanned aerial vehicle.

10. The method of claim 9, further comprising:

if the unmanned aerial vehicle has the risk of collision with the obstacle, generating a flight obstacle avoidance track according to the relative position between the unmanned aerial vehicle and the obstacle;

and sending the flight obstacle avoidance track to the display equipment, wherein the display equipment can display the flight obstacle avoidance track, and the display of the flight obstacle avoidance track is used for assisting the user in carrying out obstacle avoidance flight of the unmanned aerial vehicle.

11. The method of claim 10, further comprising:

and controlling the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

12. The method of claim 10, further comprising:

determining target pole quantity according to the flight obstacle avoidance track;

acquiring the current pole amount of the control terminal;

determining a pole amount increment according to the current pole amount and the target pole amount;

and sending the pole amount increment to the display equipment, wherein the display equipment can display the pole amount increment, and the display of the pole amount increment is used for assisting the user to control the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

13. The method according to any one of claims 9-12, further comprising:

if the unmanned aerial vehicle is in risk of colliding with the obstacle, acquiring the predicted flight time of the unmanned aerial vehicle when the unmanned aerial vehicle arrives at the position of colliding with the obstacle;

and if the predicted flight time is less than or equal to a preset time threshold value, controlling the unmanned aerial vehicle to brake.

14. The method according to any one of claims 1-13, further comprising:

If the unmanned aerial vehicle is detected to cross the obstacle in the flying scene, shooting an obstacle crossing image and/or an obstacle crossing video of the unmanned aerial vehicle through a shooting device on a holder of the unmanned aerial vehicle;

the obstacle crossing image and/or the obstacle crossing video are/is stored locally or sent to a server or a control terminal of the unmanned aerial vehicle, and the server and the control terminal can store the obstacle crossing image and/or the obstacle crossing video.

15. The method according to any one of claims 1-14, further comprising:

if the change of the direction of the flight speed vector of the unmanned aerial vehicle is detected, controlling the shooting direction of a shooting device on a holder of the unmanned aerial vehicle to turn to the direction of the flight speed vector;

sending the scene image shot by the shooting device to the display device, wherein the display device can display the scene image shot by the shooting device, the image center of the scene image displayed by the display device is the direction of the flight speed vector of the unmanned aerial vehicle, and the image center is used for assisting the user in determining the direction of the flight speed vector of the unmanned aerial vehicle.

16. A flight assistance method of an unmanned aerial vehicle is applied to a display device of the unmanned aerial vehicle, and is characterized by comprising the following steps:

acquiring flight state information of an unmanned aerial vehicle, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located;

and displaying the flight state information and the obstacle information, wherein the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

17. The method of claim 16, wherein the obstacle information comprises a relative position between an obstacle in the flight scenario and the UAV.

18. The method of claim 17, wherein the relative position between the obstacle in the flight scenario and the UAV comprises a distance between three-axis directions of the UAV and the obstacle.

19. The method according to claim 17 or 18, wherein the obstacle information further comprises at least one of a type, a shape, a size of an obstacle in the flight scenario.

20. The method according to any one of claims 16-19, further comprising:

acquiring a scene image of the flight scene;

the displaying the flight state information and the obstacle information includes:

and displaying the scene image, and displaying the flight state information and the obstacle information on the displayed scene image, wherein the display of the flight state information and the obstacle information on the scene image is used for showing the flight state of the unmanned aerial vehicle and the relative position between the obstacle in the flight scene and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

21. The method according to any one of claims 16-19, wherein after obtaining the obstacle information of the flight scenario in which the UAV is located, the method further comprises:

acquiring a virtual model of an obstacle in the flight scene;

the displaying the flight state information and the obstacle information includes:

displaying the flight state information, the obstacle information and the virtual model of the obstacle, wherein the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

22. The method according to any one of claims 16-19, wherein after obtaining the obstacle information of the flight scenario in which the UAV is located, the method further comprises:

establishing a virtual model of the barrier in the flight scene according to the barrier information;

the displaying the flight state information and the obstacle information includes:

displaying the flight state information, the obstacle information and the virtual model of the obstacle, wherein the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

23. The method according to claim 21 or 22, further comprising:

acquiring a scene image of the flight scene;

the displaying the flight state information, the obstacle information, and the virtual model includes:

displaying the scene image in a first preset display area, and displaying the flight state information and the obstacle information on the displayed scene image, wherein the scene image, the flight state information and the obstacle information in the first preset display area are used for displaying the flight scene, the flight state of the unmanned aerial vehicle in the flight scene, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user;

Displaying the virtual model of the obstacle in a second preset display area, wherein the display of the virtual model in the second preset display area is used for displaying the obstacle in the flight scene to the user.

24. The method according to any one of claims 16-23, wherein displaying the flight status information comprises:

acquiring the projection of the flight speed vector on a preset coordinate plane;

displaying the projection, the display of the projection being used to assist the user in determining the direction of the airspeed vector.

25. The method of claim 24, wherein the coordinate plane comprises a YOZ plane of a body coordinate system of the UAV or a YOZ plane of a geodetic coordinate system.

26. The method of claim 24 or 25, wherein said displaying the projection comprises:

and displaying the projection of the flying speed vector on the coordinate plane through a preset vector mark, wherein the projection is a connecting line between the center of the vector mark and the origin of coordinates on the coordinate plane, and the center of the vector mark is used for assisting the user in determining the direction of the flying speed vector.

27. The method of claim 26, wherein the vector marker is located at the origin of coordinates when the airspeed vector is less than or equal to a preset first speed threshold.

28. The method of claim 26 or 27, wherein the vector markings decrease in size as the airspeed vector increases.

29. The method according to any of claims 26-28, wherein the size of the vector tag is a preset maximum size in case the airspeed vector is equal to or less than a preset second speed threshold; and under the condition that the flying speed vector is greater than or equal to a preset third speed threshold value, the size of the vector mark is a preset minimum size.

30. The method of any of claims 26-29, wherein the vector is labeled as a vector sphere or a vector triangle.

31. The method according to any one of claims 16-30, further comprising:

and acquiring and displaying a flight prediction track of the unmanned aerial vehicle, wherein the display of the flight prediction track is used for assisting the user in flying the unmanned aerial vehicle.

32. The method according to any one of claims 16-31, further comprising:

and acquiring and displaying a flight obstacle avoidance track of the unmanned aerial vehicle, wherein the flight obstacle avoidance track is used for assisting the user in carrying out obstacle avoidance flight of the unmanned aerial vehicle.

33. The method of claim 32, further comprising:

acquiring a rod quantity increment corresponding to the flight obstacle avoidance track;

and displaying the rod quantity increment, wherein the display of the rod quantity increment is used for assisting the user to control the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

34. An unmanned aerial vehicle, comprising: a processor and a memory, the memory to store instructions, the processor to invoke the memory-stored instructions to perform the following:

acquiring flight state information of an unmanned aerial vehicle, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located;

and sending the flight state information and the obstacle information to a display device, wherein the display device can display the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

35. The UAV of claim 34 wherein the obstacle information includes a relative position between an obstacle in the flight scene and the UAV.

36. The UAV of claim 35 wherein the relative position between an obstacle in the flight scenario and the UAV comprises a distance between three-axis directions of the UAV and the obstacle.

37. The UAV according to claim 35 or 36 wherein the obstacle information further comprises at least one of a type, a shape, and a size of an obstacle in the flight scenario.

38. The unmanned aerial vehicle of any one of claims 34-37, wherein the vehicle further comprises a camera; the shooting device is used for:

acquiring a scene image of the flight scene;

the processor is specifically configured to:

the flight state information, the obstacle information and the scene image are sent to the display device, the display device can display the scene image and display the flight state information and the obstacle information on the displayed scene image, and the display of the flight state information and the obstacle information on the scene image is used for displaying the flight state of the unmanned aerial vehicle and the relative position between the obstacle in the flight scene and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

39. The UAV of any of claims 34-37 wherein the processor, prior to sending the flight status information and the obstacle information to a display device, is further configured to:

establishing a virtual model of the barrier in the flight scene according to the barrier information;

when the processor sends the flight status information and the obstacle information to a display device, the processor is specifically configured to:

the flight state information, the obstacle information and the virtual model of the obstacle are sent to the display device, the display device can display the flight state information, the obstacle information and the virtual model of the obstacle, and the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

40. The UAV of claim 39 further comprising a camera; the shooting device is used for:

Acquiring a scene image of the flight scene;

the processor is specifically configured to:

transmitting the flight state information, the obstacle information, the virtual model of the obstacle, and the scene image to the display device, the display device is capable of displaying the scene image in a first preset display area and displaying the flight status information and the obstacle information on the displayed scene image, displaying the scene image, the flight state information and the obstacle information in the first preset display area, for presenting the flight scenario, and the flight status of the UAV, the obstacle, and the relative position between the obstacle and the UAV in the flight scenario to the user, the display device is capable of displaying the virtual model of the obstacle in a second preset display area, and displaying the virtual model in the second preset display area to display the obstacles in the flight scene to the user.

41. The UAV of any one of claims 34-40 wherein the processor is further configured to:

Generating a flight prediction track according to the flight speed vector of the unmanned aerial vehicle;

and sending the flight prediction track to the display equipment, wherein the display equipment can display the flight prediction track, and the display of the flight prediction track is used for assisting the user in flying the unmanned aerial vehicle.

42. The UAV of claim 41, wherein the processor is further configured to:

and determining whether the unmanned aerial vehicle is in the risk of collision of the obstacle in the flight scene or not according to the flight prediction track, and if so, sending a collision reminding message to the display equipment and/or a control terminal of the unmanned aerial vehicle.

43. The UAV of claim 42 wherein the processor is further configured to:

if the unmanned aerial vehicle has the risk of collision with the obstacle, generating a flight obstacle avoidance track according to the relative position between the unmanned aerial vehicle and the obstacle;

and sending the flight obstacle avoidance track to the display equipment, wherein the display equipment can display the flight obstacle avoidance track, and the display of the flight obstacle avoidance track is used for assisting the user in carrying out obstacle avoidance flight of the unmanned aerial vehicle.

44. The UAV of claim 43, wherein the processor is further configured to:

and controlling the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

45. The UAV of claim 43, wherein the processor is further configured to:

determining target pole quantity according to the flight obstacle avoidance track;

acquiring the current pole amount of the control terminal;

determining a pole amount increment according to the current pole amount and the target pole amount;

and sending the pole amount increment to the display device, wherein the display device can display the pole amount increment, and the display of the pole amount increment is used for assisting the user in obstacle avoidance flight of the unmanned aerial vehicle.

46. The UAV of any one of claims 42-45 wherein the processor is further configured to:

if the unmanned aerial vehicle is at the risk of collision with the obstacle, acquiring the predicted flight time of the unmanned aerial vehicle reaching the position of collision with the obstacle;

and if the predicted flight time is less than or equal to a preset time threshold value, controlling the unmanned aerial vehicle to brake.

47. The UAV according to any one of claims 34-46 wherein the UAV further comprises a camera; the processor is further configured to:

if the unmanned aerial vehicle is detected to pass through the obstacle in the flying scene, sending a shooting instruction to the shooting device;

the shooting device is used for:

capturing an obstacle crossing image and/or an obstacle crossing video of the unmanned aerial vehicle in response to a capturing instruction of the processor; wherein the obstacle crossing image and/or the obstacle crossing video are stored locally or in a server or in a control terminal of the unmanned aerial vehicle.

48. The UAV of any one of claims 34-47 wherein the processor is further configured to:

if the change of the direction of the flight speed vector of the unmanned aerial vehicle is detected, controlling the shooting direction of a shooting device on a holder of the unmanned aerial vehicle to turn to the direction of the flight speed vector;

sending the scene image shot by the shooting device to the display device, wherein the display device can display the scene image shot by the shooting device, the image center of the scene image displayed by the display device is the direction of the flight speed vector of the unmanned aerial vehicle, and the image center is used for assisting the user in determining the direction of the flight speed vector of the unmanned aerial vehicle.

49. A display device, comprising: a processor and a display device;

the processor is used for acquiring flight state information of the unmanned aerial vehicle and acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

the display device is used for displaying the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

50. The apparatus of claim 49, wherein the obstacle information comprises a relative position between an obstacle in the flight scenario and the UAV.

51. The apparatus of claim 50, wherein the relative position between the obstacle in the flight scenario and the UAV comprises a distance between three-axis directions of the UAV and the obstacle.

52. The apparatus of claim 50 or 51, wherein the obstacle information further comprises at least one of a type, a shape, and a size of an obstacle in the flight scenario.

53. The device of any one of claims 49-52, wherein the processor is further configured to:

acquiring a scene image of the flight scene;

the display device is specifically configured to:

and displaying the scene image, and displaying the flight state information and the obstacle information on the displayed scene image, wherein the display of the flight state information and the obstacle information on the scene image is used for showing the flight state of the unmanned aerial vehicle and the relative position between the obstacle in the flight scene and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

54. The device of any one of claims 49-52, wherein the processor is further configured to:

acquiring a virtual model of an obstacle in the flight scene;

the display device is specifically configured to:

displaying the flight state information, the obstacle information and the virtual model of the obstacle, wherein the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

55. The device of any one of claims 49-52, wherein the processor is further configured to:

establishing a virtual model of the barrier in the flight scene according to the barrier information;

the display device is specifically configured to:

displaying the flight state information, the obstacle information and the virtual model of the obstacle, wherein the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

56. The apparatus according to claim 54 or 55, wherein the processor is further configured to:

acquiring a scene image of the flight scene;

the display device is specifically configured to:

displaying the scene image in a first preset display area, and displaying the flight state information and the obstacle information on the displayed scene image, wherein the scene image, the flight state information and the obstacle information in the first preset display area are used for displaying the flight scene, the flight state of the unmanned aerial vehicle in the flight scene, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to the user;

Displaying the virtual model of the obstacle in a second preset display area, wherein the display of the virtual model in the second preset display area is used for displaying the obstacle in the flight scene to the user.

57. The device of any one of claims 49-56, wherein the processor is further configured to:

acquiring the projection of the flight speed vector on a preset coordinate plane;

the display device is specifically configured to:

displaying the projection, the display of the projection being used to assist the user in determining the direction of the airspeed vector.

58. The apparatus of claim 57, wherein the coordinate plane comprises a YOZ plane of a body coordinate system of the UAV or a YOZ plane of a geodetic coordinate system.

59. The apparatus according to claim 57 or 58, wherein the display device is specifically configured to:

and displaying the projection of the flying speed vector on the coordinate plane through a preset vector mark, wherein the projection is a connecting line between the center of the vector mark and the origin of coordinates on the coordinate plane, and the center of the vector mark is used for assisting the user in determining the direction of the flying speed vector.

60. The apparatus as claimed in claim 59 wherein the vector marker is located at the origin of coordinates in the event that the airspeed vector is less than or equal to a preset first speed threshold.

61. The apparatus of claim 59 or 60 wherein the vector markings decrease in size as the airspeed vector increases.

62. The apparatus of any of claims 59-61, wherein the vector tag has a size that is a preset maximum size in the event that the airspeed vector is less than or equal to a preset second speed threshold; and under the condition that the flying speed vector is greater than or equal to a preset third speed threshold value, the size of the vector mark is a preset minimum size.

63. The apparatus of any of claims 59-62, wherein the vector is labeled as a vector sphere or a vector triangle.

64. The apparatus of any one of claims 49-63, wherein the processor is further configured to:

acquiring a flight prediction track of the unmanned aerial vehicle;

the display device is further configured to:

and displaying the flight prediction track, wherein the display of the flight prediction track is used for assisting the user to fly the unmanned aerial vehicle.

65. The device of any one of claims 49-64, wherein the processor is further configured to:

acquiring a flight obstacle avoidance track of the unmanned aerial vehicle;

the display device is further configured to:

and displaying the flight obstacle avoidance track, wherein the flight obstacle avoidance track is used for assisting the user in carrying out obstacle avoidance flight of the unmanned aerial vehicle.

66. The device of claim 65, wherein the processor is further configured to:

acquiring a rod quantity increment corresponding to the flight obstacle avoidance track;

the display device is also used for

And displaying the rod quantity increment, wherein the display of the rod quantity increment is used for assisting the user to control the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

67. A chip, comprising: a transceiver, a memory, and a processor;

the transceiver is used for data transceiving;

the memory to store program instructions;

the processor for invoking program instructions in the memory and executing the method of any one of claims 1-15 or any one of claims 16-33 based on received data of the transceiver.

68. An unmanned aerial vehicle system, comprising: the unmanned aerial vehicle of any of claims 34-48, the display device of any of claims 49-66, and a control terminal for controlling flight of the unmanned aerial vehicle.

69. A computer-readable storage medium comprising instructions which, when executed on a computer, cause the computer to perform the method of any of claims 1-15 or any of claims 16-33.

Technical Field

The present application relates to the field of unmanned aerial vehicle technologies, and in particular, to a flight assistance method, device, chip, system, and medium for an unmanned aerial vehicle.

Background

The unmanned aerial vehicle has mounted thereon a camera, which may be, for example, a wide-angle lens. By observing the sight field image taken by the wide-angle lens on the display device of the unmanned aerial vehicle, the user can perform the flight of the unmanned aerial vehicle at the first-person angle of view.

When a user flies the unmanned aerial vehicle at a first-person visual angle, the user usually needs to control the unmanned aerial vehicle to avoid obstacles by means of self feeling and proficiency. However, for the user without flight experience, the operation difficulty of controlling the unmanned aerial vehicle to avoid the obstacle by means of the self-feeling and proficiency is large.

Disclosure of Invention

The embodiment of the application provides a flight assisting method, flight assisting equipment, a flight assisting chip, a flight assisting system and a flight assisting medium of an unmanned aerial vehicle, which are used for assisting a user in flying the unmanned aerial vehicle and reducing the user operation difficulty of the unmanned aerial vehicle.

In a first aspect, an embodiment of the present application provides a flight assistance method for an unmanned aerial vehicle, where the flight assistance method is applied to the unmanned aerial vehicle, and the method includes:

acquiring flight state information of an unmanned aerial vehicle, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located;

and sending the flight state information and the obstacle information to a display device, wherein the display device can display the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

In a second aspect, an embodiment of the present application provides a flight assistance method for an unmanned aerial vehicle, which is applied to a display device of the unmanned aerial vehicle, and the method includes:

acquiring flight state information of an unmanned aerial vehicle, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located;

and displaying the flight state information and the obstacle information, wherein the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

In a third aspect, an embodiment of the present application provides an unmanned aerial vehicle, including: a processor and a memory, the memory to store instructions, the processor to invoke the memory-stored instructions to perform the following:

acquiring flight state information of an unmanned aerial vehicle, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located;

And sending the flight state information and the obstacle information to a display device, wherein the display device can display the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

In a fourth aspect, an embodiment of the present application provides a display device, including: a processor and a display device;

the processor is used for acquiring flight state information of the unmanned aerial vehicle and acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle;

the display device is used for displaying the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

In a fifth aspect, an embodiment of the present application provides a chip, including: a transceiver, a memory, and a processor;

the transceiver is used for data transceiving;

the memory to store program instructions;

The processor is configured to call the program instructions in the memory and execute the flight assistance method for the unmanned aerial vehicle according to the embodiment of the first aspect or the flight assistance method for the unmanned aerial vehicle according to the embodiment of the second aspect according to the received data of the transceiver.

In a sixth aspect, an embodiment of the present application provides an unmanned aerial vehicle system, including the unmanned aerial vehicle according to the embodiment of the present application in the third aspect, the display device according to the embodiment of the fourth aspect, and a control terminal, where the control terminal is configured to control flight of the unmanned aerial vehicle.

In a seventh aspect, the present application provides a computer-readable storage medium, including instructions that, when executed on a computer, cause the computer to perform the method for flight assistance for an unmanned aerial vehicle according to the first aspect or the method for flight assistance for an unmanned aerial vehicle according to the second aspect.

In an eighth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform a method of flight assistance for an unmanned aerial vehicle as described in the first aspect above or a method of flight assistance for an unmanned aerial vehicle as described in the second aspect above.

According to the flight assisting method, the flight assisting device, the flight assisting chip, the flight assisting system and the flight assisting medium of the unmanned aerial vehicle, flight state information of the unmanned aerial vehicle and obstacle information of a flight scene where the unmanned aerial vehicle is located are obtained on the unmanned aerial vehicle, the flight state information and the obstacle information are sent to the display device, the display device can display the flight state information and the obstacle information, the flight state information comprises a flight speed vector, the flight speed vector is used for indicating the size and the direction of the flight speed of the unmanned aerial vehicle, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle. Therefore, with the assistance of the flight state information and the obstacle information of the unmanned aerial vehicle displayed by the display device, the user can more accurately master the flight state of the unmanned aerial vehicle, particularly the flight speed of the unmanned aerial vehicle, and simultaneously know the obstacle condition of the flight scene where the unmanned aerial vehicle is located, operate the unmanned aerial vehicle to avoid obstacle flight, and reduce the user operation difficulty of the unmanned aerial vehicle.

Drawings

Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;

FIG. 2 is a flow chart of a flight assistance method for an UAV provided in an embodiment of the present application;

fig. 3a is a schematic view of a regional display of a scene image, flight state information, obstacle information, and a virtual model of an obstacle in a flight assistance method for an unmanned aerial vehicle according to an embodiment of the present application;

fig. 3b is a schematic view of a regional display of a scene image, flight status information, obstacle information, and a virtual model of an obstacle in a flight assistance method for an unmanned aerial vehicle according to another embodiment of the present application;

FIG. 4 is a flow chart of a flight assistance method for an UAV provided in accordance with another embodiment of the present application;

FIG. 5a is a schematic diagram of a flight velocity vector displayed by a vector sphere and projection in a flight assistance method of an unmanned aerial vehicle according to another embodiment of the present application;

FIG. 5b is a schematic diagram of a method for assisting a flight of an unmanned aerial vehicle according to another embodiment of the present application, in which a flight velocity vector is displayed by a vector sphere and projection;

FIG. 5c is a schematic diagram of a method for assisting a flight of an unmanned aerial vehicle according to another embodiment of the present application, in which a flight velocity vector is displayed by a vector sphere and projection;

FIG. 6 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present application;

Fig. 7 is a schematic structural diagram of a display device according to an embodiment of the present application;

fig. 8 is a schematic structural diagram of a chip according to an embodiment of the present application;

fig. 9 is a schematic structural diagram of an unmanned aerial vehicle system according to an embodiment of the present application.

Detailed Description

The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.

It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When a component is referred to as being "connected" to another component, it can be directly connected to the other component or intervening components may also be present.

Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.

The flight assistance method, the flight assistance device, the flight assistance chip, the flight assistance system and the flight assistance medium of the unmanned aerial vehicle, which are provided by the embodiment of the application, can be applied to the flight process of the unmanned aerial vehicle, and the unmanned aerial vehicle can fly by external force configuration or by self power, such as a traversing machine. Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application, and as shown in fig. 1, fig. 1 illustrates an unmanned aerial vehicle system 100, where the unmanned aerial vehicle system 100 includes an unmanned aerial vehicle 110, a display device 130, and a control terminal 140. Wherein the unmanned aerial vehicle 110 may wirelessly communicate with the display device 130 and the control terminal 140.

Unmanned aerial vehicle 110 includes, among other things, a power system 150, a flight control system 160, a frame, and a pan/tilt head 120 carried on the frame.

The power system 150 may include one or more electronic governors (abbreviated as electric governors) 151, one or more propellers 153, and one or more motors 152 corresponding to the one or more propellers 153, wherein the motors 152 are connected between the electronic governors 151 and the propellers 153, the motors 152 and the propellers 153 are disposed on the horn of the unmanned aerial vehicle 110; the electronic governor 151 is configured to receive a drive signal generated by the flight control system 160 and provide a drive current to the motor 152 based on the drive signal to control the rotational speed of the motor 152. The motor 152 is used to drive the propeller to rotate, thereby providing power for the flight of the UAV 110, which enables the UAV 110 to achieve one or more degrees of freedom of motion. In certain embodiments, the UAV 110 may rotate about one or more axes of rotation. For example, the above-mentioned rotation axes may include a Roll axis (Roll), a Yaw axis (Yaw) and a pitch axis (pitch). It should be understood that the motor 152 may be a dc motor or an ac motor. The motor 152 may be a brushless motor or a brush motor.

Flight control system 160 may include flight controller 161 and sensing system 162, and sensing system 162 is used to measure attitude information of unmanned aerial vehicle 110, i.e., position information and state information of unmanned aerial vehicle 110 in space, such as three-dimensional position, three-dimensional angle, three-dimensional velocity, three-dimensional accelerator, and three-dimensional angular velocity, etc. The sensing system 162 may include, for example, at least one of a gyroscope, an ultrasonic sensor, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer. For example, the Global navigation satellite System may be a Global Positioning System (GPS). The flight controller 161 is used to control the flight of the unmanned aerial vehicle 110, and for example, the flight of the unmanned aerial vehicle 110 may be controlled based on the attitude information measured by the sensing system 162. It should be understood that flight controller 161 may control unmanned aerial vehicle 110 according to pre-numbered program instructions, or may control unmanned aerial vehicle 110 by responding to one or more telemetry signals from control terminal 140.

The pan/tilt head 120 may include a motor 122. The pan/tilt head 120 is used to carry a load, which may be, for example, a camera 123. Flight controller 161 may control the movement of pan/tilt head 120 via motor 122. Optionally, as another embodiment, the pan/tilt head 120 may further include a controller for controlling the movement of the pan/tilt head 120 by controlling the motor 122. It should be understood that the pan/tilt head 120 may be independent of the unmanned aerial vehicle 110, or may be part of the unmanned aerial vehicle 110. It should be understood that the motor 122 may be a dc motor or an ac motor. The motor 122 may be a brushless motor or a brush motor. It should also be understood that the pan/tilt head 120 may be located at the top of the UAV 110, and may also be located at the bottom of the UAV 110.

The photographing device 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller 161 and perform photographing under the control of the flight controller 161. The image capturing Device 123 of this embodiment at least includes a photosensitive element, such as a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge-coupled Device (CCD) sensor. It is understood that the camera 123 may be directly fixed to the unmanned aerial vehicle 110, and thus the pan/tilt head 120 may be omitted.

The display device 130 is located at the ground end of the unmanned flight system 100, can communicate with the unmanned aerial vehicle 110 in a wireless manner, and can be used to display attitude information of the unmanned aerial vehicle 110. In addition, an image photographed by the photographing device 123 may also be displayed on the display apparatus 130. It should be understood that the display device 130 may be a stand-alone device (e.g., a head-mounted display device) or may be integrated into the control terminal 140.

Control terminal 140 is located at the ground end of unmanned aerial vehicle system 100 and may wirelessly communicate with unmanned aerial vehicle 110 for remote maneuvering of unmanned aerial vehicle 110.

It should be understood that the above-mentioned nomenclature for the components of the unmanned flight system is for identification purposes only, and should not be construed as limiting the embodiments of the present application.

In the flight process of the unmanned aerial vehicle, the unmanned aerial vehicle flies according to the received control instruction of the control terminal, and simultaneously, the image shot by the shooting device is sent to the display equipment. The display device outputs a scene image of a first-person visual angle with the unmanned aerial vehicle as an observer, and a user controls the unmanned aerial vehicle to fly away from an obstacle according to the scene image of the first-person visual angle and self experience, wherein the scene image is usually a three-dimensional live-action image of a flying scene where the unmanned aerial vehicle is located. The users of the unmanned aerial vehicles, especially the users without flight experience of the traversing machines, only depend on the scene images of the first person visual angle with the unmanned aerial vehicles as observers to fly the unmanned aerial vehicles, and the hands-on difficulty and the operation difficulty are high.

The flight assisting method, the flight assisting device, the flight assisting chip, the flight assisting system and the flight assisting medium of the unmanned aerial vehicle, provided by the embodiment of the application, are used for acquiring flight state information of the unmanned aerial vehicle and obstacle information of a flight scene where the unmanned aerial vehicle is located, and displaying the flight state information and the obstacle information by the display device, wherein the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle, and the flight state information comprises a flight speed vector. Therefore, the flight state information and the obstacle information are displayed, the user is helped to master the flight state of the unmanned aerial vehicle and the situation of the obstacle in the flight scene, the user can more accurately control the unmanned aerial vehicle to avoid the obstacle to fly (including obstacle crossing flight) based on the displayed flight state information and the displayed obstacle information, the starting difficulty and the operation difficulty of the unmanned aerial vehicle are effectively reduced, and meanwhile, the flight of the unmanned aerial vehicle cannot lose the challenge.

Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.

Fig. 2 is a flowchart of a flight assistance method for an unmanned aerial vehicle according to an embodiment of the present disclosure, and as shown in fig. 2, the method of this embodiment may be applied to an unmanned aerial vehicle, and the method of this embodiment may include:

s201, acquiring flight state information of the unmanned aerial vehicle, wherein the flight state information of the unmanned aerial vehicle comprises a flight speed vector of the unmanned aerial vehicle.

In this embodiment, when it is detected that the unmanned aerial vehicle is in a flight state, or when a flight assistance request of a user is received, or when it is detected that a preset flight assistance mode of the unmanned aerial vehicle is turned on, for example, the flight state information of the unmanned aerial vehicle may be acquired by a sensing system (e.g., the sensing system 162 in fig. 1) on the unmanned aerial vehicle. It is noted that the flight status information is real-time flight status information of the unmanned aerial vehicle.

The flight state information of the unmanned aerial vehicle comprises a flight speed vector of the unmanned aerial vehicle, and the flight speed vector is used for indicating the magnitude and the direction of the flight speed of the unmanned aerial vehicle.

Alternatively, the flight speed of the UAV may comprise a linear speed at which the UAV is flying.

Optionally, the flight speed of the unmanned aerial vehicle may include an angular velocity and/or an acceleration of the unmanned aerial vehicle flight in addition to the linear velocity of the unmanned aerial vehicle flight.

Optionally, in addition to the flight velocity vector of the unmanned aerial vehicle, the flight state information of the unmanned aerial vehicle may further include a flight altitude, a real-time position, and the like of the unmanned aerial vehicle.

The flight assistance mode may be manually turned on or off by a user, for example, the user may select to turn on or turn off the flight assistance mode according to the difficulty of the unmanned aerial vehicle flying in the current flight scenario. The flight assistance mode may also be controlled to be turned on or off by a flight control system (e.g., the flight control system 160 in fig. 1) of the unmanned aerial vehicle, for example, the flight control system controls the turn-on or turn-off of the flight assistance mode according to flight record information (e.g., flight duration, flight score) of the user, and when the flight duration is less than a preset flight duration threshold value or the flight score is less than a preset flight score threshold value, the flight control system controls the turn-on of the flight assistance mode, and vice versa.

S202, obtaining obstacle information of a flight scene where the unmanned aerial vehicle is located.

In this embodiment, when it is detected that the unmanned aerial vehicle is in a flight state, or when a flight assistance request of a user is received, or when it is detected that a flight assistance mode of the unmanned aerial vehicle is started, obstacle information of a flight scene where the unmanned aerial vehicle is located may be acquired through a sensing system on the unmanned aerial vehicle. For example, whether an obstacle exists in the vicinity of the unmanned aerial vehicle is measured through a radar device in the sensing system, and if so, the orientation of the obstacle relative to the unmanned aerial vehicle can be further measured. As another example, a scene image of the flying scene may be captured by a capturing device (e.g., capturing device 123 in fig. 1), and whether an obstacle exists in the vicinity of the unmanned aerial vehicle may be determined by performing obstacle recognition on the scene image. For example, the flight scene where the unmanned aerial vehicle is located may refer to a three-dimensional space range with a preset size and centered on the unmanned aerial vehicle, for example, a three-dimensional space range with a preset distance as a spherical radius and centered on the unmanned aerial vehicle. For example, the flight scene in which the unmanned aerial vehicle is located may refer to a spatial range in the current speed direction of the unmanned aerial vehicle. In other words, the obstacle information of the flight scene in which the unmanned aerial vehicle is located may be obstacle information in the current speed direction of the unmanned aerial vehicle. The obstacle may be a dynamic object or a static object, such as a mountain, a bridge, a flying bird.

It should be noted that the execution order of the operations S201 and S202 described above may be interchanged.

And S203, sending the flight state information and the obstacle information to a display device, wherein the display device can display the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

In this embodiment, the acquired flight status information and the acquired obstacle information may be sent to a display device (for example, the display device 130 in fig. 1), and the display device receives the flight status information and the obstacle information and then displays the flight status information and the obstacle information. Accordingly, the user can view the flight state information and the obstacle information through the display device, and adjust the flight attitude (such as the flight direction, the flight speed, the flight altitude and the like) of the unmanned aerial vehicle with the assistance of the displayed flight state information and the displayed obstacle information, so that the obstacle avoidance flight of the unmanned aerial vehicle can be performed more securely. In order to ensure the flight assistance effect, the flight state information and the obstacle information can be synchronously acquired, and the synchronously acquired flight state information and the obstacle information are sent to the display equipment together for displaying.

In this embodiment, the display device displays the flight state information, and the display device displays the flight velocity vector of the unmanned aerial vehicle. Therefore, the user can intuitively and accurately master the flight speed and the direction of the unmanned aerial vehicle according to the displayed flight speed vector, instead of judging the flight speed of the unmanned aerial vehicle only according to the self visual sense, and further can adjust the flight speed and the direction of the unmanned aerial vehicle according to the displayed flight speed vector and the obstacle information, so that the obstacle avoidance flight of the unmanned aerial vehicle is realized.

According to the flight assisting method of the unmanned aerial vehicle, the flight state information and the obstacle information of the flight scene where the unmanned aerial vehicle is located are obtained, and the flight state information and the obstacle information are sent to the display device, so that the display device can display the flight state information and the obstacle information, and a user can be effectively assisted in flying the unmanned aerial vehicle through the displayed flight state information (particularly flight speed vector) and the displayed obstacle information. The user without flight experience can be familiar with the flight of the unmanned aerial vehicle more quickly with the assistance of the displayed flight state information and the barrier information, and the user with flight experience can also improve the self ability of flying the unmanned aerial vehicle more quickly with the assistance of the displayed flight state information and the barrier information.

In some embodiments, the obstacle information in the flight scenario in which the unmanned aerial vehicle is located includes a relative position between the obstacle and the unmanned aerial vehicle in the flight scenario.

Alternatively, the relative position between the obstacle and the unmanned aerial vehicle may be a distance between the obstacle and an origin of a body coordinate system of the unmanned aerial vehicle.

Alternatively, the relative position between the obstacle and the unmanned aerial vehicle may be a distance between the obstacle and one or more predetermined critical positions on the unmanned aerial vehicle, for example, the one or more critical positions on the unmanned aerial vehicle may include one or more of a nose, a propeller, and a tail of the unmanned aerial vehicle.

Optionally, a cuboid model corresponding to the unmanned aerial vehicle may be pre-constructed, and the relative position between the obstacle and the unmanned aerial vehicle may include the distance between the obstacle and each side or each face of the cuboid model.

Alternatively, the relative position between the obstacle and the unmanned aerial vehicle may include a distance between three axes of the unmanned aerial vehicle (i.e., the roll axis, the yaw axis, and the pitch axis) and the obstacle.

When the obstacle information includes the relative position between the obstacle and the unmanned aerial vehicle, one possible implementation manner of S203 described above is: and sending the flight speed vector of the unmanned aerial vehicle and the relative position between the obstacle and the unmanned aerial vehicle to a display device, wherein the display device can display the flight speed vector of the unmanned aerial vehicle and the relative position between the obstacle and the unmanned aerial vehicle in the flight scene where the unmanned aerial vehicle is located. Therefore, the embodiment displays the flight speed vector of the unmanned aerial vehicle and the relative position between the obstacle and the unmanned aerial vehicle, provides more visual flight state information and obstacle information for a user, and improves the flight assistance effect of the unmanned aerial vehicle.

In some embodiments, a scene image of the flying scene may also be acquired. The scene image of the flight scene can be acquired through a shooting device on the unmanned aerial vehicle. The scene image includes a scene image of a first-person perspective with the unmanned aerial vehicle as an observer, and the range of the scene image is related to the view angle of the shooting device, which is not limited herein.

After acquiring the scene image of the flying scene, one possible implementation manner of S203 is: the method comprises the steps that flight state information of the unmanned aerial vehicle, obstacle information of a flight scene where the unmanned aerial vehicle is located and a scene image of the flight scene are sent to a display device, the display device can display the scene image and display the flight state information and the obstacle information on the displayed scene image, and the display of the flight state information and the obstacle information on the scene image is used for displaying the flight state (particularly flight speed) of the unmanned aerial vehicle and the relative position between the obstacle in the flight scene and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In this embodiment, after the display device receives the flight state information of the unmanned aerial vehicle, the obstacle information of the flight scene where the unmanned aerial vehicle is located, and the scene image of the flight scene, the flight state information and the relative position between the obstacle and the unmanned aerial vehicle are displayed on the displayed scene image. The display device may display the flight status information at a designated position on the scene image (e.g., a screen center of the scene image so that the user notices the flight status information), and may display a relative position between the obstacle and the unmanned aerial vehicle near the corresponding obstacle on the scene image. For example, when the distance between the obstacle a and the nose of the unmanned aerial vehicle is 10 meters, the distance may be displayed on the obstacle a on the scene image. For another example, when the unmanned aerial vehicle is traversing a traversable passage in the obstacle B (e.g., the obstacle B is a bridge body, and the traversable passage of the obstacle B is a bridge opening), a distance between each face of the traversable passage of the obstacle B and the unmanned aerial vehicle may be displayed on each face of the scene image (e.g., a distance between two side faces and a top face of the bridge opening, respectively, and the unmanned aerial vehicle is displayed). Therefore, the flight state information and the obstacle information of the unmanned aerial vehicle are more vividly and more intuitively displayed for the user, and the flight auxiliary effect of the unmanned aerial vehicle is improved.

In this embodiment, when the flight state information includes the flight speed vector, the user may adjust the flight speed vector displayed on the display device by adjusting the flight speed of the unmanned aerial vehicle until the flight speed vector displayed on the display device meets the requirements of the user. For example, during crossing flight, the flight speed of the unmanned aerial vehicle can be adjusted by a user, so that the direction of the flight speed vector displayed on the display device is aligned with the traversable passage of the obstacle in the scene image, the alignment of the direction of the flight speed vector with the traversable passage of the obstacle indicates that the unmanned aerial vehicle flies towards the traversable passage of the obstacle, and the alignment operation is more accurate as the distance between the unmanned aerial vehicle and the traversable passage is closer, so that the unmanned aerial vehicle can pass through the traversable passage of the obstacle, and the operation difficulty of crossing flight is effectively reduced.

In some embodiments, the obstacle information of the flight scene in which the unmanned aerial vehicle is located includes at least one of a type, a shape, and a size of the obstacle in the flight scene, in addition to a relative position between the obstacle and the unmanned aerial vehicle.

When the obstacle information of the flight scene where the unmanned aerial vehicle is located further includes at least one of the type, the shape, and the size of the obstacle in the flight scene, a virtual model of the obstacle may be established according to at least one of the type, the shape, and the size of the obstacle, and one possible implementation manner of S203 is: the flight state information, the obstacle information and the virtual model of the obstacle are sent to a display device, the display device can display the flight state information, the obstacle information and the virtual model of the obstacle, and the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state (particularly the flight speed) of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In this embodiment, a virtual model of the obstacle may be established according to at least one of the type, shape, and size of the obstacle in the obstacle information. Specifically, if the obstacle information includes only the type of obstacle in addition to the relative position between the obstacle and the unmanned aerial vehicle, a virtual model conforming to the type of obstacle may be constructed, for example, when the type of obstacle is a bird, a virtual model of the obstacle may be constructed according to the usual shape and size of the bird. If the obstacle information includes only the shape or size of the obstacle in addition to the relative position between the obstacle and the unmanned aerial vehicle, a virtual model that conforms to the shape or size of the obstacle may be constructed, and the specific construction manner of the virtual model is not limited herein.

In this embodiment, after the virtual model of the obstacle is constructed, the flight state information, the relative position between the obstacle and the unmanned aerial vehicle, and the virtual model of the obstacle may be sent to the display device, the display device receives the flight state information, the relative position between the obstacle and the unmanned aerial vehicle, and the virtual model of the obstacle, and then displays the flight state information, the relative position between the obstacle and the unmanned aerial vehicle, and the virtual model of the obstacle, and displays the virtual model of the obstacle to the user on the basis of displaying the flight state information and the relative position between the obstacle and the unmanned aerial vehicle to the user, and the user grasps more information about the obstacle according to the virtual model of the obstacle, and can more accurately control the unmanned aerial vehicle to avoid the obstacle to fly or pass through the obstacle.

On the basis of obtaining a scene image of a flying scene and a virtual model of an obstacle, one possible implementation manner of S203 is: the method comprises the steps of sending flight state information, obstacle information, virtual models of obstacles and scene images to display equipment, wherein the display equipment can display the scene images in a first preset display area and display the flight state information and the obstacle information on the displayed scene images, the flight state information and the obstacle information in the first preset display area are used for displaying a flight scene and the flight state of the unmanned aerial vehicle in the flight scene and the relative positions of the obstacles and the unmanned aerial vehicle to a user, the display equipment can display the virtual models of the obstacles in a second preset display area, and the virtual models in the second preset area are used for displaying the obstacles in the flight scene to the user.

In this embodiment, a display area of the display device may be divided into two parts, one part is a first preset display area, the other part is a second preset display area, the first preset display area is used for displaying a scene image, flight state information, and a relative position between an obstacle and the unmanned aerial vehicle in the obstacle information, and the second preset display area is used for displaying a virtual model of the obstacle in the flight scene. For example, as shown in fig. 3a, the first preset display area and the second preset display area respectively occupy different display areas of the display device, and the user can view the virtual model of the obstacle in the flying scene while viewing the scene image, the flying state information, and the relative position between the obstacle and the unmanned aerial vehicle in the obstacle information by viewing the different display areas, so that the content displayed on the display device is prevented from being too cool and complicated, and the displayed content is more dazzling.

Optionally, the area of the first preset display region is larger than the area of the second preset display region. Therefore, the user can view the scene image, the flight state information and the relative position between the obstacle and the unmanned aerial vehicle in the first preset display area with larger area, and can view the virtual model of the obstacle in the second preset display area with smaller display area, so that the display effect of the scene image is ensured, and the flight feeling of the unmanned aerial vehicle at the first personal visual angle of the user is further ensured.

In another possible implementation manner, the second preset display area is a partial display area in the first preset display area. For example, as shown in fig. 3b, the first preset display area may be the entire display area of the display device, and the second preset display area is, for example, a partial display area at the lower right corner, where a virtual model corresponding to an obstacle is superimposed on a partial area at the lower right corner of the scene image, where the virtual model of the obstacle may block a partial image in the scene image.

Optionally, in addition to displaying the virtual model of the obstacle in the second preset display area, a real three-dimensional image of the obstacle may also be displayed in the second preset display area, so that the user can view the real situation of the obstacle at each angle. For example, the current geographic position of the unmanned aerial vehicle may be acquired, images related to the current geographic position may be acquired from the internet, and the obstacle may be identified in the images, so as to obtain a three-dimensional image of the obstacle; for another example, cameras may be arranged at a plurality of fixed positions in the flight scene (e.g., preset positions on a building) in advance, and three-dimensional images of the obstacles may be acquired by the cameras, or three-dimensional images of the obstacles may be acquired by other unmanned aerial vehicles flying together.

In some embodiments, on the basis that the flight state information of the unmanned aerial vehicle includes the flight speed vector of the unmanned aerial vehicle, a flight prediction trajectory may be further generated according to the flight speed vector of the unmanned aerial vehicle, and the flight prediction trajectory is sent to a display device, where the display device is capable of displaying the flight prediction trajectory, where the display of the flight prediction trajectory is used to assist a user in flying the unmanned aerial vehicle.

In this embodiment, it may be assumed that the unmanned aerial vehicle flies at a constant speed within a preset time period, and according to the flight speed vector of the unmanned aerial vehicle, a predicted flight trajectory of the unmanned aerial vehicle within the preset time period may be calculated. And sending the predicted flight track to a display device, and displaying the predicted flight track by the display device after receiving the predicted flight track so that a user can master the predicted flight track of the unmanned aerial vehicle and adjust the flight direction of the unmanned aerial vehicle based on the predicted flight track. The user views the predicted flight trajectory on the display device, can predict the direction, area and the like to which the unmanned aerial vehicle flies in advance, and can adjust the flight of the unmanned aerial vehicle if the user does not want the unmanned aerial vehicle to fly according to the predicted flight trajectory.

Further, whether the unmanned aerial vehicle is in the risk of collision of the obstacle in the flight scene or not can be determined according to the generated flight prediction trajectory, and if so, a collision reminding message is sent to the display device and/or the control terminal of the unmanned aerial vehicle (i.e., the control terminal 140 in fig. 1) to remind a user of adjusting the flight speed of the unmanned aerial vehicle, so that the unmanned aerial vehicle is prevented from colliding with the obstacle. Whether the unmanned aerial vehicle has the risk of colliding with the obstacle can be determined according to whether the predicted flight track intersects with the obstacle or whether the shortest distance between the predicted flight track and the obstacle is smaller than a preset distance threshold value.

Further, if the unmanned aerial vehicle has a risk of colliding with the obstacle, a flight obstacle avoidance track is generated according to the relative position between the unmanned aerial vehicle and the obstacle, the flight obstacle avoidance track is sent to the display device, and the display device can display the flight obstacle avoidance track. The display of the flight obstacle avoidance track is used for assisting a user in carrying out obstacle avoidance flight of the unmanned aerial vehicle. Therefore, under the condition that the unmanned aerial vehicle has the risk of colliding with the obstacle, the flight obstacle avoidance track is provided for the user, so that the user can control the unmanned aerial vehicle to fly according to the prompt of the flight obstacle avoidance track, and the collision between the unmanned aerial vehicle and the obstacle is avoided. The flight obstacle avoidance track can be generated according to the flight speed vector of the unmanned aerial vehicle and the relative position between the unmanned aerial vehicle and the obstacle, and the specific process is not limited.

Further, after the flight obstacle avoidance trajectory is generated, a flight controller (for example, the flight controller 161 in fig. 1) in the unmanned aerial vehicle may control the unmanned aerial vehicle to fly according to the flight obstacle avoidance trajectory, so as to implement automatic obstacle avoidance flight of the unmanned aerial vehicle.

Further, after the flight obstacle avoidance track is generated, a target pole amount can be determined according to the flight obstacle avoidance track, the current pole amount of the control terminal is obtained, a pole amount increment is determined according to the current pole amount and the target pole amount, the pole amount increment is sent to the display device, the display device can display the pole amount increment, and the display of the pole amount increment is used for assisting a user to control the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

In this embodiment, the flight velocity vector of the unmanned aerial vehicle corresponding to the flight obstacle avoidance trajectory may be determined, and then the target stick amount may be determined according to a preset conversion relationship between the flight velocity vector and the stick amount, where the conversion relationship between the flight velocity vector and the stick amount is not limited herein. The current pole amount may be obtained from a received control instruction of the control terminal. And obtaining the rod amount increment according to the difference between the current rod amount and the target rod amount. The display device displays the pole amount increment after receiving the pole amount increment. Therefore, the user can perform corresponding control operation on the control terminal according to the displayed stick quantity increment so that the unmanned aerial vehicle flies according to the flight obstacle avoidance track.

Further, if the unmanned aerial vehicle has the risk of colliding with the obstacle, the predicted flight time of the unmanned aerial vehicle reaching the position of colliding with the obstacle is obtained, and if the predicted flight time is smaller than or equal to a preset time threshold, the unmanned aerial vehicle is controlled to brake, so that when the unmanned aerial vehicle cannot be controlled to fly according to the flight obstacle avoidance track, the unmanned aerial vehicle can be timely braked and stopped, and the collision of the unmanned aerial vehicle and the obstacle is avoided. The position where the predicted flight path of the unmanned aerial vehicle intersects with the obstacle, namely the position where the unmanned aerial vehicle collides with the obstacle, can determine the predicted flight time of the unmanned aerial vehicle reaching the position according to the flight speed vector of the unmanned aerial vehicle and the distance between the unmanned aerial vehicle and the position.

In some embodiments, during the flight of the unmanned aerial vehicle, if the unmanned aerial vehicle is detected to cross an obstacle in the flight scene, an obstacle crossing image and/or an obstacle crossing video of the unmanned aerial vehicle are shot through a shooting device on a tripod head (i.e., tripod head 120 in fig. 1) of the unmanned aerial vehicle, and the obstacle crossing image and/or the obstacle crossing video are stored locally on the unmanned aerial vehicle or sent to a server for storage or sent to a control terminal of the unmanned aerial vehicle for storage so as to be viewed by a user. Whether the unmanned aerial vehicle passes through the obstacle in the flying scene or not can be detected through a sensing system on the unmanned aerial vehicle. For example, when the unmanned aerial vehicle passes through a bridge opening, whether the unmanned aerial vehicle is about to enter the bridge opening can be detected through a radar device, a light sensing device and the like on the unmanned aerial vehicle, and when the unmanned aerial vehicle is determined to enter the bridge opening, the shooting device starts to shoot an obstacle passing image and/or an obstacle passing video.

In some embodiments, on the basis that the flight state information of the unmanned aerial vehicle includes the flight speed vector of the unmanned aerial vehicle, if the change of the direction of the flight speed vector of the unmanned aerial vehicle is detected, the shooting direction of the shooting device on the tripod head of the unmanned aerial vehicle is controlled to turn to the direction of the flight speed vector, and the display device is controlled to display the scene image shot by the shooting device. The image center of the scene image displayed by the display device is the direction of the flight speed vector of the unmanned aerial vehicle, and therefore the image center can be used for assisting a user in determining the direction of the flight speed vector of the unmanned aerial vehicle.

In this embodiment, the pan tilt of the unmanned aerial vehicle can be rotated, so that the shooting direction of the shooting device on the pan tilt and the direction of the flight velocity vector of the unmanned aerial vehicle are kept consistent, and since the scene image shot by the shooting device is a three-dimensional image, under the condition that the shooting direction of the shooting device is consistent with the direction of the flight velocity vector of the unmanned aerial vehicle, the center of the image of the scene image shot by the shooting device points to the direction of the flight velocity vector of the unmanned aerial vehicle. For example, under the influence of wind speed, the flight velocity vector of the unmanned aerial vehicle shifts to the right, and the cradle head rotates to the right, so that the shooting direction of the shooting device on the cradle head is consistent with the direction of the flight velocity vector, and a user can obviously feel that a scene image (or a display picture where the scene image is) displayed by the display device moves. For example, when crossing an obstacle, the user may adjust the flight speed vector of the unmanned aerial vehicle at the remote control terminal, so as to align the center of the image of the scene image displayed by the display image with the traversable passage of the obstacle, and as the distance between the unmanned aerial vehicle and the traversable passage is closer, the adjustment process of the user is more accurate, and finally, the unmanned aerial vehicle can fly through the traversable passage.

Fig. 4 is a flowchart of a flight assistance method for an unmanned aerial vehicle according to another embodiment of the present application. As shown in fig. 4, the method is applied to a display device of an unmanned aerial vehicle, and may include:

s401, acquiring flight state information of the unmanned aerial vehicle, wherein the flight state information of the unmanned aerial vehicle comprises a flight speed vector of the unmanned aerial vehicle.

Wherein the flight velocity vector is used to indicate the magnitude and direction of the flight velocity of the unmanned aerial vehicle.

In this embodiment, flight status information from the unmanned aerial vehicle may be received. For example, the unmanned aerial vehicle may transmit its flight status information directly to the display device, or may transmit its flight status information to the display device via a signal relay device (e.g., a repeater). The detailed content of the flight status information may refer to the related description of S201, and is not described again.

S402, obtaining obstacle information of a flight scene where the unmanned aerial vehicle is located.

In this embodiment, obstacle information of a flight scene in which the unmanned aerial vehicle is located from the unmanned aerial vehicle may be received. For example, the unmanned aerial vehicle may directly transmit the obstacle information of the flight scene where the unmanned aerial vehicle is located to the display device, or may transmit the obstacle information of the flight scene where the unmanned aerial vehicle is located to the display device via a signal forwarding device (e.g., a repeater). The details of the obstacle information may be referred to in the related description of S202, and are not repeated.

It should be noted that the execution order of the operations S401 and S402 may be interchanged.

And S403, displaying the flight state information and the obstacle information, wherein the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

In this embodiment, after receiving the flight state information and the obstacle information of the unmanned aerial vehicle, the display device displays the flight state information and the obstacle information, and accordingly, a user can view the flight state information and the obstacle information through the display device, and can adjust the flight attitude of the unmanned aerial vehicle with the assistance of the displayed flight state information and the displayed obstacle information, so that obstacle avoidance flight of the unmanned aerial vehicle can be performed more securely. In order to ensure the flight assistance effect, the flight state information and the obstacle information obtained at the same time can be synchronously displayed.

In this embodiment, the display device displays the flight state information, and the display device displays the flight velocity vector of the unmanned aerial vehicle. Therefore, the user can intuitively and accurately master the flight speed and the direction of the unmanned aerial vehicle according to the displayed flight speed vector, instead of judging the flight speed of the unmanned aerial vehicle only according to the self visual sense, and further can adjust the flight speed and the direction of the unmanned aerial vehicle according to the displayed flight speed vector and the obstacle information, so that the obstacle avoidance flight of the unmanned aerial vehicle is realized.

According to the flight assisting method of the unmanned aerial vehicle, the flight of the unmanned aerial vehicle is effectively assisted by a user through the displayed flight state information and the barrier information. The flight experience-free user can be familiar with the flight of the unmanned aerial vehicle more quickly with the aid of the displayed flight state information (particularly flight speed vector) and the obstacle information, and the flight experience-free user can also improve the ability of the flight experience-free user to fly the unmanned aerial vehicle more quickly with the aid of the displayed flight state information and the obstacle information.

In some embodiments, the obstacle information in the flight scenario in which the unmanned aerial vehicle is located includes a relative position between the obstacle and the unmanned aerial vehicle in the flight scenario. Optionally, the relative position between the obstacle and the unmanned aerial vehicle may be a distance between the obstacle and an origin of a body coordinate system of the unmanned aerial vehicle; alternatively, the relative position between the obstacle and the unmanned aerial vehicle may be a distance between the obstacle and one or more preset critical positions on the unmanned aerial vehicle, for example, the one or more critical positions on the unmanned aerial vehicle may include one or more of a nose, a propeller, and a tail of the unmanned aerial vehicle; or a cuboid model corresponding to the unmanned aerial vehicle can be constructed in advance, and the relative position between the obstacle and the unmanned aerial vehicle can comprise the distance between the obstacle and each side or each surface of the cuboid model; alternatively, the relative position between the obstacle and the UAV may include a distance between three axes of the UAV (i.e., roll, yaw, and pitch axes) and the obstacle.

In some embodiments, a scene image of a flight scene in which the unmanned aerial vehicle is located is acquired. Wherein a scene image of a flight scene from an unmanned aerial vehicle may be received.

On the basis of obtaining the scene image, the flight state information, and the obstacle information, one possible implementation manner of the above S403 is: the method comprises the steps of displaying a scene image, and displaying flight state information and obstacle information on the displayed scene image, wherein the display of the flight state information and the obstacle information on the scene image is used for showing the flight state of the unmanned aerial vehicle and the relative position between an obstacle in the flight scene and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, a virtual model of an obstacle in a flight scenario in which the unmanned aerial vehicle is located is obtained. Wherein a virtual model of an obstacle transmitted by the unmanned aerial vehicle may be received.

On the basis of obtaining the virtual model of the obstacle, the flight state information, and the obstacle information, one possible implementation manner of the above S403 is: and displaying the flight state information, the obstacle information and the virtual model of the obstacle, wherein the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for showing the flight state of the unmanned aerial vehicle, the obstacle and the relative position of the obstacle between the unmanned aerial vehicle to the user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

Optionally, the acquired obstacle information may further include at least one of a type, a shape, and a size of an obstacle in a flight scene in which the unmanned aerial vehicle is located. Therefore, the display device may also construct a virtual model of the obstacle according to at least one of the type, shape, and size of the obstacle.

On the basis of obtaining the scene image of the flying scene, the virtual model of the obstacle, the flying state information, and the obstacle information, one possible implementation manner of S303 is as follows: displaying a scene image in a first preset display area, and displaying flight state information and obstacle information on the displayed scene image, wherein the scene image, the flight state information and the obstacle information in the first preset display area are used for displaying a flight scene, a flight state of the unmanned aerial vehicle in the flight scene, obstacles and relative positions between the obstacles and the unmanned aerial vehicle to a user; and displaying the virtual model of the obstacle in a second preset display area, wherein the display of the virtual model in the second preset area is used for displaying the obstacle in the flight scene to a user. The first preset display area and the second preset display area may refer to the content related to the first preset display area and the second preset area in the corresponding embodiments, and are not described in detail.

In some embodiments, the flight speed vector is the magnitude and direction of the flight speed of the unmanned aerial vehicle in the three-dimensional space, and in order to improve the display effect of the flight speed vector on the display device, the flight speed vector may be projected onto a two-dimensional coordinate plane, so as to obtain a projection of the flight speed vector on the coordinate plane, and the projection is displayed. Wherein the projected display is used to assist the user in determining the direction of the airspeed vector, e.g., when the projection is directed to the upper right of the coordinate plane, it means that the UAV is flying forward and to the right. The length change of the projection reflects the magnitude change of the flying speed vector, and the direction change of the projection reflects the direction change of the flying speed vector.

Optionally, the coordinate plane includes a YOZ plane of a body coordinate system of the unmanned aerial vehicle or a YOZ plane of a geodetic coordinate system, so as to fit the direction judgment habit of the user.

In a possible implementation manner, during the process of displaying the projection, the projection of the flying speed vector on the coordinate plane may be displayed through a preset vector mark, where the projection is a connection between a center of the vector mark and an origin of coordinates on the coordinate plane, and the center of the vector mark is used for assisting a user in determining a direction of the flying speed vector.

In this embodiment, the vector mark is a preset pattern, and when the projection changes along with the flight speed vector of the unmanned aerial vehicle, the position of the vector mark on the coordinate plane changes including the change of the length of the projection and the change of the direction of the projection. The user can grasp the change in the flying velocity vector from the change in the position of the vector mark or the change in the position of the center of the vector mark. Therefore, the change of the flying speed vector is more obvious through the vector mark, and the display of the flying speed vector is more intuitive.

Optionally, when the flying velocity vector is small, the relative velocity change of the flying velocity vector is large (for example, the acceleration is large), the flying velocity vector is unstable, the vector mark frequently changes on the coordinate plane, the viewing experience of the user is affected, and the auxiliary user is not used for flying the unmanned aerial vehicle. Thus, in the event that the airspeed vector is less than or equal to the preset first speed threshold, the vector flag may remain unchanged at the origin of coordinates.

Optionally, the size of the vector marker is reduced with the increase of the flying speed vector, so that the user can obviously experience the change of the flying speed of the unmanned aerial vehicle according to the size change of the displayed vector marker. Further, the size of the vector mark is the preset maximum size under the condition that the flying speed vector is smaller than or equal to the preset second speed threshold, and the size of the vector mark is the preset minimum size when the flying speed vector is larger than or equal to the preset third speed threshold, so that the vector mark is prevented from occupying a larger display area due to overlarge size, and meanwhile, the vector mark is prevented from being too small to be observed by a user easily.

Optionally, the vector mark is a vector ball or a vector triangle, where the vector ball is a vector mark with a spherical shape and the vector triangle is a vector mark with a triangular shape.

The vector ball may have other shapes, such as a circle as shown in fig. 5a, 5b and 5c, and the vector mark may be referred to as a vector circle. In fig. 5a, 5b, and 5c, two dotted lines respectively represent an X coordinate axis and a Y coordinate axis of the coordinate plane, and a connection line between a center of the vector circle and an origin of the coordinate axis is a projection of the flight velocity vector of the unmanned aerial vehicle. When the flight velocity vector is equal to or less than the first velocity threshold value, as shown in fig. 5a, the vector circle is located at the origin of coordinates, and when the flight velocity vector is greater than the first velocity threshold value, as shown in fig. 5b and 5c, the vector circle varies with the variation of the flight velocity vector, and the longer the projection, the smaller the vector circle, the shorter the projection, the larger the vector circle, that is, the larger the flight velocity vector, the smaller the vector circle, the larger the flight velocity vector.

In this embodiment, when the user is flying the unmanned aerial vehicle, the flight speed of the unmanned aerial vehicle may be adjusted at the control terminal, so that the center of the vector mark is aligned with the passable passage of the obstacle, for example, the center of the vector circle in fig. 5a, 5b, and 5c is aligned with the bridge opening, and the alignment operation is more accurate as the distance between the unmanned aerial vehicle and the passable passage is closer. When aligned, the direction indicating the flight velocity vector of the unmanned aerial vehicle is directed toward the traversable corridor, thereby indicating that the unmanned aerial vehicle is flying toward the traversable corridor. Therefore, based on the displayed flight speed vector and the barrier information, the operation difficulty of the crossing flight of the unmanned aerial vehicle is reduced, and the user is effectively assisted in flying the unmanned aerial vehicle.

In some embodiments, a predicted flight trajectory of the unmanned aerial vehicle may also be obtained and displayed, wherein the display of the predicted flight trajectory of the unmanned aerial vehicle is used to assist a user in flying the unmanned aerial vehicle.

For example, the display device may display the predicted flight trajectory in the form of one or more curves or lines; or selecting corresponding areas (such as rectangular areas, arc areas and the like) on the screen by means of frame selection in a mode of deepening picture colors or adding dotted lines and the like, and expressing the flight prediction track through the selected areas; alternatively, the predicted flight path is displayed by a plurality of nodes (a plurality of nodes are connected to form the predicted flight path). Therefore, the user can observe the predicted flight trajectory displayed by the display device, and if the user does not want the unmanned aerial vehicle to fly according to the predicted flight trajectory, the flight of the unmanned aerial vehicle can be adjusted.

In some embodiments, a flight obstacle avoidance trajectory of the unmanned aerial vehicle can be further acquired and displayed, wherein the flight obstacle avoidance trajectory of the unmanned aerial vehicle is used for assisting a user in performing obstacle avoidance flight of the unmanned aerial vehicle. The display mode of the flight obstacle avoidance trajectory may refer to the display mode of the flight prediction trajectory, and is not described herein again.

In some embodiments, a stick amount increment corresponding to the flight obstacle avoidance trajectory may also be acquired, and the acquired stick amount increment is displayed. And the display of the pole increment is used for assisting a user to control the unmanned aerial vehicle to fly according to the flight obstacle avoidance track. For example, the display device may display the increment of the stick amount in a text or image manner. In a specific implementation process, the stick amount increment and the direction of the stick can be displayed on the image to assist a user in controlling the unmanned aerial vehicle to fly.

For details of the embodiment of the present application provided in fig. 4, reference may be made to contents related to the display device in fig. 2, which are not described herein again.

It should be noted that any of the above embodiments may be implemented alone, or at least two of the above embodiments may be implemented in any combination, which is not limited to this.

The embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores instructions which when run on a computer cause the computer to execute part or all of the steps of the flight assistance method of the unmanned aerial vehicle in any corresponding embodiment.

The embodiments of the present application also provide a computer program product containing instructions, which when run on a computer, cause the computer to execute some or all of the steps of the control method of the wireless image transmission apparatus as in any corresponding embodiment above.

Fig. 6 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present application, and as shown in fig. 6, an unmanned aerial vehicle 600 according to this embodiment may include: a processor 601 and a memory 602. Optionally, the unmanned aerial vehicle 600 further comprises a camera 603. The processor 601, the memory 602, and the imaging device 603 are connected via a communication bus, for example.

The memory 602 is used for storing instructions, and the processor 601 calls the instructions stored by the memory 602 to execute the following operations: acquiring flight state information of the unmanned aerial vehicle, wherein the flight state information comprises a flight speed vector which is used for indicating the magnitude and direction of the flight speed of the unmanned aerial vehicle; acquiring obstacle information of a flight scene where the unmanned aerial vehicle is located; and sending the flight state information and the obstacle information to a display device, wherein the display device can display the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting a user in flying the unmanned aerial vehicle.

In some embodiments, the obstacle information includes a relative position between the obstacle and the UAV in the flight scene.

In some embodiments, the relative position between the obstacle and the UAV in the flight scenario includes a distance between three axis directions of the UAV and the obstacle.

In some embodiments, the obstacle information further includes at least one of a type, a shape, a size of an obstacle in the flight scene.

In some embodiments, a camera 603 to:

acquiring a scene image of a flight scene;

the processor 601 is specifically configured to:

the flight state information, the obstacle information and the scene image are sent to a display device, the display device can display the scene image and display the flight state information and the obstacle information on the displayed scene image, and the display of the flight state information and the obstacle information on the scene image is used for displaying the flight state of the unmanned aerial vehicle and the relative position between the obstacle in the flight scene and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, the processor 601, before sending the flight status information and the obstacle information to the display device, is further configured to:

establishing a virtual model of the barrier in the flight scene according to the barrier information;

when the processor 601 sends the flight status information and the obstacle information to the display device, the processor is specifically configured to:

the flight state information, the obstacle information and the virtual model of the obstacle are sent to a display device, the display device can display the flight state information, the obstacle information and the virtual model of the obstacle, and the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, a camera 603 to:

acquiring a scene image of a flight scene;

the processor 601 is specifically configured to:

the method comprises the steps that flight state information, obstacle information, virtual models of obstacles and scene images are sent to display equipment, the display equipment can display the scene images in a first preset display area and display the flight state information and the obstacle information on the displayed scene images, the flight state information and the obstacle information in the first preset display area are used for displaying a flight scene and the flight state of the unmanned aerial vehicle in the flight scene and the relative positions of the obstacles and the unmanned aerial vehicle to a user, the display equipment can display the virtual models of the obstacles in a second preset display area, and the virtual models in the second preset display area are used for displaying the obstacles in the flight scene to the user.

In some embodiments, the processor 601 is further configured to:

generating a flight prediction track according to the flight speed vector of the unmanned aerial vehicle; and sending the flight prediction track to a display device, wherein the display device can display the flight prediction track, and the display of the flight prediction track is used for assisting a user in flying the unmanned aerial vehicle.

In some embodiments, the processor 601 is further configured to:

and determining whether the unmanned aerial vehicle is in the risk of collision of the obstacles in the flight scene or not according to the flight prediction track, and if so, sending a collision reminding message to the display equipment and/or the control terminal of the unmanned aerial vehicle.

In some embodiments, the processor 601 is further configured to:

if the unmanned aerial vehicle has the risk of collision with the obstacle, generating a flight obstacle avoidance track according to the relative position between the unmanned aerial vehicle and the obstacle; and sending the flight obstacle avoidance track to a display device, wherein the display device can display the flight obstacle avoidance track, and the display of the flight obstacle avoidance track is used for assisting a user in carrying out obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, the processor 601 is further configured to:

and controlling the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

In some embodiments, the processor 601 is further configured to:

determining target pole quantity according to the flight obstacle avoidance track; acquiring the current pole amount of a control terminal; determining the rod amount increment according to the current rod amount and the target rod amount; and sending the pole quantity increment to a display device, wherein the display device can display the pole quantity increment, and the display of the pole quantity increment is used for assisting a user in obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, the processor 601 is further configured to:

if the unmanned aerial vehicle is at the risk of collision with the obstacle, acquiring the predicted flight time of the unmanned aerial vehicle reaching the position of collision with the obstacle; and if the predicted flight time is less than or equal to the preset time threshold, controlling the unmanned aerial vehicle to brake.

In some embodiments, the processor 601 is further configured to:

if the unmanned aerial vehicle is detected to pass through the obstacle in the flying scene, sending a shooting instruction to the shooting device 603;

a camera 603, further configured to:

in response to a shooting instruction of the processor 601, an obstacle crossing image and/or an obstacle crossing video of the unmanned aerial vehicle is shot.

Wherein the obstacle crossing image and/or the obstacle crossing video may be stored locally or at a server or at a control terminal of the unmanned aerial vehicle.

In some embodiments, the processor 601 is further configured to:

if the change of the direction of the flight speed vector of the unmanned aerial vehicle is detected, controlling the shooting direction of a shooting device 603 on a holder of the unmanned aerial vehicle to turn to the direction of the flight speed vector; the scene image shot by the shooting device 603 is sent to a display device, the display device can display the scene image shot by the shooting device 603, the center of the image of the scene image displayed by the display device is the direction of the flight speed vector of the unmanned aerial vehicle, and the center of the image is used for assisting a user in determining the direction of the flight speed vector of the unmanned aerial vehicle.

Optionally, the unmanned aerial vehicle 600 further includes a communication device, and the communication device is configured to communicate with a display device and a control terminal, for example: and sending flight state information, obstacle information, an obstacle model, a pole increment and the like to the display equipment, and receiving a control instruction of the control terminal.

Optionally, the unmanned aerial vehicle 600 further includes a sensing device, which is used to acquire flight status information of the unmanned aerial vehicle, and the sensing device is, for example, the sensing system 162 in fig. 1.

The apparatus of this embodiment may be used to implement the technical solution of the method embodiment shown in fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.

Fig. 7 is a schematic structural diagram of a display device 700 according to an embodiment of the present application, and as shown in fig. 7, the display device 700 according to the embodiment includes: a processor 701 and a display 702. The processor 701 and the display device 702 are connected by a communication bus, for example.

The processor 701 is configured to acquire flight state information of the unmanned aerial vehicle, and acquire obstacle information of a flight scene where the unmanned aerial vehicle is located, where the flight state information includes a flight speed vector, and the flight speed vector is used to indicate the magnitude and direction of the flight speed of the unmanned aerial vehicle;

And the display device 702 is used for displaying the flight state information and the obstacle information, and the display of the flight state information and the obstacle information is used for assisting the user in flying the unmanned aerial vehicle.

In some embodiments, the obstacle information includes a relative position between the obstacle and the UAV in the flight scene.

In some embodiments, the relative position between the obstacle and the UAV in the flight scenario includes a distance between three axis directions of the UAV and the obstacle.

In some embodiments, the obstacle information further includes at least one of a type, a shape, a size of an obstacle in the flight scene.

In some embodiments, the processor 701 is further configured to:

acquiring a scene image of a flight scene;

the display device 702 is specifically configured to:

the method comprises the steps of displaying a scene image, and displaying flight state information and obstacle information on the displayed scene image, wherein the display of the flight state information and the obstacle information on the scene image is used for showing the flight state of the unmanned aerial vehicle and the relative position between an obstacle in the flight scene and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, the processor 701 is further configured to:

acquiring a virtual model of an obstacle in a flight scene;

the display device 702 is specifically configured to:

and displaying the flight state information, the obstacle information and the virtual model of the obstacle, wherein the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, the processor 701 is further configured to:

establishing a virtual model of the barrier in the flight scene according to the barrier information;

the display device 702 is specifically configured to:

and displaying the flight state information, the obstacle information and the virtual model of the obstacle, wherein the display of the flight state information, the obstacle information and the virtual model of the obstacle is used for displaying the flight state of the unmanned aerial vehicle, the obstacle and the relative position between the obstacle and the unmanned aerial vehicle to a user so as to assist the user in obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, the processor 701 is further configured to:

acquiring a scene image of a flight scene;

The display device 702 is specifically configured to:

displaying a scene image in a first preset display area, and displaying flight state information and obstacle information on the displayed scene image, wherein the scene image, the flight state information and the obstacle information in the first preset display area are used for displaying a flight scene, a flight state of the unmanned aerial vehicle in the flight scene, obstacles and relative positions between the obstacles and the unmanned aerial vehicle to a user;

and displaying the virtual model of the obstacle in a second preset display area, wherein the display of the virtual model in the second preset display area is used for displaying the obstacle in the flight scene to a user.

In some embodiments, the processor 701 is further configured to:

acquiring the projection of the flight speed vector on a preset coordinate plane;

the display device 702 is specifically configured to:

the projection is displayed, and the projected display is used for assisting a user to determine the direction of the flight speed vector.

In some embodiments, the coordinate plane comprises a YOZ plane of a body coordinate system of the unmanned aerial vehicle, or a YOZ plane of a geodetic coordinate system.

In some embodiments, the display device 702 is specifically configured to:

and displaying the projection of the flying speed vector on the coordinate plane through a preset vector mark, wherein the projection is a connecting line between the center of the vector mark and the origin of coordinates on the coordinate plane, and the center of the vector mark is used for assisting a user to determine the direction of the flying speed vector.

In some embodiments, the vector marker is located at the origin of coordinates where the airspeed vector is less than or equal to a preset first speed threshold.

In some embodiments, the size of the vector markings decreases as the airspeed vector increases.

In some embodiments, in the event that the airspeed vector is less than or equal to a preset second speed threshold, the size of the vector marker is a preset maximum size; and in the case that the flight speed vector is greater than or equal to a preset third speed threshold value, the size of the vector mark is a preset minimum size.

In some embodiments, the vectors are labeled as vector balls or vector triangles.

In some embodiments, the processor 701 is further configured to:

acquiring a flight prediction track of the unmanned aerial vehicle;

display device 702, further configured to:

and displaying the predicted flight track, wherein the display of the predicted flight track is used for assisting the user to fly the unmanned aerial vehicle.

In some embodiments, the processor 701 is further configured to:

acquiring a flight obstacle avoidance track of the unmanned aerial vehicle;

display device 702, further configured to:

and displaying a flight obstacle avoidance track, wherein the flight obstacle avoidance track is used for assisting a user in carrying out obstacle avoidance flight of the unmanned aerial vehicle.

In some embodiments, the processor 701 is further configured to:

acquiring a pole increment corresponding to a flight obstacle avoidance track;

display device 702, further configured to:

and displaying the rod quantity increment, wherein the rod quantity increment is used for assisting a user to control the unmanned aerial vehicle to fly according to the flight obstacle avoidance track.

Optionally, the display device 700 further includes a communication device, which is used for communicating with the unmanned aerial vehicle, for example, for receiving flight status information, obstacle model, stick amount increment, and the like sent by the unmanned aerial vehicle.

Optionally, the display device 700 further includes a memory for storing a computer program, and the processor 701 may call the computer program from the memory to implement some or all of the steps related to the display device in the embodiments.

The apparatus of this embodiment may be used to implement the technical solution of the method embodiment shown in fig. 4, and the implementation principle and the technical effect are similar, which are not described herein again.

Fig. 8 is a schematic structural diagram of a chip 800 according to an embodiment of the present disclosure, and as shown in fig. 8, the chip 800 may include: a transceiver 801, a memory 802 and a processor 803, wherein the transceiver 801, the memory 802 and the processor 803 are connected, for example, by a bus.

The transceiver 801 is used for data transceiving. A memory 802 for storing program instructions. The processor 803 is configured to invoke the program instructions in the memory 802 and execute, according to the received data of the transceiver 802, part or all of the steps of the flight assistance method for the unmanned aerial vehicle in any corresponding embodiment described above, which are similar in implementation principle and technical effect and are not described herein again.

Fig. 9 is a schematic structural diagram of an unmanned aerial vehicle system 900 according to an embodiment of the present application, and as shown in fig. 9, the unmanned aerial vehicle system 900 includes: unmanned aerial vehicle 901, display device 902, and control terminal 903. The unmanned aerial vehicle 901 is connected with the display device 902 and the control terminal 903 respectively, and the display device 902 may be an independent device or may be integrated on the control terminal 903. The unmanned aerial vehicle 901 may perform operations related to the unmanned aerial vehicle in the apparatus embodiment shown in fig. 6, and the display device 902 may perform operations related to the display device in the apparatus embodiment shown in fig. 7, which are similar in implementation principle and technical effect and are not described herein again.

In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.

Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

34页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:可移动平台及其控制方法和装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类