Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle
阅读说明:本技术 视觉定位系统、无人机以及自检测无人机自身位置的方法 (Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle ) 是由 田瑜 于 2018-07-04 设计创作,主要内容包括:本发明公开了一种视觉定位系统、无人机以及自检测无人机自身位置的方法,该无人机包括本体、镜头模块、匹配模块和翻译模块。本体具有下侧。镜头模块设置在无人机的下侧。镜头模块具有宽广的视场,并且在飞行期间随着时间拍摄无人机下方的区域的一系列图像。匹配模块比较和对比该系列图像中的每一个图像中的特征以导出第一数据。翻译模块将所述第一数据翻译成定位数据。该视觉定位系统用于该无人机。本发明可以在无人机中使用视觉定位系统以允许无人机具有附加/替代的定位信息。(The invention discloses a visual positioning system, an unmanned aerial vehicle and a method for self-detecting the position of the unmanned aerial vehicle. The body has a lower side. The camera lens module sets up the downside at unmanned aerial vehicle. The lens module has a wide field of view and captures a series of images of the area under the drone over time during flight. The matching module compares and compares features in each image in the series of images to derive first data. The translation module translates the first data into positioning data. This visual positioning system is used for this unmanned aerial vehicle. The invention may use a visual positioning system in the drone to allow the drone to have additional/alternative positioning information.)
1. A visual positioning system (709) for a drone, characterized in that it comprises:
a lens module (702), the lens module (702) for capturing a series of fisheye view images (I) over time during flight;
a matching module (703), the matching module (702) for deriving first data by comparing and contrasting at least one feature from the fisheye view image (I);
a sensor module (705), the sensor module (705) for collecting second data (D2), and the sensor module (705) electrically coupled to the lens module (702); and
a translation module (704), the translation module (704) being electrically coupled to the sensor module (705) to generate positioning data by processing first data (D1) taking into account second data (D2);
wherein the sensor module (705) includes one or more of a slave LiDAR sensor, an inertial measurement unit, a GPS receiver, and a radar.
2. The visual positioning system of claim 1, wherein the lens module includes a lens unit (1021) and an image sensor unit (1022), and the image sensor unit is to sense incident optical signals from the lens unit to generate the series of fisheye view images.
3. The visual positioning system of claim 2, wherein the lens unit comprises a fisheye lens having a field of view of at least 180 degrees.
4. The visual positioning system of claim 2, wherein the image sensor unit generates the series of fisheye view images at a series of times during flight.
5. The visual positioning system of claim 2, wherein the image sensor unit generates the series of fisheye view images during flight when the second data matches a predetermined factor.
6. The visual positioning system of claim 1, wherein the second data includes one or more of altitude data, a GPS map, GPS coordinates, and a position relative to at least one surrounding physical object.
7. The visual positioning system of claim 6, wherein the sensor module (705) includes one or more of a LiDAR sensor and a GPS receiver.
8. A drone (100), characterized in that it comprises:
a body (101), the body (101) having a lower side (1011);
a fisheye lens module (102), the fisheye lens module (102) being disposed on the underside (1011);
wherein the fisheye lens module (102) has a wide field of view and captures a series of images (I) of an area under the drone (100) over time during flight
A matching module (103), the matching module (103) for comparing and comparing features in each image of the series of images (I) to derive first data (D1); and
a translation module (104), the translation module (104) for translating the first data (D1) into positioning data.
9. The drone of claim 8, wherein the fisheye lens module comprises a lens unit (1021) and an image sensor unit (1022) arranged to sense incident light signals from the lens unit to generate a series of fisheye view images.
10. The drone of claim 9, wherein the image sensor unit generates the series of images at a series of times during flight.
11. The drone of claim 8, further comprising a verification module that compares the positioning data to a GPS map.
12. The drone of claim 8, further comprising a gimbal to connect the fisheye lens module to the body, wherein the gimbal has at most 2 axes.
13. The drone of claim 8, wherein the series of images taken over time are taken at a speed of at least 60 images per second.
14. The drone of claim 8, further comprising a sensor module (705), the sensor module (705) to collect second data (D2), and the translation module to consider the second data to derive the positioning data;
wherein the sensor module includes one or more of a LiDAR sensor, an inertial measurement unit, a GPS receiver, and a radar.
15. The drone of claim 14, wherein the fisheye lens module generates the series of images during flight when the second data matches a predetermined factor.
16. A method of self-detecting the position of an unmanned aerial vehicle when flying in an area with insufficient GPS signals, the method comprising:
obtaining initial positioning data comprising altitude data; (901)
taking a series of fisheye view images during flight at a rate of over 60 images per second; (902) comparing relative changes between the series of fisheye view images; (903) and
calculating a position of the drone from the initial positioning data and the relative change (904).
17. The method of claim 16, further comprising using data from an inertial measurement unit.
18. The method of claim 16, further comprising comparing the relative change to a stored copy of a GPS map.
19. The method of claim 16, wherein capturing the series of fisheye view images during flight at a speed of more than 60 images per second comprises:
collecting data; and
during flight, the series of fisheye view images is generated when the data matches a predetermined factor.
20. The method of claim 16, wherein capturing the series of fisheye view images during flight at a speed of more than 60 images per second comprises:
during flight, the series of fisheye view images are generated at a series of times.
Technical Field
The invention relates to a visual positioning system, an unmanned aerial vehicle and a method for self-detecting the position of the unmanned aerial vehicle.
Background
Unmanned Aerial Vehicles (UAVs) are remotely flying or autonomous aircraft carrying cameras, sensors, communication devices, or other payloads. With the leap-type development of society and industry, unmanned aerial vehicle aerial photography has been applied to more and more fields, such as movie and television shooting, fire patrol shooting, traffic monitoring.
However, there is still a need for new methods to improve the ability of drones to detect their own position. Many conventional drones use Global Positioning System (GPS) location information to determine flight routes and how to maneuver around buildings and other objects in the sky. However, GPS location information may be inaccurate due to factors such as rain, wind, distortion of the GPS location information or dropping between high-rise buildings, being indoors where the GPS location information may be limited, and the like.
Disclosure of Invention
The technical problem that this disclosure will solve is to provide positioning system beyond GPS for unmanned aerial vehicle to overcome the shortcoming that GPS positioning information may be inaccurate under many different circumstances.
The present disclosure solves the above technical problems by the following technical solutions.
In one or more embodiments according to the present disclosure, a visual positioning system for a drone is provided. The system comprises a lens module, a matching module, a sensor module and a translation module. The lens module captures a series of fisheye view images over time during flight. The matching module derives first data by comparing and contrasting at least one feature from the fisheye view image. The sensor module collects a second data, the sensor module electrically coupled to the lens module. The translation module is electrically coupled to the sensor module to generate the positioning data by processing the first data in view of the second data. The sensor module includes one or more of a LiDAR sensor, an Inertial Measurement Unit (IMU), a GPS receiver, and a radar.
In one or more embodiments according to the present disclosure, the lens module includes a fisheye lens having a field of view of at least 180 degrees.
In one or more embodiments according to the present disclosure, the second data comprises one or more of altitude data, a GPS map, GPS coordinates, a position relative to at least one surrounding physical object. In one or more embodiments according to the present disclosure, the sensor module includes one or more of a LiDAR and a GPS receiver.
In one or more embodiments according to the present disclosure, an Unmanned Aerial Vehicle (UAV) is provided. This unmanned aerial vehicle includes body, fisheye lens module, matching module and translation module. The body has a lower side. Fisheye lens module sets up the downside at unmanned aerial vehicle. The fisheye lens module has a wide field of view and, over time, takes a series of images of the area under the drone during flight. The matching module compares and compares features of each image in the series of images to derive first data. The translation module translates the first data into positioning data.
In one or more embodiments according to the present disclosure, the drone further includes a verification module that compares the positioning data to a GPS map.
In one or more embodiments of the invention, the drone further comprises a gimbal connecting the fisheye lens module to the body, the gimbal having at most 2 axes.
In one or more embodiments according to the present disclosure, the series of images taken over time are taken at a rate of at least 60 images per second.
In one or more embodiments of the invention, the drone further comprises a sensor module for collecting second data, and the translation module is for considering the second data to derive the positioning data. The sensor module includes one or more of a LiDAR sensor, an Inertial Measurement Unit (IMU), a GPS receiver, and a radar.
In one or more embodiments of the invention, a method is provided for self-detecting the position of a drone itself when flying in areas with insufficient GPS signals. The method comprises the following steps: obtaining initial positioning data comprising altitude data; taking a series of fisheye view images during flight at a rate of over 60 images per second; comparing relative changes between the series of fisheye view images; and calculating the position of the unmanned aerial vehicle according to the initial positioning data and the relative change.
In one or more embodiments according to the present disclosure, the method further includes using data from the IMU.
In one or more embodiments according to the present disclosure, the method further comprises comparing the relative change to a GPS map.
According to the present disclosure, a visual positioning system may be used in a drone to allow the drone to have additional/alternative positioning information. Compared with the traditional unmanned aerial vehicle only using GPS positioning information, the unmanned aerial vehicle provided by the invention adopts a visual positioning system as a supplementary mode on the basis of the existing positioning system, so that more accurate positioning information can be obtained.
Drawings
The various aspects of the invention are best understood from the following detailed description when read with the accompanying drawing figures. It is noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
Fig. 1 is a side view of a UAV according to some embodiments of the present disclosure.
Fig. 2 is a block diagram of a UAV according to some embodiments of the present disclosure.
Fig. 3 is a side view of a lens module according to some embodiments of the present disclosure.
Fig. 4 illustrates a diffuse two-dimensional map of various points shown in a lens module according to some embodiments of the present disclosure.
Fig. 5 illustrates four images taken by a lens module representing four views of the ground over time, according to some embodiments of the present disclosure.
Fig. 6 is a perspective view of a lens module on a 2-axis gimbal according to some embodiments of the present disclosure.
Fig. 7 is a block diagram of a UAV according to some embodiments of the present disclosure.
Fig. 8 is a block diagram of a UAV according to some embodiments of the present disclosure.
Fig. 9 is a method of self-detecting the location of a drone itself, in accordance with some embodiments of the present disclosure.
Detailed Description
Various embodiments may now be better understood by turning to the following description. These embodiments are shown in the illustrated examples.
Many variations and modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the embodiments. Accordingly, it must be understood that the illustrated embodiments have been set forth only for the purposes of example, and that they should not be taken as limiting.
The words used in this specification to describe embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification structure, material or acts beyond the scope of the commonly defined meanings.
The definitions of the words or elements of the following claims, therefore, include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements disclosed or that a single element may be substituted for two or more elements. Although elements may be described herein as acting in certain combinations, it is to be expressly understood that one or more elements from a disclosed combination can in some cases be excised from the combination, and that the combination may be directed to a subcombination or variation of a subcombination.
Further, terms such as "below," "lower," "above," "upper," "lower," "left," and "right" may be used herein to facilitate describing one element or feature's relationship to another element or feature(s), as illustrated. Spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present.
The invention discloses a new mode for improving the integrity and effectiveness of the position detection of the unmanned aerial vehicle by using a positioning system. The positioning system may be a visual positioning system. According to some embodiments, the visual positioning system may generate a visual position, e.g., a relative position with respect to a reference point or point block of the drone. The visual location may be different from a Global Positioning System (GPS) defined location. According to some embodiments of the present disclosure, there is also provided a new method of self-detecting the relative position of a drone with respect to a reference point or block of points when flying through an area where GPS signals are insufficient.
In one or more embodiments in accordance with the present disclosure, a visual positioning system may be used in an Unmanned Aerial Vehicle (UAV) to obtain additional/alternative positioning information. When GPS signals are poor, additional/alternative positioning information may be used to locate the UAV. In one or more embodiments according to the present disclosure, a visual positioning system may take images (photos and/or videos) of the ground during flight and process the images (photos and/or videos) in real-time to obtain positioning information. According to some embodiments, the visual positioning system may select at least one reference object in the image, and analyze the image to calculate positioning information for the UAV based on the reference object.
In one or more embodiments of the present disclosure, the visual positioning system may be used as a supplement over existing positioning systems, and thus the drone may acquire more accurate positioning information, as compared to a conventional drone that uses only GPS positioning information. Furthermore, a drone having a visual positioning system according to one or more embodiments of the present disclosure can obtain accurate positioning information even when flying in places where GPS signals are limited or poorly received, such as when flying between high-rise buildings and structures.
Referring to fig. 1 and 2, fig. 1 is a side view of an Unmanned Aerial Vehicle (UAV)100 according to some embodiments of the present disclosure. Fig. 2 is a block diagram of UAV100 in accordance with some embodiments of the present disclosure. In one or more embodiments, UAV100 or drone is a multi-rotor aircraft. UAV100 may be a triple-rotor, quad-rotor, six-rotor, eight-rotor, or other multi-rotor aircraft. In one or more embodiments, UAV100 includes a
In one or more embodiments, the
In one or more embodiments,
In one or more embodiments, the
Referring to fig. 1 and 3, fig. 3 is a side view of a
Referring again to fig. 3, optionally, the
On the other hand, referring again to fig. 1, UAV100 may tilt during flight, causing
Referring again to fig. 2, in one or more embodiments, the matching module 103 is electrically coupled with the
Referring again to FIG. 2, in one or more embodiments, translation module 104 is electrically coupled with matching module 103. In one or more embodiments, translation module 104 may be incorporated into a processor, such as a central processing unit CPU, a Digital Signal Processor (DSP), a programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), or other similar device, or a combination of such devices. In one or more embodiments, translation module 104 may be incorporated in the same processor as matching module 103. Alternatively, translation module 104 may be incorporated in a different processor than matching module 103.
In one or more embodiments, the translation module 104 translates the first data D1 into positioning data. In one or more embodiments, translation module 104 may receive first data D1 from matching module 103, and translation module 104 may calculate a relative change (e.g., direction/altitude/speed) of UAV100 based on first data D1 to generate positioning data. In one or more embodiments, the calculation may take into account many other known factors, such as time of flight, the initial position of the UAV, or other useful factors.
Referring to fig. 2 and 5, fig. 5 illustrates four images 501-504 taken by the
In one or more embodiments, translation module 104 may receive first data D1 and calculate a relative change (e.g., direction/altitude/speed) of UAV100 based on first data D1 to generate relative positioning data for
In one or more embodiments according to the present disclosure, UAV100 with a visual positioning system may acquire additional/alternative positioning information in addition to traditional positioning information, e.g., GPS signals. In one or more embodiments in accordance with the present disclosure,
In one or more embodiments of the invention, UAV100 using a visual positioning system as a supplement may have more accurate positioning information over existing positioning systems than a traditional drone using only GPS positioning information. Furthermore, UAV100 in accordance with one or more embodiments of the present disclosure having a visual positioning system can obtain accurate positioning information even when UAV100 is flying around locations where reception of GPS signals is limited or poor, such as when flying between high-rise buildings and structures.
Referring to fig. 6, fig. 6 is a perspective view of the
Referring to fig. 7, fig. 7 is a block diagram of a UAV700 according to some embodiments of the present disclosure. In one or more embodiments, UAV700 includes a
In one or more embodiments, the
In contrast to UAV100 in fig. 2,
In one or more embodiments, the
In one or more embodiments, the lens module 702 (or the image sensor unit 1022) may begin taking images (photos and/or videos) I when the second data D2 received by the UAV700 matches a predetermined factor. For example, the signal strength of the second data D2 (e.g., GPS data) is below a predetermined factor.
In one or more embodiments, the
It is noted that the functions of the
In one or more embodiments of the invention, UAV700 using a visual positioning system as a supplement over an existing positioning system may have more accurate positioning information than a traditional drone that uses only GPS positioning information. Furthermore, UAV700 in accordance with one or more embodiments of the present invention having a visual positioning system may obtain accurate positioning information even when UAV700 is flying in a location where GPS signals are limited or poorly received, such as, for example, between high-rise buildings and structures.
Referring to fig. 8, fig. 8 is a block diagram of a
In one or more embodiments, matching
In contrast to UAV100 or 700,
In one or more embodiments, the
It is noted that the functionality of the
Referring to fig. 9, fig. 9 is a
In
In
In
In one or more embodiments, the operation of
In one or more embodiments according to the present disclosure, a UAV/drone with a visual positioning system may acquire additional/alternative positioning information in addition to traditional positioning information. In one or more embodiments according to the present disclosure, the UAV/drone may take images (photos and/or videos) of the ground during flight, and the UAV/drone may process the images (photos and/or videos) in real-time to obtain positioning information of the UAV/drone.
In one or more embodiments of the invention, a drone that uses a visual positioning system as a supplement on top of an existing positioning system may have more accurate positioning information than a traditional drone that uses only GPS positioning information. Furthermore, drones according to one or more embodiments of the present disclosure with a visual positioning system can obtain accurate positioning information even when the UAV/drone is flying around places where GPS is limited to receive or poor in reception, such as when flying between high-rise buildings and structures.
According to some embodiments, a visual positioning system for a drone is provided. A visual positioning system for an unmanned aerial vehicle comprises a lens module, a matching module, a sensor module and a translation module. The lens module captures a series of fisheye view images over time during flight. The matching module derives first data by comparing and contrasting at least one feature from the fisheye view image. The sensor module collects second data, the sensor module being electrically coupled to the lens module. The translation module is electrically coupled to the sensor module to generate the positioning data by processing the first data in view of the second data. The sensor module includes one or more of a LiDAR sensor, an Inertial Measurement Unit (IMU), a GPS receiver, and a radar.
According to some embodiments, an Unmanned Aerial Vehicle (UAV) is provided. An Unmanned Aerial Vehicle (UAV), the UAV comprising a body, a fisheye lens module, a matching module, and a translation module. The body has a lower side. The fisheye lens module is disposed on the lower side. The fisheye lens module has a wide field of view and, over time, takes a series of images of the area under the drone during flight. The matching module compares and compares features in each of the series of images to derive first data. The translation module translates the first data into positioning data.
According to some embodiments, a method of self-detecting the location of a drone itself when flying through an area with insufficient GPS signals is provided. The method comprises the following steps: (1) acquiring initial positioning data containing height data; (2) taking a series of fisheye view images during flight at a rate of over 60 images per second; (3) comparing relative changes between the series of fisheye view images; (4) and calculating the position of the unmanned aerial vehicle according to the initial positioning data and the relative change.
Thus, specific embodiments and applications of the visual positioning system are disclosed. It will be apparent, however, to one skilled in the art that many more modifications besides those already described are possible without departing from the concepts herein disclosed. Insubstantial changes from the disclosure as viewed by a person with ordinary skill in the art are expressly contemplated as being equivalent within the scope of the disclosure. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the present disclosure. The inventive subject matter disclosed is therefore to be understood as embracing each and every matter specifically illustrated and described above, conceptually equivalent, as well as what can be obviously substituted and also essentially combined with the basic idea of the embodiment.
The foregoing has outlined features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. Those skilled in the art should appreciate that they may readily use the present disclosure as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions and alterations herein without departing from the spirit and scope of the present disclosure.
- 上一篇:一种医用注射器针头装配设备
- 下一篇:一种用于城镇景点导览的地图导航系统