Self-driven unmanned mower

文档序号:1652112 发布日期:2019-12-24 浏览:16次 中文

阅读说明:本技术 一种自驱动无人驾驶割草机 (Self-driven unmanned mower ) 是由 杨立业 陈炯霖 于 2018-06-20 设计创作,主要内容包括:一种自驱动无人驾驶割草机(1000),包括割草机本体(1)、切割模块(2)、车轮模块(3)、摄像模块(4)和中央处理单元(5)。切割模块(2)安装在割草机本体(1)上,用于割草;车轮模块(3)安装在割草机本体(1)上,用于移动割草机本体(1)。摄像模块(4)安装在割草机本体(1)上,用于采集割草机本体(1)周围环境的图像。中央处理单元(5)电连接于切割模块(2)、车轮模块和摄像模块(4)。根据摄像模块(4)采集的图像和一手持电子设备发出的控制信号,中央处理单元(5)控制切割模块(2)和车轮模块(3)在一区域内割草;或根据摄像模块(4)采集的图像,中央处理单元(5)控制切割模块(2)和车轮模块(3)在该区域内割草。(A self-driven unmanned mower (1000) comprises a mower body (1), a cutting module (2), a wheel module (3), a camera module (4) and a central processing unit (5). The cutting module (2) is arranged on the mower body (1) and used for mowing; the wheel module (3) is arranged on the mower body (1) and used for moving the mower body (1). The camera module (4) is installed on the mower body (1) and used for collecting images of the surrounding environment of the mower body (1). The central processing unit (5) is electrically connected with the cutting module (2), the wheel module and the camera module (4). According to the image acquired by the camera module (4) and a control signal sent by a handheld electronic device, the central processing unit (5) controls the cutting module (2) and the wheel module (3) to cut grass in an area; or according to the image collected by the camera module (4), the central processing unit (5) controls the cutting module (2) and the wheel module (3) to cut grass in the area.)

1. An unmanned lawnmower, comprising:

a mower body;

the cutting module is arranged on the mower body and used for mowing;

the wheel module is arranged on the mower body and used for moving the mower body;

the camera module is arranged on the mower body and used for acquiring images of the environment around the mower body; and

a Central Processing Unit (CPU) mounted on the mower body and electrically connected to the cutting module, the wheel module and the camera module;

the central processing unit controls the cutting module and the wheel module to cut grass in an area according to the image acquired by the camera module and a control signal sent by a handheld electronic device, or controls the cutting module and the wheel module to cut grass in the area according to the image acquired by the camera module.

2. The unmanned lawnmower of claim 1, wherein a boundary of the mowing area is determined by control signals sent by the handheld electronic device in cooperation with the image captured by the camera module, the unmanned lawnmower mowing within the boundary.

3. The unmanned lawnmower of claim 2, wherein the central processing unit determines a plurality of image features on the boundary from the image captured by the camera module.

4. The unmanned lawnmower of claim 3, wherein the camera module is a stereo camera, each of the image features comprising a depth information.

5. The unmanned lawnmower of claim 2, wherein the central processing unit calculates a mowing trajectory within the boundary based on a contour of the boundary.

6. The unmanned lawnmower of claim 1, wherein a mowing path within the area is determined by control signals sent by the handheld electronic device in cooperation with the images captured by the camera module, the unmanned lawnmower mowing along the mowing path.

7. The unmanned lawnmower of claim 6, wherein the central processing unit determines a plurality of image features on a plurality of mowing paths from the image captured by the camera module.

8. The unmanned lawnmower of claim 7, wherein the camera module is a stereo camera, each of the image features comprising a depth information.

9. The unmanned lawnmower of claim 1, further comprising:

the wireless signal positioning module is electrically connected with the central processing unit and used for positioning the mower body by establishing contact with at least one wireless positioning terminal; the control signal sent by the handheld electronic equipment, the image collected by the camera module and the wireless positioning signal sent by the at least one wireless positioning terminal jointly determine a boundary or a path, and the unmanned mower mows along or in the boundary.

10. The unmanned lawnmower of claim 9, further comprising:

the dead reckoning module is electrically connected with the central processing unit and used for positioning the mower body; wherein the boundary or the path is further determined by the dead reckoning module.

11. The unmanned lawnmower of claim 10, wherein the wireless signal positioning module comprises at least one of a GPS sub-module, a WiFi signal receiving sub-module, and a bluetooth signal receiving sub-module, and wherein the dead reckoning module comprises a gyroscope and/or an accelerometer.

12. The unmanned lawnmower of claim 1, further comprising:

and the distance sensor module is electrically connected with the central processing unit and used for detecting objects around the mower body, and when the distance between the objects and the mower body is within a preset range, the distance sensor module sends a distance alarm signal.

13. The unmanned lawnmower of claim 1, further comprising:

the remote equipment communication module is electrically connected with the central processing unit and is used for establishing connection with the handheld electronic equipment;

wherein the handheld electronic device sends a control signal to the remote device communication module; the central processing unit performs control as follows:

controlling the wheel module to move based on the control signal, an

When the mower moves, controlling the camera module to acquire an image;

wherein the central processing unit controls the remote device communication module to transmit images to the handheld electronic device.

14. The unmanned lawnmower of claim 1, further comprising:

a memory module, electrically connected to the central processing unit, for storing at least one registered identification image;

the central processing unit judges that an initial user image of a user acquired by the camera module is matched with at least one registered identity image; when the initial user image is matched with at least one registered identification image, the central processing unit controls the wheel module to move along with the user according to the image collected by the camera module when the user moves so as to determine the boundary of the mowing area, and the unmanned mower mows the lawn in the boundary.

15. The unmanned lawnmower of claim 14, wherein the central processing unit determines a plurality of image features on the boundary from the image captured by the camera module.

16. The unmanned lawnmower of claim 15, wherein the camera module is a stereo camera, each of the image features comprising a depth information.

17. The unmanned lawnmower of claim 14, wherein the central processing unit calculates a mowing trajectory within the boundary based on a contour of the boundary.

Technical Field

The invention relates to a mower, in particular to a self-driven unmanned mower.

Background

Generally, conventional lawn mowers require a perimeter line to be placed on the lawn to define a boundary to assist the lawn mower in mowing the area bounded by the perimeter line. Furthermore, the user needs to pre-set the perimeter line before starting the mower for the mower to work properly. It is neither convenient for the user to use, does not realize the artificial intelligence of lawn mower yet.

Disclosure of Invention

To overcome the above disadvantages, the present invention provides a self-propelled unmanned lawn mower.

To achieve the above object, the present application provides an unmanned lawn mower including a mower body, a cutting module, a wheel module, a camera module, and a central processing unit (CPU for short). The cutting module is installed on the mower body and used for mowing. The wheel module is installed on the mower body and used for moving the mower body. The camera module is installed on the mower body and used for collecting images of the surrounding environment of the mower body. The central processing unit is arranged in the mower body and is electrically connected with the cutting module, the wheel module and the camera module. According to the image acquired by the camera module and a control signal sent by a handheld electronic device, the central processing unit controls the cutting module and the wheel module to cut grass in an area; or the central processing unit controls the cutting module and the wheel module to cut grass in the area according to the image acquired by the camera module.

Preferably, the ratio of the total of the amounts of the components is zero.

The boundary of the mowing area is determined by the cooperation of a control signal sent by the handheld electronic equipment and an image acquired by the camera module, and the unmanned mowing machine mows the grass in the boundary.

Preferably, the central processing unit determines a plurality of image features on the boundary from the image captured by the camera module.

Preferably, the camera module is a stereo camera, and each image feature includes depth information.

Preferably, the central processing unit calculates a mowing trajectory within the boundary based on an outline of the boundary.

Preferably, a mowing path in the area is determined by the control signal sent by the handheld electronic device in cooperation with the image acquired by the camera module, and the unmanned mowing machine mows the grass along the mowing path.

Preferably, the unmanned mower further comprises a wireless signal positioning module, wherein the wireless signal positioning module is electrically connected to the central processing unit and used for positioning the mower body by establishing communication with at least one wireless positioning terminal; the control signal sent by the handheld electronic equipment, the image collected by the camera module and the wireless positioning signal sent by the at least one wireless positioning terminal jointly determine a boundary or a path, and the unmanned mower mows along or in the boundary.

Preferably, the unmanned mower further comprises a dead reckoning module electrically connected to the central processing unit for positioning the mower body; wherein the boundary or the path is further determined by the dead reckoning module.

Preferably, the wireless signal positioning module at least comprises one of a GPS sub-module, a WiFi signal receiving sub-module and a bluetooth signal receiving sub-module, and the dead reckoning module comprises a gyroscope and/or an accelerometer.

Preferably, the unmanned mower further comprises a distance sensor module electrically connected to the central processing unit for detecting objects around the mower body. When the distance between the object and the mower body is within a preset range, the distance sensor module sends a distance alarm signal.

Preferably, the unmanned lawn mower further comprises a remote device communication module electrically connected to the central processing unit for establishing connection with the handheld electronic device; wherein the handheld electronic device transmits a control signal to the remote device communication module; the central processing unit controls the wheel module to move based on the control signal; when the mower moves, the camera module collects images; the central processing unit controls the remote device communication module to transmit the image to the handheld electronic device.

In summary, the unmanned lawn mower of the present invention has the camera module for capturing the image of the circumference of the lawn mower body, and can determine the boundary of the mowing area or the mowing path by image processing according to the image captured by the camera module. Therefore, the unmanned mower is convenient for users to use, and has the characteristic of artificial intelligence.

Drawings

FIG. 1 is a perspective view of an unmanned lawnmower according to an embodiment of the present invention.

Fig. 2 is a partially exploded schematic view of an unmanned lawnmower according to an embodiment of the invention.

Fig. 3 is a schematic view of a camera module and a driving mechanism in an unfolded state according to an embodiment of the present invention.

Fig. 4 is a schematic diagram of the camera module and the driving mechanism in a contracted state according to an embodiment of the present invention.

FIG. 5 is a schematic view of internal components of an unmanned lawnmower in accordance with an embodiment of the present invention.

FIG. 6 is a functional block diagram of an unmanned lawn mower according to an embodiment of the present invention.

FIG. 7 is a flow chart of a method for determining a mowing boundary of an unmanned lawnmower according to an embodiment of the invention.

FIG. 8 is a schematic view of an embodiment of an unmanned lawnmower cutting grass in a yard.

FIG. 9 is a top view of the scene shown in FIG. 8 according to an embodiment of the present invention.

FIG. 10 is a schematic view of a handheld electronic device with a user interface showing a scene captured by the unmanned lawn mower in the first position of FIG. 9.

FIG. 11 is a schematic view of a handheld electronic device with a user interface showing a scene captured by the unmanned lawn mower in the second position of FIG. 9.

FIG. 12 is a flow chart of a method for determining a mowing path of an unmanned lawnmower according to another embodiment of the invention.

FIG. 13 is a top view of the scene shown in FIG. 8 in accordance with another embodiment of the present invention.

FIG. 14 is a flow chart of a method for determining a mowing boundary of an unmanned lawnmower by following user movement according to another embodiment of the present invention.

FIG. 15 is a schematic diagram of a user ID image and a user image model according to another embodiment of the present invention.

FIG. 16 is a top view of the scene shown in FIG. 8 in accordance with another embodiment of the present invention.

FIG. 17 is a flow chart of a method for obstacle avoidance and shutdown (if the obstacle is a living creature) of a lawnmower according to another embodiment of the present invention.

FIG. 18 is a schematic view of an unmanned lawnmower avoiding obstacles according to an embodiment of the present invention.

FIG. 19 is a schematic diagram of a safety shutdown of an unmanned lawn mower in accordance with an embodiment of the present invention.

Detailed Description

Specific embodiments of the present invention will be understood by reference to the following detailed description of embodiments in conjunction with the accompanying drawings. . Directional terms, such as "top," "bottom," and the like, are used throughout the drawings to describe directions. The components of the present invention can be positioned in a number of different orientations. Accordingly, the directional terminology is used for purposes of illustration and is in no way intended to be limiting. On the other hand, the attached drawings are only schematic and the size of the components may be exaggerated for clarity of illustration. Other embodiments or structural changes, without conflict, should also be understood to fall within the scope of the present invention. Also, the phraseology or terminology used herein is for the purpose of description and should not be regarded as limiting. The use of phrases or terms such as "having," "including," or "having," and variations thereof, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, "connected" and "installed" and variations thereof are used broadly herein, including directly or indirectly connected and installed. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.

As shown in fig. 1, 5 and 6, a self-propelled unmanned lawn mower 1000 for mowing an area, such as a patio of a family of people. The unmanned mower 1000 includes a mower body 1, a cutting module 2, a wheel module 3, a camera module 4, and a central processing unit 5 (CPU for short). The cutting module 2 is mounted on the mower body 1 for mowing. The wheel module 3 is mounted on the mower body 1 and used for moving the mower body 1. The camera module 4 is installed on the mower body 1 and used for acquiring images of the surrounding environment of the mower body 1. The central processing unit 5 is mounted on the mower body 1 and is electrically connected to the cutting module 2, the wheel module 3, and the camera module 4.

In this embodiment, the cutting module 2 may include a blade motor 20 and a blade assembly 21. Blade assembly 21 is used to cut grass and blade motor 20 is used to drive blade assembly 21 to cut grass. Further, the blade motor 20 is electrically connected to the central processing unit 5 and the blade assembly 21. In this way, the central processing unit 5 can control the blade assembly 21 to be turned on or off according to actual conditions.

In the present embodiment, the wheel module 3 may include a wheel control unit 30, a wheel rotating motor 31, a rotation speed sensor 32, a front wheel bracket 33, and a rear wheel bracket 34. The wheel rotating motor 31 is electrically connected to the rear wheel bracket 34 for driving the mower body 1 forward or backward. The rotation speed sensor 32 is provided near the rear wheel carrier 34 for detecting the rotation speed of the rear wheel carrier 34. The front wheel holder 33 is mounted on the mower body 1, and changes the moving direction of the mower body 1 of the unmanned mower 1000. The wheel control unit 30 is electrically connected to the central processing unit 5, the wheel rotating electric machine 31, and the rotation speed sensor 32. In practice, the wheel control unit 30 may act as a main board circuit of the unmanned lawn mower 1000. In this way, the central processing unit 5 controls the movement of the mower body 1 of the unmanned mower 1000 by the wheel control unit 30, the wheel rotating motor 31, the rotation speed sensor 32, the front wheel holder 33, and the rear wheel holder 34.

As shown in fig. 1, 5, and 6, the unmanned lawn mower 1000 further includes a blade stop module B, a battery module C, a power distribution module D, and a lighting module E. Battery module C is used to provide electrical power to the unmanned lawn mower 1000. The power distribution module D is electrically connected to the battery module C and the central processing unit 5 for distributing the electric power provided by the battery module C to other modules of the unmanned mower 1000, such as the cutting module 2, the wheel module 3, and the camera module 4. The illumination module E is electrically connected to the central processing unit 5 and is used for providing a light source for the camera module 4 in dim light.

The blade stop module B is electrically connected to the central processing unit 5 for sensing tilting and tilting. For example, when the unmanned mower 1000 is operated, the cutting module 2 is activated, and when the mower body 1 is tilted or tilted by an external force, the blade stopping module B senses a posture change of the mower body 1 and sends a posture alarm signal to the cpu 5. After the central processing unit 5 receives the posture alarm signal sent by the blade stopping module B, the cutting module 2 is turned off for safety.

As shown in fig. 1, 5 and 6, the unmanned lawn mower 1000 further comprises a remote device communication module 7, a wireless signal positioning module 8, a dead reckoning module 9 and a distance sensor module a. The remote device communication module 7 is electrically connected to the central processing unit 5 for establishing a connection with a handheld electronic device 6. In the present embodiment, the handheld electronic device 6 is exemplified by a smartphone, but the present invention is not limited thereto. For example, the handheld electronic device 6 may be a tablet computer or a wristwatch, etc. The wireless signal positioning module 8 is electrically connected to the central processing unit 5, and is connected to at least one wireless positioning terminal (not shown) to position the mower body 1. In this embodiment, the wireless signal positioning module 8 at least includes one of a GPS sub-module 80, a WiFi signal receiving sub-module 81 and a bluetooth signal receiving sub-module 82. The GPS sub-module 80 is used for receiving satellite signals, so that the wireless signal positioning module 8 can position the mower body 1 outdoors. The WiFi signal receiving sub-module 81 can establish a connection with a WiFi hotspot, for example, at least one wireless positioning terminal is a WiFi hotspot, and thus the wireless signal positioning module 8 can position the mower body 1 indoors. The bluetooth signal receiving sub-module 82 establishes a connection with an electronic device having a bluetooth access function, for example, at least one wireless positioning terminal is an electronic device having a bluetooth access function, so that the wireless signal positioning module 8 can position the lawn mower body 1 indoors.

The dead reckoning module 9 is electrically connected to the central processing unit 5 for positioning the mower body 1. In this embodiment, the dead reckoning module 9 may include a gyroscope 90 and/or an accelerometer 91. The gyroscope 90 detects the direction of the mower body 1 during movement of the mower body 1, and the accelerometer 91 detects the current speed of the mower body 1. The combination of the gyroscope 90 and the accelerometer 91 enables the mower body 1 to be positioned without satellite signals, WiFi signals or bluetooth signals.

The distance sensor module a is electrically connected to the central processing unit 5 for detecting objects around the mower body 1, such as obstacles, dogs, and infants. When the distance between the object and the mower body 1 is within a preset range, the distance sensor module A sends a distance alarm signal, wherein the preset range depends on the category of the distance sensor module A. In an embodiment of the present invention, the distance sensor module a may be selected from one or more of a sonar sensor module, an infrared sensor module, a light detection and ranging (laser positioning) module, and a radar module. Referring to fig. 2, 3 and 4, the unmanned lawn mower 1000 further includes a driving mechanism F. The mower body 1 has a housing 10, and a groove 11 is formed in the housing 10. The drive mechanism F is mounted in the recess 11 and includes a first shaft F0, a second shaft F1, an actuating member F2 and a linkage member F3. The link member F3 has a first lever part F4 and a second lever part F5 connected to the first lever part F4. A second shaft F1 is disposed at the junction of the first lever part F4 and the second lever part F5 for pivotally connecting the link member F3 with the housing 10. The other end at the connection with the first lever part F4 is pivoted to the camera module 4 through a first shaft F0. The other end at the connection with the second lever part F5 is pivoted to the actuating member F2 so that the actuating member F2 can push the one end of the second lever part F5 toward the first driving direction D1 or pull the one end of the second lever part F5 in the second driving direction D2.

When the actuating member F2 pushes one end of the second lever portion F5 in the first driving direction D1, the link member F3 rotates relative to the housing 10 in the first rotating direction R1 about the second axis F1, so that the camera module 4 is adjusted from the retracted state shown in fig. 4 to the extended state shown in fig. 3. In this manner, the camera module 4 can be deployed to capture an image, as shown in fig. 1. On the other hand, when the activating member F2 pulls one end of the second lever portion F5 in the second driving direction D2, the link member F3 rotates relative to the housing 10 in the second rotating direction R2 about the second axis F1, so that the camera module 4 is adjusted from the extended state shown in fig. 3 to the retracted state shown in fig. 4. In this manner, the camera module 4 is returned to the contracted state so as to house and protect the camera module 4.

Referring to fig. 7, a method for determining a mowing boundary of the unmanned mowing machine 1000 according to the embodiment of the invention includes the following steps:

step S100: the handheld electronic device 6 generates a user instruction, controls the unmanned mower 1000 to start moving from an initial position in a mowing area, and controls the camera module 4 to acquire images of the environment around the unmanned mower 1000;

step S101: transmitting the images captured by the camera module 4 to the handheld electronic device 6 to assist the unmanned lawn mower 1000 in moving within the area;

step S102: guiding the unmanned mower 1000 to return to the starting position according to the acquired image and a control signal in the user instruction to determine a boundary;

step S103: calculating a mowing track in the boundary according to the contour of the boundary; and

step S104: the unmanned lawnmower 1000 is controlled to mow grass along a mowing trajectory within the boundary.

Please refer to fig. 6 to 11. As shown in fig. 8, a user U uses an unmanned lawnmower 1000 to mow grass in a garden where an area 200 is desired to mow grass. First, the user U generates a user instruction with the handheld electronic device 6, controls the unmanned lawnmower 1000 to move from the home position within the mowing area 200 (the first position P1 shown in fig. 9), and controls the camera module 4 to capture an image of the surroundings of the unmanned lawnmower 1000 (step S100). Meanwhile, the central processing unit 5 controls the remote device communication module 7 to transmit the image captured by the camera module 4 to the handheld electronic device 6 to assist the unmanned lawnmower 1000 in moving within the area (step S101). In other words, when controlling the operation of the unmanned lawn mower 1000 through the handheld electronic device 6, the central processing unit 5 can simultaneously control the camera module 4 to capture images of the environment around the lawn mower body 1 and control the remote device communication module 7 to transmit the images back to the handheld electronic device 6.

For example, when the unmanned lawnmower 1000 is in the start position (first position P1 shown in fig. 9), the remote device communication module 7 transmits the image captured by the camera module 4 back to the handheld electronic device 6 so that the real-time display area 61 (shown in fig. 10) of the user interface 60 of the handheld electronic device 6 displays the relevant content of the image captured by the camera module 4 in the start position. When the unmanned lawnmower 1000 is in the second position P2 shown in fig. 9, the remote device communication module 7 transmits the image captured by the camera module 4 back to the handheld electronic device 6 to cause the real-time display area 61 (shown in fig. 10) of the user interface 60 of the handheld electronic device 6 to display the content (shown in fig. 11) associated with the image captured by the camera module 4 in the second position P2.

In addition to having a real-time display area 61, the user interface 60 of the handheld electronic device 6 also has a control area 62, the control area 62 including a direction button 620, an area map 621, a forward button 622, and a stop button 623. The control area 62 includes a direction button 620, a forward button 622, and a stop button 623 for generating user instructions to enable the user U to generate user instructions by manipulation to control the unmanned lawnmower 1000.

Then, the central processing unit 5 guides the unmanned lawnmower 1000 to return to the start position based on the image and the control signal related to the user instruction to determine the boundary 100 (step S102). In other words, upon a user instruction sent by the handheld electronic device 6, after the unmanned lawn mower 1000 returns from the start position (e.g., the first position P1 shown in fig. 9) to the start position, the closed loop boundary 100 is determined, i.e., the control signal sent by the handheld electronic device 6 in cooperation with the image captured by the camera module 4 determines the mowing boundary 100 of the area 200, within which boundary 100 the unmanned lawn mower 1000 mows.

It is noted that during the movement of the unmanned lawn mower 1000 from the starting position back to the starting position, the central processing unit determines a plurality of image features on the boundary 100 from the images captured by the camera module 4. For example, when the camera module 4 captures an image having a first geographical feature GF1 as shown in fig. 9, the central processing unit defines the first geographical feature GF1 as one of the image features of the boundary 100, wherein the first geographical feature GF1 exemplifies a swimming pool, to which the invention is not limited. Further, the user U can see one of the image features and control the unmanned lawn mower 1000 to detour. That is, when the unmanned lawn mower 1000 recognizes the second geographic feature GF2 in fig. 9 as a house, the same operations may be performed, and will not be described in detail.

In this embodiment, the camera module 4 may be a stereo camera, and each image feature includes depth information, for example, images are processed through a binocular vision field generated by the stereo camera, and the image feature includes a distance between the mower body 1 and its corresponding geographic feature. The boundary 100 may be generated from depth information of the surrounding environment, represented by the area map 621. Preferably, the central processing unit 5 employs the distance information detected by the distance sensor module a in generating the area map 621. The kind of the camera module 4 is not limited to that mentioned in the present embodiment. For example, the camera module 4 may be a depth camera, a monocular camera, or the like, according to actual requirements.

After determining the boundary 100, the central processing unit 5 calculates the mowing trajectory 300 within the boundary 100 from the contour of the boundary 100 (step S103). In actual operation, the central processing unit 5 may calculate the mowing track 300 through various algorithms such as an artificial potential field method, a grid method, a fuzzy control algorithm, a neural network path planning and the like. Then, the central processing unit 5 controls the unmanned lawnmower 1000 to mow along the mowing trajectory 300 within the area 200.

Referring to fig. 12, a method for determining a mowing path of an unmanned mowing machine 1000 according to another embodiment of the invention comprises:

step S200: the handheld electronic device 6 generates a user instruction to control the unmanned mower 1000 to move from the starting position of the mowing area, and controls the camera module 4 to acquire images of the environment around the unmanned mower 1000;

step S201: transmitting the image acquired by the camera module 4 to the handheld electronic device 6 to assist the unmanned lawn mower 1000 in moving within the area;

step S202: according to the control signal corresponding to the image and the user instruction, the handheld electronic device 6 determines a path from the starting position to the ending position; and

step S203: the unmanned lawnmower 1000 is controlled to mow grass along the path.

The method in this embodiment differs from the method in the previous embodiment mainly in that the mowing path 400 within the area 200 is determined by the control signal sent by the handheld electronic device 6 in cooperation with the image captured by the camera module 4, and the unmanned lawnmower 1000 mows along the path 400. In other words, the handheld electronic device 6 determines from the images a mowing path 400 from a starting position (a first position P1 as shown in fig. 13) to an ending position (a second position P2 as shown in fig. 13). More specifically, path 400 is generated by a control signal associated with a user command issued by handheld electronic device 6. The information contained in each point of the route 400 includes positioning information provided by the wireless signal positioning module 8, distance information of the surrounding environment provided by the distance sensor module a, and depth information provided by the camera module 4. The generated path 400 will be stored in the memory module G and the unmanned lawn mower 1000 will call the path 400 each time it mows.

Since the unmanned lawn mower 1000 is equipped with the wireless signal locating module 8 and/or the dead reckoning module 9, in addition to the control signals sent by the handheld electronic device and the images captured by the camera module, the boundary 100 or the path 400 may be determined by the wireless locating signals sent from at least one locating terminal and/or by the dead reckoning module 9, and the unmanned lawn mower 1000 mows within the boundary 100 or along the path 400.

Referring to fig. 6 and 14, the unmanned lawn mower 1000 may further include a memory module G electrically connected to the central processing unit 5. The storage module G is used for storing at least one registered identification image, but the invention is not limited thereto. For example, the storage module G can also store the above-mentioned information, including one or more selected from the boundary 100, the image acquired by the camera module 4, the positioning information acquired by the wireless signal positioning module 8, and the distance information acquired by the distance sensor module a. According to another embodiment of the invention, a method of determining a boundary 100 for mowing by an unmanned lawn mower 1000 by following movement of a user U, comprises the steps of:

step S300: registering, by an image processing step, at least one identity image associated with at least one user;

step S301: acquiring an initial user image of a user;

step S302, determining whether the initial user image is matched? with the user identification image, if so, executing step S304, if not, executing step S303;

step S303: the unmanned mower is in an idle state;

step S304: the mobile image of the user acquired by the camera module is subjected to image processing, so that the mower can move along with the movement of the user;

step S305: following the movement of the user, the unmanned mower starts to move from the initial position in the mowing area;

step S306: guiding the unmanned lawn mower back to the starting position by following the movement of the user to determine the boundary;

step S307: calculating a mowing trajectory within the boundary based on the boundary contour; and

step S308: and controlling the unmanned mower to mow along the mowing track in the boundary.

As shown in fig. 6, 14-16, another way for the drone mower 1000 of the present invention to determine a boundary or path is to follow the user moving along the boundary or path. The unmanned lawnmower 1000 of the present invention will be described herein by way of example only as it moves along a boundary with a user. First, the user U needs to register his/her identification image through an image processing step (step S300), that is, the camera module 4 is used to capture the identification image of the user U, and the central processing unit 5 registers the identification image in the storage module G storing the identification image. It should be noted that the registration operation procedure of the identification image of the present invention is not limited thereto. For example, the unmanned lawn mower 1000 may also include an image control unit, e.g., a Graphics Processing Unit (GPU), for registration of the identification image, depending on the actual needs. In the present embodiment, the identification image includes pose estimation information (i.e., an identification image model including bone features), clothing color information, and the like.

As shown in fig. 15, when the unmanned lawnmower 1000 is required to mow, first, the camera module 4 of the unmanned lawnmower 1000 captures an initial user image 500 of the user U (step S301). Meanwhile, the central processing unit 5 converts the initial user image 500 into an initial image model 600, and the initial image model 600 contains posture estimation information (i.e., an identification image model containing skeletal features), clothing color information, and the like. When the initial image model 600 is established for the user U, the central processing unit 5 determines whether the identification image matches the initial user image 500 by checking the initial image model 600 with information of the identification image (i.e., pose estimation information, clothing color information, and the like).

When the initial user image 500 does not match the identification image, the user U fails the check, and the unmanned lawnmower 1000 is in the idle state (step S303). When the initial user image 500 matches the identification image, the user U controls the mower body 1 to follow the user U according to the user moving image collected by the image-processed camera module 4 by checking the user U, so as to determine the boundary or the path (step S304). Steps S305 to S308 are similar to fig. 7, and are not described herein again.

Referring to fig. 17, a method for avoiding obstacles and shutting down a living being includes the following steps:

step S400: mowing along a mowing track within the boundary or along the path;

step S401, when mowing along a mowing track or along a path in the boundary, determining whether the detected object is located in an alarm range?, if so, executing step S402, if not, returning to step S400;

step S402, determining whether the detected object is a living object?, if yes, executing step S403, if no, executing step S404;

step S403: turning off the unmanned mower; and

step S404: and controlling the unmanned mower to avoid the object.

It should be noted that some emergency situations may occur during mowing, and therefore, some emergency situations need to be responded to. When the unmanned lawnmower 1000 mows along the mowing trajectory 300 or the path 400 at the boundary 100, the distance sensor module a detects an object on the mowing trajectory 300 or the path 400 (step S400). Here, the unmanned lawnmower 1000 is described as an example of mowing along the mowing track 300, and the camera module 4 is a stereo camera.

As shown in fig. 17 to 19, when the unmanned lawnmower 1000 mows along the mowing track 300 and there is an object O on the mowing track 300, the camera module 4 (i.e., the stereo camera) can respectively capture a right view image 800 and a left view image 900 related to the object O. In fact, there is a disparity between the right view image 800 and the left view image 900, which can be used to calculate the distance 700 between the object O and the unmanned lawn mower 1000. After calculating the distance 700 between the object O and the unmanned lawnmower 1000, the central processing unit 5 further determines whether the detected object O (or the distance 700) is within the warning range (step S401).

When it is detected that the target O (or the distance 700) is not within the warning range, the unmanned lawnmower 1000 continues to mow along the mowing trajectory 300 (step S400). When the detected object O (or the distance 700) is within the alert range, the central processing unit 5 further determines whether the detected object O is a living being (step S402). By comparing the object O with the bone analysis map stored in the storage module G, biometric recognition can be achieved. When the detected object O is not a living thing, the central processing unit 5 controls the unmanned lawnmower 1000 to avoid the object O (step S403). When the detected object is a living being, as shown in fig. 19, for example, the living being LC1 and the living being LC2 are a baby and a pet, respectively, the central processing unit 5 controls the unmanned lawnmower 1000 to be turned off for safety (step S402).

Compared with the prior art, the unmanned mower is provided with the camera module to acquire images around the mower body, and the images acquired by the camera module are allowed to determine the boundary or the path in the mowing area through image processing. The unmanned mower is convenient for users to use, and has the characteristic of artificial intelligence.

The above description is only for the preferred embodiment of the present invention, and the embodiment is not intended to limit the scope of the present invention, so that all the equivalent structural changes made by using the contents of the description and the drawings of the present invention should be included in the scope of the present invention.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:无人机及其控制方法、控制装置和计算机可读存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类