Mobile analysis processing device

文档序号:751226 发布日期:2021-04-02 浏览:22次 中文

阅读说明:本技术 移动式分析处理装置 (Mobile analysis processing device ) 是由 S·哈斯曼 F·J·克诺尔 V·钦梅克 于 2019-08-22 设计创作,主要内容包括:本发明涉及一种用于土壤处理和/或动植物群分析操作的农用移动式分析处理装置(14)。此装置(14)包括至少一个传感器(62)、带有至少一个机动刀具(58)的工具单元(52)、至少操控所述工具单元(52)的刀具(58)的执行器(48)、用于驱动所述工具单元(52)和/或所述执行器(48)的马达(50)、数据库(78)、带有接口(70a)的第一通信单元(60)和根据生成的控制指令对所述传感器(62)、所述工具单元(52)和所述执行器(48)进行控制的第一计算机(46),为生成所述执行器(48)、所述工具单元(52)和/或所述马达(50)的控制信号,需通过将所述传感器(62)获取的数据与存储在所述数据库(78)中的数据进行连续比较。该装置具备移动性和灵活性,该装置(14)相应地构建了一个单元,通过它可以将所有数据进行实时处理,生成对执行器(48)、工具单元(52)和/或马达(50)的控制信号,并可根据控制信号立即进行处理。这为不同组合提供了可能,例如与各种输送载体(12)组合,并可根据需要在农田移动装置(14)。(The invention relates to an agricultural mobile analysis processing device (14) for soil treatment and/or animal and plant population analysis operations. The device (14) comprises at least one sensor (62), a tool unit (52) having at least one motorized tool (58), an actuator (48) for actuating at least the tool (58) of the tool unit (52), a motor (50) for driving the tool unit (52) and/or the actuator (48), a database (78), a first communication unit (60) having an interface (70a), and a first computer (46) for controlling the sensor (62), the tool unit (52) and the actuator (48) in accordance with generated control commands, for generating control signals for the actuator (48), the tool unit (52) and/or the motor (50), data acquired by the sensor (62) are continuously compared with data stored in the database (78). The device is mobile and flexible, and the device (14) accordingly constitutes a unit by means of which all data can be processed in real time, control signals for the actuator (48), the tool unit (52) and/or the motor (50) being generated and processed immediately in dependence on the control signals. This provides the possibility of different combinations, for example with various transport carriers (12), and moving the device (14) as required in the field.)

1. Agricultural mobile analysis processing device (14) for soil treatment and/or animal and plant population analysis operations, characterized by at least one sensor (62), a tool unit (52) with at least one motorized cutter (58), an actuator (48) for operating at least the cutter (58) of the tool unit (52), a motor (50) for driving the tool unit (52) and/or the actuator (48), a database (78), a first communication unit (60) with an interface (70a) and a first computer (46) controlling the sensor (62), the tool unit (52) and the actuator (48) according to generated control instructions, generating control signals for the actuator (48), the tool unit (52) and/or the motor (50), continuously comparing data acquired by the sensor (62) with data stored in the database (78).

2. The analytical processing device according to claim 1, characterised in that the data acquired by the sensor (62) are compared in real time with the data of the database (78), in particular the data acquired by the sensor (62) are validated and classified.

3. The analytical processing device according to claim 1 or 2, wherein the sensor is a visual detection unit (62) with a camera (64).

4. The analytical processing device according to one of the preceding claims, wherein a connecting device (82a, 82b) is provided for connecting, if necessary, to a transport carrier (12) for moving the analytical processing device (14).

5. The analytical processing device according to one of the preceding claims, characterised in that it is designed in two parts, a first unit (14a) in which the sensor (62), the tool unit (52), the motor (50) for driving the tool (58), the tool unit (52) and/or the actuator (48), the actuator (48) are arranged, a first communication unit (60) with an interface (70a) and the first computer (46), a second communication unit (74) with an interface (36b, 70b), a second computer (76) and the database (78) are arranged in the second unit (14b), the first unit (14a) and the second unit (14b) can be connected to each other via the interfaces (70a, 70b) for data exchange.

6. The analytical processing device according to claim 5, wherein the first unit (14a) is provided with a first housing (40) and the second unit (14b) is provided with a second housing (42).

7. The analytical processing device according to claim 6, wherein the first and second housings (40, 42) are connected to each other and detachable by a plug connection (44, 80).

8. The analytical processing device according to claim 6 or 7, characterised in that the first and second housings (40, 42) are connected as connecting means to a receiving device (14c) of the transport carrier (12), the device (14) being gripped and moved by the transport carrier (12) by providing corresponding gripping means (82a, 82b) on the transport carrier (12).

9. The analytical processing device according to one of claims 6 to 8, characterised in that the first and second housings (40, 42) are connected as connecting means, if necessary, to connecting means (26b, 28, 36b, 38) of the transport carrier (12), the device (14) being connectable to the transport carrier (12) and being displaceable by providing the transport carrier (12) with the respective connecting means (26b, 28, 36b, 38).

10. The analytical processing device according to one of the preceding claims, wherein the tool unit (52) comprises at least one feed unit (54) and one rotation unit (56) interacting with the motor (50).

11. The analytical processing device according to claim 10, characterised in that the distal end of the turning unit (56) is provided with at least the knife (58), in particular a rotary plow (58) or a knife unit.

12. Analytical processing device according to one of the preceding claims, characterised in that a power supply interface (26b) is provided for external supply of power.

13. The analysis processing apparatus according to claim 12 and one of claims 4 to 11, characterized in that the power supply interface (26b) is provided in the first unit (14a) and supplies power to the first unit (14a) and the second unit (14b) via the power supply interface.

14. The analytical processing device according to one of claims 4 to 13, characterised in that a further communication interface (36b) is provided for the transport carrier (12).

15. The analysis processing device according to claim 14, characterized in that the further communication interface (36b) is provided in the second unit (14 b).

16. Method for real-time control of soil treatment and/or animal and plant population analysis operations according to the device defined in one of the preceding claims, comprising the following steps:

a. receiving, by the sensor, temporally successive data-technically defined voxels and/or pixels and/or images;

b. transmitting the received data to the database (78);

c. storing the received data in the database (78);

d. qualitative data comparing said received data with data stored in said database (78), preferably by segmentation, data reduction and/or verification of said received data by said computer (46, 76);

e. evaluating matching data with the computer (46, 76) by the classifier (68) in conjunction with the database (78) for existing defined data records;

f. processing and converting the technical data evaluated by the computer (46, 76) for adjusting and/or controlling the motor, the actuator, the tool unit and/or the specific transport means.

17. Method according to claim 16, characterized in that after obtaining the technical data of the adjustment and/or control, the motor (50), the actuator (48), the tool unit (52) and/or the transport carrier (12) are activated for soil treatment and/or the designated transport carrier (12) is subjected to an animal and plant population analysis operation.

18. Method according to claim 16 or 17, characterized in that the evaluation is carried out in the computers (46, 76) interacting with the classifier (68), in particular in the second computer (76), while the technical data processing and converting the evaluation into adjustment and/or control are carried out in the other computer (46, 76), in particular in the first computer (46), for which the evaluation is transferred from one of the computers (46, 76) to the other computer (46, 76).

19. Method according to claim 16 or 17, characterized in that storing, qualitatively comparing the received data with data stored in the database (78) and/or evaluation done by the classifier (68) is supported by artificial intelligence.

Technical Field

The present invention relates to an agricultural mobile analysis processing apparatus for soil treatment and/or animal and plant population analysis operation according to claim 1, and a real-time control method for soil treatment and/or animal and plant population analysis operation by the apparatus according to claim 15.

Background

Weed control in agriculture is a very labour intensive task, especially in organic farming where the use of chemicals is prohibited or limited. Depending on the crop being planted, it may be necessary to control weeds at a location close to a commercial crop. This control is generally carried out at the beginning of growth, whether for commercial crops or weeds, or for weeds, when small and in close proximity to each other. In order to avoid damage to commercial crops, purposeful selective methods should be employed. This is done in organic farming, such as carrot planting, by a labor intensive, physically damaging manual task called "weed fly". Seasonal workers lie on the platform to remove weeds.

For special crops with larger plant distance, such as beet or asparagus lettuce, the existing tractor suspension type farm tool is suitable for identifying single economic crops and controlling corresponding cutters, so that the area where the economic crops are located is not processed. Selectivity in this task is not necessary. That is, the system does not detect the work area, rather it "blindly" controls the tool based on the known position of the cash crop. Generally, the distance from the cash crop determines the accuracy requirement.

DE 4039797 a1 discloses a weed removal device in which the actuator for removing weeds is operated continuously, and is not temporarily interrupted until a crop is detected by the sensor. The transport carrier in this solution is designed for a vehicle.

The weed control device disclosed in DE 102015209879 a1 comprises a treatment implement. The treatment tool is used for removing weeds. Furthermore, a classification unit is provided which either contains the position data of the weeds or identifies the weeds and determines their position data. The relative position between the treatment implement and the weeds is determined by the positioning unit. A vehicle-like operating unit positions the processing tool accordingly on the basis of the determined relative position.

DE 102015209891 a1 discloses a corresponding device with a pressure delivery unit and a liquid release unit. In this embodiment, the weeds are removed by a pressurized spray. The transport carrier in this solution is designed for a vehicle.

DE 102015209888 a1 discloses a method of removing weeds by applying a liquid in pulses to the weeds. The transport carrier in this solution is also designed for a vehicle.

DE 102013222776 a1 discloses a molding die for installation in a vehicle, which molding die is arranged in a guide for guiding the molding die. In this case, a pressing die is placed over the weeds and pressure is applied. The weeds are removed by the impact of the pressure-applying stamp.

At present, agricultural robots and harvesters that provide automated support for agriculture and are equipped with telematics systems are creating a new approach. In many cases, technical principles and research efforts in aerospace, remote sensing and robotics are available to solve agricultural problems, but they have to be adapted to the specific agricultural task and new devices and methods are needed.

For example, the existing automatic agricultural robot can only remove one row of plants one by one in system design. It is only performed in the plant population and can only be performed in series. Control is usually performed last by a patrol, such as by a qualified person.

Another disadvantage of the known devices is that the transport carriers are designed as special vehicles, which remove the crop in rows at a time and are relatively inflexible.

Disclosure of Invention

The object of the present invention is to provide an agricultural mobile analysis processing device for soil treatment and/or animal and plant population analysis operations, and a method for using the device, which enables real-time controlled, qualified removal of detected plants and/or animals, and simultaneous analysis of plants and animals. In particular, the device can be connected to different transport carriers, so that the device can be moved into and over the working area.

The object is achieved by the features of the mobile evaluation device according to claim 1, and the method for controlling the device is achieved by the features of claim 15.

The dependent claims form further developments of the invention.

The invention is based on the finding that the flexibility of use and the resulting possibilities are significantly increased by producing a mobile device which contains all the units for analysis and processing functions and which is independent of the transport vehicle.

The present invention therefore relates to an agricultural mobile analysis processing apparatus for soil treatment and/or animal and plant population analysis operations. The device comprises at least one sensor, a tool unit with at least one motorized tool, an actuator which can actuate at least a tool of the tool unit, a motor for driving the tool unit and/or the actuator, a database, a first communication unit with an interface, and a first computer which controls the sensor, the tool unit and/or the actuator according to generated control commands. In order to generate corresponding control signals for the sensors, the tool unit and/or the actuators, the data acquired by the sensors are continuously compared with data stored in a database. The device is mobile and flexible, and accordingly constitutes a unit by means of which all data can be processed in real time, control signals for sensors, tool units and/or actuators can be generated and processed immediately on the basis of the control signals. This provides the possibility of different combinations, for example with various transport carriers, and of moving the device in the field as required.

The data acquired by the sensor is preferably compared in real time with the data of the database, in particular the data acquired by the sensor is validated and classified. This can improve the reaction capacity of the apparatus.

According to one embodiment of the invention, the sensor is a visual detection unit with a camera. Therefore, the data to be processed is image data, and can be compared with the database at any time.

In order to enable the device to be connected to a transport carrier for moving the device when required, the transport carrier is provided with corresponding connecting means.

In order to be able to easily replace individual components and thus reduce the set-up time, the device is made in two parts. In the first unit, sensors, a tool unit, a motor for driving a tool and/or an actuator in the tool unit, an actuator, a first computer and a first communication unit with an interface are provided. In the second unit, a database, a second computer and a second communication unit with an interface are provided. The first unit and the second unit can be connected with each other through an interface for data exchange. Furthermore, the design in two parts also makes it possible to arrange the two units spatially separated from one another. This design is advantageous, for example, when the moving parts of the device should be as light as possible. In this case, the second unit may be fixedly disposed in a concentrated state, and the first unit may be moved on the farm field.

It is advantageous here if the first unit comprises a first housing and the second unit comprises a second housing, so that the units arranged in the units are protected from external influences.

The first housing and the second housing may be detachably connected to each other by a plug connection. In this way, two units can be connected in a modular manner, or only one unit can be replaced when the unit fails.

According to one embodiment of the invention, the first and second housings are connected, if necessary, as connecting devices to the receiving devices of the transport carriers, and the devices are gripped and moved by the transport carriers by providing corresponding gripping devices on the transport carriers. Alternatively or additionally, the first and second housings can also be connected, if necessary as connecting devices, to the connecting means of the transport carrier, with the devices being connected to the transport carrier and being moved by providing corresponding connecting means on the transport carrier. The simplest and most rapid connection to the transport carrier is achieved here in order to complete the transport of the device.

Preferably, the tool unit comprises at least one feed unit and a rotary unit interacting with a motor. Thereby enlarging the range of use of the tool in a simple manner without having to move the device.

Preferably, the distal end of the turning unit is provided with at least one knife, in particular with a rotary plow or a knife unit. For example, the tool unit may selectively remove bugs and weeds during rotation.

In order to further reduce the weight of the device, a power interface of an external power supply is also arranged. The power interface may be provided at the first unit. In the assembled state of the first unit and the second unit, the second unit can supply power to the first unit and the second unit through the power interface. The power source on the transport carrier is preferably selected for power.

For the data exchange between the transport carrier and the device, a further communication interface is provided for the transport carrier on the device.

The further communication interface may be provided in the first unit or in the second unit. Preferably in the second unit.

The above task is also achieved by a method for real-time control of soil treatment and/or animal and plant population analysis operations by a device of the type described above, comprising the following steps:

-receiving the data-technically defined voxels and/or pixels and/or images by the sensor continuously in time;

-transmitting the received data to a database;

-storing the received data in a database;

-comparing the received data with data stored in a database, preferably by computer segmentation, data reduction and/or verification of the received data;

-evaluating the matching data together with existing defined data records in the database by means of a classifier and a computer;

processing and converting, by means of a computer, technical data evaluated for adjusting and/or controlling the motors, actuators, tool units and/or assigned transport carriers.

Preferably, after the technical data for the adjustment and/or control are obtained, the motor, the actuator, the tool unit and/or the specific delivery vehicle are activated for soil treatment and/or animal and plant population analysis.

According to a preferred method of the invention, the evaluation is carried out in a computer interacting with the classifier, in particular in a second computer, and the technical data for processing and converting the evaluation into the adjustment and/or control are carried out in a further computer, in particular in the first computer, for which purpose the evaluation is transferred from one computer to the other computer. This may reduce the computation time because the computers may be operating simultaneously. Furthermore, it may occur that two computers do not have to be arranged next to each other. For example, a second unit with a second computer may be remotely located from a first unit with a first computer.

Storing, qualitative data comparing the received data with data stored in a database and/or evaluation by a classifier is preferably supported by artificial intelligence. A nearly autonomous system can be created here.

The device, in particular the first unit and the second unit, may be of modular design so that they can be connected to each other and also to other units of the overall system.

Real-time is understood as the possibility of being able to carry out the analysis and processing processes in situ in one process step.

A voxel is understood in the context of the present invention to be a spatial data record generated discretely or continuously in time by a sensor, i.e. a visual detection unit, in an imaging method.

Preferably, the actuator comprises a mechanical structure, in particular a rotary unit, which is located in a clamping structure of the housing of the first unit.

Drawings

Further advantages, features and applications of the above invention may be derived from the following description in connection with the illustrated examples.

In the description, in the claims, and in the drawings, the following list of part numbers is used for the nomenclature and the designation of a reference. The specific diagram is as follows:

FIG. 1 is a schematic view of a transport carrier system having spatially separated housings for mobile devices according to a first embodiment of the present invention;

FIG. 2 is a schematic view of a transport carrier system according to a second embodiment of the present invention, the spatially separated housings of the mobile device of the transport carrier system being interconnected by a plug connection;

fig. 3 is a side view of a transport carrier system according to a first embodiment of the present invention, with a mobile device connected to a transport carrier drone.

FIG. 4 is a flow chart illustrating steps in a method of using the delivery vehicle system;

FIG. 5 is a flow chart illustrating steps of a method of determining necessary actions;

FIG. 6 shows an image acquired by a visual detection unit;

FIG. 7 is a schematic diagram of a convolutional neural network, using the image of FIG. 6 as an example;

FIG. 8 is a flow chart of a method of operation of the segmentation and data reduction unit;

FIG. 9 shows an intermediate image created by the segmentation and data reduction unit;

FIG. 10 shows a partial fragment of an intermediate image providing three different cases for a classifier;

FIG. 11 shows two basic diagrams of further pixel fields for evaluation by the classifier;

FIG. 12 shows two basic diagrams of a further field of pixels for evaluation by a classifier;

FIG. 13 shows an image created and evaluated by the classifier, an

Fig. 14 is a basic diagram illustrating the operation method of the classifier.

Detailed Description

Fig. 1 shows a schematic representation of a transport vehicle system 10, which consists of a transport vehicle in the form of a drone 12 and an agricultural mobile analysis and processing device 14 for soil treatment and animal and plant population analysis operations. Referring to fig. 3, the drone 12 includes a drive 16, the drive 16 including four motors 18 and propellers 20 driven thereby. Further, the drone 12 is provided with four feet 22 below the motor 18.

According to a first embodiment of the invention, the drone 12 includes a power source in the form of a battery 24 that powers the drive 16 and other components of the drone 12 and the mobile device 14. For this purpose, the unmanned aerial vehicle 12 is provided with a power supply interface 26a, and the mobile device 14 is provided with a power supply interface 26b corresponding to the power supply interface 26a, which are connected to each other by a detachable plug connection 28. Furthermore, a communication unit 30 with an antenna 32 and a GPS unit 34 are provided, the communication unit 30 continuously determining the position of the drone 12, the position data of the drone 12 being transmitted to the mobile device 14 for sorting the data acquired by the mobile device 14 and to a remote central processing unit, which is not detailed here. Telemetry may be accomplished by means of the GPS unit 34, the communication unit 30, and the mobile device 14. Furthermore, a control device 12b is provided which controls the drive 16.

In addition to the antenna 32, the communication unit 30 of the drone 12 also includes another interface 36a, with which a mating designated interface 36b of the mobile device 14 is connected to each other by a detachable plug connection 38 and used for data exchange.

The mobile device 14 is made up of two units 14a, 14b, a first unit 14a having a first housing 40 and a second unit 14b having a second housing 42. The first housing 40 and the second housing 42 are connected to each other and detachable by a plug connection 44, constituting a mobile device 14 unit. A set of different first units 14a on one side and a set of different second units 14b on the other side can be configured separately and adapted to different needs by simply connecting them together.

In the first housing 40, there is provided a first computer 46, an actuator in the form of a movable, motor-driven robot arm 48, a motor 50 which interacts with the robot arm 48, a tool unit 52 which is arranged on the robot arm 48, the tool unit 52 comprising a feed unit 54 and a rotary unit 56. At the distal end of the turning unit 56, a rotary plow 58 is provided as a cutter. The motor 50 drives not only the robot arm 48 but also the feed unit 54 and the turning unit 56 and thus also the rotary plow 58. The robotic arm 48 may be multi-part and have various joints, which are not described in detail since the kinematics of such motor drives are known. By means of the robot arm 48, the tool unit 52 will be moved to its area of use relative to the drone 12, and the tool unit 52 with the feed unit 54 and the turning unit 56 may be used to treat plants, for example to remove weeds, and/or to treat soil, using the rotary plough 58.

Further, a communication unit 60 and a visual detection unit 62 are provided in the first unit 14 a. The visual detection unit 62 includes a camera 64 for capturing images, a segmentation and data reduction device 66, and a classifier 68 for classifying a plurality of pixel fields consisting of pixels based on intermediate images or intermediate data generated by the segmentation and data reduction device 66, as will be described in further detail below. The visual detection unit 62 is connected to the communication unit 60.

The first unit 14a includes an interface 70a that mates with the interface 70b of the second unit 14 b. The communication unit 60 is connected via an interface 70a to an interface 70b and via an interface 70b to a communication unit 74 of the second unit 14b via a communication connection 72. The communication unit 74 of the second unit 14b is connected to the interface 36a of the drone 12 and the communication unit 30 by the interface 36b, by the plug connection 38.

A second computer 76 and a database 78 are also provided in the second unit 14 b.

Fig. 2 shows another embodiment of the transport carrier system 10, in which the configuration of the drone 12 is the same as the first embodiment. The only difference is that there is a plug connection 80 between the first unit 14a and the second unit 14b of the mobile device 14. in addition, the plug connection 80 can also detachably connect the communication unit 60 of the first unit 14a with the communication unit 74. By simple plugging, different first units 14a can be combined with different second units 14b into one mobile unit 14.

Fig. 3 shows a side view of the drone 12, whereby only two of the four motors 18 with corresponding propellers 20 are visible. A foot 22 is provided below each motor 18. Two clamping arms 82a, 82b are provided between the legs 22 for gripping, lifting, lowering and releasing the mobile device 14 as required. The mobile device 14 is composed of two units 14a and 14b and is detachably connected to each other by a plug connection 80. In the first unit 14a, it can be seen that the camera 64 is part of the visual detection unit 62 and that the rotary plow 58 is at the distal end of the turning unit 56.

The moving device 14 may also be provided with a plurality of different tool units 52, which tool units 52 are provided with a common robot arm 48 and tool turret, which brings the desired tool unit 52 to the activation position. It is also possible to design different tool units with their own actuators.

FIG. 4 is a flow chart showing the sequential steps of agricultural soil treatment and flora and fauna analysis using the delivery vehicle system 10.

In a first step 84, the delivery vehicle system 10 is first used to determine the necessary measures to be implemented for a given agricultural area. For example, the transport carrier system 10 may be brought to an agricultural area to be treated, such as to a farm field, or may be flown thereto directly from a central location. The drone 12 with the mobile device 14 takes off and flies over the field. The transport carrier system 10 obtains the necessary data about the field to be examined, via a fixed central computing unit. The central computing unit may also be a smartphone in this case. The field is captured in the form of an image by a visual detection unit 62 of the mobile device 14 with a camera 64. The images are evaluated and compared to the database 78 to ultimately determine the necessary action to be taken with the field.

In a next step 86, the mobile unit 14 adapted to the necessary measures is assembled from a set of first units 14a and a set of different second units 14b, and the two units 14a, 14b are connected to each other, according to the determined measures to be carried out on the field or part area of the field.

In a next step 88, the mobile device 14 is gripped by the drone 12 from both sides by the gripper arms 82a and 82b, respectively, and moved upwards by the gripper arms in the direction of the drone 12 into the receptacle 12a of the drone 12. Thus, the power supply interfaces 26a, 26b are connected to each other by the plug connection 28, and the interfaces 36a, 36b are connected to each other by the plug connection 38. In this way, the battery 24 of the drone 12 supplies power to the mobile device 14, enabling data exchange between the communication units 60 and 74 of the mobile device 14 on the one hand and the central computing unit on the other hand, via the antenna 32 of the communication unit 30 of the drone 12. As mentioned above, the central computing unit, which is independent of the transport carrier system 10, may also be a smartphone.

In a next step 90, the transport carrier system 10 performs the determined action in the field. For example, the drone 12 flies to an area of farmland that needs to be treated. The tool unit 52 is advanced by the robotic arm 48 to the weed to be removed. The feed unit 54 moves the rotary plow 58 over the weeds, which are removed as the turning unit 56 is activated.

In a fifth step 92, the drone 12 flies back and replaces the mobile device 14 with another mobile device 14 that is more optimized for another action, such as a spraying device with insecticide or fertilizer.

Alternatively, steps 86 and 88 may also be omitted if the drone 12 has been outfitted for the action to be performed.

Fig. 5 is a current depiction of the determination of the necessary measures by the transport carrier system 10, in particular by the displacement device 14.

In a first step 94, data-technically defined voxel and/or pixel and/or image data are continuously received by the visual detection unit 62 of the mobile device 14. The voxels, pixels and image formation receive data that is continuously transferred to the database 78, which is a second step 96.

The storage of the received data is completed in a third step 98.

In a fourth step 100, the received data is qualitatively compared with data stored in the database 78. Here, the received data is divided and data simplified by the division and data reduction means 66. In particular, the received data may also be verified by the second computer 76.

In a fifth step 102, the evaluation is performed by the classifier 68 and the second computer 76 together with the assistance of artificial intelligence, as will be described in detail below.

In a sixth step 104, the evaluation is processed by the first computer 46 and converted into technical data for the adjustment and/or control of the motor 50, the robot arm 48, the tool unit 52 and the drone 12.

In a seventh step 106, the motor 50, the robotic arm 48 and the tool unit 52 are activated to begin soil treatment or animal and plant analysis operations.

In this application, whenever artificial intelligence is mentioned, it refers to the use of a classical convolutional neural network, i.e. a CNN, which consists of one or more convolutional layers, followed by a pooling layer. In principle, the sequence of convolutional and pooling layers can be repeated randomly, often. A common input is a two-or three-dimensional matrix, such as pixels of a gray-scale or color image. Accordingly, neurons are disposed in convolutional layers.

The activity of each neuron is calculated by discrete convolution (convolutional layer). Intuitively, the input is shifted stepwise by a relatively small convolution matrix (filter kernel). The input to the neurons in the convolutional layer is computed as the inner product of the filter kernel and the current underlying image portion. Accordingly, adjacent neurons in the convolutional layer react to overlapping regions.

The neurons of this layer only respond to the stimulus of the local environment of the previous layer. This follows a biological model of the receptive field. Furthermore, the weights of all neurons of the convolutional layer are the same (shared weights). This results, for example, in each neuron in the first convolutional layer encoding the edge strength in some local region of the input. The edge detection is used as the first step of image identification and has high biological reliability. As can be seen directly from the shared weights, translation invariance is an inherent property of convolutional neural networks.

The input of each neuron determined by discrete convolution is converted into output through an activation function, which is usually a linear rectification function in a convolution neural network, called ReLu f (x) ═ max (0, x), to simulate the relative excitation frequency of a real neuron. Since the back propagation method requires the calculation of gradients, in practice a differential approximation of ReLu is used: f (x) ln (1+ ex). Similar to the visual cortex, in deeper convolutional layers, both the size of the receptive field and the complexity of the features identified increase.

In the next step, pooling, excess information is discarded. For example for object recognition in an image, the exact position of the edge in the image is negligible and an approximate location of a feature is sufficient. There are many types of pooling. To date, it is most common for the maxima to pool, i.e. from each 2x 2 squared neuron in the convolutional layer, only the activity of the most active (and therefore "max") neuron is retained for further computational steps; the activity of the remaining neurons is discarded. Although the data is reduced (75% in the example), the performance of the network is generally not degraded by pooling.

The use of a convolutional neural network and the use of the segmentation and data reduction means 66 will be explained in more detail below with reference to fig. 6 to 14.

There are different ways in which all objects in an image are classified by the classifier 68. Many ways are to find each object in the image and then classify it. However, this is not always possible. This is exemplified by the classification of plants 108 in a field. FIG. 6 illustrates an example image 108.

In fig. 6, various plants 108 are depicted, which should be classified in real time by the classifier 68. Real time here means 10 frames per second. Since in the example here the exact position of the end of the plant 110 cannot be easily identified, another approach has to be taken, since the calculation time is not sufficient to first distinguish the plants 110 themselves and then classify them.

The image 108 shown in fig. 6 is composed of pixels, each of which logically contains only exactly one category. Thus, classifying the entire image 108 on a pixel-by-pixel basis can be a cumbersome way. This means that each pixel is classified into a certain class in turn.

However, since a single pixel does not contain the necessary information to conclude on the category assignment, the classification must be made with reference to the surrounding area. The region may be classified using a Convolutional Neural Network (CNN) as described above. The network is in the order shown in fig. 7.

The input image 110 here is the image of fig. 6, to which input image 110 the elements of the convolutional neural network are now applied. In this example, the features are convolved 112, then pooled 114, further features are convolved, pooled again, and summarized at a dense layer 116. The network output will then give the class attribution of the central pixel of the input image 110, specifically one pixel of the image 110 of fig. 6.

Thereafter, a new image portion, typically one pixel shifted, is selected and classified again using the convolutional neural network. This process means that the calculations required by the convolutional neural network must be repeated by the number of pixels to be classified. This is time consuming. The resolution of the image 110 of fig. 6 is 2000 x 1000 pixels. Therefore, the convolutional neural network has to be calculated 200 ten thousand times. However, the initial problem was just the classification of the plant 108 itself. On average, such an image contains about 5% of plant pixels, i.e. only about 10 ten thousand pixels.

A simple segmentation and data reduction by the segmentation and data reduction unit 66 can find out whether a pixel is part of the plant 108 or the background 118. This way of partitioning is computationally less complex than a convolutional neural network and therefore faster. Segmentation and data reduction is achieved by a segmentation and data reduction unit 66 similar to that shown in fig. 8. Fig. 8 illustrates the respective steps.

In a first step 120, each image composed of a plurality of pixels, which is transferred to the database 78, is converted into an RGB (red, green, blue) color pattern.

In a next step 122, each pixel transferred based on the RGB color pattern is converted into an HSV (hue, saturation, lightness) color pattern.

In a next step 124, the HSV color pattern is evaluated.

The color saturation (saturation) of each pixel based on the HSV color mode is evaluated according to a threshold, the pixel being set to a binary value 1 if the value of the color saturation exceeds the threshold and to a binary value 0 if the value of the color saturation is below the threshold.

At the same time, the hue (hue) of each pixel based on the HSV color mode is evaluated according to a predetermined range, and if the hue is within the predetermined range, the pixel is set to a binary value 1, and if the hue is outside the predetermined range, the pixel is set to a binary value 0.

In a next step 126, an intermediate image is generated from the binary information of hue and color saturation, which contains significantly less data than the image 108 generated by the camera.

The formula given below is the segmentation result of fig. 8, which must be used for each pixel. The segmented image S (x, y) is split into three components, red, green, and blue, by RGB image Ψ (x, y) classification. A pixel of the segmented image is set to 1 if the minimum of red, green, or blue divided by the green pixel is less than or equal to a Threshold (THs). Wherein the threshold for an 8-bit image is implemented with a scale value of 255. If the threshold is not reached, the pixels of the segmented image are set to 0 according to equation (1).

This results in a first optimization: the segmentation is performed according to fig. 8 before the entire image 108 is divided into 200 images. That is, the entire image 108 is viewed and determined to be a plant pixel using the formula given above. First, the image 108 is segmented, i.e., the background 118 is set to black (0), as shown in fig. 9. Second, if it is a plant pixel, its coordinates are written into a list. Then, only the coordinates that are also in this list are put into the convolutional neural network. Unnecessary pixels of the ground, i.e., the background 118, are omitted. The call volume of the convolutional neural network is reduced by about 20 times.

By the segmentation, the value of the background 118 is set to 0. The image elements observed by the convolutional neural network are segmented images. Typically, feature computations on the convolutional layer will be applied to each pixel of the picture element. Fig. 10 shows three cases 128, 130, 132 of calculation, each case taking a feature 134 of size 5x 5.

The red case 128 shows the feature calculation where the feature is located entirely above the background 118. Here, each element is multiplied by 0, so that the result of the entire calculation is 0, i.e., an offset value. Thus, the result of this calculation is known prior to the calculation. Even if the background 118 is non-zero, i.e. there is soil, the calculation does not include any information about the plant 110, so the result is simply a constant virtual value.

In the yellow case 130, the average eigenvalue is not located on the plant 110. This means that some of them are also multiplications by 0. In this case the plants occupy the border area and this value is shown in the characteristic map.

In the blue case 132, at least the center pixel of the feature is on the plant.

After observing the three cases 128, 130, 132, only the two cases 130, 132, yellow and blue, need to be calculated. That is, both cases 130, 132 are characterized by at least one input value that is not zero. From all other feature calculations, the results are known prior to the calculation, they are zero or only offset values. The coordinates at which the blue case 132 occurs are known. These are the stored coordinates during the segmentation. The yellow case 130 needs to be recalculated to determine if a coordinate is present. This requires an examination of every plant pixel found during the segmentation process. This is negligible, since the inspection is too laborious and the yellow situation 130 only occurs in the edge regions of the plant 110.

Thus, the computation can be optimized by applying the feature computation and all other elements of the convolutional neural network only to the plant pixels that have been found.

The difference between two adjacent plant pixels 136, 138 is illustrated in fig. 11. To the left is plant pixel 136 and to the right is adjacent plant pixel 138. The blue/purple regions 140, 142 may be different plants that need to be classified. The red/purple regions 144, 142 represent image elements being observed by the convolutional neural network to classify the orange pixels 146.

It can be determined by accurate observation that there is a large overlap of the observation regions (red/purple) 144, 142. This in turn means that the two picture elements 136, 138 contain mostly the same values. If now the convolutional neural network computes the features at the convolutional layer, the same values will also be derived from the feature computation.

The feature 148 having a size of 5x5 pixels is depicted in green in fig. 12. This feature 148 is located at the same coordinate within the entire image, but it is shifted within the image elements (red/purple) 144, 142 that the convolutional neural network is to observe. However, since the positioning is the same in the entire image, the calculation of the central black box 150 will result in the same values in the left image 136 and the right image 138. This research result can be applied to all elements of the convolutional neural network. Therefore, a single feature calculation can be performed on the entire image first, with the edge region omitted. In theory, the decomposition of the input image 108 plays a decisive role in the application of the dense layer 116. The dense layer 116 may also be calculated in the same manner as the convolution 112. In this case, the size of the feature is a result of the interaction of the size of the input image with the existing pooling layers in the network. This allows the classification to be further optimized, now the convolutional neural network elements are only applied to the plant pixels that have been found. The feature map calculated by the last convolution presents the result of the classification, as shown in fig. 13. Here, all carrot plants 152 are classified in green and all weeds 154 are classified in red pixels.

However, these optimizations can also lead to changes in the classification results. The pooling layer here is most affected. Each time pooling occurs, the information is removed from the network. Pooling also loses local reference since existing picture elements are no longer viewed individually. Figure 14 illustrates this problem visually.

In fig. 14, one picture element 156 at a time is shown as a red border 158. Before optimization, each image element is classified by its central pixel separately through a convolutional neural network. The right image element 160 is shifted one pixel to the right. The four colors violet, blue, yellow, green present a single application of pooling. As shown, they can yield different results because pooling always starts from the edge and moves one more pooled element (here two bins). Thus, two adjacent image elements 156, 160 result in two different pooling elements. This results in that each pooling results in two new branches for further calculations if this is to be taken into account in the optimization. Since pooling must be applied once to the entire image, the starting point must be in the upper left corner, and the starting point for the next pooling plus one pixel in the upper left corner. In further calculations, the two pooled results must be processed separately. A further second pooling leads to two new paths, and four results have to be calculated separately. The result is then composed by four result pixel-by-pixel rotations. If only one path is observed after pooling, the output image after two pooling becomes smaller. The length and width of the output image will be only one quarter as large as the input image, respectively. Observing all paths may result in an output image that is approximately the size of the input image.

Another difference is the absence of the border region of the plant. Since the feature is not applied to all elements by any overlap with the plant, there is a computational difference here. This may also change the classification result compared to traditional calculation methods.

The absence of a calculation of the value of the characteristic outside the plant results in a different value, since the result shows zero, i.e. the offset value.

Although the three factors affect the result, the convolutional neural network still has strong robustness, so the result can still meet a high precision value.

The next step is to train the network directly with these corrections so that the network can better adapt to the new calculations, thus directly counteracting possible deviations in the calculations.

The segmentation and data reduction device provides location coordinates for the pixels associated with the weeds 154.

List of labels: 10 transporting the carrier system; 12 unmanned aerial vehicle; 12a receiving device, a receiving area of the unmanned aerial vehicle 12; 12b a control device of the unmanned aerial vehicle; 14 moving the device; 14a first unit; 14b a second unit; 14c, a holding device of the clamping arm on the moving device 14; 16 drivers; 18 motor; 20 a propeller; 22 machine legs; 24 batteries; 26a power interface of the drone 12; 26b a power interface of the mobile device 14; 28, plug connection; 30 a communication unit; 32 antennas; 34 a GPS unit; 36a interface of the drone 12; 36b an interface of the mobile device 14; 38 plug connection; 40 a first housing of the first unit 14 a; 42 a second housing of the second unit 14 b; 44 plug connection; 46 a first computer; 48 a robotic arm as an actuator; 50 motors; 52 a tool unit; 54 a feeding unit; 56 a rotation unit; 58 a rotary plow; 60 a first communication unit of the first unit 14 a; 62 a visual detection unit; 64 cameras; 66 segmentation and data reduction means; 68 a classifier; 70a interface of the first unit 14 a; 70b interface of the second unit 14 b; 72 a communication connection; 74 the second communication unit of the second unit 14 b; 76 a second computer; 78 a database; 80 plug connection; 82a left clamping arm; 82b, a right clamping arm; 84 first step: determining necessary measures; 86 second step: selecting a mobile device 14 from the available mobile devices 14; 88 a third step: connecting the unmanned aerial vehicle 12 with the device; 90 fourth step: executing the determined measure; 92, a fifth step: replacing the mobile device 14 with another mobile device 14 and performing the next action; 94 first step: continuously receiving; 96 second step: transmitting the data; 98 a third step: storing the data; 100, a fourth step: comparing the data; 102 fifth step: evaluated by the classifier 68; 104, sixth step: converting the technical data into adjustment and control data; 106, a seventh step: starting the unit; 108 example image, input image; 110 plants; 112 convolution; 114, pooling; 116 summarized in dense layers; 118 background; 120 a first step: conversion to RGB (red, green, blue) color mode; 122 second step: transition to HSV (hue, saturation, lightness) color mode; 124, a third step: evaluating the HSV image; 126 fourth step: generating an intermediate image; 128 first instance, red; 130 second case, yellow; 132 third case, blue; 134 characteristics; 136 left plant pixel; 138 right plant pixel; 140 blue region; 142 purple region; 144 red region; 146 orange region; 148 feature, green; 150 center black frame; 152 a carrot plant; 154 weeds, weeds; 156 left image element; 158 red border; 160 right image element

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:收割机、周围状况检测系统、周围状况检测程序、记录有周围状况检测程序的记录介质、以及周围状况检测方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!