Combine harvester, method for generating field agricultural operation map, program for generating field agricultural operation map, and recording medium having recorded program for generating field ag

文档序号:1722267 发布日期:2019-12-17 浏览:45次 中文

阅读说明:本技术 联合收割机、田地农业经营地图生成方法、田地农业经营地图生成程序及记录有田地农业经营地图生成程序的记录介质 (Combine harvester, method for generating field agricultural operation map, program for generating field agricultural operation map, and recording medium having recorded program for generating field ag) 是由 北原麻央 高原一浩 于 2018-06-22 设计创作,主要内容包括:本发明期望能够将倒伏谷秆的检测利用于未来的农业经营计划的信息管理技术。联合收割机具备:算出机体的地图坐标即机体位置的机体位置计算部(66);在收获作业时拍摄田地的拍摄部(70);被输入由拍摄部(70)获得的拍摄图像的图像数据,推定拍摄图像中的倒伏谷秆区域,并输出用于表示推定的倒伏谷秆区域的识别输出数据的图像识别模块(5);输出评价依次收获的农作物而得到的每单位行驶的农作物评价值的评价模块(4A);根据机体位置和识别输出数据生成表示倒伏谷秆区域的地图上的位置的倒伏谷秆位置信息的倒伏谷秆位置信息生成部(51);以及根据收获农作物的时间点的机体位置和农作物评价值而生成收获信息的收获信息生成部(4B)。(The present invention is expected to be able to utilize the detection of fallen grain stalks in information management technology for future agricultural operation plans. A combine harvester is provided with: a body position calculation unit (66) for calculating the body position, which is the map coordinates of the body; an imaging unit (70) that images a field during harvesting operations; an image recognition module (5) which receives image data of a captured image obtained by the imaging unit (70), estimates a fallen straw region in the captured image, and outputs recognition output data indicating the estimated fallen straw region; an evaluation module (4A) for outputting an evaluation value of crops for each unit of travel obtained by evaluating crops harvested in sequence; a lodging straw position information generating part (51) for generating lodging straw position information representing the position on the map of the lodging straw area according to the body position and the identification output data; and a harvest information generation unit (4B) that generates harvest information from the body position at the time point when the crop is harvested and the crop evaluation value.)

1. a combine harvester for harvesting crops while traveling in a field, comprising:

A body position calculation unit that calculates a body position, which is a map coordinate of the body, based on positioning data from the satellite positioning module;

an imaging unit provided in the machine body and configured to image the field during harvesting operation;

An image recognition module that receives image data of captured images sequentially obtained by the imaging unit in a time-series manner, estimates a fallen straw region in the captured images, and outputs recognition output data indicating the estimated fallen straw region;

An evaluation module that outputs a crop evaluation value per unit travel obtained by evaluating the crops harvested in sequence;

A lodging straw position information generating unit that generates lodging straw position information indicating a position on a map of the lodging straw region based on the body position at the time point when the captured image is acquired and the recognition output data; and

A harvesting information generating unit that generates harvesting information based on the body position at the time point when the crop is harvested and the crop evaluation value.

2. A combine harvester according to claim 1,

the crop evaluation value includes yield or taste or both.

3. A combine harvester according to claim 1 or 2,

And a field agricultural operation map generation unit configured to generate a field agricultural operation map by integrating the lodging straw position information and the harvest information on a map, the control system or the cloud computer system being built in the machine body.

4. A method for generating a field agricultural operation map comprises the following steps:

Outputting identification output data indicating a fallen straw region estimated based on an image captured by an image capturing unit provided in the combine harvester;

Generating lodging straw position information indicating a position on a map of the lodging straw region based on the body position at the time point when the captured image is acquired and the recognition output data;

Outputting a crop evaluation value per unit travel obtained by evaluating crops sequentially harvested by performing work travel on a field by the combine harvester;

A step of generating harvest information from the body position at the time point of harvesting the crop and the crop evaluation value; and

And integrating the lodging straw position information and the harvesting information on a map to generate a field agricultural operation map.

5. A field agricultural operation map generation program that causes a computer to realize:

a function of outputting identification output data representing a fallen straw region estimated based on a captured image obtained by a capturing unit provided in the combine harvester;

A function of generating lodging straw position information indicating a position on a map of the lodging straw region from the body position at the time point when the captured image is acquired and the recognition output data;

A function of outputting a crop evaluation value per unit travel obtained by evaluating crops sequentially harvested by performing work travel on a field by the combine harvester;

A function of generating harvest information from the body position at the time point of harvesting the crop and the crop evaluation value; and

And integrating the lodging straw position information and the harvesting information on a map to generate a field agricultural operation map.

6. A recording medium having recorded thereon a field agricultural operation map generation program that causes a computer to realize:

A function of outputting identification output data representing a fallen straw region estimated based on a captured image obtained by a capturing unit provided in the combine harvester;

a function of generating lodging straw position information indicating a position on a map of the lodging straw region from the body position at the time point when the captured image is acquired and the recognition output data;

A function of outputting a crop evaluation value per unit travel obtained by evaluating crops sequentially harvested by performing work travel on a field by the combine harvester;

A function of generating harvest information from the body position at the time point of harvesting the crop and the crop evaluation value; and

And integrating the lodging straw position information and the harvesting information on a map to generate a field agricultural operation map.

Technical Field

The present invention relates to a combine harvester that can harvest crops while traveling in a field and perform agricultural operation support based on a captured image obtained by an imaging unit, and a method for generating a field agricultural operation map using information obtained by the combine harvester.

Background

in the cutting operation of the combine harvester, the vertical planted grain stalks in the cutting period not only have the vertical grain stalks, but also become the lodging grain stalks. In the harvesting operation of the planted standing grain stalks in such a fallen state, a control different from the harvesting operation of the planted standing grain stalks in a standing state is required. For example, a combine harvester of patent document 1 includes a television camera and an image processing device for capturing an image of a grain stalk in front of a harvesting unit. The image processing device compares an image from the television camera with images stored in advance and indicating the standing states of various grain stalks, and detects the standing state of the grain stalks. At this time, when it is detected that a part of the grain straw in front of the cutting part is fallen, the reel is inclined with the falling side of the grain straw as the lower side. This is intended to improve the harvesting performance of the fallen straw.

In the combine harvester of patent document 2, evaluation and determination of the lodging degree of the planted straw is performed before harvesting based on the power spectrum distribution obtained based on the captured image of the electronic camera obtained at the time of harvesting. The threshing load is adjusted by controlling the speed of the vehicle and the like in a timely manner according to the degree of lodging, thereby realizing smooth threshing work.

disclosure of Invention

Problems to be solved by the invention

in the combine harvesters of patent documents 1 and 2, the fallen straws are detected during the harvesting operation of the crops, and the operation travel control is adjusted based on the detection result. Although the harvesting of the crops is repeated every year, the information on the detection of the fallen straw during the harvesting operation is only used for the operation. For example, if the fallen straw is generated due to local excessive fertilizer (excessive nitrogen) or sunshine, the information indicating the position of the fallen straw can be used in the agricultural operation plan, which is the next plan of cultivation and harvesting of crops, to reduce the fallen straw.

In view of such circumstances, it is desirable to be able to utilize the detection of fallen straws in information management technology for future agricultural operation planning.

means for solving the problems

the present invention provides a combine harvester for harvesting crops while traveling in a field, comprising: a body position calculation unit that calculates a body position, which is a map coordinate of the body, based on positioning data from the satellite positioning module; an imaging unit provided in the machine body and configured to image the field during harvesting operation; an image recognition module that receives image data of captured images sequentially obtained by the imaging unit in a time-series manner, estimates a fallen straw region in the captured images, and outputs recognition output data indicating the estimated fallen straw region; an evaluation module that outputs a crop evaluation value per unit travel obtained by evaluating the crops harvested in sequence; a lodging straw position information generating unit that generates lodging straw position information indicating a position on a map of the lodging straw region based on the body position at the time point when the captured image is acquired and the recognition output data; and a harvest information generation unit that generates harvest information from the body position at the time point when the crop is harvested and the crop evaluation value.

in the present invention, when a fallen straw is displayed in a captured image, the image recognition module estimates a fallen straw region from image data that is the captured image. Further, since the body position indicated by the map coordinates at the time point when the captured image is acquired is calculated by the body position calculating section, the lodging straw position information indicating the position on the map of the lodging straw is generated from the body position and the identification output data indicating the lodging straw region. At the same time, an evaluation value of crops to be harvested for each unit of travel of the crops to be harvested in sequence is obtained, and harvest information is generated based on the position of the body at the time point when the crops are harvested and the evaluation value of the crops. As a result, the distribution of the fallen straw regions on the map can be confirmed from the fallen straw position information, and the distribution of the crop evaluation values of the crops on the map can be confirmed from the harvest information. By comparing the distribution of the lodging grain stalk regions in the field with the distribution of the crop evaluation values, the fertilizing amount of the lodging grain stalk regions can be reduced or the planting amount can be adjusted in the next crop cultivation. By using the combine harvester of the present invention, it is possible to obtain information for supporting the next agricultural operation plan while performing control of the harvest operation support in consideration of the position on the map of the fallen straw (the distance between the fallen straw and the combine harvester).

In the harvesting of crops, the yield, i.e., the yield and the taste of the harvested crops are important evaluation variables. If the crop is wheat or rice, the yield of grains put into the grain box per travel distance (per hour) and the moisture or protein content of grains harvested per travel distance (per hour) can be measured in this order. Therefore, in a preferred embodiment of the present invention, the crop evaluation value includes yield, taste, or both of them. This makes it possible to grasp the position of the fallen straw region in the field, and the yield and taste depending on the field position.

In a preferred embodiment of the present invention, the field agriculture operation map generation unit for generating the field agriculture operation map by integrating the lodging straw position information and the harvest information on the map is built in a control system, a cloud computer system, a server installed at a remote location, or the like in the machine body. Such a field agricultural operation map can be generated by combining an lodging straw map representing the distribution of lodging straw regions of the division units in the field and a harvest map representing the yield, taste and distribution of the division units in the field in such a manner that the map coordinates or the field coordinates coincide. When the field agricultural operation map is generated by the combine or a communication terminal (a liquid crystal display, a tablet computer, a smartphone, or the like) attached to the combine, the field agricultural operation map can be used anytime and anywhere by uploading the field agricultural operation map to the cloud computer system. By using the field agricultural operation map, the evaluation value of crops in the lodging straw region can be analyzed in a division unit. For example, if the crop evaluation value is yield, the yield difference between the lodging grain stalk region and the non-lodging grain stalk region can be clearly and easily grasped from such a field agricultural operation map, and can be referred to in an agricultural operation plan such as a fertilizer application plan or the like later.

further, the information on the position of the lodging straw and the harvest information generated by the combine harvester can be uploaded to the cloud computer system, and the agricultural operation map of the field can be generated on the cloud computer system side. The cloud computer system is a generic term for a system that aggregates or distributes various information services using a computer network and provides the information services to users, and includes existing server/client systems, personal information exchange systems, and the like.

In the present invention, the method of creating a field agricultural operation map for creating a field agricultural operation map as described above is also an object of the present invention. The invention relates to a method for generating a field agricultural operation map, which comprises the following steps: outputting identification output data indicating a fallen straw region estimated based on a captured image obtained by an imaging unit provided in the combine harvester; generating lodging straw position information indicating a position on a map of the lodging straw region from the body position at the time point when the captured image is acquired and the recognition output data; outputting a crop evaluation value per unit travel obtained by evaluating crops sequentially harvested by performing work travel on a field by the combine harvester; a step of generating harvest information from the body position at the time point of harvesting the crop and the crop evaluation value; and integrating the lodging straw position information and the harvesting information on a map to generate a field agricultural operation map.

According to the present invention, when the combine is driven for work, the lodging straw position information and the harvest information are generated, and then the agricultural operation map of the field is generated only by integrating the lodging straw position information and the harvest information on the map. Further, if the lodging straw position information and the harvest information are generated using map data that is a common basis, there is no need to integrate the lodging straw position information and the harvest information on a map.

Further, a field agricultural operation map generation program according to the present invention causes a computer to realize: a function of outputting identification output data representing a fallen straw region estimated based on a captured image obtained by a capturing unit provided in the combine harvester; a function of generating lodging straw position information indicating a position on a map of the lodging straw region from the body position at the time point when the captured image is acquired and the recognition output data; a function of outputting a crop evaluation value per unit travel obtained by evaluating crops sequentially harvested by performing work travel on a field by the combine harvester; a function of generating harvest information from the body position at the time point of harvesting the crop and the crop evaluation value; and integrating the lodging straw position information and the harvesting information on a map to generate a field agricultural operation map.

Further, a recording medium on which a field agricultural operation map generation program according to the present invention is recorded, the field agricultural operation map generation program causing a computer to realize: a function of outputting identification output data representing a fallen straw region estimated based on a captured image obtained by a capturing unit provided in the combine harvester; a function of generating lodging straw position information indicating a position on a map of the lodging straw region from the body position at the time point when the captured image is acquired and the recognition output data; a function of outputting a crop evaluation value per unit travel obtained by evaluating crops sequentially harvested by performing work travel on a field by the combine harvester; a function of generating harvest information from the body position at the time point of harvesting the crop and the crop evaluation value; and integrating the lodging straw position information and the harvesting information on a map to generate a field agricultural operation map.

Drawings

fig. 1 is an overall side view of a combine harvester.

Fig. 2 is a schematic diagram illustrating a process of measuring taste and yield and generating harvest information indicating taste and yield per unit travel.

fig. 3 is a functional block diagram showing the functions of a control system of the combine harvester.

Fig. 4 is an explanatory diagram schematically showing a flow of generation of recognition output data by the image recognition module.

Fig. 5 is a data flow chart showing a flow from image capturing to data generation of a field agricultural operation map.

Fig. 6 is a schematic diagram showing an example of a field agricultural operation map.

Detailed Description

Hereinafter, an embodiment of a combine harvester as an example of the harvester of the present invention will be described based on the drawings. In this embodiment, when the front-rear direction of the machine body 1 is defined, the definition is made along the machine body traveling direction in the working state. The direction indicated by the reference symbol (F) in fig. 1 is the front side of the machine body, and the direction indicated by the reference symbol (B) in fig. 1 is the rear side of the machine body. When the left-right direction of the body 1 is defined, the left and right are defined in a state viewed in the body traveling direction.

As shown in fig. 1, in a combine harvester, a harvesting unit 2 is coupled to a front portion of a machine body 1 including a pair of right and left crawler travel devices 10 so as to be capable of ascending and descending around a transverse axis X. The rear part of the machine body 1 is provided with a threshing device 11 and a grain tank 12 for storing grains in a state of being arranged along the transverse width direction of the machine body. A cab 14 covering a cab is provided in a front right portion of the machine body 1, and an engine 15 for driving is provided below the cab 14.

As shown in fig. 1, the threshing device 11 receives the harvested grain stalks harvested by the harvesting unit 2 and conveyed rearward in the interior, and threshes the ear tips in the threshing cylinder 113 while holding the roots of the transported grain stalks between the threshing feed chain 111 and the clamp rail 112. Then, the grain sorting process for the threshing processed object is performed by a sorting unit provided below the threshing cylinder 113. Then, the grains sorted in the sorting unit are transferred to the grain box 12, and stored in the grain box 12. Although not described in detail, the grain discharging device 13 is provided to discharge the grains stored in the grain tank 12 to the outside.

The harvesting section 2 includes a plurality of raising devices 21, a pusher-type cutting device 22, a straw conveying device 23, and the like. The standing device 21 stands the lodging standing grain stalks. The cutting device 22 cuts the roots of the standing grain stalks. The grain straw conveying device 23 gradually changes the vertical posture of the cut grain straw into a horizontal posture, and conveys the grain straw to the starting end of the threshing feed chain 111 of the threshing device 11 located on the rear side of the machine body.

The straw conveying device 23 includes a converging conveyor 231, a root gripping conveyor 232, a spike locking conveyor 233, a supply conveyor 234, and the like. The converging conveyor 231 conveys the plurality of harvested straws harvested by the cutting device 22 while collecting them toward the center in the harvesting width direction. The root holding and conveying device 232 holds the collected roots of the harvested straws and conveys the roots to the rear. The ear tip retaining and conveying device 233 retains and conveys the ear tip side of the harvested grain stalks. The feeding conveyor 234 guides the root of the harvested straw to the threshing feed chain 111 from the terminal end of the root holding conveyor 232.

An imaging unit 70 having a color camera is provided at the front end of the ceiling portion of the cab 14. The front-rear direction extension of the imaging field of view of the imaging unit 70 reaches approximately the horizon from the front end region of the cutting unit 2. The width-directional extension of the imaging field of view reaches several tens of meters from about 10 meters. The captured image obtained by the imaging unit 70 is converted into image data and sent to a control system of the combine harvester.

The image pickup unit 70 picks up an image of the field at the time of harvesting operation. The control system of the combine harvester has a function of recognizing the fallen straw as a recognition object based on the image data transmitted from the image pickup unit 70. In fig. 1, the normal standing straw group is denoted by the symbol Z0, and the lodging straw group is denoted by the symbol Z2.

A satellite positioning module 80 is also provided on the ceiling portion of the cab 14. The satellite positioning module 80 includes an antenna for a satellite for receiving a GNSS (global navigation satellite system) signal (including a GPS signal). To supplement the satellite navigation of the satellite positioning module 80, an inertial navigation unit incorporating a gyro acceleration sensor or a magnetic orientation sensor is incorporated into the satellite positioning module 80. Of course, the inertial navigation unit can be disposed at different locations. In fig. 1, for convenience of illustration, the satellite positioning module 80 is disposed at the rear of the ceiling portion of the cab 14. However, the satellite positioning module 80 is preferably disposed at a position closer to the center side of the body of the front end portion of the ceiling portion so as to be as close as possible to a position directly above the center portion of the cutting device 22.

The combine harvester has a function of calculating and outputting a grain yield and a grain taste as a crop evaluation value per unit travel obtained by evaluating crops harvested in sequence. Specifically, as shown in fig. 2, the amount of grain (i.e., yield) and the taste (moisture, protein, etc.) of the grain supplied from the threshing device 11 to the grain tank 12 over time are measured, and based on the measurement results, the evaluation module 4A calculates and outputs the yield and taste as the evaluation value of the crop.

In this embodiment, a yield measuring unit 120 for measuring yield and a taste measuring unit 125 for measuring taste (here, measurement of moisture and protein components) are provided in the grain box 12. The yield measuring unit 120 is provided in the end region of the grain supply line 130 for connecting the threshing device 11 and the grain tank 12. The grain tank inner pipe portion of the supply pipe 130 is equipped with a screw conveyor 131 rotating around the axis PX. The end of the casing 132 of the screw conveyor 131 serves as a casing of the throughput measuring unit 120, and is formed with an opening that functions as the discharge port 122 of the throughput measuring unit 120. The yield measuring unit 120 includes a discharging rotor 121 that rotates around an axis PX to discharge grains conveyed by the screw conveyor 131, and a load cell structure 123 that detects a load generated when the grains are discharged. The load applied to the load cell structure 123 by the grain discharged from the discharge opening 122 by the discharge rotary body 121 is correlated with the discharge amount (i.e., yield) of the grain per rotation of the discharge rotary body 121.

in the yield estimation processing performed in the evaluation module 4A, the yield per unit time is estimated from the rotation speed signal of the discharging rotating body 121 and the load cell signal of the load cell structure 123. Further, based on the yield per unit time and the travel speed, the unit travel yield is estimated and output as a crop evaluation value.

The taste measurement unit 125 obtains measurement values related to moisture and protein components by performing spectral analysis on a light beam returned by irradiating a grain with a light beam. In this embodiment, the taste measurement unit 125 includes a cylindrical container 129, and the cylindrical container 129 has a receiving opening 126 for receiving at least a part of the grain discharged by the yield measurement unit 120, and a discharge opening 127 for discharging the received grain. Further, a shutter 128 is provided in the cylindrical container 129. The shutter 128 can temporarily store the grains received through the receiving port 126 or circulate the grains to the discharge port 127 by opening and closing the shutter.

In the taste estimation process executed by the evaluation module 4A, when the gate 128 is changed to the storage (closed) posture and a predetermined amount of grains are stored in the cylindrical container 129, taste measurement by the spectrum measurement method is started, and a taste value is estimated and output as a crop evaluation value based on the measurement value. When the taste measurement is completed, the gate 128 is changed to the discharge (open) position, and the stored grains are discharged. After that, the gate 128 is returned to the storable posture, and the next taste estimation is performed, and the estimated taste values are sequentially output as the crop evaluation values.

The unit travel yield estimated in the yield estimation process is associated with the travel locus of the machine body 1 obtained in the harvest information generation unit 4B from the machine body position calculated by the machine body position calculation unit 66. The yield is thereby recorded in succession as the harvesting operation of the combine is driven.

The taste value estimated in the taste estimation process is also associated with the travel locus of the body 1 obtained by the harvest information generation unit 4B from the body position calculated by the body position calculation unit 66. Thus, the taste values are sequentially recorded as the combine is driven for harvesting.

As a result, the unit travel yield and the unit travel taste value are associated with the unit travel distance in the field (indicated by a subscripted P in fig. 2) as the harvest information. The harvesting position is calculated from the body position calculated based on the positioning data from the satellite positioning module 80, and is thus a position on the map that can be represented by an absolute azimuth position represented by latitude or a coordinate position of field coordinates. Therefore, a yield map and a taste map representing the distribution of yield and taste per travel distance (as a result, each micro-segment of the field) in the field can be generated based on the harvest information.

In order to obtain a harvest position from the body position estimated based on the positioning data from the satellite positioning module 80, the distance between the antenna of the satellite positioning module 80 and the harvesting unit 2 and the delay amount in time from harvesting of the grain stalks to measurement of the grain yield and measurement of the taste are set in advance.

fig. 3 shows a functional block diagram of a control system built inside the body 1 of the combine harvester. The control system according to this embodiment is composed of a plurality of electronic control units called ECUs, various operating devices, a sensor group, a switch group, and a wiring network such as an in-vehicle LAN for transmitting data therebetween. The notification device 91 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display screen, or the like. The communication unit 92 is used for data exchange between the control system of the combine harvester and the cloud computer system 100 or the mobile communication terminal 200 provided at a remote location. Here, the mobile communication terminal 200 is a tablet computer operated by a monitor (including a driver) at a work travel site. The control unit 6 is a core element of the control system, and is represented as an aggregate of a plurality of ECUs. The positioning data from the satellite positioning module 80 and the image data from the imaging unit 70 are input to the control unit 6 through the wiring network.

The control unit 6 includes an output processing unit 6B and an input processing unit 6A as input/output interfaces. The output processing unit 6B is connected to the vehicle travel device group 7A and the work equipment device group 7B. The vehicle travel device group 7A includes control devices (e.g., an engine control device, a gear shift control device, a brake control device, a steering control device, etc.) related to vehicle travel. The working device equipment group 7B includes: a cutting part 2, a threshing device 11, a grain discharge device 13, a power control device of a grain stalk conveying device 23, and the like.

the input processing unit 6A is connected to a traveling system detection sensor group 8A, an operating system detection sensor group 8B, and the like. The travel system detection sensor group 8A includes sensors for detecting states of an engine speed adjusting tool, an accelerator pedal, a brake pedal, a shift operating tool, and the like. The work system detection sensor group 8B includes sensors for detecting the device states of the harvesting unit 2, the threshing device 11, the grain discharge device 13, and the grain straw conveying device 23, and the states of the grain straw and the grain.

The control unit 6 includes: the work travel control module 60, the image recognition module 5, the data processing module 50, the body position calculation unit 66, the notification unit 67, and the evaluation module 4A and the harvest information generation unit 4B described with reference to fig. 2.

the notification section 67 generates notification data based on instructions from the functional sections of the control unit 6 and the like, and supplies the notification data to the notification device 91. The body position calculation unit 66 calculates the body position, which is the map coordinate (or field coordinate) of the body 1, based on the positioning data sequentially transmitted from the satellite positioning module 80.

The combine harvester according to the embodiment can travel by both automatic travel (automatic steering) and manual travel (manual steering). The work travel control module 60 includes not only the travel control unit 61 and the work control unit 62 but also an automatic work travel command unit 63 and a travel route setting unit 64. A travel mode switch (not shown) for selecting either an automatic travel mode for traveling with automatic steering or a manual steering mode for traveling with manual steering is provided in the cab 14. By operating the travel mode switch, it is possible to switch from the manual steering travel to the automatic steering travel or from the automatic steering travel to the manual steering travel.

The travel control unit 61 has an engine control function, a steering control function, a vehicle speed control function, and the like, and supplies a travel control signal to the vehicle travel device group 7A. The work control unit 62 supplies a work control signal to the work equipment group 7B in order to control the operations of the harvesting unit 2, the threshing device 11, the grain discharging device 13, the grain straw conveying device 23, and the like.

When the manual steering mode is selected, the travel control unit 61 generates a control signal based on the operation of the driver, and controls the vehicle travel device group 7A. When the automatic steering mode is selected, the travel control unit 61 controls the vehicle travel device group 7A related to steering and the vehicle travel device group 7A related to vehicle speed based on the automatic travel command provided from the automatic work travel command unit 63.

the travel route setting unit 64 loads the travel route for automatic travel created by any one of the control unit 6, the mobile communication terminal 200, the cloud computer system 100, and the like, in the memory. The travel route loaded in the memory is in turn used as a target travel route in automatic travel. This travel route can be used for guidance for causing the combine harvester to travel along the travel route even during manual travel.

More specifically, the automatic work travel command unit 63 generates an automatic steering command and a vehicle speed command, and supplies the commands to the travel control unit 61. The automatic steering command is generated so as to eliminate the bearing deviation and misalignment between the travel route loaded by the travel route setting unit 64 and the vehicle position calculated by the body position calculating unit 66. The vehicle speed command is generated based on a preset vehicle speed value. The automatic work travel command unit 63 provides the work control unit 62 with a work device operation command according to the vehicle position and the traveling state of the vehicle.

The image data of the captured images sequentially obtained by the imaging unit 70 in succession is input to the image recognition module 5. The image recognition module 5 estimates an existence region where the recognition target object exists in the captured image, and outputs recognition output data including the existence region and an estimated probability at the time of estimating the existence region as a recognition result. The image recognition module 5 is constructed using a neural network technique employing deep learning.

The flow of the generation of the recognition output data by the image recognition module 5 is shown in fig. 4 and 5. In the image recognition module 5, pixel values of RGB image data are input as input values. In this embodiment, the estimated authentication object is a region where the fallen straw exists (hereinafter, referred to as a fallen straw region). Therefore, the recognition output data as the recognition result includes the fallen straw region represented by a rectangle and the estimated probability at the time of estimating the fallen straw region.

In fig. 4, the estimation result is modeled, and the lodging stalk region is represented by a rectangular frame marked with a symbol F2. The lodging stalk regions are defined by 4 corners, respectively, but the coordinate positions on the captured image of the 4 corners of each rectangle are also included in the estimation result. Of course, if the fallen straw is not estimated as the authentication target, the fallen straw region is not outputted, and the estimation probability is zero.

In this embodiment, the image recognition module 5 sets the internal parameters such that the recognition target (lodging straw) is located at a position farther from the imaging unit 70 in the captured image, and the estimated probability of the recognition target is reduced. This makes recognition of the recognition target object in the imaging region where the resolution is lowered due to the distance from the imaging unit 70 strict, and reduces erroneous recognition.

The data processing module 50 processes the recognition output data output from the image recognition module 5. As shown in fig. 3 and 5, the data processing module 50 of this embodiment includes a lodging stalk position information generating unit 51 and a statistical processing unit 52.

The lodging stalk position information generating unit 51 generates lodging stalk position information indicating a position on the map of the recognition target object from the body position at the time point when the captured image is acquired and the recognition output data. The position on the map where the lodging straw included in the recognition output data is located is obtained by converting the coordinate position (camera coordinate position) on the photographed image of 4 corner points of the rectangle representing the lodging straw into the coordinate on the map.

The imaging unit 70 acquires the captured images at predetermined time intervals (for example, at 0.5 second intervals) and inputs the image data to the image recognition module 5, so that the image recognition module 5 also outputs recognition output data at the same time intervals. Therefore, when a fallen straw enters the imaging field of the imaging unit 70, a plurality of pieces of identification output data usually include the presence region for the same fallen straw. As a result, position information of a plurality of lodging rice straws for the same lodging rice straw is obtained. At this time, the estimated probability included in the identification output data as each raw data (i.e., the estimated probability of the presence region of the fallen straw included in the fallen straw position information) is often different in value depending on the positional relationship between the imaging unit 70 and the fallen straw.

Therefore, in this embodiment, such a plurality of pieces of lodging stalk position information are stored, and the estimation probability included in each of the stored plurality of pieces of lodging stalk position information is statistically calculated. A representative value of the estimated probability group is obtained by using a statistical operation of estimated probabilities for a plurality of pieces of identification target position information. Using the representative value, it is possible to correct the position information of the plurality of recognition objects to one optimum recognition object position information. An example of such correction is to obtain an arithmetic average value, a weighted average value, or an intermediate value of each estimated probability as a reference value (representative value), obtain a logical sum of the presence regions having estimated probabilities equal to or higher than the reference value, and generate lodging straw correction position information having the logical sum as the optimal presence region. Of course, it is also possible to generate one piece of lodging stalk position information with high reliability using statistical calculation other than the above.

By using the thus obtained information on the position of the fallen straw on the map indicating the position of the fallen straw region, the running work control and the warning notification set in advance are performed when the fallen straw is recognized.

As described above, the evaluation module 4A estimates the taste value of grains (crop evaluation value) by the taste estimation process, and estimates the yield of grains (crop evaluation value) by the yield estimation process. The taste value and the yield output from the evaluation module 4A sequentially as the work travels are supplied to the harvest information generation unit 4B. The harvest information generating unit 4B generates harvest information by recording the sequentially provided taste value and yield in association with the travel locus of the machine body 1.

In this embodiment, the harvest information generated by the harvest information generation unit 4B and the lodging straw position information generated by the lodging straw position information generation unit 51 are uploaded to the cloud computer system 100 through the communication unit 92. The cloud computer system 100 is provided with a field agricultural operation map generation unit 101 for generating a field agricultural operation map by integrating the information on the position of the fallen straw and the information on the harvest on the map.

Fig. 6 schematically shows an example of a field agricultural operation map. The agricultural operation map of the field includes an lodging grain stalk map in which an existing region (indicated by oblique lines) of lodging grain stalks is assigned to micro compartments set in the field, a yield map in which yields (indicated by q11 and … …) are assigned to the same micro compartments, and a taste map in which taste values (indicated by w11 and … …) are assigned to the same micro compartments. The agricultural operation map for the field also includes a fertilization plan map in which the next fertilization amount (indicated by f11 and … …) in the same micro section is recorded. In the example of fig. 6, the same size of the micro regions in each map is used, but different sizes of the micro regions may be used.

In the estimation of the type of fertilizer and the amount of fertilizer applied in the same micro-segments required for the generation of the fertilization plan map, the yield and taste values in the region where the lodging grain stalks are present and the region where the lodging grain stalks are present are referred to suppress the growth of grains to some extent, since the lodging grain stalks are mainly generated due to an excess of fertilizer.

The estimation of the type of fertilizer and the amount of fertilizer applied may be performed automatically by computer software or by an agricultural operator who observes an agricultural map. Alternatively, a semi-automatic method may be adopted in which the agricultural operator corrects the fertilizing amount estimated by the computer software.

note that the configurations disclosed in the above-described embodiments (including other embodiments, the same below) can be combined with the configurations disclosed in the other embodiments without contradiction, and the embodiments disclosed in the present specification are examples, and the embodiments of the present invention are not limited thereto, and can be changed as appropriate within a range not departing from the object of the present invention.

[ other embodiments ]

(1) In the above-described embodiment, the fallen grain straw is set as the recognition target object recognized by the image recognition module 5, but another recognition target object (for example, a weed group growing higher than the planted grain straw, an obstacle such as a person, or the like) may be additionally set. In this case, the work travel control module 60 is configured to perform necessary control in response to recognition of a weed group or an obstacle.

(2) In the above-described embodiment, the image recognition module 5 is constructed using a neural network technique of a deep learning type. Alternatively, an image recognition module 5 constructed using other machine learning techniques may be employed.

(3) In the above-described embodiment, the image recognition module 5, the data processing module 50, the evaluation module 4A, and the harvest information generation unit 4B are integrated in the control unit 6 of the combine harvester, but some or all of them may be configured in a control unit (for example, the mobile communication terminal 200 or the like) independent of the combine harvester.

(4) The functional sections shown in fig. 3 are distinguished primarily for illustrative purposes. In fact, each functional unit may be integrated with another functional unit, or may be further divided into a plurality of functional units.

Industrial applicability

The harvester of the present invention having the functions of photographing a field and calculating a position of a machine body can be applied not only to a combine harvester for harvesting rice, wheat, and the like but also to a combine harvester for harvesting other crops such as corn, and a harvester for harvesting carrots, and the like.

Description of the symbols

1: machine body

2: cutting part

4A: evaluation module

4B: harvesting information generating part

5: image recognition module

50: data processing module

51: lodging straw position information generating part

52: statistical processing unit

57: body position calculating section

6: control unit

6A: input processing unit

6B: output processing unit

60: operation driving control module

61: running control unit

62: work control unit

63: automatic work travel command unit

64: travel route setting unit

66: body position calculating section

70: image pickup unit

80: satellite positioning module

91: notification device

120: yield measuring container

125: food flavor measuring container

100: cloud computer system

101: agricultural operation map generation part for field

200: mobile communication terminal

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:施工现场管理装置、输出装置及施工现场的管理方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!