Forward looking sensing and mechanical control during crop harvesting operations

文档序号:91201 发布日期:2021-10-12 浏览:36次 中文

阅读说明:本技术 作物收割操作期间的前视感知和机械控制 (Forward looking sensing and mechanical control during crop harvesting operations ) 是由 瑞安·R·怀特 阿什拉芙·卡迪尔 于 2021-03-16 设计创作,主要内容包括:一种计算机执行的方法,包括:获取收割数据,该收割数据指示田地上的先验收割操作;基于该收割数据,识别田地中已收割过作物植株的区;检测在农业收割机的沿行进方向的路径上的田地中的作物植株的特征,基于所检测到的特征,生成高度度量,该高度度量表示田地的特定区域上的作物植株的高度,以及基于该高度度量和所识别的区生成控制信号,以控制农业收割机。(A computer-implemented method, comprising: acquiring harvest data indicative of prior harvest operations on a field; identifying, based on the harvest data, a region of the field in which the crop plants have been harvested; the method comprises detecting a characteristic of crop plants in a field on a path of the agricultural harvester along a direction of travel, generating, based on the detected characteristic, a height metric representing a height of the crop plants over a particular region of the field, and generating a control signal to control the agricultural harvester based on the height metric and the identified region.)

1. A computer-implemented method (500), comprising:

acquiring (514) harvest data indicative of a priori harvest operations on a field;

identifying (530) regions in the field where crop plants have been harvested based on the harvest data;

detecting (534) a characteristic of a crop plant in a field on a path of the agricultural harvester (100) along a direction of travel;

generating (546) an elevation metric based on the detected features, the elevation metric representing an elevation of the crop plants over a particular region of the field; and

generating a control signal (604) based on the height metric and the identified zone to control the agricultural harvester.

2. The computer-implemented method of claim 1, further comprising:

identifying the crop plants on the particular region of the field as falling crops based on the harvesting data and a determination that the height metric indicative of crop height is below a threshold; and

generating the control signal to control the agricultural harvester based on identification of a crop that falls over the particular region of the field.

3. The computer-implemented method of claim 1, wherein generating a control signal comprises controlling a controllable subsystem of the agricultural harvester.

4. The computer-implemented method of claim 3, further comprising generating a recommendation to alter operation of the agricultural harvester.

5. The computer-implemented method of claim 4, wherein the recommendation is generated based on the selected crop production capacity.

6. The computer-implemented method of claim 4, wherein the recommendation comprises at least one of:

a change in ground speed of the agricultural harvester; or

A change in harvesting function on the agricultural harvester.

7. The computer-implemented method of claim 4, wherein generating a control signal comprises: controlling a display device to provide an indication of the recommendation.

8. The computer-implemented method of claim 1, wherein the height metric comprises an average height of the crop plants over the particular area.

9. The computer-implemented method of claim 1, further comprising:

generating a representation of the amount of crop plant material on the path of the agricultural harvester; and

generating the control signal based on the amount of crop plant material.

10. The computer-implemented method of claim 9, wherein the representation of the quantity of crop plants comprises an indication of biomass.

11. The computer-implemented method of claim 1, wherein detecting the characteristic of the crop plant comprises:

receiving image data of crop plants in the particular region;

applying a geometric classifier to the image data; and

determining the height metric based on the geometric classifier.

12. The computer-implemented method of claim 1, further comprising:

identifying a plurality of regions of the field; and

for each given region of said field,

generating a corresponding height metric representing a height of a crop plant above a field surface over the given area;

comparing the corresponding height metric to a threshold;

classifying the given region based on the comparison; and

generating a control signal corresponding to the given region based on the classification.

13. The computer-implemented method of claim 12, wherein classifying the given region comprises:

classifying the given area as a standing crop if the corresponding height metric value is above the threshold; and

classifying the given area as a non-standing crop if the corresponding height metric value is below the threshold, including:

classifying the given region as crop stubble based on a determination that the given region was harvested during the a priori harvesting operation; and

classifying the given region as a fallen crop based on a determination that the given region was not harvested during the a priori harvesting operation.

14. An agricultural harvester (100), comprising:

a controllable subsystem (222);

a forward looking crop sensor (121), said forward looking crop sensor (121) configured to generate a sensor signal indicative of a detected characteristic of crop plants in a field on a path of an agricultural harvester along a direction of travel;

a crop state determination logic system (288), the crop state determination logic system (288) configured to:

obtaining harvest data indicative of prior harvest operations on the field;

identifying, based on the harvest data, regions in the field where crop plants have been harvested;

generating a height metric based on the detected features, the height metric representing a height of crop plants on a particular region of the field; and

a control system (202), the control system (202) configured to control the controllable subsystem based on the height metric and the identified zone.

15. A control system (202) for an agricultural harvester (100), the control system comprising:

a harvest data processing logic system (290), the harvest data processing logic system (290) configured to receive harvest data indicative of prior harvest operations on a field, and the harvest data processing logic system (290) identifying a region of the field where crop plants have been harvested based on the harvest data;

a crop state classification logic system (288), the crop state classification logic system (288) configured to:

detecting a characteristic of a crop plant in a field on a path of an agricultural harvester along a direction of travel; and

generating a height metric based on the detected features, the height metric representing a height of crop plants on a particular region of the field; and

a control logic system (234, 236, 238, 240), the control logic system (234, 236, 238, 240) configured to generate a control signal to control the agricultural harvester based on the height metric and the identified zone.

Technical Field

The present description relates generally to agricultural machinery. More particularly, but not by way of limitation, the present description relates to controlling an agricultural harvester using a geometric image classifier and a harvest map.

Background

An agricultural harvester, such as a combine or a cutter-rower, travels through a field of crops to harvest the crops. In one common arrangement, an agricultural harvesting header extends forward from the agricultural harvester to engage the bead-laden stalks, sever the bead-laden stalks, and then transport the severed crops into the main body of the agricultural harvester itself for processing.

In agricultural harvesters, throughput (the rate at which the crop moves through the machine) is dependent on the forward ground speed of the harvester and the density of the crop being harvested. Some machine settings (assuming throughput) may be set and then the machine speed changed as the operator observes differences in crop density to maintain the desired throughput.

Some current systems automatically adjust the forward ground speed of the harvester in an attempt to maintain the desired crop throughput. For example, some systems have attempted to use a priori data (e.g., aerial images of a field) to generate a predicted production map that predicts production at different geographic locations in a field being harvested. This can be done by attempting to identify crop density based on an image classifier that classifies images of the field during the passage of the harvester.

The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.

Disclosure of Invention

A computer-implemented method, comprising: acquiring harvest data indicative of prior harvest operations on a field; identifying, based on the harvest data, regions in the field where crop plants have been harvested; detecting a characteristic of a crop plant in a field on a path of an agricultural harvester along a direction of travel; generating a height metric based on the detected features, the height metric representing a height of crop plants on a particular region of the field; and generating a control signal based on the height metric and the identified zone to control the agricultural harvester.

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.

Drawings

Fig. 1 is a partial perspective schematic view of one example of an agricultural harvester.

Fig. 2 shows an example of an agricultural architecture including the agricultural harvester shown in fig. 1.

Fig. 3 is a block diagram illustrating one example of a biomass system.

FIG. 4 is a block diagram illustrating one example of a crop status determination logic system.

Fig. 5A and 5B (collectively fig. 5) are flowcharts illustrating exemplary operations for determining a crop state and generating control signals based on a predicted amount of crop material.

FIG. 6 is a flow chart illustrating exemplary operations for determining a crop status.

Fig. 7-1 shows an exemplary field harvested by a harvester.

Fig. 7-2 shows an exemplary field harvested by a harvester.

FIG. 8 is a block diagram illustrating one example of the architecture shown in FIG. 3 deployed in a remote server architecture.

Fig. 9-11 illustrate examples of mobile devices that may be used in the architecture shown in the previous figures.

FIG. 12 is a block diagram illustrating one example of a computing environment that may be used in the architecture shown in the previous figures.

Detailed Description

As noted above, some current harvester systems attempt to use a priori data (e.g., aerial images) in order to generate a prediction map that can be used to control the harvester. For example, much work has been done in attempting to generate a predicted yield map of a field based on vegetation index values generated from aerial images. Such predictive production maps attempt to predict production at different locations within a field. The system attempts to control the combine (or other harvester) based on the predicted yield.

Further, some systems attempt to use a forward looking sensing system, which may involve acquiring an optical image of the field in front of the harvester in the direction of travel. Based on these images, the yield of the area directly in front of the harvester can be predicted. This is another source of a priori data in the form that can be used to generate a predicted production map.

Difficulties can exist with all of these types of systems.

For example, in a system that utilizes image processing to predict the yield of a field through which a harvester passes, it may be difficult to distinguish between standing crops and areas having crop stubbles (i.e., areas that have been harvested). Some systems attempt to address this problem using image classifiers trained to distinguish such regions. However, such image classifiers are typically extensively trained, which requires extensive training data, which requires significant processing amplitude and time. Even so, such classifiers may be inaccurate.

In contrast, the present description relates to controlling an agricultural harvester using a geometric image classifier and a harvest map. In described examples, the system utilizes a geometric classifier to distinguish field areas with standing crops from field areas with non-standing crops (fallen crops or crop stubble). This information is fused with a harvest map to determine whether the area with non-standing crop is an area that has already been harvested (and thus is an area of crop stubble) or an area of crop that is fallen down. The geometric classifier is configured to measure the height of the crop relative to the ground, which can be used to predict the yield (or mass flow) on the path of the harvester to control various subsystems on the harvester (e.g., to achieve a desired yield).

Fig. 1 is a partial perspective schematic view of a combine harvester 100 (also referred to as a "harvester" or "combine"). As can be seen in fig. 1, combine harvester 100 illustratively includes an operator cab 101, which operator cab 101 may have a variety of different operator interface mechanisms for controlling combine harvester 100, as will be discussed in more detail below. In one example, combine 100 is completely autonomous and may not have an operator cab. The combine may include a set of front end equipment, which may include a header 102 and cutters, generally indicated at 104. The combine may also include a feed mechanism 106, a feed accelerator 108, and a thresher, generally indicated at 110. The thresher 110 illustratively includes a threshing rotor 112 and a set of concave plates 114. Furthermore, the combine harvester 100 may comprise a separator 116, which separator 116 comprises a separator rotor. The combine harvester 100 may include a cleaning subsystem (or cleaning chamber) 118, which may itself include a cleaning fan 120, a chaffer screen 122 and a screen 124. The material handling subsystems in combine 100 may include (in addition to feed mechanism 106 and feed accelerator 108): a discharge straw wheel 126, a tailing elevator 128, a clean grain elevator 130 (which moves the clean grain into a clean grain bin 132), and a discharge auger 134 and nozzle 136. The combine 100 may also include a residue subsystem 138, which residue subsystem 138 may include a shredder 140 and a spreader 142. Combine 100 may also have a propulsion subsystem that includes an engine that drives ground engaging wheels 144, tracks, or the like. It should be noted that the combine 100 may also have more than one of any of the above subsystems (e.g., left cleaning room, right cleaning room, separator, etc.).

In operation, and as an overview, combine 100 is illustratively moved in the field in the direction indicated by arrow 146. A forward looking sensor 121 is mounted at the front of the combine harvester 100 and senses characteristics of the crop in front of the combine harvester 100. In one example, sensor 121 is an image capture sensor that captures images of an area in front of header 102 (e.g., a video feed, a series of still images, etc.). A video feed or one or more images may be used to show a view (e.g., on a display device in operator cab 101) of the front of operator cab 101, e.g., showing header 102 and/or the crop in front of header 102. The images may be used to identify the volume (or amount) of crop to be engaged by header 102. This can be used to automatically increase or decrease the ground speed of the combine 100 to maintain a desired crop yield. This will be described in more detail below.

As the combine moves, header 102 engages the crop to be harvested and focuses the crop toward cutters 104. After the crop is cut, the crop moves through a conveyor in the infeed mechanism 106 toward an infeed accelerator 108, which accelerates the crop into a thresher 110. The crop is threshed by the rotor 112 rotating the crop against the concave plate 114. The threshed crop moves through a separator rotor in the separator 116 where some of the residue is discharged moving from the draft wheel 126 toward a residue subsystem 138. The residue may be shredded by the residue shredder 140 and spread on the field by the spreader 142. In other embodiments, the residue simply falls into a windrow, rather than being shredded and scattered.

The grain falls into a cleaning chamber (or cleaning subsystem) 118. Chaffer screen 122 separates some of the larger material from the grain, while screen 124 separates some of the finer material from the clean grain. The clean grain falls onto an auger in a clean grain elevator 130, which moves the clean grain upward and deposits the clean grain in a clean grain bin 132. The residue can be removed from the cleaning chamber 118 by the airflow generated by the cleaning fan 120. The residue may also move back in the combine 100 toward the residue handling subsystem 138.

The tailings can be moved by the tailings elevator 128 back to the thresher 110 where they can be threshed again. Alternatively, the tailings can also be transferred to a separate re-threshing mechanism (also using a tailings elevator or another transport mechanism) where they can also be re-threshed.

Fig. 1 also shows that in one example, combine harvester 100 may include a ground speed sensor 147, one or more separator loss sensors 148, a net grain camera 150, one or more cleaning bin loss sensors 152, a forward-looking camera 154, a rear-looking camera 156, a tailings elevator camera 158, and various other cameras or image/video capture devices. The ground speed sensor 147 is illustratively a sensor of the speed of travel of the combine 100 on the ground. This may be accomplished by sensing the rotational speed of the wheel, drive shaft, axle or other component. The speed of travel may also be sensed by a positioning system, such as a Global Positioning System (GPS), dead reckoning system, LORAN system, or various other systems or sensors that provide an indication of the speed of travel. In one example, an optical sensor captures images and optical flow is used to determine relative motion between two (or more) images taken at a given time interval.

The cleaning bay loss sensor 152 illustratively provides an output signal indicative of the amount of grain lost. In one example, this includes signals indicative of the quality of grain lost by the right and left sides of the cleaning chamber 118. In one example, the sensor 152 is an impact sensor that counts impacts of grain per unit of time (or per unit of distance traveled) to provide an indication of grain loss in the cleaning chamber. Impact sensors for the right and left sides of the cleaning chamber may provide respective signals, and may also provide a combined or aggregate signal. In one example, a count of grain impacts and a spatial distribution associated with the count may be acquired using sound-based sensors across the area of the cleaning chamber and/or rotor. It should be noted that the sensors 152 may comprise only a single sensor, rather than separate sensors for each cleaning chamber.

The separator loss sensor 148 provides a signal indicative of grain loss in the left and right separators. Sensors associated with the left and right separators may provide individual grain loss signals or combined or aggregate signals. This can also be done using a number of different types of sensors. It should be noted that the separator loss sensor 148 may also include only a single sensor, rather than separate left and right sensors.

Cameras 150, 156, and 158 illustratively capture video or still images that may be transmitted to and displayed on a display or remote device of operator cab 101 (shown in more detail below) in near real-time. For example, clean grain camera 150 generates a video showing grain entering clean grain bin 132 (or passing through clean grain elevator 130). The cameras 156 and 158 illustratively generate video feeds showing the tails in the elevator and discharge straw wheels and the area of the field behind the combine 100, respectively. Alternatively, or in addition to the video feed, the captured images may be enhanced and presented to the operator, for example, in a manner intended to reduce the cognitive burden on the operation. These are merely examples, and other or different cameras may be used and/or they may be devices that capture still images or other visual data.

It should also be understood that the sensors and measuring mechanisms (in addition to the sensors already described) may also include other sensors on the combine harvester 100. For example, the sensors and measurement mechanisms may include a residue setting sensor configured to sense whether the combine harvester 100 is configured to shred residues, drop residues into windrows, and the like. The sensors and measuring means may comprise a cleaning chamber fan speed sensor which may be arranged adjacent the fan 120 to sense the speed of the fan. The sensor and measurement mechanism may include a threshing gap sensor that senses the gap between the rotor 112 and the concave 114. The sensor and measurement mechanism includes a threshing rotor speed sensor that senses the rotor speed of the rotor 112. The sensor and measuring mechanism may include a chaffer gap sensor that senses the size of the openings in the chaffer 122. The sensors and measuring mechanisms may include a screen gap sensor that senses the size of the openings in screen 124. The sensor and measurement mechanism may include a material other than grain (MOG) humidity sensor that may be configured to sense a humidity level of a material other than grain that is passing through the combine 100. The sensors and measurement mechanisms may include machine setting sensors configured to sense various configurable settings on the combine 100. The sensors and measurement mechanisms may also include mechanical orientation sensors, which may be any of a number of different types of sensors that sense the orientation or attitude of the combine 100. The crop attribute sensor may sense a variety of different types of crop attributes, such as crop type, crop humidity, and other crop attributes. The crop attribute sensor may also be configured to sense a characteristic of the crop while the combine harvester 100 is processing the crop. For example, the crop attribute sensor may sense the grain feed rate as the grain travels through the net grain elevator 130. The crop attribute sensor may sense yield associated with the location at which the grain was harvested (as indicated by location sensor 157) based on the mass flow rate of grain through hoist 130, or provide other output signals indicative of other sensed variables. Some other examples of the types of sensors that may be used are described below.

In one example, various machine settings may be set and/or controlled to achieve desired performance. These settings may include the gap of the recess plate, the speed of the rotor, the screen and chaffer settings, the cleaning fan speed, and so forth. These settings may be illustratively set or controlled based on a desired crop throughput (e.g., the amount of crop processed per unit time by the combine 100). Thus, if the quality of the crop varies spatially in the field and the ground speed of the combine 100 remains constant, the production volume will vary with the amount of crop. As described below, the forward looking sensor 121 may be used to estimate the height of the crop in a given area and further estimate the volume of crop to be treated. The volume may be converted to a biomass metric indicative of the biomass of the crop to be joined. The machine speed can then be controlled based on the estimated biomass to maintain the desired throughput.

Fig. 2 is a block diagram illustrating one example of an agricultural architecture 200 including the agricultural harvester 100 shown in fig. 1, the agricultural harvester 100 harvesting a crop 201. Some of the items shown in FIG. 2 are similar to those shown in FIG. 1, and they are numbered similarly.

Combine 100 includes a processing and control system 202 (also referred to as control system 202), which processing and control system 202 is configured to control the other components and systems of architecture 200, one or more processors or servers 203, data storage 205, and combine 100 may also include other items 207. The data storage 205 is configured to store data, such as field data, for use by the combine harvester 100. Examples include, but are not limited to, field location data identifying the location of a field on which the combine harvester 100 is to operate, field shape and topography data defining the shape and topography of the field, crop location data being indicative of the location of a crop in the field (e.g., the location of a crop row), or any other data.

One example of a control system 202 is shown in FIG. 3, and FIG. 3 will be described in conjunction with FIG. 2. Control system 202 includes a communication controller 204, with communication controller 204 configured to control a communication system 206 to communicate between components of combine 100 and/or combine 100 and other machines or systems (e.g., remote computing system 208 and/or machine 210), either directly or through a network 212. In addition, the combine 100 may also communicate with other agricultural machines 214. The farm machine 214 may be a similar type of machine as the combine 100, and the farm machine 214 may also be a different type of machine. The network 212 may be any of a number of different types of networks, such as the internet, a cellular network, a local area network, a near field communication network, or any of a number of other networks, or a combination of networks or a combination of communication systems.

Remote user 216 is shown interacting with remote computing system 208. The remote computing system 208 may be a number of different types of systems. For example, the remote computing system 208 may be a remote server environment, a remote computing system used by the remote user 216. Further, the remote computing system 208 may be a remote computing system such as a mobile device, a remote network, or a variety of other remote systems. The remote computing system 208 may include one or more processors or servers, data storage, and the remote computing system 208 may include other items as well.

Communication system 206 may include wired and/or wireless communication logic systems, and communication system 206 may be substantially any communication system that may be used by systems and components of combine 100 to communicate information to other items (e.g., between control system 202, sensors 220, controllable subsystem 222, image capture system 224, and image analysis system 226). In one example, the communication system 206 communicates over a Controller Area Network (CAN) bus (or another network, such as ethernet, etc.) to pass information between these items. The information may include various sensor signals and output signals generated by sensor variables and/or sensed variables.

Control system 202 includes various control logic systems configured to control subsystems 222 or other systems and components in architecture 200. For example, the control system 202 includes a user interface component 228, the user interface component 228 configured to control an interface, such as an operator interface 230, the operator interface 230 including an input mechanism configured to receive input from an operator 232 and an output mechanism that provides output to the operator 232. User input mechanisms may include mechanisms such as steering wheels, pedals, levers, joysticks, hardware buttons, dials, links, switches, keyboards, etc., as well as virtual mechanisms or actuators (e.g., actuators displayed on a touch-sensitive display screen or a virtual keyboard). The output mechanism may include a speaker and/or a display device (e.g., a screen) that displays user-actuatable elements such as icons, links, buttons, and the like. In the case where the display device is a touch-sensitive display, those user-actuatable items may be actuated by touch gestures. Similarly, where the mechanism includes a speech processing mechanism, the operator 232 may provide input and receive output through a microphone and a speaker, respectively. Operator interface 230 may include any of a variety of other audio, visual, or tactile mechanisms.

The control system also includes feed speed control logic 234, setup control logic 236, path control logic 238, ground speed control logic 240, and may include other items 242. Some examples of controllable subsystems 222 have been discussed above and may include propulsion subsystem 244, steering subsystem 246, user interface mechanism 248, threshing subsystem 110, separator subsystem 116, cleaning subsystem 118, residue subsystem 138, material handling subsystem, header subsystem, and may include a variety of other systems 250, some of which have been described in fig. 1.

Feed speed control logic system 234 illustratively controls propulsion system 244 and/or any other controllable subsystems 222 to maintain a relatively constant feed speed based on the yield of the geographic location that combine 100 will encounter or other characteristics predicted by the predictive model. As an example, if the predictive model indicates that the predicted yield ahead (in the direction of travel) of the combine 100 is to be reduced, the feed speed control logic 234 may control the propulsion system 244 to increase the forward speed of the combine 100 in order to keep the feed speed relatively constant. On the other hand, if the predictive model indicates that production ahead of the combine 100 is to be relatively high, the feed speed control logic 234 may control the propulsion system 244 to decelerate to again maintain the feed speed at a relatively constant level.

The setup control logic 236 may generate control signals to adjust machine settings or configurations. For example, the logic system 236 may control actuators to change machine settings based on predicted characteristics of the field being harvested (e.g., based on predicted production or other predicted characteristics). As an example, the setting control logic 236 may actuate actuators that change the pit gap, thresher drum/rotor speed, conveyor speed, auger speed, pit gap, screen and chaffer settings, cleaning fan speed, etc., based on, for example, a predicted yield or biomass that the combine 100 will encounter.

Route control logic 238 may generate and apply control signals to steering subsystem 246 to control steering of combine 100.

The sensors 220 may include any of a number of different types of sensors. In the example shown, the sensors 220 include a forward looking sensor 121, a speed sensor 147, a position sensor 157, an environmental sensor 252, a production sensor 254, and may also include other types of sensors 256.

The forward looking sensor 121 may be a variety of different sensors including, but not limited to, a camera, a stereo camera, a laser-based sensor, a lidar sensor, a radar sensor, a sonar sensor, an ultrasonic-based sensor, a Light Emitting Diode (LED) -based lidar sensor, or any other sensor capable of measuring crop height, etc. In one example, the forward looking sensor 221 is a laser system or a stereo camera system, and the forward looking sensor 221 determines the average crop height throughout the area of interest. The area of interest is illustratively known and located a known distance in front of the combine 100. For example, the area of interest may be centered at a known distance in front of the combine 100, as wide as the harvester header and as deep as one quarter meter to one meter.

The location sensor 157 is configured to determine the geographic location of the combine 100 on the field and may include, but is not limited to, a Global Navigation Satellite System (GNSS) receiver that receives signals from GNSS satellite transmitters. The position sensor 157 may also include a real-time kinematic (RTK) component configured to enhance the accuracy of position data derived from GNSS signals. The speed sensor 147 is configured to determine the speed at which the combine harvester 100 travels over the field during the scattering operation. This may include sensors that sense movement of the ground engaging elements (e.g., wheels or tracks) and/or may utilize signals received from other sources such as position sensors 157 and the like.

As shown in fig. 2, the image capture system 260 includes image capture components configured to capture one or more images of the area of interest (i.e., a portion of the field in which the combine 100 is to operate) and image processing components configured to process those images. The captured images represent spectral responses captured by the image capture system 260, which are provided to the image analysis system 262 and/or stored in the data storage 264. The spectral imaging system illustratively includes a camera that acquires a spectral image of the field under analysis. For example, the camera may be a multispectral camera or a hyperspectral camera, or a variety of other devices for capturing spectral images. The camera may detect visible light, infrared radiation, or others.

In one example, the image capture component includes a stereo camera configured to capture still images, time series of images, and/or video of a field. An exemplary stereo camera captures high definition video at thirty Frames Per Second (FPS) and has a wide field of view of 110 degrees. Of course, this is for illustrative purposes only.

Illustratively, a stereo camera includes two or more lenses, where each lens has a separate image sensor. Stereoscopic images (e.g., stereophotographs) captured by stereoscopic cameras allow for computer stereoscopy to extract three-dimensional information from digital images. In another example, a single lens camera may be utilized to obtain an image (referred to as a "single" image).

Image capture system 260 may include one or more of an aerial image capture system 266, an onboard image capture system 268, and/or other image capture systems 270. Examples of aerial image capture systems 266 include cameras or other imaging components carried on Unmanned Aerial Vehicles (UAVs) or unmanned aerial vehicles (e.g., block 210). Examples of on-board image capture system 268 include a camera or other imaging component (e.g., sensor 121) mounted on combine harvester 100 (or 214) or otherwise carried by combine harvester 100 (or 214). An example of the image capture system 270 includes a satellite imaging system. The system 260 also includes a positioning system 272, and the system 260 may also include other items 274. The positioning system 272 is configured to generate a signal indicative of a geographic location associated with the captured image. For example, the positioning system 272 may output GPS coordinates associated with the captured image to obtain a geo-reference image 276, which geo-reference image 276 is provided to the image analysis system 262.

The image analysis system 262 illustratively includes one or more processors 278, a communication system 280, a data store 282, a target field data identification logic system 284, a trigger event detection logic system 286, a crop status determination logic system 288, a harvest data (e.g., map) processing logic system 290, and may also include other items 292.

In one example, communication system 280 is substantially similar to communication system 206 described above. The target field data identification logic system 284 is configured to identify the target or subject field under analysis for which the image 276 is being analyzed. Further, the target field data identification logic system 284 is configured to acquire or otherwise identify field data for the target field, such as, but not limited to, topographical data identifying field topology, harvest data indicating previous harvesting operations (i.e., which regions of the field have been harvested) during the current growing season, and so forth.

The trigger event detection logic 286 is configured to detect a trigger criteria that triggers the image analysis. For example, in response to detecting the trigger criteria, the logic system 286 may transmit instructions to the image capture system 260 to capture an image of the target field. These images are then processed by image analysis system 262, and the results of the image analysis are utilized by crop status determination logic system 288 to determine the status of the crop represented in the captured image (e.g., standing crop, fallen crop, crop stubble). For example, as described below, logic system 288 may geometrically classify (e.g., height-based) the crop plants in the image. As discussed in further detail below, logic system 288 is configured to determine that the height of the crop plants represented in the captured images is below a threshold and, thus, is classified as a non-standing crop (i.e., a fallen or stubble). As used herein, the term "crop plant" refers to a crop in a growing and non-growing state, for example after harvesting (referred to as a stubble).

The harvest data processing logic 290 is configured to acquire harvest data indicative of the area in the field that has been harvested. In one example, the harvest data includes harvest yield data from a particular region of the field based on prior harvest experiences of the combine harvester 100 and/or other harvesting machines. Using this data, system 262 generates and outputs image/crop analysis results 294.

The results 294 illustratively identify areas of standing crop, fallen crop, and crop stubble (i.e., already harvested crop) in the field. In one example, the results 294 are used to generate a representation of the amount of plant material on the path of the combine harvester 100. This may include any desired characteristic, such as, but not limited to, the mass, volume, weight, etc., of the crop to be harvested by the combine harvester 100. In one example, the representation identifies one or more of a predicted biomass of a crop to be engaged by the header and threshed in the combine harvester 100, a predicted crop grain yield, a material other than grain (MOG), and the like.

In one example, the results 294 are provided to a biomass system 296 configured to estimate a volume of crop to be processed and convert the volume into a biomass metric indicative of the biomass of the crop to be joined.

Crop biomass may indicate a predicted amount of biomass in a given portion of the field. For example, biomass can be expressed as units of area (m)2) Mass unit (kg). Crop grain yield may indicate a predicted amount of yield in a given portion of the field. For example, crop grain yield may be expressed as units of yield (bushels) on an area unit (acre).

In any case, this information can be used to control the combine harvester 100. Examples of which are discussed in further detail below. Briefly, in one example, machine speed may be controlled based on estimated biomass and/or crop yield to maintain a desired production capacity.

Note that as indicated by the dashed box in fig. 2, the control system 202 may include some or all of the image capture system 260, the image analysis system 262, and/or the biomass system 296.

The biomass system 296 includes a sensed region generator logic system 302, the sensed region generator logic system 302 determining a region sensed by the forward looking sensor 121. The volume generator logic system 304 uses the sensed area and determines a crop volume or characteristic in front of the combine harvester 100. The volume to biomass conversion logic 306 receives a crop volume or characteristic in front of the combine 100 and estimates the biomass of the crop in the sensed volume. In one example, this is based on a conversion factor that is based on a sensed variable indicative of actual biomass (e.g., from a sensor in the combine harvester 100, such as a rotor pressure sensor). The data store interaction logic system 308 stores data from the data store 310 and retrieves information from the data store 310. The biomass system 296 also includes a recommendation logic system 312, one or more processors 314, and may include other items 316. The recommendation logic system 312 is configured to generate recommendations based on the estimated biomass to maintain a desired production capacity. The recommendation may be performed automatically by control system 202 and/or manually by operator 232. Some recommendations that may be generated include changing the ground speed of the combine 100, and/or changing machine settings, such as pit setting, sieve and chaffer setting, cleaning fan speed, threshing rotor speed, conveyor/feed speed, or cutter speed. Of course, other settings may also be changed.

FIG. 4 is a block diagram illustrating one example of a crop status determination logic system 288. As shown, the logic system 288 includes a geometric classifier 402 and is configured to receive images 404 and field data 406. As described above, the image 404 may be acquired by an image capture system. Image acquisition system 260 may include an aerial image acquisition system 266, an onboard image acquisition system 268, and the like. For example, when the combine harvester 100 is harvesting a field, an image may be acquired by the sensor 121 of an area of the field forward of the combine harvester 100 in the forward direction of travel.

The field data 406 represents features or conditions of the field such as, but not limited to, harvest data, terrain data, and the like. This may include a priori data and/or field data and may be obtained from a variety of different sources. For example, it may be obtained from a drone or drone, a satellite system, a sensor 220 on the combine 100, other farm machinery 214, or other device. In one example, the harvest data includes a harvest map that identifies regions of the field (e.g., by the combine 100 and/or other machine 214) that have been harvested. The topographical data may include a topographical map representing the topography of the field (e.g., surface elevation, grade, etc.).

Using the images 404 and field data 406, the crop status determination logic system 288 outputs image/crop analysis results (e.g., results 294) that represent the status of the crop on the path of the combine 100. For example, an area of a field may be identified as having a standing crop. In this case, logic system 288 outputs the standing crop classification (block 408), as well as a height metric representing the height of the standing crop plants above the field surface. The logic system 288 may also output a fallen crop classification (block 410) indicating that the crop in the particular area is fallen, that is, the crop is not harvested but is not standing (i.e., has a height below a threshold). Additionally, logic system 288 may output a crop residue classification (block 412) indicating that a particular area has been harvested. Exemplary operation of logic system 288 is discussed below.

Fig. 5 is a block diagram 500 for determining a crop state in a field and generating corresponding control signals to control a farm machine. For purposes of illustration, and not limitation, FIG. 5 will be described in the context of architecture 200 shown in FIG. 2.

At block 502, the logic system 286 detects a trigger event to actuate a crop status determination. In one example, the triggering event includes detecting that the combine harvester 100 is to perform a harvesting operation. This is represented by block 504. The detection of the trigger event may also include detecting a manual input (block 506), or may occur automatically (block 508). Of course, the trigger event may be detected in other ways. This is indicated by block 510.

At block 512, a target field to be harvested is identified. In one example, this may include input from the operator 232 identifying the target field for the control system 202. In another example, the target field may be automatically identified based on geographic location information (e.g., using signals from sensors 157) identifying the location of combine harvester 100.

At block 514, field data for the target field is identified. As described above, in one example, the field data 406 is acquired by the crop status determination logic system 288 and may include a priori data (block 516), field data (block 518), or other data (block 520). In any case, the data may be obtained by combine 100 (block 522) or other machine or system (e.g., remote computing system 208, machine 210, machine 214, etc.). This is indicated by block 524.

The field data may include topographical data 526, such as a topological map of the target field. The field data may also obtain harvest data 528 indicative of the harvesting operation at the current growing season. For example, a harvest map may be obtained that indicates production data from the target field.

Based on the harvest data, a region of the field where the crop has been harvested is identified. This is indicated by block 530. For example, a stubble signal is generated at block 532 that identifies regions in the field that have been harvested and are to be classified as crop stubble regions.

At block 534, characteristics of crop plants in a particular region of the field (on the path of the combine harvester 100) are detected. Illustratively, the characteristic represents the height of the crop plant above the field surface. The characteristics may be detected based on data acquired by harvester 100 (block 536) and/or data acquired by other machines (such as those discussed above with respect to block 524). This is indicated by block 538.

In one example of block 534, images of the field are acquired at block 540 and geometric classification is performed at block 542 to acquire crop height data. In one example, at block 544, a 3D point cloud is generated from the stereo image data.

At block 546, a height metric is generated that represents the height of the crop plants in the particular region. In one example, the height metric generated at block 546 represents the average height of the crop plants above the field surface. This is indicated by block 548.

Note that the particular regions that detect the crop plant characteristics at block 534 and that generate the height metric at block 546 may be identified in any of a variety of ways. For example, the area may be a predefined area in front of the combine harvester 100. For illustration, the predefined area may be defined as the width of the header, between 0 and 20 feet in front of the header.

Further, the region may be selected arbitrarily and/or based on an identified boundary identified from the image data. For example, based on the image data, portions of the field having different crop heights (exceeding a height difference threshold) may be identified. In this way, the selected area generally follows the boundary between standing crops, fallen crops and/or crop stubble.

At block 550, a crop status in the particular area is determined based on the height metric generated at block 546 and the identified area where the crop identified at block 530 has been harvested. One example of block 540 is shown in fig. 6.

In the example of fig. 6, at block 552, a height metric for a particular region is generated. The height metric represents the height of the crop plants in a particular area. For example, the height metric may represent the average height of the crop plants in a particular area. This may be generated in any of a variety of ways. In one example, the ground plane (representing the surface of the field) is estimated based on sensor data from sensors 220 and/or based on remotely received data, such as a topographical map from system 208.

At block 554, a height threshold is selected or otherwise obtained. The height threshold may be user selected (represented at block 556). In another example, the height threshold is automatically selected. This is indicated by block 558. The height threshold may be based on any of a variety of factors. For example, the height threshold may be based on the type of crop and/or the condition of the crop being harvested in the field. For example, conditions such as maturity, moisture content, etc. can be utilized to determine the expected height of a crop plant standing in a field. This is indicated by block 560. Further, environmental conditions may be used to select an altitude threshold. This is indicated by block 562. For example, the environmental conditions may indicate current or a priori wind speed, precipitation, etc., to name a few. Of course, the height threshold may be selected in any of a variety of other ways and based on any of a variety of other considerations. This is indicated by block 564.

At block 566, logic system 288 determines whether the height metric is above the threshold value obtained at block 544. If so, the particular region of the field is classified as a standing crop. This is indicated by block 568. In this case, the standing crop category result is output along with a height metric representing the height of the standing crop in the particular area (e.g., block 408). This is indicated by block 570.

If the height metric is not above the threshold, block 572 determines whether the area is within an already harvested zone in the field. If so, the region is classified as a crop stubble region and a crop stubble category is output at block 574.

If the area is not in an already harvested zone, block 576 classifies the area as having fallen crops and a category. In this regard, it is worth noting that in this example, the distinction between crop stubble and fallen crop can be achieved at blocks 574 and 576, respectively, without the need for trained image classifiers to classify images of non-standing crops. As described above, utilizing such image classifiers to distinguish crop stubble from fallen crops requires extensive training using a large amount of training data, which in turn requires a large amount of processing amplitude and time. Often, such classifiers may be inaccurate, which can lead to incorrect predictions of biomass or yield.

Returning again to FIG. 5, at block 578, a representation of the predicted amount of crop plant material on the path of the machine is generated. The representation may take any of a variety of forms. For example, block 578 may predict the biomass that the combine harvester 100 will encounter as it moves through the field. This is indicated by block 580. Alternatively or additionally, block 578 may predict the yield from the field. This is indicated by block 582. Of course, other measurements may be utilized. This is indicated by block 584.

In one example, a representation is generated at block 578 based on the crop classification (i.e., the crop status determined at block 550). This is indicated by block 586. For example, the pre-measured amount of crop plant material is based on whether the area is a standing crop, a fallen crop, and/or a crop stubble. In addition, the predicted amount (mass flow) of crop plant material may be based on the height of the crop. This is indicated by block 588. In this way, the mass flow through the combine 100 can be determined based on the classification zones in front of the combine 100 and the crop heights sensed in these zones. The settings (e.g., speed, etc.) of the machine may be controlled based on the determined mass flow rate.

Alternatively or additionally, the predicted amount of crop plant material may be based on other crop conditions. This is indicated at block 590. For example, but not limited to, the prediction may be based on a Normalized Differential Vegetation Index (NDVI) image indicating growth stage, water stress (drought), plant stalk diameter, presence of weeds, etc. Of course, other considerations may be taken into account when generating the representation of the predicted amount of crop plant material. This is indicated by block 592.

At block 594, recommendations are generated to alter the operation of the combine 100 based on the predicted amount of crop plant material generated at block 578. For example, recommendations may be generated based on a desired production capacity of the combine harvester 100. This is represented by block 596. In other examples, the recommendation may be based on a target performance of the combine harvester 100, such as grain quality, productivity, fuel consumption, or any other performance metric.

The recommended changes may include changes to the ground speed of the combine 100. This is indicated by block 598. Alternatively or additionally, the recommended changes may include changes to the harvesting function of the combine 100. This is indicated by block 600. For example, changes to the harvesting function may include changes to the header, threshing subsystem 110, separator subsystem 116, cleaning subsystem 118, residue subsystem 138, or any other controllable subsystem of the combine 100. Of course, other recommendations may also be generated. This is indicated by block 602.

At block 604, control signals are generated by control system 202 (e.g., using user interface component 228, logic system 234, logic system 236, logic system 238, logic system 240, etc.) to control one or more controllable subsystems 222. For example, the user interface mechanism 248 may be controlled by the user interface component 228 to output an indication of the predicted amount of crop plant material on the plant path of the machine (represented by block 606) and/or to output the recommendation generated at block 594. This is indicated by block 608. For example, the output at blocks 606 and/or 608 may be provided through a display device, speaker, or the like.

Alternatively or additionally, the control signal generated at block 604 may control an actuator to adjust the settings of the controllable subsystem 222. This is indicated by block 610. For example, for the combine harvester 100 shown in fig. 1, the position of the header 102 may be adjusted, the rotor pressure of the separator 116 may be adjusted, and the operation of the cleaning subsystem 118 may be adjusted. Of course, these are for illustration purposes only.

Further, control signals may be generated to control the propulsion system 244 to control the ground speed of the combine harvester 100. This is indicated by block 612. Alternatively or additionally, control signals may be generated to control the steering system 246, and thus the steering of the combine harvester 100. This is indicated by block 614. Of course, combine 100 (or other machines and systems in architecture 200) may be controlled in other ways. This is indicated by block 616. At block 618, the operation determines whether there are more areas on the field to be harvested. If so, operation may return to block 534 to harvest a subsequent region of the field on the path of the combine harvester 100.

For purposes of illustration and not limitation, the operation shown in fig. 5 will be described in the context of fig. 7-1, fig. 7-1 showing an exemplary field 700 upon which the combine harvester 100 operates to harvest crop plants 702. As shown, combine harvester 100 travels along path 704 in a forward direction as indicated by arrow 706. As combine harvester 100 travels along path 704, header 102 operates to harvest the crop plants. The field data acquired at block 514 includes harvest data identifying a region 708 in the field that has been harvested (e.g., by the combine harvester 100 or another harvester operating in the field 700).

A forward looking sensor 121 (illustratively, a stereo camera) captures an image of a crop plant 702 on a path 704 in front of the combine harvester 100. Block 534 identifies the characteristics of the crop plants on the path 704 by performing geometric classification (block 542) based on the images acquired by the sensor 121. Block 536 generates a first height metric representing the height of the crop plant material in a first area (represented by dashed box 710) and a second height metric representing the height of the crop plant in a second area (represented by dashed box 712). The first height metric indicates that the crop plants 702 in the area 710 have an average height above a selected threshold. Thus, area 710 is classified as a standing crop. The second height metric indicates that the crop plants 702 in area 712 have an average height below a selected threshold. Thus, region 712 is classified as a non-standing crop.

Additionally, based on identifying region 708 as having been harvested, region 714 is removed from non-standing crop region 712 and classified as crop stubble. The remaining portion of zone 712 (i.e., zone 716) is identified as a crop that is falling. In other words, region 716 is identified as an unharvested region having a height metric below a threshold. Using information from the classification regions 710, 714, and 716, the biomass system 296 can generate a representation of a predicted amount of crop material (e.g., biomass) on the path 704.

Fig. 7-2 illustrates one example of calculating mass flow for a combine harvester 100 harvesting a field 700. Region 720 is identified as a crop stubble based on the prior harvest data, and region 722 is identified as a standing crop based on image processing that determines that the height of the crop within region 722 is above a standing threshold. In this example, region 720 is processed as a binary signal (i.e., region 720 is or is not crop stubble), and is therefore ignored for purposes of mass flow determination. Instead, region 722 is analyzed to determine the contribution to mass flow. In one example, system 204 determines the amount of crop material based on a width 724 of area 722 located within a path 726 of header 102 and a crop height (e.g., a height metric indicating an average crop height in area 722). Based on this determination, the speed (or other setting) of the combine harvester 100 is controlled.

It can thus be seen that the present system provides a number of advantages. For example, but not limiting of, the present system generates predictions of crop plant material for controlling an agricultural harvester, which improves operation of the machine, for example, by maintaining a desired production capacity. Further, the present system uses geometric classification to determine whether an area of the field is a standing crop, a fallen crop, or a crop stubble by utilizing the harvest data. The classification between fallen crops and crop stubble is done using the harvest data without the need for a specific image classifier to perform the classification task.

It should be noted that the above discussion has described various different systems, components, and/or logic systems. It should be understood that such systems, components, and/or logic systems may be comprised of hardware items (e.g., processors and associated memory, or other processing components, some of which are described below) that perform the functions associated with those systems, components, and logic systems. In addition, as described below, the system, components and/or logic system may be comprised of software that is loaded into memory and then executed by a processor or server or other computing component. The system, components, and/or logic system may also be comprised of different combinations of hardware, software, firmware, etc., some examples of which are described below. These are merely a few examples of different structures that may be used to form the above described systems, components, and/or logic systems. Other configurations may also be used.

The present discussion has referred to processors, processing systems, controllers, and servers. In one example, these may include a computer processor with associated memory and timing circuitry that are not separately shown. They are functional parts of and activated by the system or device to which they belong, and implement the functions of other parts or items in these systems.

In addition, a number of user interface displays have been discussed. The plurality of user interface displays may take a variety of different forms and may have a variety of different user-actuatable input mechanisms disposed thereon. For example, the user-actuatable input mechanism can be a text box, a check box, an icon, a link, a drop down menu, a search box, and the like. The input mechanism may also be actuated in a number of different ways. For example, a pointing or clicking device (e.g., a control ball or mouse) may be used to actuate the input mechanism. The input mechanism may be actuated using hardware buttons, switches, a joystick or keyboard, finger switches or finger pads, or the like. A virtual keyboard or other virtual actuator may also be used to actuate the input mechanism. Further, where the screen displaying the input mechanism is a touch sensitive screen, the input mechanism may be actuated using touch gestures. In addition, where the device displaying the input mechanism has a voice recognition component, voice commands may be used to actuate the input mechanism.

A plurality of data stores are also discussed. It should be noted that these data memories may be divided into a plurality of kinds of data memories, respectively. May all be local data stores of the system accessing these data stores, or may all be remote, or some may be local and others remote. All of these configurations are contemplated herein.

Further, the figures show a number of blocks, each of which is assigned a certain function. It should be noted that fewer blocks may be used such that the functions are performed by fewer components. In addition, more blocks may be used, with the functions being distributed among more components.

Fig. 8 is a block diagram of one example of the architecture shown in fig. 2, in which combine harvester 100 communicates with elements in a remote server architecture 800. In an example, the remote server architecture 800 may provide computing, software, data access, and storage services that do not require end users to know the physical location or configuration of the system delivering the services. In examples, the remote server may deliver the service over a wide area network (e.g., the internet) using an appropriate protocol. For example, the remote server may deliver the application over a wide area network, and may be accessed through a web browser or any other computing component. The software or components shown in fig. 2 and the corresponding data may be stored on a server located at a remote location. The computing resources in the remote server environment may be centralized at a remote data center location, or the computing resources may be distributed. Remote server architectures can deliver services through a shared data center even though they appear to users as a single point of access. Thus, the components and functionality described herein may be provided from a remote server located at a remote location using a remote server architecture. Alternatively, the components and functionality may be provided from a conventional server, or may be installed directly on the client device, or otherwise provided.

In the example shown in fig. 8, some items are similar to those shown in fig. 2, and they are numbered similarly. Fig. 8 specifically illustrates that the system 260, system 262, system 296, and/or data store 20 can be located at a remote server location 802. Accordingly, combine 100, machine 210, machine 214, and/or systems 208 access these systems through remote server location 802.

Fig. 8 also depicts another example of a remote server architecture. Fig. 8 shows that it is also contemplated that some elements of fig. 2 are disposed at a remote server location 802, while other elements are not so disposed. For example, the data store 205 may be disposed at a location separate from the location 802 and may be accessed through a remote server located at the location 802. Alternatively or additionally, one or more of the systems 260, 262, and 296 may be disposed at a location separate from the location 802 and may be accessed through a remote server located at the location 802.

Wherever these components are located, agricultural harvester 100 may have direct access to these components over a network (wide area network or local area network), these components may be clustered at a remote site by a service, or these components may be provided as a service, or may be accessed by a connected service located at a remote location. Further, the data can be stored at substantially any location and can be intermittently accessed or forwarded to the interested party. For example, a physical carrier wave may be used instead of or in addition to an electromagnetic wave carrier wave. In such an example, another mobile machine (e.g., a fuelling vehicle) may have an automatic information collection system in the event that cellular coverage is poor or non-existent. When the agricultural machine is near a refueling truck for refueling, the system will automatically collect or transmit information from or to the machine using any type of temporary wireless connection. The collected information can then be forwarded to the main network when the fuelling vehicle arrives in a location with cellular coverage (or other wireless coverage). For example, a bowser may enter a covered location when traveling to refuel other machines or when at a primary fuel storage location. All of these architectures are contemplated herein. Further, the information may be stored on the agricultural machine until the agricultural machine enters a covered location. The agricultural machine itself may then send/receive information to/from the primary network.

It should also be noted that the elements or portions of elements in fig. 2 may be arranged on a variety of different devices. Some of these devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, and the like.

FIG. 9 is a simplified block diagram of one illustrative example of a handheld or mobile computing device that may be used as the user's or customer's handheld device 16 in which the present system (or a portion thereof) may be deployed. For example, the mobile device may be deployed in an operator cab of the agricultural machine 100 or as the remote computing system 208. Fig. 10 to 11 are examples of a handheld device or a mobile device.

Fig. 9 provides a general block diagram of the components of client device 16 that may operate some of the components shown in fig. 2, interact with them, or both. In device 16, a communication link 13 is provided, which communication link 13 allows the handheld device to communicate with other computing devices, and in some embodiments, provides a channel for receiving information automatically, such as by scanning. Examples of communication links 13 include protocols that allow communication via one or more communication protocols, such as wireless services for providing cellular access to a network, and protocols that provide local wireless connectivity to a network.

In other examples, the application may be received on a removable Secure Digital (SD) card connected to interface 15. The interface 15 and the communication link 13 communicate with a processor 17 (which may also be embodied as a processor or server in the previous figures) via a bus 19, which bus 19 is further connected to a memory 21 and input/output (I/O) components 23 as well as a clock 25 and a positioning system 27.

In one example, I/O components 23 are provided to facilitate input and output operations. I/O components 23 for various embodiments of device 16 may include input components (e.g., buttons, touch sensors, optical sensors, microphones, touch screens, proximity sensors, accelerometers, orientation sensors) and output components (e.g., a display device, speakers, and/or printer port). Other I/O components 23 may also be used.

Clock 25 illustratively includes a real-time clock component that outputs a time and date. Clock 25 illustratively may also provide timing functions for processor 17.

Location system 27 illustratively includes components that output the current geographic location of device 16. This may include, for example, a Global Positioning System (GPS) receiver, LORAN system, dead reckoning system, cellular triangulation system, or other positioning system. The positioning system 27 may also include, for example, mapping software or navigation software that generates desired maps, navigation routes, and other geographic functions.

Memory 21 stores operating system 29, network settings 31, applications 33, application configuration settings 35, data store 37, communication drivers 39, and communication configuration settings 41. The memory 21 may include all types of tangible volatile and non-volatile computer-readable storage devices. Memory 21 may also include computer storage media (described below). The memory 21 stores computer readable instructions that, when executed by the processor 17, cause the processor to perform computer-implemented steps or functions in accordance with the instructions. The processor 17 may also be activated by other components to facilitate its functions.

Fig. 10 shows an example in which the device 16 is a tablet computer 850. In fig. 10, computer 850 is shown with a user interface display screen 852. Screen 852 may be a touch screen or a pen-enabled interface that receives input from a pen or stylus. Screen 852 may also use an on-screen virtual keyboard. Of course, for example, the screen 852 may also be attached to a keyboard or other user input device by a suitable attachment mechanism (e.g., a wireless link or a USB port). The computer 850 may also illustratively receive voice input.

Fig. 11 shows that the device may be a smartphone 71. The smartphone 71 has a touch-sensitive display 73, the touch-sensitive display 73 displaying icons or tiles or other user input mechanisms 75. The mechanism 75 may be used by a user to run applications, make calls, perform data transfer operations, and the like. Typically, the smart phone 71 is built on a mobile operating system and provides a higher level of computing power and connectivity than a feature phone.

Note that other forms of the device 16 are possible.

Fig. 12 is an example of a computing environment in which the elements of fig. 2, or some of them, for example, may be deployed. With reference to FIG. 12, an exemplary system for implementing some embodiments includes a computing device in the form of a computer 910. The components of computer 910 may include, but are not limited to: a processing unit 920 (which may include a processor or server from previous figures), a system memory 930, and a system bus 921 that couples various system components including the system memory to the processing unit 920. The system bus 921 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The memory and programs described with reference to fig. 2 may be deployed in corresponding portions of fig. 12.

Computer 910 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 910 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. The computer storage medium is distinct from and does not include a modulated data signal or carrier wave. Computer storage media includes hardware storage media including volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media include, but are not limited to: computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 910. Communication media may embody computer readable instructions, data structures, program modules, or other data in a transmission mechanism and include any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.

The system memory 930 includes computer storage media in the form of volatile and/or nonvolatile memory such as Read Only Memory (ROM)931 and Random Access Memory (RAM) 932. A basic input/output system 933(BIOS), containing the basic routines that help to transfer information between elements within computer 910, such as during start-up, is typically stored in ROM 931. RAM 932 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 920. By way of example, and not limitation, FIG. 12 illustrates operating system 934, application programs 935, other program modules 936, and program data 937.

The computer 910 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 12 illustrates a hard disk drive 941, an optical disk drive 955, and a nonvolatile optical disk 956 that read from or write to non-removable, nonvolatile magnetic media. The hard disk drive 941 is typically connected to the system bus 921 through a non-removable memory interface such as interface 940, and optical disk drive 955 is typically connected to the system bus 921 by a removable memory interface, such as interface 950.

Alternatively or in addition, the functions described herein may be performed, or at least partially performed, by one or more hardware logic system components. By way of example, and not limitation, illustrative types of hardware logic system components that may be used include Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex programmable logic system devices (CPLDs), and the like.

The drives and their associated computer storage media discussed above and illustrated in FIG. 12, provide storage of computer readable instructions, data structures, program modules and other data for the computer 910. In FIG. 12, for example, hard disk drive 941 is illustrated as storing operating system 944, application programs 945, other program modules 946, and program data 947. Note that these components can either be the same as or different from operating system 934, application programs 935, other program modules 936, and program data 937.

A user may enter commands and information into the computer 910 through input devices such as a keyboard 962, a microphone 963, and a pointing device 961, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, handle, satellite dish, scanner, or the like. These input devices and other input devices are often connected to the processing unit 920 through a user input interface 960 that is coupled to the system bus, but may be connected by other interface and bus structures. A visual display 991 or other type of display device is also connected to the system bus 921 via an interface, such as a video interface 990. In addition to the monitor, computers may also include other peripheral output devices such as speakers 997 and printer 996, which may be connected through an output peripheral interface 995.

The computer 910 is operated in a networked environment using logical system connections (e.g., local area network-LAN, wide area network-WAN, or control area network-CAN) to one or more remote computers (e.g., a remote computer 980).

When used in a LAN networking environment, the computer 910 is connected to the LAN 971 through a network interface or adapter 970. When used in a WAN networking environment, the computer 910 typically includes a modem 972 or other means for establishing communications over the WAN 973, such as the Internet. In a networked environment, program modules may be stored in the remote memory storage device. Fig. 12 illustrates, for example, that remote application programs 985 can be located on remote computer 980.

It should also be noted that the different examples described herein may be combined in different ways. That is, portions of one or more examples may be combined with portions of one or more other examples. All of these aspects are contemplated herein.

Example 1 is a computer-implemented method, comprising:

acquiring harvest data indicative of prior harvest operations on a field;

identifying regions in the field where crop plants have been harvested based on the harvest data;

detecting a characteristic of a crop plant in a field on a path of an agricultural harvester along a direction of travel;

generating a height metric based on the detected features, the height metric representing a height of crop plants on a particular region of the field; and

generating a control signal based on the height metric and the identified zone to control the agricultural harvester.

Example 2 is a computer-implemented method according to any or all of the previous examples, further comprising:

identifying crop plants on the particular region of the field as falling crops based on the harvesting data and a determination that the height metric indicative of crop height is below a threshold; and

generating the control signal to control the agricultural harvester based on identification of a crop that falls over the particular region of the field.

Example 3 is a computer-implemented method according to any or all of the preceding examples, wherein generating a control signal comprises controlling a controllable subsystem of the agricultural harvester.

Example 4 is a computer-implemented method according to any or all of the preceding examples, further comprising generating a recommendation to alter operation of the agricultural harvester.

Example 5 is a computer-implemented method according to any or all of the preceding examples, wherein the recommendation is generated based on the selected crop yield.

Example 6 is a computer-implemented method according to any or all of the preceding examples, wherein the recommendation comprises at least one of:

a change in ground speed of the agricultural harvester; or

A change in harvesting function on the agricultural harvester.

Example 7 is a computer-implemented method according to any or all of the previous examples, wherein generating the control signal comprises: controlling a display device to provide an indication of the recommendation.

Example 8 is a computer-implemented method according to any or all of the preceding examples, wherein the height metric comprises an average height of the crop plants over the particular area.

Example 9 is a computer-implemented method according to any or all of the previous examples, further comprising:

generating a representation of the amount of crop plant material on the path of the agricultural harvester; and

generating the control signal based on the amount of crop plant material.

Example 10 is a computer-implemented method according to any or all of the preceding examples, wherein the representation of the quantity of crop plants comprises an indication of biomass.

Example 11 is a computer-implemented method according to any or all of the preceding examples, wherein detecting a characteristic of the crop plant comprises:

receiving image data of crop plants in the particular region;

applying a geometric classifier to the image data; and

determining the height metric based on the geometric classifier.

Example 12 is a computer-implemented method according to any or all of the previous examples, further comprising:

identifying a plurality of regions of the field; and

for each given region of said field,

generating a corresponding height metric representing a height of a crop plant above a field surface over the given area;

comparing the corresponding height metric to a threshold;

classifying the given region based on the comparison; and

generating a control signal corresponding to the given region based on the classification.

Example 13 is a computer-implemented method according to any or all of the preceding examples, wherein classifying the given region comprises:

classifying the given area as a standing crop if the corresponding height metric value is above the threshold; and

classifying the given area as a non-standing crop if the corresponding height metric value is below the threshold.

Example 14 is a computer-implemented method according to any or all of the preceding examples, wherein classifying the given area as a non-standing crop comprises:

classifying the given region as crop stubble based on a determination that the given region was harvested during the a priori harvesting operation; and

classifying the given region as a fallen crop based on a determination that the given region was not harvested during the a priori harvesting operation.

Example 15 is an agricultural harvester, comprising:

a controllable subsystem;

a forward looking crop sensor configured to generate a sensor signal indicative of a detected characteristic of a crop plant in a field on a path of an agricultural harvester along a direction of travel;

a crop state determination logic system configured to:

obtaining harvest data indicative of prior harvest operations on the field;

identifying, based on the harvest data, regions in the field where crop plants have been harvested;

generating a height metric based on the detected features, the height metric representing a height of crop plants on a particular region of the field; and

a control system configured to control the controllable subsystem based on the height metric and the identified zone.

Example 16 is an agricultural harvester according to any or all of the previous examples, wherein the control system is configured to:

generating a representation of an amount of crop plant material on a path of the agricultural harvester based on the height metric and the identified region; and

generating a recommendation based on the representation of the amount of crop plant material to alter the operation of the controllable subsystem.

Example 17 is an agricultural harvester according to any or all of the preceding examples, wherein the crop status determination logic system is configured to:

identifying a plurality of regions of the field,

for each of the regions of the field,

receiving image data of crop plants in the designated area;

applying a geometric classifier to the image data; and

generating a corresponding height metric representing a height of a crop plant above the field surface over the given area;

comparing the corresponding height metric to a threshold; and

classifying said given region based on said comparison; and

the control logic system is configured to generate a control signal corresponding to the given region based on the classification.

Example 18 is an agricultural harvester according to any or all of the preceding examples, wherein the crop status determination logic system is configured to:

classifying the given area as a standing crop based on a determination that the corresponding height metric is above the threshold;

classifying the given region as a crop stubble based on a determination that the corresponding height metric is below the threshold and that the given region is located within the given region; and

classifying the given area as a fallen crop based on a determination that the corresponding height metric is below the threshold and that the given area is not located within the given zone.

Example 19 is a control system for an agricultural harvester, the control system comprising:

a harvest data processing logic system configured to receive harvest data indicative of prior harvest operations on a field, and to identify a region of the field in which crop plants have been harvested based on the harvest data;

a crop state classification logic system configured to:

detecting a characteristic of a crop plant in a field of an agricultural harvester along a direction of travel path; and

generating a height metric based on the detected features, the height metric representing a height of crop plants on a particular region of the field; and

a control logic system configured to generate a control signal to control the agricultural harvester based on the height metric and the identified zone.

Example 20 is a control system for an agricultural machine according to any or all of the preceding examples, wherein,

the crop state classification logic system is configured to:

identifying a plurality of regions of the field; and

for each given region of said field,

receiving image data of crop plants in the given region;

applying a geometric classifier to the image data; and

generating a corresponding height metric representing a height of a crop plant above a field surface over the given area;

comparing the corresponding height metric to the threshold;

classifying the given area as a standing crop based on a determination that the corresponding height metric is above the threshold;

classifying the given region as a crop stubble based on a determination that the corresponding height metric is below the threshold and that the given region is located within the given region; and classifying the given area as a fallen crop based on a determination that the corresponding height metric is below the threshold and that the given area is not located within the given zone; and

the control logic system is configured to generate a control signal corresponding to each given region based on the classification of the given region.

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

36页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:割草机

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!