Agricultural system

文档序号:1652079 发布日期:2019-12-24 浏览:31次 中文

阅读说明:本技术 农业系统 (Agricultural system ) 是由 L·费拉里 J·H·波塞里斯 于 2018-05-09 设计创作,主要内容包括:一种系统,其包括波束成形传感器(210),该波束成形传感器(210)被配置为获取表示农用田地(102)中的条带(104)的传感器数据(212);该系统还包括被配置为基于传感器数据(212)确定条带属性数据(216)的控制器(214)。(A system comprising a beamforming sensor (210), the beamforming sensor (210) configured to acquire sensor data (212) representative of a swath (104) in an agricultural field (102); the system also includes a controller (214) configured to determine strip property data (216) based on the sensor data (212).)

1. A system, comprising:

a beamforming sensor configured to acquire sensor data representing a swath in an agricultural field; and

a controller configured to determine strip property data based on the sensor data.

2. The system of claim 1, wherein the beamforming sensor comprises a beamforming radar sensor.

3. The system of claim 2, wherein the beamforming sensor comprises a phased array radar sensor.

4. The system of claim 1, wherein the beamforming sensor comprises a beamforming ultrasound sensor.

5. The system of any one of the preceding claims, wherein the strip property data comprises strip area data representing a cross-sectional area of the strip.

6. The system of claim 5, wherein the controller is configured to:

processing the sensor data to determine:

(i) band profile data representing a position of an outer surface of the band; and

(ii) ground profile data representing a location of a surface of the ground; and

processing the strip profile data and the ground profile data to determine the strip area data.

7. The system of claim 1, wherein the strip attribute data comprises strip volume data representing a volume of the strip.

8. The system of claim 1, wherein the stripe attribute data comprises foreign object indicator data.

9. The system of claim 8, wherein the controller is configured to: setting the foreign object indicator data to indicate detection of a foreign object if the power of the received imaging signal represented by the sensor data is greater than a power threshold level.

10. The system of claim 1, wherein the tape property data comprises tape humidity data and/or tape density data representing humidity and/or density of a tape.

11. The system of claim 10, wherein the controller is configured to set the strip humidity data and/or strip density data based on a phase and/or amplitude of the sensor data.

12. The system of claim 1, wherein the beamforming sensor is associated with an agricultural vehicle and is configured to acquire sensor data representing a swath in an agricultural field in proximity to the agricultural vehicle.

13. The system of claim 12, wherein the controller is configured to determine vehicle control instructions for the agricultural vehicle based on the strip property data.

14. The system of claim 13, wherein the vehicle control instructions comprise:

vehicle steering instructions for automatically controlling a direction of travel of the agricultural vehicle; and/or

A vehicle speed command for automatically controlling the speed of the agricultural vehicle.

15. The system of claim 13, wherein the system further comprises an agricultural vehicle configured to operate in accordance with the vehicle control instructions.

Background

Determining attributes of a strap (swing) to be collected/picked up by an agricultural vehicle such as a baler (bager), forage harvester (former) or windrower may be beneficial for improving operation of strap collection.

Disclosure of Invention

According to a first aspect of the invention, there is provided a system comprising:

a beamforming sensor configured to acquire sensor data representing a swath in an agricultural field; and

a controller configured to determine stripe property data (swing-property-data) based on the sensor data.

Advantageously, the use of beamforming sensors may provide highly directional sensor data due to the inherent properties of beamforming. This may enable accurate determination of the properties of the strip at a particular location.

The beamforming sensor may comprise a beamforming radar sensor. The beamforming sensor may comprise a phased array radar sensor. The beamforming sensor may comprise a beamforming ultrasound sensor.

The stripe attribute data may include stripe area data (swap-area-data) representing a cross-sectional area of the stripe.

The controller may be configured to:

processing the sensor data to determine:

(i) stripe profile data (beat-profile-data) indicating a position of an outer surface of the stripe; and

(ii) ground-profile-data, which represents the position of the ground; and

the strip profile data and the ground profile data are processed to determine strip area data.

The stripe attribute data may include stripe volume data (swap-volume-data) representing a volume of the stripe.

The stripe attribute data may include foreign-object-indicator-data. The controller may be configured to: the foreign object indicator data is set to indicate that a foreign object is detected if the power of the received imaging signal, as represented by the sensor data, is greater than a power threshold level.

The strip attribute data may comprise strip humidity data (beat-motion-data) and/or strip density data (beat-density-data) representing the humidity and/or density of the strip.

The controller may be configured to set the strip humidity data and/or the strip density data based on the phase and/or amplitude of the sensor data.

The beamforming sensors may be associated with the agricultural vehicle and may be configured to acquire sensor data representing swaths in an agricultural field in the vicinity of the agricultural vehicle.

The controller may be configured to determine vehicle control instructions for the agricultural vehicle based on the strip property data. The vehicle control instructions may include: a vehicle steering command for automatically controlling a direction of travel of the agricultural vehicle; and/or vehicle speed instructions for automatically controlling the speed of the agricultural vehicle.

The system may also include an agricultural vehicle configured to operate according to vehicle control instructions.

The vehicle control instructions may be configured to cause the output device to provide instructions to an operator of the agricultural vehicle to set a speed and/or direction of travel of the agricultural vehicle.

A computer program may be provided which, when run on a computer, causes the computer to configure any apparatus comprising a controller, processor, machine, vehicle or device disclosed herein or to perform any method disclosed herein. The computer program may be a software implementation and the computer may be considered any suitable hardware, including a digital signal processor, a microcontroller, and implementations in read-only memory (ROM), erasable programmable read-only memory (EPROM), or electronically erasable programmable read-only memory (EEPROM), as non-limiting examples.

The computer program may be provided on a computer readable medium, which may be a physical computer readable medium such as a disk or a memory device, or may be embodied as a transient signal. Such transient signals may be downloaded over a network, including the internet.

Drawings

Embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings, in which:

FIG. 1a shows an example of an agricultural field;

figure 1b schematically shows a cross-section of a strip;

FIG. 2 schematically illustrates a system associated with determining stripe property data;

figures 3a and 3b show examples of locations where beamforming sensors may be positioned on an agricultural vehicle;

FIG. 4 shows an example of how a strip may be imaged by a beamformed radar sensor;

FIG. 5 shows another example of how a strip may be imaged by a beamformed radar sensor;

FIG. 6 schematically illustrates a system that may determine vehicle control instructions for an agricultural vehicle based on strip property data; and

fig. 7 shows another example of how a phased array radar sensor may image a strip.

Detailed Description

Fig. 1a schematically shows an agricultural field 102. The field 102 includes rows of crop material, which may be hay, straw, or similar products that have been left in the field 102 in the form of strips 104. The bands 104 are elongate rows of the product in question, which are stacked in the transverse center and tend to flatten out at the respective transverse edges. Typically, as shown in FIG. 1a, a field 102 that has been harvested comprises a plurality of strips 104 that are substantially parallel to one another. The strips 104 are spaced apart from one another by a substantially uniform gap. By way of non-limiting example, the crop material in the strip 104 may be picked up by an agricultural machine such as a baler, a forage harvester, or a windrower.

Fig. 1b schematically shows a cross-section of a row 104 of strips on the ground 106.

Fig. 2 schematically illustrates a system for determining stripe attribute data 216, the stripe attribute data 216 representing one or more attributes of stripes in a field. The system includes a beamforming sensor 210 that can acquire sensor data 212 representing swaths in an agricultural field. As will be discussed in more detail below, the beamforming sensor 210 may be mounted on an agricultural machine (not shown) and may be operated while the agricultural machine is picking up a swath from a field. That is, the beamforming sensor 210 may have a field of view that contains a portion of the strip to be picked up.

The beamforming sensors 210 may transmit-imaging-signals (transmitted-imaging-signals) and receive-imaging-signals (received-imaging-signals) reflected from objects such as strips or the ground. One or both of the transmit imaging signals and the receive imaging signals may be directional beamforming signals. In some examples, it may be particularly advantageous to provide a beamforming sensor 210 configured to receive a receive beamforming imaging signal.

The system also includes a controller 214, and the controller 214 may determine strip property data 216 based on the sensor data 212. It will be appreciated that the controller 214 may be located on the agricultural machine, or remote from the agricultural machine. For example, the functions of the controller 214 may be performed on a remote server (such as a server "in the cloud").

Advantageously, due to the inherent properties of beamforming, the beamforming sensor 210 may provide highly directional sensor data 212, as will be described below. This may enable accurate determination of the properties of the strip at a particular location. For example, the presence or absence of a strip, and thus the position of the end of the strip, may be accurately determined. In some examples, this may be used to better control an agricultural vehicle, as will be described below.

In some examples, the beamforming sensor 210 may be provided as a beamforming radar sensor or a beamforming ultrasound sensor. In this manner, the beamforming sensors 210 may generate electromagnetic or ultrasonic signals capable of penetrating obstructions such as dust and fog that may be present in an agricultural environment. Therefore, the use of a beam forming radar sensor or a beam forming ultrasonic sensor may be considered beneficial compared to optical sensor systems. For a beam forming radar sensor, this is because the beam forming radar sensor may use electromagnetic waves (e.g. radio waves or microwaves) having a sufficiently long wavelength such that scattering caused by the shelter is low. Similarly, a beamforming ultrasonic sensor may use ultrasonic waves that are not significantly scattered by the mask. In this manner, the beamformed radar sensors and the beamformed ultrasonic sensors are able to generate sensor data 212 that better represents the object of interest (including the strip and/or the ground, as will be discussed below) than can be generated with optical sensors. Thus, reliability can be improved under challenging environmental conditions.

It is also advantageous that the beamforming sensor 210 may be used at night and in foggy conditions, which may not be possible or convenient for the optical system.

The sensor data 212 may represent: (i) distance to the detected object; and (ii) a direction from the beamforming sensor 210 to the detected object. The sensor data 212 may be provided as a plurality of coordinates representing locations from which received imaging signals have been received, and in this example, they are provided as polar coordinates.

Optionally, the sensor data 212 may also include (iii) the power/amplitude of the received imaging signals; and/or (iv) a phase difference between the transmit imaging signal and the corresponding receive imaging signal. Any of the beamforming sensors described herein may be a two-dimensional beamforming sensor or a three-dimensional beamforming sensor.

The beamforming sensors may be implemented as MIMO (multiple input multiple output) sensors, such as MIMO radar sensors, or phased array sensors. The phased array sensor may be a phased array radar sensor or a phased array ultrasonic sensor. A phased array radar sensor can be implemented: for beamforming the signals transmitted by the radar transmission antenna; and/or for beamforming signals received at the radar receiving antenna. (as is known in the art, the radar transmitting antenna may be the same physical antenna as the radar receiving antenna).

The stripe attribute data 216 may include stripe area data representing a cross-sectional area of the stripe. The cross-section may be in a direction transverse to the longitudinal direction of the elongate row of strips, which may also be transverse to the direction of movement of the agricultural vehicle from which the strips are to be picked up. Such a cross-section is shown in fig. 1 b.

The stripe attribute data 216 may include stripe width data (swing-width-data) representing the lateral width of the stripe.

The stripe attribute data 216 may include stripe height data (stride-height-data) indicating the height of the stripe.

The stripe attribute data 216 may include stripe center data (swap-center-data) indicating the center of the stripe. The stripe center data may be one-dimensional in that it may represent the lateral center of the stripe (from side to side of the stripe as shown in fig. 1 b) or the elevational center of the stripe (from top to bottom of the stripe as shown in fig. 1 b), or the stripe center data may be two-dimensional in that it may represent both the lateral center of the stripe and the elevational center of the stripe.

The stripe attribute data 216 may include stripe end data (switch-extreme-data) indicating the position of the end of the stripe. The strip end data may be one-dimensional in that it may represent the lateral end of the strip or the height end of the strip. Further, the strip end data may be two-dimensional in that it may represent both the lateral and height ends of the strip. The stripe attribute data may also include stripe profile data that represents a perimeter of the stripe.

The strip attribute data 216 may include strip volume data representing the volume of the strip. Further details of one example of how the controller 214 can determine the swath area data and the swath volume data are described below with reference to fig. 4.

In some examples, the stripe attribute data 216 includes foreign object indicator data that indicates whether a foreign object has been detected. Advantageously, by appropriately selecting beamforming sensors (such as beamforming radar sensors that emit radar waves at frequencies that can penetrate the strip), the sensor data 212 may represent objects within the strip. That is, objects present inside the strip may provide additional reflections represented by the sensor data 212. Thus, the presence of foreign matter inside the strip can be identified in advance, and appropriate action can be taken on the agricultural vehicle before the agricultural vehicle picks up the strip containing the foreign matter.

If the power/amplitude of the received radar imaging signal is greater than the power threshold level, the controller 214 may set the foreign object indicator data to indicate the presence of a foreign object. This is particularly useful for detecting metallic foreign objects, since metallic objects are known to provide reflected radar signals with high power.

Advantageously, the controller 214 may process the sensor data 212 to detect the presence of a non-ferromagnetic foreign object. Accordingly, the beam forming sensor 210 may be used to detect objects that cannot be detected by a metal detector, such as indium and aluminum objects. As will be appreciated, picking up such foreign matter may cause significant damage to the agricultural machinery that picks up the strip and/or any machinery or people that subsequently dispose of the treated strip, for example in a bale.

Further, the controller 214 may set the foreign object indicator data to represent the foreign object based on the distance to the detected object. In particular, based on the difference between the detected object and the distance to the ground and/or the strip, as will be described below with reference to fig. 5.

In some examples, the strip attribute data 216 may include strip humidity data representing a humidity of the strip. The controller 214 may set the strip humidity data based on the phase and/or amplitude of the sensor data 212. For example, one or more calibration operations may be performed to determine how the humidity of the strip affects the phase and/or amplitude of the received imaging signals, such that the algorithm parameter values may be stored in memory, or the database may be loaded with appropriate reference values. Then, in use, the controller 214 may apply an appropriate algorithm (with set parameter values), or use data stored in a database, to determine strip humidity data based on the received sensor data 212.

In some examples, the strip attribute data 216 may include strip density data representing a humidity of the strip. The controller 214 may set the strip density data based on the phase and/or amplitude of the sensor data 212 in the same manner as the strip humidity data discussed above.

Fig. 3a and 3b show examples of locations where beamforming sensors 310a, 310b may be positioned on an agricultural vehicle. In this example, the agricultural vehicle is a tractor that pulls a baler. In other examples, the agricultural vehicle may be a baler, a forage harvester, a tractor, or a windrower. Any of these vehicles may or may not be self-propelled.

In fig. 3a, the beamforming sensor 310a is located in the lower portion of the agricultural vehicle such that it has a low field of view 318 a. In fig. 3b, the beamforming sensor 310b is located in the upper portion of the agricultural vehicle such that it has a high field of view 318 b. The advantage of placing the radar at a higher position is that the field of view can be increased. However, this may involve a compromise with decreasing lateral range (cross-range) resolution, which is the ability to detect objects in a plane perpendicular to the wave. The lateral range resolution may depend on the angular resolution and the distance.

The beamforming sensors may be associated with the agricultural vehicle in any manner such that they acquire sensor data representing swaths in an agricultural field in the vicinity of the agricultural vehicle. As shown in fig. 3a and 3b, the field of view 318a, 318b of the beamforming sensor is located in front of the agricultural machine (in the direction in which the vehicle is moving when picking up the strip) so that the sensor data represents the strip in front of the agricultural vehicle. In other examples, the beamforming sensor may have a field of view on the side of the agricultural machine (in a direction transverse to the direction in which the vehicle is moving when picking up the strip) such that the sensor data represents the strip on the side of the agricultural vehicle. Such an example may be used to scan parallel rows of a strip to be subsequently picked up by an agricultural vehicle. That is, strip property data for a strip row that is different from the row in the process being picked up by the agricultural vehicle may be acquired. This may allow for planned and future control operations to be determined before the agricultural machine picks up the parallel rows of the strip.

In some examples, the beamforming sensors may be located on another vehicle (not shown) that is different from the agricultural machine from which the strip is to be picked up, but may still be considered to be associated with the agricultural machine. For example, because another vehicle may be controlled such that it takes a route associated with an agricultural machine, or otherwise positioned relative to an agricultural vehicle. The other vehicle may be a manned or unmanned vehicle, and may be a land or air vehicle (an unmanned vehicle may be referred to as a drone (drone)). The use of aircraft may enable acquisition of sensor data from beamforming sensors at relatively high altitudes to obtain an overview of the field, thereby providing a wide field of view. Subsequently or alternatively, the aircraft may be kept at a lower altitude with the agricultural vehicle. For example by flying over or in front of an agricultural vehicle. The collected sensor data may be streamed to the controller and/or the "cloud".

Fig. 4 shows an example of a beamformed radar sensor 410 used to acquire sensor data representing a strip on the soil. The sensor data represents two distinct profiles: (i) a strip profile 404 (as represented by a portion of the sensor data that may be considered strip profile data) that represents a location of an outer surface of the strip; and (ii) a ground profile 406 (as represented by the portion of the sensor data that can be considered ground profile data) that represents the location of the surface of the ground/soil. By appropriately selecting the operating frequency of the radar signals 422, 424, the transmitted imaging signals may penetrate the strip. In this way, the received imaging signals 423, 425 due to the strip profile and the soil can be detected.

Accordingly, the controller may process the swath profile data and the ground profile data to determine swath area data representing a cross-sectional area of the swath. Further, in examples using a three-dimensional beamforming radar sensor, the strip profile data and the ground profile data may represent a three-dimensional profile, and the controller may process the strip profile data and the ground profile data to determine strip volume data representing a volume of the strip.

Fig. 5 shows an example of a beamformed radar sensor 510, the beamformed radar sensor 510 being used to acquire sensor data representing a strip on the soil in which a foreign object is located. In the same manner as fig. 4, the sensor data represents a strip profile 504 and a ground profile 506. In this example, the sensor data also includes foreign object data as represented by received imaging signals 527 that are reflected back to the beamforming radar sensor 510 by foreign objects 526 within the strip. As discussed above, if the power/amplitude of the received imaging signal 527 is greater than the power threshold level, the controller may determine a portion of the sensor data as foreign object data (and thus set the foreign object indicator data accordingly).

In some examples, the controller may determine a portion of the sensor data as foreign object data based on the distance to the detected object 526 (and thus set the foreign object indicator data accordingly). For example, the controller may determine a distance to the ground 506 and/or an outer surface of the strap 504 for a particular direction. The controller may correlate the received imaging signals to be associated with the ground or the strip based on their correlation (e.g., in one or more of range/power/phase) with received imaging signals received in other directions. Then, if the sensor data also includes a received imaging signal 527 that is not sufficiently correlated with the strip profile data or the ground profile data, the controller may determine that the received imaging signal 527 is associated with a foreign object. In one example, if the received imaging signal 527 represents a reflection from an object that is more than a threshold distance from the strip profile 504 and/or the ground profile 506, the received imaging signal 527 may be deemed insufficient to correlate with the strip profile data or the ground profile data.

In some examples, the controller may apply different power threshold levels to determine whether to classify the detected object as a foreign object. For example, if the received imaging signal 527 represents a reflection from an object that is:

if less than a threshold distance from the swath profile 504 or the ground profile 506, the controller may compare the power of the received imaging signal 527 to a first power threshold; and

if greater than a threshold distance from the swath profile 504 and/or the ground profile 506, the controller may compare the power of the received imaging signal 527 to a second power threshold.

The second power threshold (applied when a potential foreign object is not near the exterior of the strap) may be lower than the first power threshold based on the received imaging signal 527 being less likely to come from a discontinuity in the surface of the strap or ground.

Fig. 6 schematically illustrates a system that may determine vehicle control instructions 628 for an agricultural vehicle 630 based on strip attribute data 616.

The system includes a beamforming sensor 610, and the beamforming sensor 610 may be any beamforming sensor described herein. The beamforming sensors 610 provide sensor data 612 to a controller 614. Controller 614 processes sensor data 612 and determines strip attribute data 616, strip attribute data 616 may be any type of strip attribute data 616 described herein. The controller 614 also processes the strip property data 616 to determine vehicle control instructions 628 for the agricultural machine 630. As discussed above, by way of non-limiting example, the agricultural machine 630 may be a baler, a forage harvester, a tractor, or a windrower, and may or may not be self-propelled.

Vehicle control instructions 628 may include vehicle steering instructions for automatically controlling the direction of travel of agricultural machine 630. In this manner, if the controller 614 determines that the agricultural machine 630 is not centered on the strip (e.g., by identifying an offset between (i) a lateral center of the strip as defined by, for example, strip center data and (ii) a lateral center of a picker/head of the agricultural machine 630), the controller 614 may provide vehicle control instructions 628 that cause the steering of the agricultural machine 630 to be adjusted so as to center the agricultural machine 630 relative to the strip (e.g., to reduce the offset). In some examples, the controller 614 may determine the center and/or offset of the pickup/head of the agricultural machine 630 by utilizing a known relationship between the field of view of the beamforming sensor 610 and the center of the pickup/head of the agricultural machine 630. For example, the lateral center of the field of view of the beamforming sensor 610 may correspond to the lateral center of the pickup/head of the agricultural machine 630.

In this manner, the farm machine 630 may be autonomously controlled such that it picks up the strip in an improved manner (e.g., a manner that results in less strip being wasted/missed). That is, the banding guidance may be provided, for example, by identifying a banding contour.

The vehicle control instructions 628 may also or alternatively include vehicle speed instructions for automatically controlling the speed of the agricultural machine 630. For example, the controller 614 may determine crop area data or crop volume data (as swath attribute data 616) and provide vehicle speed instructions based on the crop area data or the crop volume data. In one example, the controller 614 may provide vehicle speed instructions that automatically increase the speed of the agricultural machine 630 when the sensor data 612 indicates a decrease in the value of the crop area data or the crop volume data, and vice versa. In some examples, controller 614 may apply an algorithm to the crop area data or the crop volume data to determine the vehicle speed command. In other examples, the controller 614 may use a database or lookup table to determine vehicle speed instructions based on crop area data or crop volume data.

In this way, feed forward control of the speed of the agricultural vehicle 630 (which may be a tractor towing a baler) may be performed, for example, based on the volume or cross-sectional area of the strap.

In some examples, controller 614 may determine vehicle control instructions 628 based on the foreign object indicator data (which is an example of strip property data 616). For example, the controller 614 may determine a vehicle speed command for automatically stopping the agricultural vehicle 630 in front of the detected foreign object. In some examples, the controller 614 may also cause an output device (such as a display or audio device) to provide information indicative of the detected foreign object to an operator of the agricultural vehicle 630. For example, its position in the strip or its size, or any other information that will assist the operator in removing the foreign object.

In some examples, the controller 614 may determine vehicle steering instructions for automatically steering the agricultural vehicle 630 around the detected foreign object. Further, in some examples, the controller 614 may determine vehicle pick up instructions for automatically controlling the picker/head of the agricultural machine 630 such that it does not pick up a strip near the foreign object. For example, the vehicle pick-up instructions may automatically control the pick-up/head such that the pick-up/head is raised in front of the detected foreign object (or the agricultural vehicle 630 is otherwise placed in a non-pick-up mode); and then the pickup/head is lowered behind the detected foreign object (or agricultural vehicle 630 is otherwise placed in a pickup mode).

In some examples, vehicle control instructions 628 may cause an output device (such as a display or audio device in the cab of agricultural vehicle 630) to provide instructions to the operator of agricultural vehicle 630 to set the travel speed and/or travel direction of agricultural vehicle 630.

In this way, foreign matter detection may be performed before foreign matter (such as stones or metal objects) is harvested. This is particularly useful for self-propelled forage harvesters where indium or aluminium objects may not be detected by the metal detection system. Detecting such foreign objects before harvesting is particularly advantageous because the operator of the agricultural machine can remove the foreign objects from the strip more quickly and easily than from inside the agricultural machine. This may therefore make the harvesting operation of the crop much more efficient than if foreign matter was placed into the agricultural machine 630.

Fig. 7 shows another example of how phased array radar sensor 710 may image strip 704.

Phased array radar sensor 710 is associated with an agricultural vehicle, which in this example is a tractor 730. Phased array radar sensor 710 includes a plurality of antennas, (i) the transmit signals are phase shifted relative to each other before being provided to the transmit antennas such that, in general, they combine constructively in a particular direction; and/or (ii) the received signals are phase shifted relative to each other such that, as a whole, they combine constructively in a particular direction. The phase shift of the transmit signal and/or the receive signal is controlled by the controller 714 such that a particular direction of θ is achieved. Controller 714 may be the same as or different from the controller (not shown in FIG. 7) that determines the stripe attribute data.

In this way, beamforming techniques may be used to scan the surface, and thus acquire swath profile data and ground profile data (as examples of surface profile data), as discussed above. Beamforming techniques may be thought of as the ability to drive each antenna/microphone of an antenna/microphone array with a different signal phase. In this way, it is possible to have a microwave/ultrasound beam with high directivity, since the interference between the different signals generated by each antenna/microphone can be varied. Using beamforming techniques, a beam (for either transmit or receive, or both) may be focused on a plurality of points, possibly each specific point, in the field of view of the beamforming sensor 710 so that the controller may evaluate the reflections of each point and identify the location of the strip.

From the above description it will be appreciated that the distance associated with each reflection and angle of arrival can be understood by appropriate post-processing of the sensor data. Further, by processing the shape of the received imaging signal, such as the radar signal waveform (e.g., amplitude, phase), the controller can determine physical and/or chemical information (e.g., density, humidity) about the strip.

It will be appreciated that any of the control operations disclosed herein, such as setting the travel speed or direction of travel of the baler or associated tractor, may be performed by comparing data to one or more thresholds, applying an algorithm to the data, or determining control values using a look-up table/database based on the received/determined data.

15页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于表征用户平台的环境的方法和设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类