Active seeker system

文档序号:1145906 发布日期:2020-09-11 浏览:26次 中文

阅读说明:本技术 主动导引头系统 (Active seeker system ) 是由 沙哈尔·利维 于 2018-09-05 设计创作,主要内容包括:本文公开了用于检测和/或跟踪移动目标的主动导引头系统。该系统包括:生成从系统输出的照射波束的照射模块;用于整形输出波束的视场(FOV)的立体角的光学组件;光路扫描模块,其适于围绕扫描轴成成角度地偏转输出波束的输出光路,以执行一个或更多个扫描循环;以及成像模块,其适于对来自光束的输出光路周围的特定视场的波束的光谱范围中的光进行成像。在一些情况下,输出光束的立体角被整形成使得其具有沿着波束的特定一侧延伸的细长FOV横截面;并且输出光路在横过波束的细长FOV的长轴的方向上成角度地偏转,以便扫掠波束的细长FOV,从而用一维扫描覆盖期望的关注的区域(FOR)。在一些实现方式中,该系统适用于监测FOR,以用于检测和跟踪目标。监测可以包括目标检测阶段,在该阶段期间,光束的FOV被设置为小于FOR的范围,并且FOR在扫描成像模式下被成像。监测可以包括目标跟踪阶段,在该阶段期间,根据与被跟踪的目标相关联的特定估计属性来调节一个或更多个成像参数。(Active seeker systems for detecting and/or tracking moving targets are disclosed herein. The system comprises: an illumination module that generates an illumination beam output from the system; an optical assembly for shaping a solid angle of a field of view (FOV) of the output beam; an optical path scanning module adapted to angularly deflect an output optical path of the output beam about a scanning axis to perform one or more scanning cycles; and an imaging module adapted to image light in a spectral range of the beam from the particular field of view around an output optical path of the light beam. In some cases, the solid angle of the output beam is shaped such that it has an elongated FOV cross-section extending along a particular side of the beam; and the output optical path is angularly deflected in a direction transverse to the long axis of the elongate FOV of the beam so as to sweep the elongate FOV of the beam to cover the desired area of interest (FOR) with a one-dimensional scan. In some implementations, the system is adapted to monitor FOR detecting and tracking a target. The monitoring may include an object detection phase during which the FOV of the light beam is set to be less than the range of FOR, and FOR is imaged in a scanning imaging mode. The monitoring may include a target tracking phase during which one or more imaging parameters are adjusted according to a particular estimated attribute associated with the target being tracked.)

1. An active seeker system comprising:

an illumination module configured and operable to generate a light beam for propagation along an output optical path of the system;

an imaging module operable to image light in a spectral range of the beam of light from the output optical path;

an optical assembly comprising a beam shaper adapted to shape the beam to form an output beam having an elongated cross-section extending along a particular transverse axis across the output optical path for illuminating a particular field of view; and

an optical path scanning module configured and operable to angularly deflect a direction of the output optical path about a scanning axis to perform a one-dimensional scanning cycle by sweeping a field of view of the elongated output beam to cover an area of interest (FOR).

2. The system of claim 1, wherein the beam shaper is configured and operable to shape the beam such that a field of view of the elongated output beam has a line shape and a transverse aspect ratio between a wide transverse dimension and a narrow transverse dimension of the elongated beam is about 40.

3. The system of claim 1 or 2, wherein the elongated output beam extends to cover a first lateral dimension of the FOR, and the optical path scanning module is configured and operable to perform the one-dimensional scanning cycle by: deflecting a direction of the output optical path to scan a particular angular range around the scanning axis so as to sweep the beam to cover a second dimension of the FOR being imaged in the one-dimensional scanning cycle.

4. The system of any preceding claim, comprising a control system connectable to the imaging system and configured and operable for monitoring the region of interest while searching for the target by receiving a sequence of images captured by the imaging system during the one-dimensional scan cycle, and processing the images to detect the target illuminated by the beam; the detecting includes determining whether return light reflected/scattered by the light beam from the target is captured by one or more pixels of at least one image in the sequence of images.

5. The system of claim 4, wherein the control system is configured and operable for determining the one or more pixels by determining whether an intensity of light captured by the pixels exceeds a particular predetermined signal-to-noise ratio (SNR) threshold.

6. The system of claim 4 or 5, wherein the intensity of the return light reflected/scattered from the target and captured by the one or more pixels of the at least one image is a function of τ and 1/R, where R is the distance between the imaging module and the target and τ is the exposure time of the at least one image; and wherein the controller is adapted to estimate a change in the distance R of the object during the scan and to dynamically adjust the exposure time τ of the image with the change in the distance, thereby reducing the total time T of the scan cycle while optimizing the SNR of the object detection to be above a predetermined threshold.

7. The system of claim 5 or 6, wherein the intensity of the return light scattered from the target and captured by the one or more pixels in the sequence of images is T/(R)4Ω) where R is the distance between the imaging module and the target, Ω is the solid angle of the total area of interest (FOR) covered by the scan cycle, and T is the duration of the scan cycle(ii) a And wherein the controller is adapted to perform a plurality of scan cycles FOR tracking the target while dynamically adjusting at least one of a duration T of the scan cycle and a solid angle Ω of the FOR scanned in the scan cycle.

8. The system of claim 7, wherein, during tracking of the target, the controller is adapted to dynamically reduce the duration T of the scanning cycle as the distance R of the target becomes shorter, thereby dynamically optimizing the frame rate 1/T, respectively, of FOR frames scanned during the one-dimensional scanning cycle.

9. The system of claim 7 or 8, wherein the controller is adapted to determine an angular velocity of the target during tracking of the target and to dynamically adjust a solid angle Ω of the FOR covered by one or more scan cycles, thereby optimizing the tracking flexibility.

10. The system of any preceding claim, wherein the imaging system comprises a one-dimensional photodetector array, and wherein the imaging system is configured to image a field of view of the elongated output beam on the one-dimensional photodetector array.

11. The system of any one of claims 1 to 9, wherein the imaging system comprises a two-dimensional photodetector array.

12. The system of claim 11, wherein the imaging system is configured to have a field of view greater than a field of view of the illumination module, whereby the field of view illuminated by the elongated output beam is imaged onto a subset of pixels of the two-dimensional photodetector array.

13. The system of claim 12, comprising a false alarm processor adapted to process images captured by the two-dimensional photodetector array to determine false detection of a target by comparing parameters of light intensities captured by a subset of the pixels to corresponding noise-related parameters associated with light intensities measured by other pixels of the two-dimensional photodetector array.

14. The system of claim 13, wherein the noise-related parameter is estimated using one or more of: an average background clutter level and slope associated with the standard deviation of the clutter, an estimated size of structures in the image, and a comparison between the images.

15. The system of any preceding claim, wherein the scan axis is a transverse axis across the output optical path.

16. The system of claim 15, wherein the scan axis is not orthogonal to the particular lateral axis.

17. The system of claim 16, wherein the scan axis is parallel to the particular transverse axis.

18. The system of any preceding claim, wherein the optical path scanning module comprises a gimbal rotatable about the scanning axis, the gimbal being configured and operable for performing the angular deflection of the direction of the output optical path.

19. The system of claim 18, wherein the gimbal comprises an actuation module for rotating about the scan axis.

20. The system of any one of claims 18 or 19, wherein the imaging module is mounted on the gimbal.

21. The system of any one of claims 18 or 19, wherein the imaging module is external to the gimbal and has a fixed imaging optical path relative to the gimbal, and wherein the optical assembly comprises one or more imaging optical elements disposed on the gimbal and configured and operable for directing light from the output optical path to propagate along the fixed imaging optical path for imaging by the imaging module.

22. The system of any one of claims 15 to 20, wherein the illumination module is mounted on the gimbal.

23. The system of any one of claims 15 to 20, wherein the illumination module is external to the gimbal and has a fixed light projection optical path along which the light beam emanates from the illumination module, and wherein the optical assembly comprises one or more light-guiding optical elements arranged on the gimbal and configured and operable for guiding the light beam from the fixed light projection optical path along the output optical path.

24. The system of any one of claims 1 to 18, wherein the optical path scanning module comprises one or more scanning optical deflectors configured and operable for performing the angled deflection of the direction of the output optical path.

25. The system of claim 24, wherein the one or more scanning optical deflectors comprise at least one MEMs turning mirror.

26. The system of claim 24 or 25, wherein the illumination module has a fixed light projection path along which the light beam emanates from the illumination module; and wherein the optical module comprises one or more optical elements for directing light from the projection optical path to propagate along the output optical path defined by the one or more scanning optical deflectors.

27. The system of any one of claims 24 to 26, wherein the imaging module has a fixed imaging optical path for imaging light arriving along the fixed imaging optical path; and wherein the optical module comprises one or more optical elements for directing light from the output optical path defined by the one or more scanning optical deflectors and directing light from the output optical path defined by the one or more scanning optical deflectors to propagate along the imaging optical path for imaging by the imaging module.

28. The system of claim 4, wherein upon detection of the target, the control system is configured and operable to initiate a target tracking phase to steer a platform carrying the system toward the target.

29. The system of claim 25, wherein during the target tracking phase, the control system stops operation of the one-dimensional scan and initiates a snapshot mode of operation in which the output beam is directed in a forward direction to continuously illuminate the target, and the control system operates the beam shaping module to adjust the cross-section of the output beam such that the transverse aspect ratio of the cross-section of the output beam is in the range of 1 to 2, approximately 1, and expands the solid angle of the field of view of the output beam so as to cover the angular range of the target.

30. The system of claim 28 or 29, wherein the beam shaping module comprises an optical element configured and operable for adjusting a transverse aspect ratio of the output light beam.

31. The system of any one of claims 28 to 30, wherein the beam shaping module comprises an adjustable beam expander configured and operable for expanding a field of view of the light beam.

32. The system of claim 31, wherein during the target tracking phase, the control system processes images captured by the imaging module to estimate a number of pixels illuminated by light returning from the target, and upon determining that the number of pixels exceeds a predetermined threshold, operates the beam expander module to expand the beam and the field of view of the imaging assembly.

33. The system of any one of claims 28 to 33, wherein the control system comprises a steering controller connectable to a steering module of the platform and configured and operable for operating the steering module to steer the platform towards the target.

34. The system of claim 33, wherein the steering controller comprises a closed-loop controller adapted to process images acquired by the imaging module and operate the steering module so as to minimize a difference between identities of pixels in successive images illuminated by light returning from the target, thereby minimizing an angular velocity of the platform relative to the target and directing the platform toward the target.

35. An active seeker system comprising:

an illumination module configured and operable for generating an illumination beam for propagation along an output optical path of the system;

an optical assembly including a beam shaper adapted to adjust a field of view (FOV) solid angle of the beam to form an output beam having the adjusted FOV propagating along an output optical path of the system;

an imaging module operable to image light in a spectral range of a light beam from a particular FOV around the output optical path;

an optical path scanning module configured and operable to angularly deflect a direction of the output optical path about a scanning axis to perform one or more scanning cycles; and

a control system configured and operable FOR operating the optical assembly, the imaging module and the optical path scanning module to monitor an area of interest (FOR) FOR detecting and tracking a target; whereby the monitoring comprises: performing an object detection phase FOR detecting the object within a predetermined FOR, wherein the object detection phase comprises:

(i) setting the FOR to a first range;

(ii) operating the optical assembly to adjust the FOV of the light beam to a range less than the FOR;

(iii) operating the scanning module and the imaging system in a scanning imaging mode to obtain FOR frame image data by scanning the FOR with the illumination beam and capturing a plurality of images of different portions of the FOR; and

(iv) processing the plurality of images to identify image pixels in the plurality of images that are indicative of the target; thereby, the target is detected;

and when the target is detected, performing a target tracking phase for tracking the target.

36. The system of claim 35, wherein in (iii), the control system is configured and operable to dynamically adjust exposure times of the plurality of images during capture of the plurality of images, thereby optimizing the time required for the detection of the target.

37. The system of claim 36, wherein the exposure time is adjusted based on an estimated distance of the target during the detection phase.

38. The system of any one of claims 35 to 37, wherein performing the tracking phase comprises sequentially capturing a plurality of FOR frame image data indicative of the FOR, and processing each of the FOR frame image data to identify a target in each of the FOR frame image data; and wherein the controller is adapted to dynamically adjust at least one of the following parameters of the capture of one or more of the FOR frame image data during the tracking:

(i) a solid angle range of the FOR captured in each FOR frame image data;

(ii) a frame rate 1/T FOR the sequential capture of the FOR frame image data;

(iii) a selected imaging mode FOR capturing the FOR frame image data, whereby the selected imaging mode is selected as one of a scan imaging mode and a snapshot imaging mode.

39. The system of claim 38, wherein the solid angle range of the FOR capturing a particular FOR frame image data is adjusted based on an estimated angular velocity of the target present in a preceding FOR frame image data.

40. The system of claim 38 or 39, wherein the frame rate 1/T used to capture a particular FOR frame image data is adjusted based on an estimated SNR of the target in a preceding FOR frame image data.

41. The system of claim 38 or 39, wherein the selected imaging mode FOR capturing particular FOR frame image data is adjusted based on an estimated distance of the target.

Technical field and background

A seeker (seeker head), also referred to hereinafter interchangeably as a seeker (seeker), is a guidance system (typically mounted on a movable platform such as a missile) that is operable to capture/image radiation (e.g., emitted and/or reflected/scattered radiation) returning from a target and process the captured/imaged radiation to detect the target and then track and follow the target.

Seeker types can be divided into two main categories: passive and active guidances, passive guidances do not actively illuminate the target with a radiation beam, while active guidances usually comprise a radiation source for illuminating the target.

An example of a passive seeker is disclosed, for example, in U.S. patent No. 5,389,791, which discloses a conventional rate-stabilized IR seeker with an improvement that results in a wider FOV and much less image blur. The improvement comprises a circular optical wedge rotatably mounted in front of the optical assembly of the seeker sensor. The combination of the primary scan vector generated by the circular sensor of the seeker and the secondary scan vector generated by the rotating wedge produces a scan pattern with a wider overall FOV than the scan pattern of the sensor alone, while providing a contingent point with zero or near zero spatial scan velocity. Such points enable "snapshot" data to be collected with little or no image blur.

A disadvantage of the passive seeker type is that the signal emission (intensity) from the target cannot be controlled by the interceptor and may depend on many unpredictable parameters such as target emissivity, target temperature, target trajectory, etc. Thus, the signal-to-noise ratio (SNR) of such a seeker may be relatively low. This is because passive directors typically do not include any radiation source and do not actively illuminate the target, but are merely configured to detect radiation emission/scattering/reflection from the target.

It should be noted that in the following description, the term SNR is used to denote the ratio of signal strength to instrument noise of the detector and background (e.g., clutter) radiation sensed by the detector. This disadvantage is particularly evident when highly flexible detection and tracking is required to follow high angular velocity targets (due to high target speeds or large misalignment), since in this case the detection should be done at a high rate (short exposure time), and therefore the signal strength obtained on each image and the corresponding SNR may be poor and may prevent reliable detection and/or tracking of the passive seeker.

In some cases, to overcome the disadvantages of passive guidances, active guidances are used, which typically include a radiation source (e.g., a light source) in addition to a radiation detector. The radiation source is used to actively irradiate the target so as to return a sufficient intensity of radiation from the target. Thus, active seeker systems generally provide improved seek (seek) performance, since the target is actively illuminated by the active seeker, and therefore the signal strength and corresponding SNR returned from the target may be much higher than passive seeker systems.

Examples of active seeker systems are disclosed in, for example, U.S. patent No. 5,200,606 and U.S. patent No. 4,024,392, U.S. patent No. 5,200,606 discloses an aperture mirror that allows outgoing transmitted laser beams to pass through and reflect returning reflected beams onto a detector array; 4,024,392 discloses a gimbaled active optical system having a laser beam output that coincides with the instantaneous field of view of the gimbal system over a large angle to create an active laser guidance head.

SUMMARY

As mentioned above, active guiders (active guiders) are commonly used for detecting and tracking moving targets, especially when there is a need for reliable and flexible detection and tracking of fast moving targets (e.g. targets with high angular velocity).

A major drawback of conventional active probes is that they require a powerful radiation source to illuminate the area of interest (FOR), i.e. the entire solid angle in which the target is expected, in order to reliably detect the target within a sufficiently short detection time T. Thus, due to the need to use a powerful radiation source, the payload of the seeker carried by the platform (e.g., missile) carrying the seeker should also include a relatively large and heavy energy source that powers the radiation source. Thus, the weight and optional size of the active seeker payload (including the seeker itself and its power source) may become too heavy/large to be carried by a flexible platform (e.g., an interceptor). This places severe limitations on the platforms and tasks that may be employed with an active seeker (e.g., depending on the size of the platform carrying the active seeker and/or the duration of the platform tasks).

The present invention provides a novel active seeker system and method for improving the above-mentioned shortcomings of conventional active seeker systems. Advantageously, with the seeker system and method of the present invention, the need for using a high power radiation source is greatly relaxed.

The present invention is based on the inventors' understanding that in order to achieve reliable homing to fast moving targets, different radiation exposure schemes with different and/or dynamically adjusted imaging exposure/integration times τ and/or with different and/or dynamically adjusted total FOR Frame Capture Times (FCT) should be employed during different phases of the detection operation and tracking operation of the system. In this regard, it should be noted that the phrase fast moving target is used herein to indicate a target that may attain (as viewed from the seeker reference frame) a large angular velocity relative to the platform (e.g., interceptor). To this end, it should be understood that the phrases exposure time τ and/or integration time are used interchangeably herein to refer to the exposure time of an imaging module used to capture an image. The phrases "FOR frame capture time" and/or "FOR capture time" (also abbreviated herein as FCT) are used herein to refer to the total time required to capture image data indicative of the entire FOR. When operating in snapshot mode, the entire FOR is captured in a single image, so the FOR Frame Capture Time (FCT) T is effectively equal to the exposure time τ. However, when operating in scan mode, M images are grabbed to cover the entire FOR, and thus in this case, the FOR Frame Capture Time (FCT) may be the sum of the exposure times τ of the M images (according to the present invention, the exposure times of the M images may be different and dynamically adjusted).

Thus, according to various embodiments of the present invention, the active guidance head is configured and operable for dynamically adjusting various parameters of the illumination and imaging protocols, whereby the target will be illuminated and imaged according to the operating phase of the system (e.g., the target detection phase and/or the target tracking phase) and/or according to the distance of the target from the guidance head. The parameters adjusted may include, FOR example, illumination intensity, FOR (Ω), imaging exposure time (τ), fct (t), and the like.

In this regard, the following should be understood. The term FCT is used herein to denote the total time (T) that the system takes to image(s) covering the entire FOR. Such a set of images (in units of each FOR capture cycle) is referred to herein as FOR frame data/images. FOR frame data may be acquired by a scanning imaging technique, in which case the information of the FOR frame consists of several (e.g., M) images of different FOVs captured during a scan cycle. Alternatively or additionally, FOV frame data may be acquired by a snapshot imaging technique or by a snapshot imaging technique, in which case the information of the detection frame typically consists of one snapshot image with a FOV equal to or greater than FOR. In this context, the term region of interest (FOR) is used to denote the entire solid angle imaged in each frame period. In this context, the term "illumination and imaging mode" is used to refer to an illumination and imaging technique FOR capturing a FOR frame image of a FOR in which the target is expected to reside. This may include a snapshot imaging mode and a scan imaging mode.

In snapshot imaging mode, the system is run at a frame rate, whereby, in each frame period, the illuminating radiation fills (flood) the entire FOR simultaneously, and a snapshot of the entire FOR is captured on the detector/imager. In this case, the solid angle field of view (FOV) of the projected illuminating radiation is set to cover the entire FOR, and thus equal to or larger than the solid angle of FOR. Therefore, the FOV of the imaging system is also set equal to or greater than FOR in this mode.

It should be noted that the term field of view (FOV) as used herein generally refers to the solid angle of imaging and/or illumination. In various implementations of the system according to the invention, the FOVs of the imaging and illumination may not be similar, and more particularly may have different ranges/solid angles. To this end, the imaging FOV (denoted herein as I-FOV) is the total FOV of the imaging system. The illuminator FOV is the solid angle of the illuminator beam, called the L-FOV.

In a scanning imaging mode, illuminating radiation is projected onto the FOR at a solid angle of field of view (FOV) much smaller than the FOR. In this case, the frame period corresponds to a scanning cycle during which the illuminating radiation is scanned to cover the entire FOR. Each "step" of the scan cycle lasts FOR a particular "integration period" during which the imager/detector is operated to capture at least a portion of FOR illuminated by the beam of illuminating radiation (i.e., at least a portion of FOR that overlaps the FOV of the illuminating beam). Thus, at the end of a scanning cycle, a plurality of images are obtained, which images together constitute detection frame information indicative of the radiation response (reflection/scattering) obtained from the reflection/scattering of the illumination beam scanned over the whole FOR.

To this end, according to various embodiments of the present invention, one or more of the above-described parameters (i.e., one or more of (1) illumination and imaging modes, (2) frame rate, and (3) solid angle of FOR) are dynamically adjusted during system operation in order to optimize illumination and/or more specifically, the SNR of radiation returned from a target in response during system operation.

For example, according to some embodiments, the system is configured and operable in two modes of operation: a detection mode and a tracking mode. During the detection mode, there may be no information about the position of the target relative to the seeker, or only approximate information (e.g., information that may be provided from an external source, such as a long range radar system). In this case, assuming that the target may be very far from the seeker, the seeker is operated in a scanning imaging mode such that the SNR of the signal returned from the target is optimized (e.g., maximized) for a given illumination/radiation source (with a given output intensity) of the system. In this case, the scanning imaging mode may provide an optimization of the SNR, since in this mode the FOV of the illumination beam is much smaller than FOR. Thus, when operating in a scanning mode, the flux of illuminating radiation falling on the target and correspondingly the instantaneous intensity of radiation returning (reflected/scattered) from the target is stronger. Thus, the received SNR is enhanced.

More specifically, the inventors of the present invention have noted that the scan imaging mode is advantageous compared to the snapshot imaging mode in terms of the SNR obtained for a given frame rate (for a given scan cycle duration) and illumination power P (total output power of the illumination module of the system). This is because in scan mode, during each scan cycle (whose duration frame duration T is the duration at the frame rate), a sequence of multiple images (e.g., M images) of FOR is acquired, thereby grabbing/integrating each image over a smaller integration time (also referred to herein as exposure time) τ such that the sum of all M exposure times is equal to T (the exposure times are not necessarily equal). Thus, the amount of noise N (i.e., clutter noise (and possibly instrument noise)) collected in an image is smaller compared to a snapshot imaging mode in which a single image is captured over an integration time τ that matches the frame period (τ -T). On the other hand, in scan mode, only a portion of the solid angle Ω of FOR (the FOV of the illuminator in scan mode) is illuminated during the integration time τ during which each image is captured. This portion is the solid angle phi of the FOV of the illuminator and may be at most M times smaller than the solid angle omega of FOR, omega-M phi (since M images are used to cover FOR), and thus the flux (power transferred to unit area) is at most M times larger in scan mode FOR a given illuminator power P.

Another advantage of a shorter integration time is that the object spans fewer pixels on the detector in each frame, which effect improves the SNR. During the exposure time, the target may cross pixels on the detector due to the angular velocity of the target relative to the seeker.

However, scanning imaging modes may be advantageous in terms of SNR, and in some cases/scenarios scanning also introduces some difficulties, especially when tracking moving targets. This is because scanning imaging modes, particularly where high speed scanning is used, may result in side effects such as scan jitter (e.g., due to vibrations or other disturbances), and/or so-called rolling shutter effects, which may in turn result in less accurate detection and positioning of the target. In addition, high speed scanning may require a fast motor to operate the scanning gimbals/mirrors and stabilize them against vibration. These in turn can be heavy and power consuming (requiring a heavy/large power source) and are therefore less suitable for use with a seeker mounted on a relatively small and flexible platform. Furthermore, scanning may be less efficient when it involves detection and tracking of fast moving targets. This is due in particular to the so-called rolling shutter effect of the scanning. More specifically, in the case of an angular velocity of the target relative to the system (which is relatively large relative to the angular velocity of the seeker, FOR example, relative to a platform carrying the seeker), a rolling shutter effect may be exhibited because, when the image scans capture a set of consecutive images of different portions of the FOR, a rapidly moving target (having a high angular velocity) within the FOR may be lost, FOR example, at an integration time τ1The target is located in the FOV-B portion of the FOR and has moved to FOV-A by the time τ 2 the scan moves to illuminate and image the FOV-B portion of the FOR. Thus, height scanning is preferably avoided in cases where the target may move at a high angular velocity relative to the system, due to the rolling shutter effect of the scanning. In this regard, it should be understood that the term rolling shutter effect/artifact is used herein to refer to effects associated with scanning and not to a particular shutter/readout operation of an imaging sensor, which may be in various implementationsIn embodiments configured for operation in a global and/or rolling shutter.

The present invention provides several techniques that, individually or in combination, allow the above-described drawbacks of the scanning imaging mode to be alleviated while also obtaining the SNR benefits of the scanning imaging mode, which allows the use of less powerful and therefore smaller and lighter illumination modules and power sources.

According to some embodiments, the inventive guide head is configured FOR operating a one-dimensional scanning pattern to cover FOR. In this case, an elongate illumination beam (e.g. having an aspect ratio of its cross-section of about 40) is scanned in one direction across (e.g. perpendicular to) the longer dimension of its aspect ratio (this may typically be a transverse scan of the beam or a rotation of the beam cross-section on an axis substantially parallel to its optical path). The use of one-dimensional scanning with an elongated beam has several advantages compared to two-dimensional scanning (e.g. raster scanning, etc.) of a beam, which is typically point-like. This is because two-dimensional scanning typically has a "fast" scan axis along which the spot of the illuminating beam moves very fast and a "slow" scan axis along which the beam spot moves much slower, so one-dimensional scanning eliminates the need to scan along the "fast" scan axis (since in the case of 1D scanning the entire span of the "fast" axis is illuminated and captured in a snapshot of each image of the scan). This in turn reduces the need to perform scanning along the fast axis using a fast scan actuator (e.g., a gimbal motor). This is because in this case the system employs a one-dimensional scan of the elongated illumination beam over FOR, wherein the broadsides/dimensions of the beam cross-section completely cover one dimension of the angular span of FOR, while at each scanning step a snapshot image is captured with a FOV covering the entire FOV of the elongated illumination beam. Thus, in embodiments of the seeker of the present invention employing one-dimensional active scanning of the FOR, a fast and powerful scanning actuator may not be required, and a slower and less powerful actuator (gimbal motor) may only be included, and thus a smaller and lighter battery/energy source. Furthermore, when using a slower one-dimensional scan, the jitter effects of the scan are also mitigated.

Furthermore, one-dimensional scanning is also advantageous in terms of the rolling shutter effect. In particular, in this case, the rolling shutter effect may only be present along a one-dimensional scan axis, while the artifacts of the rolling shutter effect along this scan axis are also smaller and less pronounced, since the elongated beam sweeps FOR along only one dimension, thereby reducing the chance that objects present in FOR will disappear in the scan.

Furthermore, according to some embodiments of the invention, the one-dimensional scan is performed along a preferred scan direction relative to which the angular velocity of the target relative to the seeker is expected to be relatively small. This improves/reduces the above-mentioned rolling shutter effect which largely depends on the angular velocity of the target, and thus improves the reliability of the system in detecting and tracking the target. Indeed, in some implementations of the system, the scan direction may be controlled/adjusted (e.g., by a controller of the seeker) such that it may be adjusted to the preferred direction, while in other embodiments the scan direction is fixedly set to the preferred direction. For example, in many cases, the velocity of the target in the horizontal direction/plane is expected to be much greater than its vertical velocity. Thus, in turn, the angular velocity of the target in the horizontal axis (yaw plane) may be higher relative to the seeker than would be possible with the angular velocity of the target in the vertical axis (pitch plane) relative to the seeker. Thus, in some embodiments, the scanning direction is set/adjusted/controlled to scan in the vertical direction (in the elevation direction), whereby the illumination beam is shaped to be horizontally (in the yaw direction) elongated. This substantially eliminates the rolling shutter effect that may occur due to the angular velocity of the target in the horizontal/yaw plane.

It should be noted that as described in the various embodiments above, the scanning imaging mode may also be continued after the target is detected. For example, the scan imaging mode may be performed as long as the distance of the target is large and prevents a sufficient SNR from being obtained in the snapshot imaging mode. However, in some embodiments, once a target is detected, and during tracking mode, various parameters of: (1) illumination and imaging modes; (2) an exposure time τ; (3) frame rate 1/T; and (4) the solid angle of FOR, which may be dynamically adjusted during the tracking mode of operation (e.g., based on the distance and velocity of the target) in order to optimize illumination and/or more specifically the SNR of radiation returned from the target in response during the tracking operation, while also possibly optimizing the frame rate 1/T, thereby optimizing flexibility in tracking the target. Additionally, where one-dimensional scanning is employed, the direction of the one-dimensional scanning may be dynamically adjusted based on the angular velocity of the target relative to the seeker as described above in order to mitigate/reduce the rolling shutter effect of the scanning.

Thus, the system may continue to operate in the scanning imaging mode, for example, until the target is close enough so that, for a given illuminator of the system, the SNR of the return radiation from the target is sufficiently high (i.e., above a particular desired SNR threshold) even if operating in the snapshot imaging mode.

To this end, according to some embodiments of the present invention, the controller of the system changes the mode of operation of illumination and imaging to snapshot mode when the target is close enough to occupy a large portion of the FOR, resulting in a sufficiently high SNR. This is advantageous because the angular velocity of the target relative to the system reaches higher values as the target gets closer.

Thus, operation in the snapshot mode avoids the rolling shutter effect of scanning and provides more reliable tracking. In this mode, the FOV of the illumination beam is expanded/adjusted (e.g., with a beam expander) to cover/fill the entire FOR at the same time, and possibly also correspondingly adjust the FOV of the imaging so as to cover at least the FOV of the illumination beam. Thus, images of the entire FOR are captured sequentially without scanning. As will be described below, the frame rate T of the imaging may be dynamically adjusted/increased as the target approaches, depending on the SNR from the target, as long as the SNR remains above a certain minimum threshold level. This enables better tracking flexibility for close range targets that may be moving at high angular velocities relative to the seeker.

Alternatively or additionally, according to some embodiments, the system is configured and operable for targeting and system-dependentThe desired distance (and/or possibly the frame rate of system operation, depending on the mode of operation of the system, whether detection mode or tracking mode). In fact, the farther away the target, the smaller the portion of the system that occupies the solid angle FOR the FOR. In effect, for a given illumination intensity flux, the return (scatter flux) and R received by the detector/image-4Proportional, where R is the distance to the target. Thus, according to some embodiments of the present invention, the frame rate of imaging (whether in scan mode or snapshot mode) is adjusted to approximately R4Proportionally such that for a given intensity of the illumination beam, sufficient radiation signal (sufficient intensity) returned from the target will be collected on the detector/imager. This in turn dynamically optimizes (e.g., stabilizes) the SNR of the detected signal from the target when the target may be at different distances.

In this regard, it should be noted that, in practice, the intensity of the radiation returned from the target depends not only on the distance from the target, but also on the size of the target, or more specifically, on the cross-sectional area of the target covering the region of interest. To this end, in some embodiments, the frame rate is adjusted according to an estimated/determined number of detector pixels illuminated by radiation returning (scattered/reflected) from the target. This is because the number of pixels illuminated is actually indicative of the size of the target in the FOR (cross-sectional projection) and the distance of the target.

Furthermore, in some embodiments, the solid angle of the FOR itself is dynamically adjusted during operation of the system, FOR example, according to the mode of operation of the system and/or according to the distance of the target or the number of detector pixels illuminated by radiation returning from the target. More specifically, FOR example, during a detection mode of operation of the system during which an object is searched and has not been detected/identified, the FOR of the system may be set to cover a relatively large solid angle (e.g., at 6 e)-6To 50e-6In the range of steradians) such that a large FOR is covered in searching FOR a target during each frame period. After the target has been detected (i.e., during the tracking mode of operation), the FOR of the system may be set to cover a substantially reduced solid angleAs long as it is large enough to ensure that the moving object remains within the FOR, thereby enabling continuous tracking. In this regard, it is noted that the smaller the FOR, the higher the illumination flux from a given illuminator (e.g., the radiation/light source of the system), and correspondingly, FOR a given frame rate, the better the SNR of the signal returned from the target and captured by the detector FOR a given target distance. This therefore allows to achieve a desired SNR with a higher frame rate or to include a reduced power illumination source in the system while achieving sufficient illumination of the target for tracking the target even in a relatively large distance range.

In this regard, it should also be noted that, in general, the further away the target (the greater its distance from the seeker system), the smaller the angular velocity of the target relative to the system (assuming the target has a particular given velocity), and thus in successive FOR frame capture cycles, a smaller FOR is required in order to ensure that the target remains within the FOR of the seeker. Thus, in some implementations, during the tracking mode, the system operates to change the FOR of the system as a function of the estimated distance of the target from the system and/or as a function of the angular velocity of the target relative to the system (the angular velocity of the target may be estimated, FOR example, by comparing changes in the position of the target between successive frames). In general, the FOR may be adjusted to cover a larger solid angle as the target becomes closer to the system and/or the relative angular velocity of the target increases. The controller is configured to dynamically adjust FOR during the tracking mode to increase its solid angle, typically as the target approaches, to ensure (1) that the target remains within FOR in successive frame(s) and (2) preferably (within the limits of the system) that the entire target is captured in each frame during most of the detection mode (i.e., until the target is very close to the system).

According to yet another aspect of the invention, the system is configured and operable for further improving the SNR of target detection by providing and utilizing means for assessing the amount of clutter radiation in the target's surroundings. To this end, according to some embodiments, the imager and the illumination beam are directed to cover substantially overlapping field of view FOVs, while the FOV of the imager is adjusted/configured to be larger than the FOV of the illumination beam and also extend beyond the FOV of the illumination beam at least during the scan imaging mode. It is therefore contemplated that the radiation signals returned from the target are provided only from those pixels of the detector whose field of view overlaps with the FOV of the projected illumination beam (since the signals from the target are based on the radiation of the illumination beam reflected from the target). Other pixels whose field of view does not overlap the FOV of the illumination beam are considered to sense clutter radiation only (because these pixels cannot receive illumination beam radiation reflected/scattered from the target). Thus, according to some embodiments, the controller processes some pixels outside the L-FOV to estimate the level of clutter noise.

Thus, in accordance with one salient aspect of the present invention, a novel and inventive active seeker system and method is provided. The active seeker system/method of this aspect includes:

an illumination module configured and operable for generating an illumination beam for propagation along an output optical path of the system;

an optical assembly including a beam shaper adapted to adjust a field of view (FOV) solid angle of a light beam to form an output light beam having an adjusted FOV propagating along an output optical path of the system;

an imaging module operable to image light in a spectral range of the light beam from a particular FOV around the output optical path;

an optical path scanning module configured and operable to angularly deflect a direction of the output optical path about a scanning axis to perform one or more scanning cycles; and

a control system configured and operable FOR operating the optical assembly, the imaging module and the optical path scanning module to monitor a region of interest (FOR) FOR detecting and tracking the target. According to this aspect of the invention, the monitoring comprises performing an object detection phase FOR detecting objects within a predetermined FOR, wherein the object detection phase comprises:

(i) setting FOR to a first range;

(ii) operating the optical assembly to adjust the FOV of the light beam to a range less than the FOV;

(iii) operating the scanning module and the imaging system in a scanning imaging mode to obtain FOR frame image data by scanning the FOR with the illumination beam and capturing a plurality of images of different portions of the FOR; and

(iv) the plurality of images are processed to identify image pixels indicative of the target in the plurality of images and thereby detect the target.

In some embodiments, in operation (iii) above, the control system is configured and operable to dynamically adjust exposure times of the plurality of images during capture of the plurality of images, thereby optimizing the time required for detection of the target. For example, the exposure time may be dynamically adjusted during the detection phase based on the estimated distance of the target.

According to some implementations, upon detecting an object, the system performs an object tracking phase for tracking the object. In some embodiments, performing the tracking phase includes sequentially capturing a plurality of FOR frame image data indicative of FOR and processing each of the FOR frame image data to identify a target therein. The controller is adapted to dynamically adjust at least one of the following parameters of capturing one or more FOR frame image data during tracking:

(i) a solid angle range of FOR captured in each FOR frame image data;

(ii) a frame rate 1/T FOR sequential capture of FOR frame image data;

(iii) a selected imaging mode FOR capturing FOR frame image data, wherein the selected imaging mode may be one of a scan imaging mode and a snapshot imaging mode.

According to some implementations, the solid angle range of FOR capturing particular FOR frame image data is adjusted based on an estimated angular velocity of an object appearing in the preceding FOR frame image data.

According to some implementations, the frame rate 1/T used to capture a particular FOR frame image data is adjusted based on an estimated SNR of the target in the previous FOR frame image data.

According to some implementations, the selected imaging mode used to capture the particular FOR frame image data is adjusted based on the estimated distance of the target.

In accordance with yet another salient aspect of the present invention, a novel and inventive active seeker system and method is provided. The active seeker system/method of this aspect includes:

an illumination module configured and operable to generate a light beam for propagation along an output optical path of the system;

an imaging module operable to image light in a spectral range of the light beam from the output optical path;

an optical assembly comprising a beam shaper adapted to shape the beam to form an output beam having an elongated cross-section extending along a particular transverse axis across the output optical path for illuminating a particular field of view; and

an optical path scanning module configured and operable to angularly deflect a direction of the output optical path about a scanning axis to perform a one-dimensional scanning cycle by sweeping a field of view of the elongated output beam to cover an area of interest (FOR).

According to some embodiments, the beam shaper is configured and operable for shaping the beam such that a field of view of the elongated output beam has a linear shape and a transverse aspect ratio between a wide transverse dimension and a narrow transverse dimension of the elongated beam is about 40. Alternatively or additionally, according to some embodiments, the elongated output beam extends to cover a first lateral dimension of the FOR. The optical path scanning module is configured and operable to perform a one-dimensional scanning cycle by: the output optical path is deflected in a direction to scan a particular angular range about the scanning axis so as to sweep the beam to cover a second dimension of the FOR being imaged in the one-dimensional scanning cycle.

In some embodiments, the system includes a control system connectable to the imaging system and configured and operable for monitoring a region of interest while searching for a target by receiving a sequence of images captured by the imaging system during a one-dimensional scan cycle. The control system is adapted to process the image to detect the target illuminated by the beam. The detecting includes determining whether return light of the light beam reflected/scattered from the target is captured by one or more pixels of at least one image in the sequence of images. For example, the control system is configured and operable to determine one or more pixels by determining whether the intensity of light captured by the pixels exceeds a particular predetermined signal-to-noise ratio (SNR) threshold.

In this regard, in general, the intensity of return light reflected/scattered from a target and captured by one or more pixels in at least one image is a function of τ and 1/R, where R is the distance between the imaging module and the target and τ is the exposure time of the at least one image. According to some embodiments, the controller is adapted to estimate the change in distance R from the target during scanning and to use the change in distance to dynamically adjust the exposure time τ of the image, thereby reducing the total time T of the scanning cycle, while optimizing the SNR of target detection above a predetermined threshold.

More specifically, in some cases, in general, the intensity of return light scattered from a target and captured by one or more pixels in a sequence of images may be proportional to T/(R4 Ω), where R is the distance between the imaging module and the target, Ω is the solid angle of the total area of interest (FOR) covered by the scan cycle, and T is the duration of the scan cycle. To this end, according to some embodiments, the controller is adapted to perform a plurality of scan cycles FOR tracking the target while dynamically adjusting at least one of a duration T of the scan cycle and a solid angle Ω of a FOR scanned in the scan cycle.

FOR example, during tracking of the target, the controller is adapted to dynamically decrease the duration T of the scanning cycle as said distance R of the target becomes shorter, thereby dynamically optimizing the frame rate 1/T of FOR frames scanned during the one-dimensional scanning cycle, respectively. Alternatively or additionally, the controller is adapted to determine the angular velocity of the target during tracking of said target and to dynamically adjust the solid angle Ω of FOR covered by one or more scan cycles, thereby optimizing the flexibility of said tracking.

According to some embodiments, the system comprises a one-dimensional photodetector array, and wherein the imaging system is configured to image the field of view of the elongated output beam on the one-dimensional photodetector array.

Optionally, the imaging system comprises a two-dimensional photodetector array, and preferably the imaging system is configured to have a field of view larger than the field of view of the illumination module, thereby imaging the field of view illuminated by the elongated output beam onto a subset of pixels of the two-dimensional photodetector array. To this end, the system may comprise a noise estimator module (e.g. a false alarm processor) adapted to process images captured by the two-dimensional photodetector array to determine false detection of the target by comparing parameters of light intensities captured by a subset of the pixels with corresponding noise related parameters associated with light intensities measured by other pixels of the two-dimensional photodetector array. The noise-related parameter may be estimated, for example, based on one or more of: average background clutter level and slope associated with standard deviation of clutter, estimated size of structures in images, comparison between images.

According to various embodiments of the present invention, the scan axis of the 1D scan is the horizontal axis that is transverse to the output optical path of the system. Generally, the scanning axis is not orthogonal to a particular transverse axis along which the elongated cross-section of the illumination beam extends (in order to sweep the beam across the region of interest), and preferably, the scanning axis may be parallel to the particular transverse axis of the elongated cross-section of the illumination beam.

For example, according to some embodiments of the present invention, the optical path scanning module comprises a gimbal rotatable about the scanning axis, the gimbal being configured and operable to perform the angular deflection of the output optical path direction, the gimbal comprising an actuation module for rotation about the scanning axis.

In various implementations, the imaging module may be mounted (e.g., directly mounted) on the gimbal, or the imaging module may be located external to the gimbal (and have a fixed imaging optical path relative to the gimbal). In the latter case, the optical assembly of the system may include one or more imaging optical elements disposed on the gimbal, the one or more imaging optical elements configured and operable to direct light from the output optical path to propagate along a fixed imaging optical path for imaging by the imaging module.

In various implementations, the illumination module may be mounted (e.g., directly mounted) on the gimbal, or the illumination module may be located external to the gimbal and have a fixed light projection path along which the light beam emanates from the illumination module. In the latter case, the optical assembly may include one or more light-guiding optical elements disposed on the gimbal, the one or more light-guiding optical elements configured and operable to guide the light beam from the fixed light projection optical path to propagate along the output optical path.

Instead of or in addition to using a gimbal, the optical path scanning module may include one or more scanning optical deflectors configured and operable to perform angular deflection of the output optical path direction. For example, the one or more scanning optical deflectors include at least one MEMs turning mirror.

To this end, in some implementations, the illumination module may have a fixed light projection path along which the light beam emanates from the illumination module; and the optical module may include one or more optical elements for directing light from the projection optical path to propagate along an output optical path defined by the one or more scanning optical deflectors. Alternatively or additionally, the imaging module may have a fixed imaging optical path for imaging light arriving therealong, and the optical assembly may include one or more optical elements for directing light from an output optical path defined by the one or more scanning optical deflectors and propagating along the imaging optical path for imaging by the imaging module.

According to various embodiments of the invention, upon detection of a target, the control system is configured and operable to initiate a target tracking phase to steer the platform carrying the system towards the target.

In some embodiments, during the target tracking phase, the control system stops operation of the one-dimensional scan and initiates a snapshot mode of operation in which the output beam is directed in a forward direction to continuously illuminate the target and the beam shaping module is operated to adjust the cross-section of the output beam such that the transverse aspect ratio of the cross-section of the output beam is approximately 1 (e.g., in the range of 1 to 2) and the solid angle of the field of view of the output beam is expanded to cover the angular range of the target. To this end, the beam shaping module includes an optical element configured and operable for adjusting a transverse aspect ratio of the output light beam. Further, the beam shaping module may include an adjustable beam expander configured and operable to expand the field of view of the light beam.

In some embodiments, during the target tracking phase, the control system processes images captured by the imaging module to estimate a number of pixels illuminated by light returning from the target, and operates the beam expander module to expand the beam and the field of view of the imaging assembly upon determining that the number of pixels exceeds a predetermined threshold.

According to some embodiments, the control system includes a steering controller connectable to a steering module of the platform and configured and operable for operating the steering module to steer the platform to the target. For example, the steering controller may comprise a closed loop controller adapted to process images acquired by the imaging module and operate the steering module so as to minimize differences between identities (identities) of pixels in successive images illuminated by light returned from the target, thereby minimizing the angular velocity of the platform relative to the target and directing the platform towards the target.

Further aspects and advantages of the invention are described in more detail below with reference to the accompanying drawings.

Brief Description of Drawings

In order to better understand the subject matter disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:

fig. 1A illustrates a block diagram of an active seeker system 100 according to some embodiments of the present invention;

FIG. 1B illustrates a configuration of a control unit used in the system of FIG. 1A;

fig. 2A and 2B illustrate operation flowcharts of the control unit in a target detection period in which a target is detected and a tracking period performed after the target is detected so as to track the target, respectively.

FIG. 3 is a schematic diagram of an example of system operation when switching from scan mode to snapshot mode;

FIGS. 4A and 4B illustrate the FOV of the imaging module at two different scan times, respectively; and

fig. 5A to 5C schematically show examples of different scanning patterns performed by a one-dimensional scanner.

Detailed Description

Fig. 1A-1B are block diagrams illustrating an active seeker system 100 according to some embodiments of the present invention.

Active seeker system 100 includes an illumination module 110, a beam shaping assembly 130, a scan module 140, an imaging module 120, and a controller 150 (the controller may be interchangeably referred to herein as a control system/unit).

The illumination module 110 is configured for generating a radiation beam LB for propagation towards the output optical path OP to form an output beam OB propagating from the system 100 along the output optical path OP. The radiation beam is typically an optical beam (e.g., light in the visible, IR, or other range), but may also be a non-optical beam with electromagnetic radiation, such as Radio Frequency (RF). The imaging module 120 is configured for imaging radiation in a spectral range of the radiation beam LB generated by the illumination module. Indeed, the imaging module referred to herein is typically an imager operable in the optical range of the radiation beam LB. However, in general, the imaging module 120 may also be operable to image radiation in other spectral ranges (e.g., non-optical ranges), for example, it may include a set of one or more antenna modules (e.g., an antenna array) operable to image radiation in the RF range. The beam shaping component 130 includes one or more beam shapers configured to shape the radiation beam LB to modify at least its FOV, and possibly also the FOV of the imaging module. The beam shaping component 130 is configured and operable to adjust the FOV of the output beam and possibly also the FOV of the imaging module such that in the snapshot mode of operation, the FOV covers the entire FOR, while in the scan mode of operation, the FOV of the output beam (and possibly also the imaged FOV) covers only a portion of the FOR. The scanning module 140 generally operates in a scanning mode and is configured to angularly deflect the direction of the optical path OP about at least one scanning axis/angle to perform a one-dimensional scan. To this end, the scanning module 140 is configured and operable FOR sweeping the field of view of the output beam OB to cover an area of interest FOR.

It should be noted that in some embodiments of the present invention, the optical path of the imaging module and the optical path of the illumination module may be separate (e.g., spaced apart co-aligned optical paths). The optical components (e.g., beam shaping module and/or other optical modules) may also be different/separate optical components of the imaging module and the illumination module (although some optical elements along the optical path of imaging and illumination may optionally be included).

To this end, according to some embodiments of the present invention, the optical/propagation axes/paths of both the illumination module 110 and the imaging module 120 are coupled to the scanning module such that during a scanning operation, both optical/propagation paths are deflected together by the scanning module to scan FOR. For example, in accordance with one embodiment of the present invention, the scanning module 140 includes an actuatable gimbal (e.g., a shaft gimbal with a suitable scanning actuator) on which both the illumination module 110 and the imaging module 120 are mounted/mounted. Thus, when the gimbal is actuated, the optical paths of both the illumination module 110 and the imaging module 120 are deflected to scan (i.e., illuminate and image) FOR. Alternatively or additionally, the illumination module 110 and the imaging module 120, or any of them, may not be directly equipped on a gimbal, but the gimbal is equipped with mirrors/deflectors adapted to couple the optical path of the illumination module 110 and/or the imaging module 120 to the output optical path OP scanned by the scanning module.

In some embodiments, the beam shaping component 130 comprises at least one global beam shaper 130.1, which global beam shaper 130.1 is configured and operable to globally adjust (e.g. enlarge/reduce the total solid angle Φ of the L-FOV) the FOV of the output beam OB (the FOV of the output beam OB is hereinafter denoted as L-FOV) while possibly not changing the aspect ratio of the cross-section of the output beam OB. In some embodiments, the global beam shaper 130.1 is positioned only along the optical path of the illumination (and not along the optical path of the imaging module), so that simpler beam shaping optics (of a non-imaging type) can be used. Alternatively, in some embodiments, the optical path of the imaging module 120 also passes through the global beam shaper 130.1, such that the global adjustment of the L-FOV adjustment also adjusts the FOV of the imaging module (the FOV of the imaging module is hereinafter denoted as I-FOV). As described above and in more detail below, in some embodiments, the FOV (I-FOV) of the imaging module is maintained larger than the FOV (L-FOV) of the output beam OB, such that the image captured by the imaging module further includes the following regions: in these regions there should not be signaling/imaging/collection of radiation of the output beam OB returning (reflected/scattered) from the object, but in these regions only clutter radiation and possibly instrument noise must be sensed/imaged/collected. These regions are therefore used to estimate global noise/clutter that may affect the SNR of the target response, thereby enabling an increase in SNR. To this end, the imaging module may have a wider spectral range than the illuminator. The imaging module may also observe a wider FOV than the output optical path, i.e. not only image the light/radiation from the output optical path.

In some embodiments, the beam shaping component 130 comprises at least one asymmetric beam shaper 130.2, the asymmetric beam shaper 130.2 being configured and operable to apply an asymmetric adjustment to the aspect ratio of the FOV (L-FOV) of the output beam OB. As described above and in more detail below, the asymmetric beam shaper 130.2 may be configured and operable to modify the aspect ratio of the illumination beam between elongate aspect ratios to form an output beam OB having a so-called linear cross-section-an elongate cross-section extending along a transverse axis across the optical path. In this case, the output beam OB will illuminate a field of view L-FOV having an elongated cross section. Optionally, the asymmetric beam shaper is configured to shape the output beam OB such that the beam has an aspect ratio of 40: 1 between its wide and narrow lateral dimensions (e.g., the aspect ratio may be in the range of 10-50, and the angular range of the beam with respect to its wide axis may be in the range of 1 milliradian to 10 milliradians). Preferably, in this case, the FOV of the output beam OB is adjusted such that the width of the long side/transverse dimension of the cross section of the output beam OB is at least as wide as the width of FOR which the object should be detected/tracked, while the width of the short side/transverse dimension of the cross section is smaller than FOR. This therefore enables the entire FOR to be scanned in a single 1D scan.

In some embodiments, the beam shaping component 130 further comprises a beam expander module 130.3, the beam expander module 130.3 being configured and operable to expand/contract the solid angle of the illumination FOV (L-FOV) relative to the solid angle of the imaging FOV (I-FOV). This may be used to adjust the imaging FOV and/or illumination FOV such that the imaging FOV (I-FOV) is larger than the illumination FOV (L-FOV), thereby enabling improved noise/clutter estimation, as described in more detail above and below. It should be noted that the system 100 may alternatively be configured such that the imaging FOV (I-FOV) is set to be inherently larger than the illumination FOV (L-FOV) without any specific adjustment to the FOV ratio.

Thus, according to some embodiments, the scanning module 140 is configured to angularly deflect the direction of the optical path OP about one scanning axis (typically a transverse scanning axis) to perform a one-dimensional scan. To this end, the scanning module 140 is configured FOR sweeping a field of view (L-FOV) of the output beams, preferably together with the imaged field of view I-FOV, to cover the area of interest FOR. In this case, the scanning module includes a gimbal on which a deflection mirror for deflecting the output beam and the optical path of the imaging is mounted, or the imaging module 120 and/or the illumination module 110 are directly mounted on the gimbal.

Further optionally, in some embodiments, the scanning module 140 may further comprise a rotational actuator coupled to the asymmetric beam shaper 130.2 and/or the gimbal FOR rotating the asymmetric beam shaper about an axis (e.g. about an optical axis of the asymmetric beam shaper) such that the linear output beam may be rotated to scan FOR.

As described above, the system 100 is adapted to operate in two periods. In a first phase (i.e., the object detection phase), the system operates to search FOR and detect objects within a relatively large FOR. During a second period, the system operates to track the target. According to some embodiments, the system is configured and operates in two modes of operation (a scan imaging mode and a snapshot mode). As described in more detail below, in some implementations, the system is configured to operate in a scan mode during a detection period, and in both modes during a tracking period, initially in the scan mode, and then transition to operate in a snapshot mode (e.g., when a target is determined to be proximate). The controller 150 is configured and operable to control and/or adjust the operation of the system in two phases/periods so as to optimize the SNR of the target (e.g., maintain the SNR above a particular threshold level) while also adjusting the flexibility of detection and/or tracking (speed/rate and/or flexibility) so that fast moving targets can be detected and tracked with improved SNR and flexibility while reducing the illumination power requirements and thus reducing the weight and size of the system. To this end, the controller 150 is configured and operable for adjusting the illumination parameters and imaging parameters, and selecting the imaging and illumination modes. In some cases, imaging is performed in a snapshot mode, while in other cases imaging is performed in a scan mode, whereby the scan module is configured to perform, for example, a one-dimensional scan in a repetitive (e.g., cyclic) manner. FOR this reason, the FOR is scanned once per cycle so as to cover once per cycle when the scanning module 140 deflects the direction of the output optical path by the specific angle 111.

Fig. 1B is a block diagram of the system 100 showing in more detail the different modules of the controller 150 and how they are connected to the other modules of the system 100 in a clear manner. It should be noted that the controller is typically a computerized system, or a combination of a computerized system and analog/electronic circuit modules, and that the modules 151 to 159 shown in the figures may be implemented as software implemented modules, hardware modules, electronic circuits, and/or any combination of the above.

Reference is now made to fig. 2A and 2B, which are flow charts illustrating in greater detail a method 1000 of operation of the controller 150 and system 100 performed in a target detection period (1100 shown in fig. 2A) during which a target is detected and a tracking period (1200 shown in fig. 2B) after the target is detected to track the target, according to some embodiments of the present invention.

Turning more specifically to the detection period 1100 shown in FIG. 2A, for example, the following operations may be performed:

optionally, in operation 1110, input data indicative of a heading position of a target to be detected and tracked may be received, for example, from an external source (such as a radar system external to system 1000). In this case, system 1000 (e.g., steering controller 159) may operate (see optional operation 1115) the platform steering module to direct the platform toward the target heading position.

To this end, the seeker acquires its LOS direction before detection of the target begins. This can be done with a star tracker mounted on the seeker or using a star tracker implemented by the seeker's own detection camera. Another option for measuring the LOS of the seeker is a gyroscope (this will be described in more detail below). Then, in 1115, the seeker points its line of sight (LOS) in the direction of the sought area (toward the FOR, e.g., toward the edge of the FOR). In the event that the illumination beam is not properly aligned with the entire sought area (i.e., with the FOR), some preconditioning may be performed (as illustrated and described below with reference to fig. 5B).

In operation 1117, a distance R to the target and a closing velocity V of the platform to the target are estimated (e.g., estimated based on the data obtained in operation 1110 above)C. In this regard, it should be noted that the term approach velocity refers to the radial velocity between the platform and the target (or in other words, the variation of the distance to the target)Rate of change dR/dt).

The illuminator is then turned on, and target detection and tracking is enabled, as described below with reference to method 1000. FCT (e.g., frame rate FR ═ 1/T) and camera exposure time τ may be dynamically adjusted as time passes. This makes the detected signal stronger as the seeker-target distance decreases. FOR each image that captures the FOR or a portion thereof, an image analysis algorithm is performed to determine whether radiation (e.g., a flash of radiation) having an intensity above a threshold has reached a detection area of interest in the image. If so, the target is detected, and if not, the target detection period continues until the target is detected.

As the target detection period continues, the seeker system 100 recalculates the predicted/expected location of the target based on information that the target has not been found to minimize platform maneuvering, and then performs corrective maneuvers.

After the target is detected, as described below with reference to tracking period 1200, the seeker keeps illuminating and tracking the target while adjusting (e.g., expanding) the solid angle of FOR and/or decreasing the exposure time (i.e., integration time) τ of the imaging module in the event the detected signal becomes too strong in order to prevent image saturation. At some stage, the illuminator beam shaper adjusts the beam shape as the target solid angle relative to the guide head becomes larger than the illuminator solid angle.

Returning to the detection period, at which stage the target has not been detected, operation 1120 may be performed (e.g., by the FOR control module 155 setting/adjusting the FOR parameters of the system to a particular initial wide range to effectively search FOR the target-6To 50e-6Within the sphere range.

Further, operation 1130 is performed by the controller 150 to optimize the SNR of the signal from the target in the detection mode (considering that the target may be remote and FOR is set to a wide range so that the target can be detected in a large area). To this end, to operate with a sufficiently high SNR, at this stage, the imaging mode control module 157 performs operation 1132 to set the illumination and imaging modes to the scanning imaging mode, which provides an improved SNR (compared to the snapshot mode) as described above. To do so, the imaging mode control module 157 performs operation 1132.2 and operates the beam shaper 130 to adjust the angular range (φ) of the illumination beam FOV (L-FOV) to be less than the angular range of FOR Ω: phi < < omega. Further optionally, the FCT control module 156 performs operations 1137 (specifically 1137.2 and 1137.4) to dynamically adjust the exposure time of the imager per frame or more based on the expected SNR of the targets in the frame to reduce/optimize the duration T required to capture an image covering the entire FOR (T referred to herein as FOR capture duration (FCT) and/or scan cycle duration).

Optionally, to further achieve improved SNR while reducing or eliminating the rolling shutter effect associated with the scan, the imaging mode control module 157 may perform operation 1132.4 and operate the asymmetric beam shaper 130.2 to adjust the aspect ratio (AR in fig. 3) of the illumination beam OB such that the cross-section of the output beam OB is elongated (linear) and suitable FOR performing a 1D scan over FOR. In this case, the width (matching the width of FOR on the long side of the beam, so the entire FOR can be swept in a single one-dimensional scan).

Further optionally, the imaging mode control module 157 may perform operation 1132.6 to operate the beam shaper module 130, in particular to operate the optional beam expander module 130.3, to set/adjust the ratio of the angular range (solid angle) of the imaging FOV (I-FOV) to the angular range of the illumination beam FOV (L-FOV) such that the imaging FOV is larger than the FOV (L-FOV) of the illumination beam. As mentioned above, this provides that some predetermined parts of the image captured from the I-FOV will inherently not be affected by reflection/scattering of the illuminating radiation of the output beam from the object. Accordingly, upon receiving such images, the noise estimator 153 processes these portions of the images to estimate the level of noise/clutter returned from the environment to enable estimation/improvement of the SNR of the target signal and/or also utilizes a False Alarm Rate (FAR) module (such as constant FAR (cfar)) to eliminate/reduce/filter out detection false alarms that may be associated with high clutter/noise levels, as described below.

Thus, after setting the above parameters (e.g., FOR, τ, T, I-FOV, L-FOV, and/or AR), the imaging mode control module 157 performs operation 1135 FOR operating the optical path scanning module 140 to scan FOR while optimizing/reducing the duration of the scanning cycle, t (fct), and the imaging mode control module 157 also performs operation 1137 FOR operating the image module 120 during each scanning cycle to capture/capture a plurality of images (e.g., M images) to obtain image data indicative of the entire FOR being illuminated in each scanning cycle. Optionally, as described above, operation 1135.2 is performed for operation of the optical path scanning module 140 to perform a one-dimensional scan.

In each scan cycle, the target detection module 152 processes data indicative of M captured images of FOR captured during the scan cycle to detect a target.

Optionally, the imaging mode control module 157 dynamically adjusts the exposure time τ of different ones of the M images in order to optimize/reduce FOR capture duration (FCT) while maintaining SNR above an acceptable threshold level. This may be accomplished, for example, by performing optional operations 1137.2 and 1137.4 based on each or a few of the M images captured during the scan cycle in order to dynamically optimize/reduce the exposure time of the M images. More specifically, in operation 1137.2, the imaging mode control module 157 estimates a distance R to the target. May be based on the initial distance R and the approaching speed V obtained in the above operation 1117, for examplecAnd the time elapsed since the operation to estimate the distance R. As described above, the SNR of the target is typically a function of 1/R (e.g., theoretically equal to 1/R4Proportional). Accordingly, in operation 1137.2, an expected improvement/change in SNR due to the decrease/change in target distance is estimated. In operation 1137.4, the exposure time τ of one or more of the subsequent M images to be captured during the scan cycle is set/adjusted according to the change/improvement in SNR. More specifically, the SNR is a function of the exposure time τ, and thus once it is determined at 1137.2 that the SNR is expected to improve in subsequent images (due to the approaching distance to the target), the exposure time τ of subsequent images is then successively shortened, so as to reduce the FOR capture duration (FCT),while keeping the SNR just above an acceptable threshold.

It should be appreciated that the scan controller (scan module) 140 operates at varying scan speeds (e.g., scan angular speeds) during a scan cycle, in accordance with the dynamic adjustment of the exposure time τ of the M images acquired during the scan cycle. This thus produces a time-dependent scan profile, i.e., sufficient exposure time τ at the respective locations required to capture the respective M images.

The detection module 152 performs operation 1139 to process the data associated with the M images to identify radiation reflected/scattered from the target and determine the target location. This is accomplished, for example, by identifying pixels in the image that are sufficiently intense to be above noise, and/or utilizing image analysis algorithms (e.g., statistical image analysis) to determine external and/or internal correlations of pixel intensities in one or more images.

To this end, the FOR frame image processor 151 optionally segments the image received from the imaging module 120 into image segments associated with regions of the imaging FOV (I-FOV) (regions that are being illuminated when the respective image is grabbed) (i.e., image segments whose FOV overlaps with the L-FOV of the illumination beam OB) (these image segments are referred to herein as "object indication image segments") and segments associated with regions of the imaging FOV (I-FOV) that are not illuminated during the capture of the respective image (referred to herein as "clutter indication image segments"). Optionally, the FOR frame image processor performs operation 1135.2 to indicate image segments with "clutter" (i.e., image segments outside the illuminated L-FOV) and provide them to the noise estimator module 153. The latter processes these image segments to evaluate the value of global noise/clutter and thereby improve the SNR of target detection in image portions overlapping the illuminated L-FOV and or to reduce false detections (false alarms) using, for example, CFAR techniques. Optionally, the FOR frame image processor further comprises a scan mitigator module 151.2, the scan mitigator module 151.2 receiving data from the optical scan module 140 and/or from the imaging mode control module 157 indicating coordinates of each of the M images obtained in a scan cycle relative to coordinates of FOR, and more particularly data indicating coordinates of each target indication image segment in FOR, and based on their respective coordinates, merging (e.g. concatenating) the target indication image segments to obtain an image frame of FOR indicating a FOR region imaged during the scan cycle. The latter is then provided to the detection module 152 FOR performing the above-described operation 1139 to detect the position of the target in the FOR based on the image frame of the FOR thus obtained.

Once the target is detected, the system 100 proceeds to a tracking period 1200, as shown in FIG. 2B. During the tracking period, the controller repeatedly performs the following operations as shown in fig. 2B for tracking the target.

In operation 1210, the controller 150 obtains data indicative of the detected target location, and optionally, in operation 1215, the steering controller 159 utilizes the data for operating the steering controller of the platform to steer the platform and direct the platform toward the target.

To this end, in order to navigate (steer) the platform to the target, the steering controller may comprise a closed-loop controller adapted to operate on the basis of the processed FOR frame images with respect to the steering module so as to minimize the difference between the identities of the pixels in successive images illuminated by light returning from the target, thereby minimizing the angular velocity of the platform relative to the target and thus directing the platform towards the target.

While tracking the target, the controller 150 (more specifically, the imaging mode control module 157) repeats the operation to perform operation 1220 of obtaining FOR frame data indicating an image (images) of the entire FOR at the FCT (in the scan or snapshot mode). Initially (at least at the beginning of the tracking period), FOR frame data is obtained via a scanning imaging mode. In this case, operations 1132, 1135, and 1139 described above, and optionally their sub-operations, may be performed to track and locate the target in the scan imaging mode. Typically, when the tracking phase is initiated, the FOR is reduced compared to the FOR of the detection phase, and the illuminated field of view L-FOV is in the order of the FOR, but may be only slightly smaller than the FOR.

Generally, as will be described in more detail below, at some stage, the imaging mode control module 157 may change the imaging mode to a snapshot mode. In this case, the imaging mode control module 157 operates the beam shaper as follows. The asymmetrical beam shaper 132.2 adjusts the cross-section of the beam to a spot beam (i.e. with an aspect ratio AR of the order of 1: 1). The global beam shaper 132.1 may (if needed) be adjusted such that the illumination FOV (L-FOV) overlaps the whole FOR (whereby the imaging FOV (I-FOV) may also be adjusted at this stage, or it is a priori set to cover the whole FOR as well). Thus, the imaging module is operated to capture a single image capturing the entire FOR at each FOR frame duration T, whereby the integration time τ of the image is set to about τ -T. The FOR image obtained by the imaging module 120 at each FOR frame imaging cycle is then optionally provided to the FOR frame image processor 151, and the FOR frame image processor 151 may be adapted to perform operation 1139 on a single image of FOR. In particular, the snapshot mitigator module 151.1 operates to segment the image into a segment (target indicative image segment) showing the entire FOR being illuminated and a segment (clutter indicative image segment) outside of the FOR. The target-indicative image segments are then provided to a target detection module FOR detecting the position of the target in the FOR, and the clutter-indicative image segments may be provided to a noise estimator 153 FOR estimating the global level of noise/clutter as described above.

During tracking (e.g., at each FOR frame imaging cycle or occasionally (e.g., every few cycles)), the controller 150 performs an operation 1230 to repeatedly monitor parameters of the tracked object appearing in the FOR frame images/data and adjust one or more tracking parameters, such as FOR, I-FOV, L-FOV, FCT (T), exposure time τ, and/or imaging mode (scan or snapshot mode) accordingly. In the following, the term Frame Rate (FR) is used to denote the rate 1/T at which FOR frames are acquired (either by scan mode or snapshot mode).

Optionally, the controller 150 performs operation 1231 to determine/estimate the SNR of the target. FOR example, the controller 150 processes pixels of the target that occur in the most recent (e.g., last) one or more FOR frames to determine the SNR when the target is detected. For example, the degree to which the intensity of these pixels is above the global noise/clutter value is determined, and how important/reliable the detection is determined taking into account, for example, the standard deviation/volatility (in time and/or space) of the noise based on these parameters that may be determined by the noise estimation (e.g., taking into account clutter imaging segments obtained during one or more cycles). It should be noted that the SNR (T-SNR) of the target should be kept above a certain SNR threshold (SNR-TH) in order to ensure reliable detection and tracking, but in case of high SNR well above the threshold, the tracking flexibility can be increased (e.g. increase the frame rate 1/T at the expense of a certain reduction of the SNR as long as it is kept above the threshold).

Further optionally, the controller 150 performs an operation 1232 to determine/estimate the angular velocity of the target relative to the platform (relative to the system 100). FOR example, the controller 150 may process the locations (coordinates) within the FOR at which the target appears in the respective nearest (e.g., last) two or more FOR frames to determine a lateral (e.g., vertical and/or horizontal) velocity of the target relative to the platform in the FOR, which is associated with a relative angular velocity of the target. It should be noted that FOR higher angular velocities, a larger FOR solid angle is needed to reliably track the target to ensure that the target is not lost from the FOR frame. Generally, the closer the target, the higher its angular velocity relative to the platform may be, and thus a larger FOR cube angle (herein after a specified R-FOR) is required.

To this end, in optional operation 1234, the controller 150 (e.g., FOR control module 155) optionally adjusts the FOR range of the system to a desired R-FOR (which is determined based on the monitored angular velocity and SNR) in order to reliably track the target. This may be performed by adjusting the beam shaping module and or the scanning module to cover the required FOR during the capturing of each FOR frame (during each duration T of the FOR frame grabbing cycle (in snapshot mode or scanning mode)).

In this regard, it will be appreciated that FOR a given frame rate T and a given imaging mode (scan or snapshot), the SNR decreases when the FOR is extended, and increases when the FOR is set to a smaller range. Thus, once the desired solid angle FOR (R-FOR) is determined, the controller 150 may estimate the SNR (E-SNR) of the intended target, i.e. the SNR when the intended target is detected in the next FOR frame.

Thus, in operation 1233, the controller 150 (e.g., the frame rate control module 156) may optionally adjust the frame rate FR to 1/T based on the angular velocity and SNR of the object monitored in the previous FOR frame (the SNR and angular velocity of the object as determined in operations 1231 and 1232 above).

This generally results in better flexibility (i.e., higher frame rate) as the target approaches/becomes closer to the system. This is because, although the angular velocity of the target and hence the required FOR solid angle may increase as the target approaches, the amount of radiation returned from the target is also at R-4The factor of (c) increases. Thus, with this technique, more flexible tracking is dynamically adjusted as the target approaches, thereby enabling improved tracking of fast moving targets.

Optionally, in operation 1235, the controller 150 (e.g., imaging mode control module 157) operates to select illumination and imaging modes based on the monitored angular velocity and SNR. In this regard, the imaging mode control module 157 may determine whether to continue operating in the scan mode or to initiate operating in the snapshot mode, for example, based on the angular velocity and the SNR. Where the expected SNR from the target is sufficiently high (e.g., well above the threshold), when the target is moving at a high angular velocity (which may be the case when the target is relatively close together), the imaging mode control module 157 may switch to snapshot mode, which, although associated with a slightly lower SNR compared to scan mode, is less/not affected by the rolling shutter artifact associated with the scan.

Nonetheless, in accordance with some embodiments, in the event that the expected SNR does not yet allow operation in snapshot mode, the imaging mode control module 157 appropriately adjusts the scanning scheme to mitigate such artifacts by performing optional operation 1236, thereby mitigating/reducing rolling shutter artifacts. FOR example, in the case where 1D scanning is involved, the imaging mode control module determines the direction of the target angular velocity in FOR, and preferably adjusts the 1D scanning direction (scanning axis) to a direction approximately perpendicular to the target angular velocity, so that the rolling shutter effect is reduced/suppressed from the FOR frame. This may be accomplished by operating the beam shaping module to align the elongate scanning beam approximately in the direction of angular velocity while operating the gimbal assembly of the scanning module to scan in the vertical direction. Such adjustments are possible in embodiments of the system comprising a rotary actuator that can be connected to the scanning module (connected to the gimbal) 140 and/or the asymmetric beam shaping module 130.2 for changing their orientation about the longitudinal axis of the optical path.

Once the above parameters are adjusted, the controller 140 continues to perform operation 1240 to repeatedly operate the system to acquire the next FOR frame(s) in the selected imaging and illumination mode and frame rate and FOR range. In operation 1250, the controller 150 also processes the next FOR frame to identify a target and causes operation 1200 to continue until the target is reached or unless otherwise indicated.

Fig. 3 illustrates a seeker system whose operation may be controllably switched from a one-dimensional scan mode to a snapshot mode as a target becomes closer to the system. As shown, a light source (laser device) 110 produces a light beam having a particular fixed beam shape that propagates along an optical path and interacts with a beam shaper to produce an output beam having a desired cross-section according to the operating mode of the system. In scan mode CS-MOD-1, the beam has an elongated cross-section extending along a particular transverse axis across the output optical path for illuminating a particular field of view; and in snapshot mode CS-MOD-II, the beam has a spot shape covering FOR. In scan mode, the beam shaper 130 operates to shape the beam such that the field of view of the elongated output beam has a line shape and the elongated output beam has a transverse aspect ratio of, for example, about 40 between the wide and narrow transverse dimensions of the elongated beam (e.g., the aspect ratio may be in the range of 10-50, and the angular range of the beam with respect to its wide axis may be in the range of 1 to 10 milliradians). The optical path scanning module performs a plurality of cycles of one-dimensional scanning such that in each cycle, the direction of the output optical path is deflected to scan a particular angular range/segment around the scanning axis, thereby sweeping the beam to illuminate the FOR of the scan frame illuminated and imaged in each one-dimensional scanning cycle. The scan axis is a transverse axis across the output optical path and may be horizontal, vertical or a rotational axis; the scanning axis is not orthogonal to the particular transverse axis and may, for example, be parallel to the particular transverse axis.

For this purpose, as described above, the control system 150 performs a monitoring phase to monitor the area of interest while searching for the target by: a sequence of images captured by the imaging system during each one-dimensional scan cycle is received and the images are processed to determine whether the target illuminated by the light beam is detected by one or more pixels by capturing return light of the light beam reflected/scattered from the target at the one or more pixels.

To this end, the controller 150 is preprogrammed with a specific predetermined signal-to-noise ratio (SNR) threshold to determine whether the intensity of light captured by the pixel exceeds the threshold. Intensity and tau/R of return light reflected/scattered from a target and captured by one or more pixels4Proportional (R is the distance between the imaging module and the target; τ is the exposure time for each of the M frames).

As the range R of the target becomes shorter, the controller 150 operates to dynamically reduce the duration T of the one-dimensional scan cycle in order to improve tracking flexibility. Thereby, the frame rate 1/T of the scanned frames scanned during a one-dimensional scanning cycle is optimized. This can be achieved by actually reducing (dynamically adjusting) the exposure time of the captured image during the scan). The range R can be estimated by applying previous measurements of target location and velocity (done with other systems such as radar or IR sensors) to a target state vector prediction model, and measuring the positioning of the interceptors themselves. As also described above, the imaging system may include a one-dimensional photodetector array and be configured to image a field of view of the elongated output beam on the one-dimensional photodetector array. Optionally, the imaging system may comprise a two-dimensional photodetector array and be configured to image the field of view illuminated by the elongated output beam on a subset of pixels (SS in fig. 4A and 4B) of the two-dimensional photodetector array. In this case, spectral line filters may be used in that particular subset to reduce the noise level. The false alarm processor may also be used to process images captured by the two-dimensional photodetector array to determine false detection of a target by comparing parameters of light intensities captured by the subset of pixels to corresponding noise-related parameters associated with light intensities measured by other pixels of the two-dimensional photodetector array. The noise-related parameters include: background constant levels and slopes, and/or structure sizes, and/or comparison with previous images, clutter, etc.

Fig. 4A and 4B show the FOV of an imaging module at two different scan times, respectively. In the figure, the large rectangle represents the total FOV (I-FOV); the large dotted ellipse is the entire seek area FOR. The small elongated dark ellipse is the illumination beam of the scan mode cross section CS-MOD-1, which defines a small seek area. A small rectangle (SS) around the small sought area marks the detection area of interest, which is the smallest area on the detector (from the small sought area to some edge) where there may be a detection signal. Object detection to form other areas on the FOV is not possible because the laser never illuminates other areas of the FOV (illuminator and imaging module are fixed). As mentioned above, this small rectangle means less false alarm detections and less data to analyze. To further reduce background noise during detection, a narrow spectral filter appropriate to the illuminator wavelength may be mounted on this small rectangle. Such a filter reduces residual background noise (stray light, stars, etc.) without affecting the target signal.

Fig. 4A shows the case when the entire sought area is at the center of the total FOV and the small sought area is a horizontal slice at the center of the entire sought area. After gimbal up scan, the situation corresponding to fig. 4B is reached at a later scan stage. At this stage, the small sought area is a narrow slice of the top region of the entire sought area. Also, the detection roi (ss) occurs at the same location on the detector.

It should be noted that, in general, the gimbal unit targets the payload in the relevant direction and stabilizes its LOS. In the present invention, the gimbal is also used for scanning. Other scanning options are possible if the gimbal has more than one degree of freedom. In this regard, reference is made to FIGS. 5A-5C. Fig. 5A shows the situation when the entire elongated homing region is rotated relative to the reference frame of the imaging module. In the reference frame, a single up/down scan does not scan the entire sought area. For this purpose, as shown in fig. 5B, a gimbal with rotational freedom (relative to the reference frame) and thereafter up/down freedom may be used (it should be noted that rotation may also be done by the interceptor). Fig. 5C shows the simultaneous use of up/down and left/right degrees of freedom, which allows scanning of the entire sought area despite its misalignment with the reference frame. The scan direction/axis is marked with a dashed diagonal.

Thus, the optical path scanning module may comprise a gimbal rotatable about the scanning axis to provide angular deflection of the direction of the output optical path. The gimbal may include an actuation module for rotation about a scan axis. The imaging module may be mounted on a gimbal. Both the illumination module and the imaging module may be mounted on a movable gimbal.

In general, the imaging module may be external to the gimbal and have a fixed imaging optical path relative to the gimbal. In this case, the optical assembly includes one or more imaging optical elements disposed on the gimbal and which direct light from the output optical path to propagate along a fixed imaging optical path for imaging by the imaging module. Similarly, the illumination module may be external to the gimbal and have a fixed light projection path along which the light beam emanates from the illumination module. In this case, the optical assembly includes one or more light-guiding optical elements disposed on the gimbal for guiding the light beam from the fixed light projection optical path to propagate along the output optical path.

The scanning module may include a scanning optical deflector(s) to provide angular deflection of the direction of the output optical path. For this purpose a MEMs turning mirror (a plurality of MEMs turning mirrors) may be used.

It should also be noted that the illumination module may be configured with a fixed light projection path along which the light beam is emitted from the illumination module; and/or the imaging module may be configured with a fixed imaging optical path. Thus, the optical module comprises optical elements for: for directing light from the projection optical path to propagate along an output optical path defined by the scanning optical deflector(s); and/or for directing light from an output optical path defined by the scanning optical deflector(s) to propagate along the imaging optical path.

As described above, the control system 150 performs a monitoring phase to determine whether the target illuminated by the light beam is detected by the pixel(s) in the sequence of images. The control system 150 may also be configured to initiate a target tracking phase for steering the platform carrying the system towards the target when the target is detected. During the target tracking phase, the control system 150 operates to stop the operation of the one-dimensional scanning so that the output beam is continuously directed in the forward direction so that during this phase the target is continuously illuminated. Further, the control system operates the beam shaping module to form the output beam into the tracking beam by adjusting the cross-section of the output beam such that the transverse aspect ratio of the cross-section of the output beam is about 1 and expanding the FOV solid angle of the output beam. To this end, the beam shaping module includes an optical element(s) configured and operable for adjusting the transverse aspect ratio of the output light beam, and an adjustable beam expander for expanding the FOV of the light beam.

The control system may be configured to process images captured by the imaging module during a target tracking phase to estimate a number of pixels illuminated by light returning from the target, and upon identifying that this number of pixels exceeds a predetermined threshold, operate the beam expander module to expand the FOV.

As described above, the control system may include a steering controller 159, and the steering controller 159 may be connected to a steering module of the platform for operating the steering module to steer the platform to the target. To this end, the steering controller includes a closed-loop controller that processes the images acquired by the imaging module and operates the steering module to minimize the difference between the intensities of the pixels in successive images illuminated by light returning from the target, thereby minimizing the angular velocity of the platform relative to the target and directing the platform toward the target.

30页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:圆柱形测量销直径的测量方法和装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!