Enhanced depth mapping using visual inertial ranging

文档序号:689844 发布日期:2021-04-30 浏览:38次 中文

阅读说明:本技术 使用视觉惯性测距的增强深度映射 (Enhanced depth mapping using visual inertial ranging ) 是由 D·西尔韦 E·希尔什 M·拉芬费尔德 T·凯特茨 于 2019-09-02 设计创作,主要内容包括:公开了一种成像装置(22),其包括辐射源(40),该辐射源朝向目标场景(24)发射光学辐射的脉冲束(42)。感测元件的阵列(52)输出指示感测元件上光子的相应入射时间的信号。物镜光学器件(54)在感测元件的阵列上形成目标场景的第一图像。图像传感器(64)捕获目标场景的第二图像。处理和控制电路(56,58)被配置为处理第二图像以检测目标场景中至少一个对象与所述装置之间的相对运动,并且被配置为响应于来自所述阵列的所述信号而构建感测元件上光子的入射时间的直方图,以及响应于所检测到的相对运动来调节直方图,并且基于所调节的直方图生成目标场景的深度图。(An imaging device (22) is disclosed, comprising a radiation source (40) emitting a pulsed beam (42) of optical radiation towards an object scene (24). An array (52) of sensing elements outputs signals indicative of respective times of incidence of photons on the sensing elements. Objective optics (54) form a first image of the target scene on the array of sensing elements. An image sensor (64) captures a second image of the target scene. Processing and control circuitry (56,58) is configured to process the second image to detect relative motion between at least one object in the target scene and the apparatus, and to construct a histogram of times of incidence of photons on the sensing elements in response to the signals from the array, and to adjust the histogram in response to the detected relative motion, and to generate a depth map of the target scene based on the adjusted histogram.)

1. An image forming apparatus comprising:

a radiation source configured to emit a pulsed beam of optical radiation towards a target scene;

an array of sensing elements configured to output signals indicative of respective times of incidence of photons on the sensing elements;

objective optics configured to form a first image of the target scene on the array of sensing elements;

an image sensor configured to capture a second image of the target scene; and

processing and control circuitry configured to process the second image to detect relative motion between at least one object in the target scene and the apparatus, and to construct a histogram of the times of incidence of the photons on the sensing elements in response to the signals from the array, and to adjust the histogram in response to the detected relative motion and generate a depth map of the target scene based on the adjusted histogram.

2. The apparatus of claim 1, wherein the relative motion is due to movement of the apparatus, and wherein the processing and control circuitry is configured to filter the histogram to compensate for the movement of the apparatus.

3. The apparatus of claim 2, and comprising an inertial sensor configured to sense the movement of the apparatus and output an indication of the movement, wherein the processing and control circuitry is configured to apply the indication output by the inertial sensor in connection with processing the second image in detecting the movement of the apparatus.

4. The apparatus of claim 1, wherein the processing and control circuitry is configured to extend exposure times of the cumulative histogram upon detecting an absence of the relative motion between the target scene and the apparatus.

5. The apparatus of claim 1, wherein the relative motion comprises movement of an object in the target scene, and wherein the processing and control circuitry is configured to filter the histogram to compensate for the movement of the object.

6. The apparatus of claim 5, wherein the processing and control circuitry is configured to process the second image to extract a trajectory of the movement of the object, and to correct the histogram for the sensing element onto which the trajectory is imaged by the objective optics.

7. The apparatus of claim 1, wherein the processing and control circuitry is configured to identify edges in the histogram and to apply the identified edges in detecting the relative motion.

8. An image forming apparatus comprising:

a radiation source configured to emit a pulsed beam of optical radiation towards a target scene;

an array of sensing elements configured to output signals indicative of respective times of incidence of photons on the sensing elements;

objective optics configured to form a first image of the target scene on the array of sensing elements;

an image sensor configured to capture a second image of the target scene; and

processing and control circuitry configured to process the second image to estimate a depth range of at least one object in the target scene, and to construct a histogram of the times of incidence of the photons on the sensing elements in response to the signals from the array while gating a time range of one or more of the histograms in response to the estimated depth range, and to generate a depth map of the target scene based on the adjusted histogram.

9. The apparatus of any of claims 1-8, wherein the second image is a color image.

10. The apparatus of any one of claims 1 to 8, wherein the sensing element comprises a Single Photon Avalanche Diode (SPAD).

11. A method for imaging, comprising:

directing a pulsed beam of optical radiation toward a target scene;

imaging the target scene onto an array of sensing elements in an imaging device;

receiving a signal from the sensing element, the signal being indicative of a respective time of incidence of a photon on the sensing element;

capturing an image of the target scene;

processing the captured image to detect relative motion between at least one object in the target scene and the imaging device;

constructing a histogram of the time of incidence of the photon on the sensing element in response to the signals from the array;

adjusting the histogram in response to the detected relative motion; and

generating a depth map of the target scene based on the adjusted histogram.

12. The method of claim 11, wherein the relative motion is due to movement of the imaging device, and wherein adjusting the histogram comprises filtering the histogram to compensate for the movement of the imaging device.

13. The method of claim 12, and comprising sensing the movement of the imaging device with an inertial sensor that outputs an indication of the movement, wherein filtering the histogram comprises applying the indication output by the inertial sensor in connection with processing the image in detecting the movement of the imaging device.

14. The method of claim 11, wherein constructing the histogram comprises extending an exposure time of accumulating the histogram upon detecting an absence of the relative motion between the target scene and the imaging device.

15. The method of claim 11, wherein the relative motion comprises movement of an object in the target scene, and wherein adjusting the histogram comprises filtering the histogram to compensate for the movement of the object.

16. The method of claim 15, wherein processing the image comprises extracting a trajectory of the movement of the object, and filtering the histogram comprises correcting the histogram for the sensing element onto which the trajectory is imaged by the objective optics.

17. The method of claim 11, wherein adjusting the histogram comprises identifying edges in the histogram, and applying the identified edges in detecting the relative motion.

18. A method for depth mapping, comprising:

directing a pulsed beam of optical radiation towards a target scene, and receiving signals indicative of respective times of incidence of photons reflected from the target scene on an array of sensing elements in an imaging device;

constructing a histogram of the time of incidence of the photons on the sensing element in response to the signals from the array accumulated over the selected exposure time;

capturing an image of the target scene;

processing the captured image to detect relative motion between an object in the target scene and the imaging device;

receive an indication of movement of the imaging device from an inertial sensor;

upon detecting that the imaging device and the target scene are stationary, increasing the exposure time to accumulate the histogram;

upon detecting that the imaging device has moved, filtering the histogram to correct for the movement; and

upon detecting that the object has moved, correcting the histogram for the movement of the object.

19. The method of any of claims 11 to 18, wherein the second image is a color image.

20. The method of any one of claims 11 to 18, wherein the sensing element comprises a Single Photon Avalanche Diode (SPAD).

Technical Field

The present invention relates generally to systems and methods for depth mapping, and in particular to depth mapping using time-of-flight sensing.

Background

Existing and emerging consumer applications have created an increasing demand for real-time three-dimensional (3D) imagers. These imaging devices, also referred to as depth sensors or depth mappers, enable the distance (and often also the intensity) to each point in the target scene to be measured remotely, referred to as the target depth of field, by illuminating the target scene with a light beam and analyzing the reflected optical signal. Some systems also capture a color image of the target scene and register the depth map with the color image.

One common technique for determining the distance to each point in the target scene involves transmitting one or more pulsed light beams toward the target scene, and then measuring the round-trip time, i.e., time-of-flight (ToF), that the light beam takes when traveling from the source to the target scene and back to the detector array adjacent to the source.

Some ToF systems use Single Photon Avalanche Diodes (SPADs), also known as geiger mode avalanche photodiodes (GAPDs), or possibly arrays of SPAD sensing elements, in measuring photon arrival times. In some systems, the bias control circuit sets the bias voltages in different SPADs in the array to different respective values.

Disclosure of Invention

Embodiments of the present invention described below provide improved depth mapping systems and methods of operation of such systems.

There is thus provided in accordance with an embodiment of the present invention an imaging apparatus including a radiation source configured to emit a pulsed beam of optical radiation toward a target scene. The array of sensing elements is configured to output a signal indicative of respective times of incidence of photons on the sensing elements. The objective optics are configured to form a first image of the target scene on the array of sensing elements. The image sensor is configured to capture a second image of the target scene. Processing and control circuitry is configured to process the second image to detect relative motion between at least one object in a target scene and the apparatus, and the processing and control circuitry is configured to construct a histogram of times of incidence of photons on the sensing element in response to signals from the array, and to adjust the histogram in response to the detected relative motion and generate a depth map of the target scene based on the adjusted histogram.

In some embodiments, the relative motion is a movement of the device, and the processing and control circuitry is configured to filter the histogram to compensate for the movement of the device. In a disclosed embodiment, the apparatus includes an inertial sensor configured to sense movement of the apparatus and output an indication of the movement, wherein the processing and control circuitry is configured to apply the indication output by the inertial sensor in conjunction with processing the second image in detecting the movement of the apparatus.

Additionally or alternatively, the processing and control circuitry is configured to delay exposure time of the cumulative histogram when no relative motion between the target scene and the device is detected.

In further embodiments, the relative motion includes movement of an object in the target scene, and the processing and control circuitry is configured to filter the histogram to compensate for the movement of the object. In disclosed embodiments, the processing and control circuitry is configured to process the second image to extract a trajectory of the object movement and correct the histogram for the sensing element onto which the trajectory is imaged by the objective optics.

Additionally or alternatively, the processing and control circuitry is configured to identify edges in the histogram and apply the identified edges in detecting relative motion.

There is also provided, in accordance with an embodiment of the present invention, an imaging device including a radiation source configured to emit a pulsed beam of optical radiation toward a target scene. The array of sensing elements is configured to output a signal indicative of respective times of incidence of photons on the sensing elements. The objective optics are configured to form a first image of the target scene on the array of sensing elements. The image sensor is configured to capture a second image of the target scene. Processing and control circuitry is configured to process the second image to estimate a depth range of at least one object in the target scene, and the processing and control circuitry is configured to construct a histogram of times of incidence of photons on the sensing elements in response to the signals from the array, while gating a time range of one or more of the histograms in response to the estimated depth range, and generate a depth map of the target scene based on the adjusted histogram.

In some embodiments, the second image is a color image. Additionally or alternatively, the sensing element comprises a Single Photon Avalanche Diode (SPAD).

There is also provided, in accordance with an embodiment of the present invention, a method of imaging, including directing a pulsed beam of optical radiation toward a target scene. An object scene is imaged onto an array of sensing elements in an imaging device. A signal is received from the sensing element indicative of a respective time of incidence of a photon on the sensing element. An image of a target scene is captured, and the captured image is processed to detect relative motion between at least one object in the target scene and the imaging device. In response to signals from the array, a histogram of the time of incidence of photons on the sensing element is constructed. The histogram is adjusted in response to the detected relative motion, and a depth map of the target scene is generated based on the adjusted histogram.

There is also provided, in accordance with an embodiment of the present invention, a method of depth mapping, including directing a pulsed beam of optical radiation toward a target scene, and receiving signals indicative of respective times of incidence of photons reflected from the target scene on an array of sensing elements in an imaging device. In response to the signals from the array accumulated over the selected exposure time, a histogram of the time of incidence of photons on the sensing element is constructed. An image of a target scene is captured and processed to detect relative motion between an object in the target scene and an imaging device. An indication of movement of the imaging device is received from the inertial sensor. The exposure time for accumulating the histogram is increased upon detecting that the imaging device and the target scene are stationary. When it is detected that the imaging device has moved, the histogram is filtered to correct for the movement. Upon detecting that the object has moved, the histogram is corrected for the motion of the object.

The present invention will be more fully understood from the detailed description of embodiments of the invention given below, taken together with the drawings, in which:

drawings

FIG. 1 is a schematic, pictorial illustration of a depth mapping system, in accordance with an embodiment of the present invention;

FIG. 2 is a schematic side view of the depth mapping system shown in FIG. 1, according to an embodiment of the present invention;

fig. 3 is a flow diagram schematically illustrating a method for processing ToF information, in accordance with an embodiment of the invention;

FIG. 4A is a schematic top view of a still imaging device and a scene containing edges captured in a depth map generated by the imaging device according to an embodiment of the present invention;

FIG. 4B is a ToF histogram captured by the imaging device of FIG. 4A at a location of an edge in a scene, according to an embodiment of the invention;

FIG. 5A is a schematic top view of a mobile imaging device and a scene captured in a depth map generated by the imaging device, according to an embodiment of the present invention;

fig. 5B is a ToF histogram captured by the imaging device of fig. 5A according to an embodiment of the invention;

FIG. 6A is a schematic top view of a still imaging device and a scene of graduated depth captured in a depth map generated by the imaging device in accordance with an embodiment of the present invention;

fig. 6B is a ToF histogram captured by the imaging device of fig. 6A according to an embodiment of the invention;

FIG. 7A is a schematic top view of a mobile imaging device and a scene captured in a depth map generated by the imaging device, according to an embodiment of the present invention; and

fig. 7B is a ToF histogram captured by the imaging device of fig. 7A, in accordance with an embodiment of the invention.

Detailed Description

SUMMARY

For depth mapping with fine distance resolution, fine temporal resolution of the ToF is required. To this end, averaging and multi-measurement techniques have been developed, such as time-dependent single photon counting (TCSPC). In this technique, each measurement cycle starts with a start or sync signal and ends with a stop signal provided by the SPAD when the first photon arrives in that cycle (assuming the photon arrives before the next cycle starts). A histogram of arrival times is typically constructed over many cycles of this type and then processed to locate statistical peaks.

These capabilities may be used in an array of processing circuits coupled to the array of sensing elements and including a memory that records the time of incidence of photons incident on each sensing element in each acquisition cycle. To this end, the processing circuitry coupled to each SPAD sensing element may include a respective time-to-digital converter (TDC) that increments in memory a count of respective times of incidence of photons on the sensing element in a plurality of different time bins. At the end of each frame, the controller processes the histogram of the respective counts stored in the pixel memory to derive and output respective arrival time values for the corresponding sensing elements.

In this way, the ToF-based depth mapper is able to measure the depth of field of the target over a large range of distances under varying ambient light conditions. However, existing such depth mappers suffer from high noise and low resolution. The signal/noise ratio and resolution can be improved by increasing the exposure time for constructing the ToF histogram, which means that the histogram accumulates over a larger number of pulses of the one or more beams illuminating the target scene.

However, increasing exposure time also increases the susceptibility of the depth measurement to motion artifacts. These artifacts may arise due to various types of relative motion between objects in the target scene and the depth mapping device, including both movement of objects in the scene and movement of the depth mapping device itself. Although some motion artifacts may be inferred and corrected by comparison between histograms constructed at different times and locations in the scene, this approach is computationally inefficient and may not be able to distinguish certain types of motion artifacts from features in the scene that produce similar histogram features.

Embodiments of the invention described herein utilize auxiliary information, i.e., information from sources other than ToF sensing elements, in detecting relative motion between an object in a target scene and a depth mapping device. In some embodiments, the ancillary information is provided by processing an additional image of the scene, such as a color video image, captured by an image sensor associated with the device. The additional images may also provide depth information, which may indicate three-dimensional motion, for example, using pattern-based or stereoscopic depth sensing. Additionally or alternatively, the assistance information may be provided by an inertial sensor in the device that indicates whether the device has moved, and if so, in what direction. This combined processing of the image and inertial signals is known as "visual inertial ranging".

Accordingly, an embodiment of the present invention provides an image forming apparatus including: a radiation source emitting a pulsed beam of optical radiation towards a target scene; and an array of sensing elements that output signals indicative of respective times of incidence of photons on the sensing elements; and objective optics for imaging a target scene onto the array. The processing and control circuitry constructs a histogram of the time of incidence of photons (also referred to as the "arrival time" of photons) on the sensing element based on the signals from the sensing element.

The additional image sensor captures its own image of the target scene. In some embodiments, the processing and control circuitry processes the latter image to detect relative motion between the at least one object in the target scene and the apparatus. The processing and control circuitry adjusts the histogram based on the detected motion (or lack thereof), and generates a depth map of the target scene based on the adjusted histogram. If movement of an object in the device or scene is detected in this manner, the processing and control circuitry may filter the histogram to compensate for the movement, and may thus eliminate or at least reduce corresponding motion artifacts in the depth map. On the other hand, when no relative motion is detected (which means that both the apparatus and the target scene are stationary), the processing and control circuitry may extend the exposure time of the cumulative histogram, thereby enhancing the signal/noise ratio and accuracy of the depth map.

In other embodiments, the processing and control circuitry processes the additional images to estimate a depth range of an object in the target scene, and uses the estimated depth range in gating a time range of the histogram of photon incidence times. Thus, the histogram can be constructed with higher resolution while ignoring artifacts that fall outside the gating range. A guided filter (such as a cross-bilateral and guided filter) optimized to produce bounded errors (rather than minimizing average errors) may be used in this context to give a prediction of the estimated depth of each point in each frame. These estimates can be used to filter out parts of the histogram where the true signal is not verifiable, giving a higher detection rate.

Description of the System

Fig. 1 is a schematic, pictorial illustration of a depth mapping system 20, in accordance with an embodiment of the present invention. In the depicted scene, the imaging device 22 generates a depth map of the target scene 24 within the field of view 26 of the device. In this example, the target scene 24 contains moving objects, such as human silhouettes 28, as well as stationary objects, including chairs 30, walls 32, pictures 34, windows 36, and carpets 38. Although the imaging device 22 is illustrated in fig. 1 as a desktop unit, the imaging device may alternatively be a mobile device or a handheld device, and thus may also be moved during the acquisition of the depth map.

The imaging device 22 measures depth values by directing a beam of optical radiation toward a point in the target scene 24 and measuring the arrival time of photons reflected from each point. For convenience, the front plane of the device 22 is taken as the X-Y plane, and the depth coordinates of points in the target scene are measured along the Z axis. Thus, the depth map generated by imaging device 22 represents target scene 24 as a grid of points in the X-Y plane, with depth coordinates indicating the measured distance to each point.

Fig. 2 is a schematic side view of system 20 showing details of imaging device 22, according to an embodiment of the present invention. For the sake of brevity and clarity, these details are shown by way of example to aid in understanding the principles of operation of the invention in generating depth maps, and in particular the use of auxiliary motion-related information in generating such depth maps. Alternatively, these principles may be applied to other types of systems with suitable depth mapping and imaging capabilities.

The imaging device 22 includes a radiation source 40 that emits a plurality of pulsed beams 42 of optical radiation toward the target scene 24. The term "optical radiation" is used in the present specification and claims to refer to any electromagnetic radiation in the visible, infrared and ultraviolet ranges, and is used interchangeably with the term "light" in this context. In this example, the radiation source 40 includes a two-dimensional array 44 of Vertical Cavity Surface Emitting Lasers (VCSELs) that are driven to emit a sequence of short pulses of optical radiation. A Diffractive Optical Element (DOE)46 may optionally be used to replicate the actual beams emitted by the VCSELs in array 44 to output a greater number of beams 42 (e.g., about 500 beams) at different respective angles relative to radiation source 40. The collimating lens 48 projects the beam 42 toward the target scene 24.

A receiver 50 (also referred to as a "depth camera") in the imaging device includes a two-dimensional array 52 of sensing elements, such as SPADs or Avalanche Photodiodes (APDs), that output signals indicative of respective times of incidence of photons on the sensing elements. Objective optics 54 form an image of target scene 24 on array 52. The processing unit 56 is coupled to groups of mutually adjacent sensing elements (which are referred to herein as "super-pixels") and processes the signals from the sensing elements in each super-pixel together to generate a measure of the arrival time of a photon on a sensing element in the group after each pulse of the beam 42. For clarity of explanation, processing unit 56 is illustrated in FIG. 2 as being separate from array 52, but in some implementations, the processing unit and array are integrated in a single integrated circuit device. Alternatively, each sensing element may have its own dedicated processing unit.

The processing unit 56 includes hardware amplification and logic circuitry that senses and records the pulses output by the SPAD (or other sensing element). Thus, the processing unit 56 measures the arrival time of the photons that produced the pulse output by the SPAD, and possibly the intensity of the reflected laser pulse incident on the array 52. Processing unit 56 may include, for example, a time-to-digital converter (TDC) and digital circuitry for constructing a histogram of the arrival times of photons incident on a respective sensing element (or superpixel group of sensing elements) over a plurality of pulses emitted by the VCSELs in array 44. The processing unit 56 thus outputs a value indicative of the distance from the corresponding point in the scene 24, and may also output an indication of the signal strength.

Alternatively or in addition, some or all of the components of the processing unit 56 may be separate from the array 52 and may be integrated with the control processor 58, for example. For the sake of generality, the control processor 58 and the processing unit 56 are collectively referred to herein as "processing and control circuitry".

Based on the histograms constructed by the processing unit 56, the control processor 58 calculates the time of flight of the photons in each beam 42 and thus generates a depth map comprising depth coordinates corresponding to the distance from the corresponding point in the target scene 24. This mapping is based on the timing of the emission of the beam 42 by the radiation source 40 and the arrival time (i.e., the time of incidence of the reflected photons) measured by the processing unit 56. The control processor 58 stores the depth coordinates in the memory 60 and may output a corresponding depth map for display and/or further processing.

In addition to the depth sensing functionality described above, the imaging device 22 also includes a two-dimensional imaging camera 62. The camera 62 in this example includes an image sensor 64, such as an RGB color sensor, as is known in the art. Imaging lens 66 forms an image of target scene 24 on image sensor 64, which thus outputs an electronic image of the target scene. Because camera 62 is mounted in a fixed spatial and optical relationship with respect to receiver 50, the electronic image output by camera 62 will generally be aligned with the image formed by objective optics 54 on array 52. Control processor 58 receives image data output by camera 62 and is used in detecting relative motion between objects in target scene 24 and imaging device 22, and in adjusting the histograms constructed by processing unit 56 in response to the detected relative motion, as described further below.

In the illustrated embodiment, the imaging device 22 also includes an inertial sensor 68, such as the kind of solid-state accelerometer found in most smart phones and other kinds of mobile devices. The inertial sensor 68 senses and outputs an indication of movement of the imaging device 22, as is well known in the art. The control processor 58 applies this indication, typically in conjunction with image data provided by the camera 62, in adjusting the histogram constructed by the processing unit to compensate for movement of the imaging device. By processing the outputs of the inertial sensors in conjunction with the images output by camera 62, as explained further below, control processor 58 is able to more accurately model the effect of movement of the imaging device on the depth map and to distinguish between the effect of movement of imaging device 22 and the effect of movement of objects in the target scene.

Control processor 58 typically includes a programmable processor programmed in software and/or firmware to perform the functions described herein. Alternatively or in addition, the controller 26 includes hardwired and/or programmable hardware logic circuits that perform at least some of the functions of the control processor. Although control processor 58 is illustrated in fig. 2 as a single monolithic functional block for simplicity, in implementation, the control processor may comprise a single chip or a set of two or more chips with suitable interfaces for receiving and outputting signals shown in the figures and described herein.

Method of operation

Fig. 3 is a flow diagram schematically illustrating a method for processing ToF information, in accordance with an embodiment of the present invention. For clarity and brevity, the method is described herein with reference to elements of system 20 and imaging device 22, as shown in fig. 1 and 2. Alternatively, the principles of the present method may be applied to other kinds of ToF-based depth mapping systems, mutatis mutandis. Although the steps of the method of fig. 3 are shown in a certain order, in practice, these steps may be performed in parallel. Additionally or alternatively, one or more of these steps may be omitted, and the remaining steps may be performed separately or in various different sub-combinations. All such alternative implementations are considered to be within the scope of the present invention.

The control processor 58 initiates the method of fig. 3 periodically, for example after each frame of histogram data acquired by the receiver (depth camera) 50. The control processor examines the output of the inertial sensor 68 and processes the image output by the camera 62 to determine if there has been any relative motion between the imaging device 22 and the object in the scene 24 during the current frame. For example, the control processor 58 may compare the position of the object in the current frame to the same object in the previous frame to determine whether its position has changed, and if so, may compare the object motion to the camera motion indicated by the inertial sensor 68 to determine whether the change in object position is due to motion of the object or motion of the camera. Alternatively, this step may be based solely on processing of the image from camera 62 and finding no change from the image captured in the previous frame.

When no motion is detected at all in this step, control processor 58 may decide to extend the exposure time of the cumulative histogram. For example, control processor 58 may instruct processing unit 56 to continue accumulating photon arrival times over an additional one or more frames and construct a corresponding histogram. Alternatively, control processor 58 may read out and sum successive histograms in memory 60. In either case, the signal/noise ratio of the resulting histogram will generally increase with the square root of the exposure time, thereby increasing the accuracy of the depth coordinates that can be extracted from the histogram.

Otherwise, the control processor 58 checks whether the motion detected in the image output by the camera 62 is due to motion of the imaging device 22 or motion in the scene 24. In this regard, the signal from the inertial sensor 68 provides a reliable indicator. Additionally or alternatively, control processor 58 may utilize image processing techniques known in the art for this purpose to calculate an optical flow field across an image or sequence of images of scene 24 that it receives from camera 62. A consistent translational and/or rotational flow over the entire field will generally indicate movement of the imaging device 22, while a local flow will define the object or objects being moved. The effect of such movement on the ToF histogram is shown in the following figures.

Upon finding that imaging device 22 has moved relative to scene 24, control processor 58 may filter the histogram constructed by processing unit 56 to compensate for this movement. For example, a sudden movement in the Z direction (towards or away from the scene 24 in the coordinate system defined in fig. 1 and 2) may produce a histogram with multiple peaks at different times of flight, while a gradual movement will result in a broadening of the histogram peaks. (these effects are shown in fig. 5A/B and 7A/B.) in such cases, control processor 58 may be able to filter out invalid data or fold multiple peaks or widen peaks to give the correct depth value.

On the other hand, when control processor 58 identifies an object that has moved in target scene 24, it may filter the histogram to compensate for the movement of the object. For example, the control processor may process the images output by the camera 62, or possibly a sequence of such images, to extract a trajectory of the movement of the object, and then correct the histogram for those sensing elements in the array 52 to which the trajectory is imaged by the objective optics. For example, movement of the object in the Z-direction (toward or away from the imaging device 22) will produce a sequence of histogram peaks at different depths in successive frames. Moving an object laterally, i.e., in the X-Y plane (parallel to the plane of image sensor 64, as shown in fig. 2), will produce a histogram peak at some depth coordinate that will shift across the depth map in successive frames. Once the control processor 58 has identified a moving object, it may filter out or merge peaks at different Z or X-Y coordinates in order to correct the sequence of frames under consideration. The decision as to whether to filter or merge the histogram data depends on the application requirements and the quality of the data.

Reference is now made to fig. 4A and 4B, which schematically illustrate the effect and pattern of processing of edges 72 in a scene 70 mapped by an imaging device 22, in accordance with an embodiment of the present invention. Fig. 4A is a schematic top view of imaging device 22 and scene 70, while fig. 4B is a ToF histogram captured by imaging device 22 at the location of edge 72 in scene 70. The edge 72 appears as a double peak in this histogram due to the finite width of the pixels in the depth map generated by the imaging device 22. (pixels include photons reflected from both sides of the edge 72; thus, photons reflected from surfaces on the upper side of the edge will produce one peak at shorter distances while those reflected from surfaces on the lower side will produce another peak at longer distances in FIG. 4A.) the control processor 58 may identify such edges in the histogram, along with their corresponding edges in the image output by the camera 62, and use in detecting and correcting for relative motion between objects in the scene and the imaging device 22.

Fig. 5A and 5B schematically illustrate the effect of sudden motion of the imaging device 22 relative to a scene 74 captured in a depth map generated by the imaging device, according to an embodiment of the present invention. Fig. 5A is a schematic top view illustrating movement of imaging device 22 between positions 76 and 78. Fig. 5B is a ToF histogram captured by the imaging device, where two peaks correspond to the two positions of the imaging device. Although the histogram in fig. 5B is very similar to the histogram in fig. 4B, the control processor 58 is able to distinguish between these two cases based on edges in the scene 70 that appear in the image captured by the camera 62 and the signal output by the inertial sensor 68. For example, if an edge appears in the image captured by the camera 62, but the inertial sensor 68 does not detect any movement, the control processor 58 may infer that the double peak in the histogram is due to an actual edge in the depth map. On the other hand, when the inertial sensor detects movement of the imaging device and/or fails to detect a corresponding edge in the image, the control processor will conclude that the double peak is false. Control processor 58 will then correct the histogram of fig. 5B accordingly, for example by eliminating one of the two peaks or combining the two peaks into a single peak, with the movement from position 76 to position 78 being adjusted.

Fig. 6A/B and 7A/B show similar kinds of comparisons according to another embodiment of the present invention. Fig. 6A is a schematic top view of the imaging device 22 in the process of generating a depth map of the scene 80 of graduated depth, while fig. 6B is a ToF histogram captured by the imaging device 22 in this case. In this case, the imaging device 22 is stationary, but the peaks in the histogram in fig. 6B are widened due to the range of depth values covered by the corresponding pixels in the depth map.

On the other hand, in the schematic top view of fig. 7A, the imaging device 22 is gradually moved in the Z-direction within the range 82, thereby similarly producing a broadened peak, as shown in the ToF histogram of fig. 7B. Again, the control processor 58 can detect motion of the imaging device 22 based on the output of the inertial sensor 68 and/or analysis of the image captured by the camera 62, and can therefore adjust and correct the histogram, for example, by narrowing the expansion of the histogram to compensate for the detected motion and calculating more accurate depth values.

Although the above-described embodiment relates to a particular physical configuration of device 22, the principles of the present invention may be similarly applied to other kinds of ToF-based depth mapping devices. For example, although device 22, the techniques described above may be applied to improve accuracy. It will be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种多脉冲激光雷达系统抗干扰处理方法及装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!