Method and apparatus for compensating for light reflection from a cover of a time-of-flight camera

文档序号:1041587 发布日期:2020-10-09 浏览:6次 中文

阅读说明:本技术 用于补偿来自飞行时间相机的盖的光反射的方法和设备 (Method and apparatus for compensating for light reflection from a cover of a time-of-flight camera ) 是由 A·施恩莱博 D·卢基特施 H·波兰克 于 2020-03-27 设计创作,主要内容包括:提供了一种用于在由飞行时间相机感测的场景的图像中补偿来自飞行时间相机的盖的光反射的方法。该方法从飞行时间相机接收场景的图像。此外,该方法包括使用参考图像修改场景的图像以获得场景的补偿图像。参考图像的像素指示仅与来自飞行时间相机的盖的光反射相关的参考值。另外,该方法包括输出补偿图像。(A method is provided for compensating for light reflections from a cover of a time-of-flight camera in an image of a scene sensed by the time-of-flight camera. The method receives an image of a scene from a time-of-flight camera. Further, the method includes modifying an image of the scene using the reference image to obtain a compensated image of the scene. The pixels of the reference image indicate reference values that are related only to light reflections from the cover of the time-of-flight camera. Additionally, the method includes outputting the compensated image.)

1. A method (100) for compensating for light reflections in an image of a scene sensed by a time-of-flight camera, the light reflections coming from a cover of the time-of-flight camera, the method (100) comprising:

receiving (102) an image of the scene from the time-of-flight camera;

modifying (104) an image of the scene using a reference image to obtain a compensated image of the scene, wherein pixels of the reference image indicate reference values relating only to light reflections from the cover of the time of flight camera; and

outputting (106) the compensated image.

2. The method of claim 1, wherein an illumination element of the time-of-flight camera for illuminating the scene and a light capture element of the time-of-flight camera for receiving reflected light from the scene are arranged in a common cavity covered by the cover.

3. The method of claim 1 or 2, wherein the image of the scene is one of an original image, an image derived from the original image, an intensity image, an image derived from the intensity image, a depth image, or an image derived from the depth image.

4. The method of any of claims 1-3, wherein the reference picture is a reference original picture.

5. The method of claim 4, wherein the time-of-flight camera illuminates the scene with a continuous wave modulation signal and generates a measurement signal based on reflected light from the scene, wherein the image of the scene is a raw image based on a correlation between the continuous wave modulation signal and the measurement signal according to a correlation function, and wherein the reference raw image is selected from a plurality of reference raw images based on a phase shift between the continuous wave modulation signal and the measurement signal used for the correlation.

6. The method of claim 4 or 5, wherein the time-of-flight camera illuminates the scene using a continuous wave modulation signal, and wherein the reference raw image is selected from a plurality of reference raw images based on a frequency of the continuous wave modulation signal.

7. The method of any of claims 4 to 6, wherein the reference raw image is based on a factory calibration.

8. The method according to any one of claims 4 to 6, wherein the method (100) further comprises: controlling the time-of-flight camera to illuminate with a coded modulation signal such that a measurement range of the time-of-flight camera ends immediately after the cover, wherein a raw image captured by the time-of-flight camera based on the coded modulation signal is used as the reference raw image.

9. The method of any preceding claim, further comprising:

receiving a depth image from the time-of-flight camera;

determining a closest reflecting object rendered in the depth image;

determining an exposure time of the time-of-flight camera such that a measurement range of the time-of-flight camera ends before the closest reflective object; and

controlling the time-of-flight camera to capture a secondary raw image using the exposure time, wherein the secondary raw image is used as the reference image.

10. The method of any preceding claim, further comprising:

receiving a depth image from the time-of-flight camera;

determining whether any reflective objects having a distance greater than the cover are rendered in the depth image;

controlling the time-of-flight camera to capture a secondary raw image if no reflective object is drawn in the depth image that is a greater distance than the cover, wherein the secondary raw image is used as the reference image.

11. The method of any of claims 1-3, wherein the reference image is a reference depth image.

12. The method of claim 11, further comprising:

receiving a depth image from the time-of-flight camera; and

determining whether any reflective objects having a distance greater than the cover are drawn in the depth image, wherein the depth image is used as the reference depth image if no reflective objects having a distance greater than the cover are drawn in the depth image.

13. The method of any of claims 1 to 3, wherein the reference image is a reference intensity image.

14. The method of claim 13, wherein modifying (104) a phase image of the scene using the reference intensity image comprises:

scaling the reference values indicated by pixels of the reference intensity image by a scaling factor to obtain a scaled reference intensity image; and

modifying an image of the scene using the scaled reference intensity image to obtain the compensated image.

15. The method of any of claims 1-14, wherein the exposure times of the image of the scene and the reference image are equal.

16. The method of any of claims 1-14, wherein modifying (104) the image of the scene using the reference image if the exposure times of the image of the scene and the reference image are different comprises:

scaling the reference values indicated by pixels of the reference image by a scaling factor to obtain a scaled reference image, wherein the scaling factor is based on a ratio of exposure times of an image of the scene and the reference image; and

modifying an image of the scene using the scaled reference image to obtain the compensated image.

17. An apparatus (700) for compensating for light reflections from a cover of a time-of-flight camera in an image of a scene sensed by the time-of-flight camera, the apparatus (700) comprising:

an input circuit (710) configured to receive a phase image of the scene from the time-of-flight camera;

processing circuitry (720) configured to modify an image of the scene using a reference image to obtain a compensated image of the scene, wherein pixels of the reference image indicate reference values relating only to light reflections from the cover of the time of flight camera; and

an output circuit (730) configured to output the compensated image.

18. An apparatus for compensating for light reflections in an image of a scene sensed by a time-of-flight camera, the light reflections coming from a cover of the time-of-flight camera, the apparatus comprising:

means for receiving a phase image of the scene from the time-of-flight camera;

means for modifying an image of the scene using a reference image to obtain a compensated image of the scene, wherein pixels of the reference image indicate reference values that are related only to light reflections from the cover of the time of flight camera; and

means for outputting the compensated image.

19. A method (600) for compensating for light reflections from a cover of a time-of-flight camera, wherein a light capturing element of the time-of-flight camera is covered by the cover and comprises an array of photon mixing devices, wherein each photon mixing device separates charges generated by light reaching the photon mixing device so as to provide two charge values for the respective photon mixing device, the method (600) comprising:

receiving (602) a charge value of the photonic mixing device;

modifying (604) the charge value using a reference value related only to light reflections from the cover of the time of flight camera to obtain a compensated charge value; and

outputting (606) the compensated charge value.

20. An apparatus (700) for compensating for light reflections from a cover of a time-of-flight camera, wherein a light capturing element of the time-of-flight camera is covered by the cover and comprises an array of photon mixing devices, wherein each photon mixing device separates charges generated by light reaching the photon mixing device so as to provide two charge values for the respective photon mixing device, the apparatus (700) comprising:

an input circuit (710) configured to receive a charge value of the photonic hybrid;

a processing circuit (720) configured to modify the charge value using a reference value related only to light reflections from the cover of the time of flight camera to obtain a compensated charge value; and

an output circuit (730) configured to output the compensated charge value.

21. An apparatus for compensating for light reflections from a cover of a time-of-flight camera, wherein a light capturing element of the time-of-flight camera is covered by the cover and comprises an array of photonic mixing devices, wherein each photonic mixing device separates charges generated by light reaching the photonic mixing device to provide two charge values for the respective photonic mixing device, the apparatus comprising:

means for receiving a charge value of the photonic mixing device;

means for modifying the charge value using a reference value related only to light reflections from the cover of the time of flight camera to obtain a compensated charge value; and

means for outputting the compensated charge value.

22. A non-transitory machine readable medium having a program stored thereon, having program code for performing the method of any one of claims 1 to 16 or 19 when the program is executed on a processor or programmable hardware.

23. A program having a program code for performing the method of any one of claims 1 to 16 or 19 when the program is executed on a processor or programmable hardware.

Technical Field

The present disclosure relates to error correction for time-of-flight (ToF) sensing. In particular, examples relate to methods and devices for compensating for light reflections from the cover of a ToF camera.

Background

ToF cameras are typically covered by a cover glass to protect the imaging and illumination elements from the surrounding environment. However, the light emitted by the lighting element is partially reflected by the cover glass. Reflection causes erroneous measurements because it occurs that light reflected by the cover glass and light reflected by the scene sensed by the ToF camera can produce undesirable mixing of light.

Disclosure of Invention

Therefore, there is a need to compensate for light reflections from the cover of the ToF camera.

This need may be met by the subject matter of the appended claims.

One example relates to a method for compensating for light reflections from a cover of a ToF camera in an image of a scene sensed by the ToF camera. The method includes receiving an image of a scene from a ToF camera. Further, the method includes modifying an image of the scene using the reference image to obtain a compensated image of the scene. The pixels of the reference image indicate reference values that are related only to light reflections from the cover of the ToF camera. Additionally, the method includes outputting the compensated image.

Another example relates to yet another method for compensating for light reflections from the cover of a ToF camera. The light capturing element of the ToF camera is covered by a cover and comprises an array of photonic mixing devices. Each photonic mixing device separates charges generated by light reaching the photonic mixing device to provide two charge values for the respective photonic mixing device. The method includes receiving a charge value of a photonic mixing device. Furthermore, the method comprises modifying the charge value using a reference value related only to light reflections from a cover of the ToF camera to obtain a compensated charge value. The method also includes outputting a compensated charge value.

Drawings

Some examples of apparatus and/or methods will now be described, by way of example only, with reference to the accompanying drawings, in which:

fig. 1 shows a flow chart of an example of a method for compensating for light reflections from a cover of a ToF camera in an image of a scene sensed by the ToF camera;

fig. 2 shows an example of a ToF camera;

fig. 3 shows an exemplary comparison between two intensity images of a ToF camera;

FIG. 4 shows an exemplary comparison between two depth images of a ToF camera;

FIG. 5 shows an example of measurement settings for determining a reference image;

FIG. 6 shows a flow chart of an example of another method for compensating for light reflections from the cover of a ToF camera; and

fig. 7 shows an example of a device for compensating for light reflection from the cover of a ToF camera.

Detailed Description

Various examples will now be described more fully with reference to the accompanying drawings, in which some examples are shown. In the drawings, the thickness of lines, layers and/or regions may be exaggerated for clarity.

Thus, while further examples are capable of various modifications and alternative forms, specific examples thereof are shown in the drawings and will be described below in detail. However, such detailed description does not limit further examples to the particular forms described. Further examples may cover all modifications, equivalents, and alternatives falling within the scope of the disclosure. The same or similar reference numerals denote the same or similar elements throughout the description of the drawings, and these elements may be embodied in the same or modified forms as compared with each other while providing the same or similar functions.

It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled through one or more intervening elements. If an "or" is used to combine two elements, a and B, this is to be understood as disclosing all possible combinations, i.e. only a, only B and a and B, if not explicitly or implicitly otherwise. An alternative wording of the same combination is "at least one of a and B" or "a and/or B". The same applies, with variations in detail, to combinations of more than two elements.

The terminology used herein to describe particular examples is not intended to be limiting of further examples. Further examples may also use multiple elements to achieve the same functionality whenever singular forms such as "a," "an," and "the" are used and the use of a single element is neither explicitly nor implicitly limiting as to mandatory. Similarly, when functionality is subsequently described as being implemented using multiple elements, further examples may use a single element or processing entity to achieve the same functionality. It will be further understood that the terms "comprises" and/or "comprising," when used, specify the presence of stated features, integers, steps, operations, processes, actions, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, processes, actions, elements, components, and/or groups thereof.

Unless defined otherwise, all terms (including technical and scientific terms) used herein have the same general meaning as the art to which examples belong.

Fig. 1 shows a flow diagram of a method 100 for compensating for light reflections from a cover of a ToF camera in an image of a scene sensed by the ToF camera. Before describing the method 100 in detail, the following paragraphs will introduce some basic knowledge of ToF depth measurements for the teaching reasons associated with fig. 2.

ToF camera 200 includes an illumination element 210 for illuminating a scene with modulated light 211 (e.g., infrared light). The lighting element 210 generates modulated light 211 based on an (electrically) modulated radio frequency signal, such as a continuous wave modulated signal (cw), for example by controlling one or more light emitting diodes, LEDs, or one or more laser diodes based on the modulated signal. An object 230 in the scene illuminated by the modulated light 211 reflects at least a portion of the modulated light 211 back to the light capturing element 220 (e.g., comprising optical elements, image sensors, and driver electronics) of the ToF camera 200. In other words, the light capturing element 220 receives reflected light 231 from the object 230. For example, if ToF camera 200 is used for secure facial recognition, object 230 may be a human face.

The image sensor of the light capturing element 220 is pixelated and each pixel measures a small fraction of the reflected light 231. Accordingly, an (electrical) measurement signal based on reflected light 231 from the scene is generated. For example, each pixel may include a Photon Mixing Device (PMD) for measuring the reflected light 231.

According to the distance d between the ToF camera 200 and the object 230objThat is, the reflected light 231 exhibits a delay with respect to the emission of the modulated light 211 according to the depth. The measurement signal therefore experiences a distance-dependent (depth-dependent) phase shift with respect to the modulated radio frequency signal.

According to (auto-) correlation functions

Figure BDA0002429316170000041

Make the modulation signal sum of each pixelThe measurement signals are correlated to obtain a correlation value L for each pixel. Correlation functionSimulating a phase-distance function describing the distance d measured by each pixel of the ToF camera 200objAnd phase values.

Correlation function

Figure BDA0002429316170000043

The output of (a) is the correlation value L for each pixel. The determined correlation values L are then combined to the original image (sometimes also referred to as "phase image"). That is, the original image includes a plurality of pixels, each of which represents a corresponding correlation value L.

To for the correlation functionSampling is performed to generate a plurality of original images. The phase shift between the modulation signal used for correlation and the measurement signal varies between the individual original images. In other words, different phase shifts are used to correlate the modulated signal and the measurement signal to obtain corresponding original images.

Sampling the same object at the same distance and showing the same reflectivity enables correlation functions to be performed

Figure BDA0002429316170000045

Sampling is performed. For example, phase shifts of 0 °, 90 °, 180 ° and 270 ° may be used to generate four original images, each of which includes a value representing a corresponding correlation value L、L90°、L180°And L270°A plurality of pixels.

For the phase correlation value LCorrelation function

Figure BDA0002429316170000046

Distance-dependent phase shift between measurement signal and modulation signal passing through pixel with respect to zero value of function parameter

Figure BDA0002429316170000047

A shift is performed.

Using a pair correlation function

Figure BDA0002429316170000048

Four correlation values L for sampling、L90°、L180°And L270°The phase shift (phase angle) can be determined as follows

Taking into account the speed of light c and the modulation frequency f of the emitted light 211p(i.e., the modulation frequency of the modulation signal), the distance d to the object 230 can be calculated as followsobjNamely depth:

Figure BDA0002429316170000051

the lighting element 210 and the light capturing element 220 are arranged (housed) in a common cavity 250 covered by a cover 240 to protect the lighting element 210 and the light capturing element 220 from the surrounding environment (e.g. dust or moisture). For example, the cover 240 may be made of glass, plastic, or any other suitable material. For example, the cover 240 may be a glass cover of a mobile phone or an automotive ToF system. In some examples, the cover 240 may be an OLED (organic light emitting diode) display. It should be noted that the cover 240 may be any element that is capable of protecting the illumination element 210 and the light capturing element 220 from the surrounding environment and is partially transparent to the modulated light 211 emitted by the illumination element 210 and the reflected light 231 to be received by the light capturing element 220.

As shown in fig. 2, modulated light 211 emitted by lighting element 210 is partially reflected by cover 240. Thus, in addition to the desired reflected light 231 from the object 230, the light capturing element 220 also receives the undesired reflected light 241 from the cover 240. Unwanted reflected light 241 from the cover 240 mixes with wanted reflected light 231 from the object 230, resulting in erroneous measurements by the ToF camera. This is illustrated in fig. 3 and 4 for another scene sensed by the ToF camera.

Fig. 3 shows a comparison between two intensity images taken by ToF camera 200. The upper intensity image 310 is taken by the ToF camera 200 without the cover 240. In other words, when ToF camera 200 captures image 310, cover 240 is removed. For comparison, the lower intensity image 320 was taken by a ToF camera mounted cover 240.

Unwanted reflected light 241 from the cover 240 mixes with wanted reflected light 231 from the sensing scene. As can be seen from the intensity images 310 and 320, the contrast of the intensity images deteriorates due to the unwanted reflected light 241 from the cover 240.

Fig. 4 shows two depth images 410 and 420 of ToF camera 200 corresponding to intensity images 310 and 320 shown in fig. 3. The upper depth image 410 is taken by ToF camera 200 without cover 240, while the lower depth image 420 is taken by a ToF camera with cover 240 installed. As can be seen from depth images 410 and 420, unwanted reflected light 241 from cover 240 affects the depth measurement of ToF camera 200.

Returning to fig. 1, the method 100 may allow for compensation of light reflections from the cover of the ToF camera in the image of the scene sensed by the ToF camera. Method 100 includes receiving an image of a scene from a ToF camera (102). The image of the scene may be any type of image that may be provided by the ToF camera, such as an original image, an image derived from an original image, an intensity image, an image derived from an intensity image, a depth image, or an image derived from a depth image. For example, the image derived from the original image may be a phase angle image that includes pixels representing phase shifts (phase angles) derived from one or more correlation values of respective pixels of the lighting element. In other examples, the image derived from one of the images listed above may be an erroneous image based on the respective image and one or more error corrections for the ToF camera. In some examples, the image derived from one of the images listed above may be an image based on a combination of the respective image and the other image (e.g., a combination of two depth images captured using modulation signals of different frequencies).

Further, the method 100 includes modifying an image of the scene using the reference image to obtain a compensated image of the scene (104). The pixels of the reference image indicate (represent) reference values that are only related to the light reflection from the cover of the ToF camera. In other words, the pixels of the reference image indicate values that are (substantially) caused only by light reflections from the cover of the ToF camera and not by light reflections from any other object, such as the surroundings of the ToF camera. For example, the reference image may be a reference original image, a reference depth image, or a reference intensity image. For example, the pixels of the reference depth image only indicate depth values related/caused by light reflections from the cover of the ToF camera. The reference image thus characterizes the effect of the light reflection from the cover on the image taken by the ToF camera. The reference image thus allows compensation of the effect of light reflection from the cover in the image of the scene.

When modifying 100 an image of a scene, a value indicated by a pixel of the image of the scene is modified by a reference value indicated by a pixel of a reference image. For example, an image of a scene may be modified pixel by pixel using a reference image. In other words, the values indicated by the pixels of the image of the scene are modified by the reference values indicated by the pixels located at the same pixel position in the reference image. For example, a reference value indicated by a pixel of a reference image may be subtracted from a value indicated by a pixel of an image of the scene. However, the proposed concept is not limited to subtracting the reference value from the values indicated by the pixels of the image of the scene. In general, pixels of an image of a scene may be modified/adjusted/changed in any suitable manner based on pixels of a reference image.

Further, the method 100 includes outputting a compensated image of the scene (106). Similar to that described above for the received image of the scene, the compensated image of the scene may be, for example, an original image, an image derived from an original image, an intensity image, an image derived from an intensity image, a depth image, or an image derived from a depth image.

The compensated image of the scene is corrected for the effects of light reflections from the cover of the ToF camera. Thus, erroneous measurements made by the ToF camera, represented by pixels of an image of the scene, may be mitigated at least in a compensated image of the scene. In some examples, the compensated image of the scene may not be affected by light reflections from the cover of the ToF camera.

As described above, in some examples, the reference image may be a reference original image. To sense a scene, a ToF camera may illuminate the scene using a continuous wave modulated signal and generate a measurement signal based on reflected light from the scene as described above. Thus, according to the general concept of ToF sensing described above, according to a correlation functionThe original image of the scene is based on the correlation of the continuous wave modulated signal with the measurement signal. In order to correctly correct the original image of the scene, a reference original image may be selected from a plurality of reference original images based on a phase shift between the continuous wave modulation signal and the measurement signal for correlation. The multiple reference original images may correspond to different phase shifts. Thus, for each phase shift used, an appropriate (corresponding) reference raw image for correcting the light reflection from the cover of the ToF camera can be selected.

Similarly, a reference original image may be selected from a plurality of reference original images based on the frequency of the continuous wave modulation signal. The plurality of reference original images may correspond to different frequencies of the continuous wave modulated signal. Thus, for each frequency of the continuous wave modulation signal, a (corresponding) reference raw image for correcting light reflection from the cover of the ToF camera can be selected.

For example, if four original images using four different phase shifts are photographed by a ToF camera, four reference original images may be provided to correct the original images. If four additional original images using four different phase shifts are taken for another frequency of the continuous wave modulated signal, four additional reference original images may be provided to correct the four additional original images for the other frequency of the continuous wave modulated signal. In other words, for each parameter (phase and frequency), a separate reference image may be provided.

For example, the reference raw image may be based on factory calibration. That is, the reference raw image may be generated or provided during the production process of the ToF camera. For example, a reference raw image may be captured by a ToF camera in a defined production (calibration) environment that does not contain any reflective objects. For example, the ToF camera may sense white walls, such that only the cover of the ToF camera causes reflections. As described above, the reference original image may be saved and subtracted from the scene image that is subsequently captured. This approach is feasible because the light capture element gathers all incident light into a correlation value (or phase value).

In some examples, the reference raw image may be generated or (e.g., sporadically) updated during operation of the ToF camera. For example, coded modulation may be used to fix the measurement range of the ToF camera to the cover of the ToF camera in order to characterize the cover of the ToF camera. For coded modulation, a coded modulation signal is used for illumination instead of a continuous wave modulation signal. In a code modulated signal, the sequence of pulses is changed. In other words, the code modulated signal exhibits pulses of variable length when the continuous wave modulated signal exhibits an alternating sequence of high and low pulses of equal length (duration). For example, Kasami code sequences or m-sequences may be used to encode the modulated signal.

The result of the coded modulation signal used for illumination is: the correlation function only differs from a constant value for reflected light originating from a certain distance range with respect to the ToF camera. In other words, only light reflected from objects within a certain distance range causes the value of the correlation function to differ from a constant value. In terms of mathematical expressions, this can be expressed as follows:

wherein (c), (d) represents a correlation function, d represents a distance from an object reflecting light to the ToF camera, a represents a constant, f (d) represents a distance correlation function, dminMinimum distance of an object representing reflected light to a ToF camera whose correlation function is sensitive to reflected light, and dmaxThe maximum distance of an object representing the reflected light to the ToF camera for which the correlation function is sensitive to the reflected light.

In other words, the correlation range of the correlation function is limited for code modulated signals compared to continuous wave modulated signals. The correlation range in which the correlation function is sensitive to reflected light from an object sensed by the ToF camera defines the measurement range of the ToF camera. That is, the measurement range of the ToF camera corresponds to the correlation range of the correlation function, which is its output distance-dependent output value.

Thus, in some examples, method 100 may include controlling ToF camera 200 to use a code modulated signal for illumination such that the measurement range of ToF camera 200 ends immediately after cover 240, as shown in fig. 5. I.e. the signal is modulated using the designed code to achieve the desired code correlation. Thus, the raw image captured by ToF camera 200 based on the coded modulation signal includes only pixels indicating values that are related only to light reflections from cover 240 of ToF camera 200. In other words, the raw image only characterizes the cover 240 of the ToF camera 200. Thus, the original image can be used to reference the original image. Coded modulation can characterize the cover of ToF camera 200 during operation. In addition, a change in reflectivity of the cover 240 (e.g., caused by fingerprints, dirt, scratches, protective films, fog, rain, etc.) may be detected.

In other examples, a continuous wave modulated signal may be used instead of a coded modulated signal to characterize the cover of a ToF camera. For example, method 100 may include receiving a depth image from a ToF camera and determining a closest reflecting object drawn in the depth image. Further, the method 100 may include determining an exposure time of the ToF camera such that a measurement range of the ToF camera ends before a closest reflective object. The power of the reflected light decreases with the distance of the reflecting object from the ToF camera. Thus, by adjusting the exposure time of the ToF camera, only reflected light from objects showing the largest distance in relation to the exposure time of the ToF camera will contribute to the measurement signal. Reflected light from objects exhibiting a greater distance will only be significant in the noise floor of the light capturing element. Thus, by setting the exposure time of the ToF camera appropriately, it can be ensured that the reflected light of the closest reflecting object will only contribute to the noise floor of the light capturing element. Thus, the method 100 may further comprise controlling the ToF camera to capture the secondary raw image using the determined (set) exposure time. The auxiliary original image may accordingly be used as a reference image, since it only comprises pixels indicating values caused only by the cover of the ToF camera.

In still further examples, the method 100 may include receiving a depth image from a ToF camera and determining whether any reflective objects are drawn in the depth image that are a distance greater than a cover of the ToF camera. If no reflective objects are drawn in the depth image that are a greater distance than the cover, the method 100 may include controlling the ToF camera to capture a secondary raw image. The auxiliary original image may accordingly be used as a reference image, since it only comprises pixels indicating values caused only by the cover of the ToF camera.

In some examples, the reference image may be a reference depth image as described above. To generate or update the reference depth image, method 100 may, for example, include receiving the depth image from the ToF camera and determining whether any reflective objects are drawn in the depth image that are at a distance greater than the cover. If no reflective object is drawn in the depth image that is a distance greater than the cover, the depth image may be used as a reference depth image because it includes only pixels that indicate values due to the cover of the ToF camera only.

Alternatively, in some examples, the reference image may be a reference intensity image (e.g., a grayscale image). The reference intensity image may be used to correct the intensity image of the ToF and other images of the ToF camera, such as the raw image. For example, if the light capturing element of the ToF camera uses PMD to detect reflected light, the gate of the PMD is modulated during normal operation such that the charges induced by the incident light are separated (e.g., collected in two separate capacitors). In the grayscale mode, all charges are added (e.g., collected in one capacitor). The scaling factor may be used to take into account different operating principles. Thus, modifying 104 the phase image of the scene using the reference intensity image may comprise scaling the reference values indicated by the pixels of the reference intensity image by a scaling factor to obtain a scaled reference intensity image. Further, modifying 104 the phase image of the scene using the reference intensity image may include modifying the image of the scene using the scaled reference intensity image similar to above to obtain a compensated image.

In some examples, the exposure times of the scene image and the reference image may be equal. In other words, the exposure time for capturing the scene image and the reference image may be the same.

Alternatively, the exposure times for capturing the scene image and the reference image may be different. The different exposure times of the scene image and the reference image may be compensated by a scaling factor. For example, if the exposure times of the scene image and the reference image are different, modifying 104 the scene image using the reference image may include scaling reference values indicated by pixels of the reference image by a scaling factor to obtain a scaled reference image. The scaling factor is based on (e.g., is the same as or proportional to) the ratio of the exposure times of the scene image and the reference image. Further, modifying 104 the image of the scene using the reference image may include modifying the image of the scene using the scaled reference image to obtain a compensated image of the scene as described above. Thus, an image of a scene captured by a ToF camera using a first exposure time may be corrected using a reference image captured with a different second exposure time. This may allow, for example, to omit saving a reference image for each exposure time supported by the ToF camera.

In the foregoing description, it was described that the light reflection from the cover of the ToF camera is corrected at the image level. However, the proposed concept is not limited to image-level error correction. In some examples, compensation of light reflection from the cover of the ToF camera may be performed at the charge level of the light capturing element. This will be described in more detail in connection with fig. 6.

Fig. 6 shows a flow diagram of a method 600 for compensating for light reflections from a cover of a ToF camera. As described above (e.g., fig. 2), the light capturing element of the ToF camera is covered by a cover. In the example of fig. 6, the light capturing element comprises an array of PMDs, such that each PMD measures a fraction of the light reaching the light capturing element (e.g., each pixel of the light capturing element can comprise a PMD). Such PMDs are known so that the structure of the PMD will not be described in detail in this disclosure. Each PMD separates the charges generated by the light arriving at the PMD based on the modulation signal used for illumination, so that (at least) two charge values are provided for the respective PMD (i.e. for each PMD of the PMD array).

The method 600 includes receiving (602) a charge value for a PMD. For example, the charge value of the PMD may be represented by an analog or digital value provided by the PMD or a connected circuit device (e.g., analog-to-digital converter, ADC).

Similar to that described above for the image level, method 600 also includes modifying (604) the charge value using a reference (charge) value that is related only to light reflections from the cover of the ToF camera to obtain a compensated charge value. In other words, the reference value is (substantially) caused only by light reflections from the cover of the ToF camera, and not by light reflections of any other object, such as an object in the surrounding environment of the ToF camera. The reference value thus characterizes the effect of light reflection from the cover on the PMD-generated charge. The reference value thus allows compensating the influence of light reflection from the cover in the charge generated by the PMD.

For example, the charge value of each PMD may be modified using the reference value. In some examples, a separate reference value may be provided for each PMD. In other examples, separate reference values may be provided for different subsets of the PMD array. In a further example, the same reference value may be provided for all PMDs.

For example, the reference value may be subtracted from the charge value of the PMD. However, the proposed concept is not limited to subtracting the reference value from the charge value of the PMD. In general, the charge value of the PMD may be modified/adjusted/changed in any suitable manner based on the reference value.

Further, method 600 includes outputting (606) the compensated charge value. For example, the original image may be determined based on the compensated charge values.

Similar to the image level described above, the compensation charge values are corrected for the effects of light reflection from the cover of the ToF camera. Thus, erroneous measurements by the ToF camera represented by the charge values of the PMD may be mitigated at least in compensating the charge values. In some examples, the compensation charge value may not be affected by light reflection from the cover of the ToF camera.

The reference value for modifying the charge value of the PMD may be determined similarly to that described above for the reference image. For example, if no reflective object is drawn in the depth image of the ToF camera at a distance greater than the cover, the ToF camera may be controlled to capture the auxiliary image. The charge values of the PMD obtained when taking the auxiliary image can be used as reference values, since they are only caused by the cover of the ToF camera. Alternatively, the charge values of the PMD obtained when taking the depth image may be used as reference values, since they are only caused by the cover of the ToF camera. Similarly, the charge value of PMD obtained when taking an image for substantially limiting the measurement range of the ToF camera to the coded modulation signal of the cover as described above may be used as the reference value.

Scaling may also be used to compensate for different exposure times of the reference value and the charge value of the PMD. That is, modifying (604) the charge value of the PMD using the reference value may include scaling the reference value by a scaling factor to obtain a scaled reference value. For example, the scaling factor may be based on (e.g., the same as or proportional to) a ratio of an exposure time used to generate the reference value to an exposure time used to generate the charge value for the PMD. Further, modifying (604) the charge value of the PMD using the reference value may include modifying the charge value of the PMD using a scaled reference value as described above to obtain a compensated charge value.

Further, scaling can be used to correct the charge value of PMD using a reference value obtained when capturing an intensity (grayscale) image, similar to that described above for image levels.

An example of a device 700 for compensating light reflections from the cover of a ToF camera according to the proposed concept is further shown in fig. 7. The apparatus 700 comprises a processing circuit 720. For example, the processing circuit 720 may be a single special-purpose processor, a single shared processor, or multiple individual processors, some or all of which may be shared, Digital Signal Processor (DSP) hardware, an Application Specific Integrated Circuit (ASIC), or a Field Programmable Gate Array (FPGA). The processing circuit 720 may optionally be coupled to Read Only Memory (ROM), Random Access Memory (RAM), and/or non-volatile memory, for example, for storing software. Device 700 may also include other hardware, conventional and/or custom.

The apparatus 700 comprises an input circuit 710 configured to receive input data 701 representing a scene image from a ToF camera or a charge value of a photonic mixing device of the ToF camera. The processing circuit 720 processes the input data 701 according to the concept described above for compensating for light reflections from the cover of the ToF camera. Thus, the output circuit 730 of the device outputs output data 702 representing a compensated image or compensated charge value of the scene.

For example, the functionality of device 700 may be implemented in an application processor coupled to the ToF camera module providing the image or to the ToF camera module itself.

Examples described herein may be summarized as follows:

some examples relate to a method for compensating for light reflections from a cover of a ToF camera in an image of a scene sensed by the ToF camera. The method includes receiving an image of a scene from a ToF camera. Further, the method includes modifying an image of the scene using the reference image to obtain a compensated image of the scene. The pixels of the reference image indicate reference values that are related only to light reflections from the cover of the ToF camera. Additionally, the method includes outputting the compensated image.

According to some examples, the illumination element of the ToF camera for illuminating the scene and the light capturing element of the ToF camera for receiving reflected light from the scene are arranged in a common cavity covered by a cover.

In some examples, the image of the scene is one of an original image, an image derived from an original image, an intensity image, an image derived from an intensity image, a depth image, or an image derived from a depth image.

According to some examples, the reference image is a reference original image.

In some examples, the ToF camera illuminates a scene using a continuous wave modulated signal and generates a measurement signal based on reflected light from the scene. The image of the scene is an original image based on a correlation between the continuous wave modulated signal and the measurement signal according to a correlation function. The reference original image is selected from a plurality of reference original images based on a phase shift between the continuous wave modulation signal and the measurement signal used for correlation.

According to some examples, the ToF camera illuminates the scene using a continuous wave modulated signal. The reference original image is selected from a plurality of reference original images based on the frequency of the continuous wave modulation signal.

In some examples, the reference raw image is based on a factory calibration.

According to some examples, the method further comprises: the ToF camera is controlled to illuminate with a coded modulation signal such that the measurement range of the ToF camera ends immediately after the cover. An original image captured by the ToF camera based on the coded modulation signal is used as a reference original image.

In some examples, the method further comprises: the method includes receiving a depth image from a ToF camera and determining a closest reflective object rendered in the depth image. Further, the method comprises: the exposure time of the ToF camera is determined such that the measurement range of the ToF camera ends before the closest reflecting object. Further, the method comprises: controlling the ToF camera to capture a secondary raw image using the exposure time, wherein the secondary raw image is used as a reference image.

According to some examples, the method further comprises: the method includes receiving a depth image from a ToF camera and determining whether any reflective objects having a distance greater than a cover are drawn in the depth image. If no reflective object is drawn in the depth image that is a distance greater than the cover, the method further includes controlling the ToF camera to capture a secondary raw image, wherein the secondary raw image is used as a reference image.

In some examples, the reference image is a reference depth image.

According to some examples, the method further comprises: the method includes receiving a depth image from a ToF camera and determining whether any reflective objects having a distance greater than a cover are drawn in the depth image. If no reflective object having a distance greater than the cover is drawn in the depth image, the depth image is used as a reference depth image.

In some examples, the reference image is a reference intensity image.

According to some examples, modifying the phase image of the scene using the reference intensity image comprises: the reference values indicated by the pixels of the reference intensity image are scaled by a scaling factor to obtain a scaled reference intensity image. Further, modifying the phase image of the scene using the reference intensity image comprises: an image of the scene is modified using the scaled reference intensity image to obtain a compensated image.

In some examples, the exposure times of the image of the scene and the reference image are equal.

If the exposure times of the image of the scene and the reference image are different, then, according to some examples, modifying the image of the scene using the reference image includes scaling, by a scaling factor, reference values indicated by pixels of the reference image to obtain a scaled reference image. The scaling factor is based on the ratio of the exposure times of the image of the scene and the reference image. Further, modifying the image of the scene using the reference image includes modifying the image of the scene using the scaled reference image to obtain a compensated image.

Further examples relate to an apparatus for compensating for light reflections from a cover of a ToF camera in an image of a scene sensed by the ToF camera. The apparatus includes an input circuit configured to receive a phase image of a scene from a ToF camera. Further, the apparatus includes a processing circuit configured to modify an image of the scene using the reference image to obtain a compensated image of the scene. The pixels of the reference image indicate reference values that are related only to light reflections from the cover of the ToF camera. The apparatus also includes an output circuit configured to output the compensated image.

A further example relates to another device for compensating for light reflections from a cover of a ToF camera in an image of a scene sensed by the ToF camera. The apparatus includes means for receiving a phase image of a scene from a ToF camera. Further, the apparatus includes means for modifying an image of the scene using the reference image to obtain a compensated image of the scene. The pixels of the reference image indicate reference values that are related only to light reflections from the cover of the ToF camera. The apparatus also includes means for outputting a compensated image.

Other examples relate to further methods for compensating for light reflections from the cover of a ToF camera. The light capturing element of the ToF camera is covered by a cover and comprises an array of photonic mixing devices. Each photonic mixing device separates charges generated by light reaching the photonic mixing device to provide two charge values for the respective photonic mixing device. The method includes receiving a charge value of a photonic mixing device. Furthermore, the method comprises modifying the charge value using a reference value related only to light reflections from a cover of the ToF camera to obtain a compensated charge value. The method also includes outputting a compensated charge value.

Still other examples relate to devices for compensating for light reflections from the cover of a ToF camera. The light capturing element of the ToF camera is covered by a cover and comprises an array of photonic mixing devices. Each photonic mixing device separates charges generated by light reaching the photonic mixing device such that two charge values are provided for the respective photonic mixing device. The apparatus includes an input circuit configured to receive a charge value of a photonic mixing device. Furthermore, the device comprises a processing circuit configured to modify the charge value using a reference value related only to light reflections from a cover of the ToF camera to obtain a compensated charge value. The apparatus also includes an output circuit configured to output the compensated charge value.

Further examples relate to a device for compensating for light reflections from a cover of a ToF camera. The light capturing element of the ToF camera is covered by a cover and comprises an array of photonic mixing devices. Each photonic mixing device separates the charges generated by light reaching the photonic mixing device such that two charge values are provided for the respective photonic mixing device. The apparatus includes means for receiving a charge value of a photonic mixing device. Furthermore, the apparatus comprises means for modifying the charge value using a reference value related only to light reflections from a cover of the ToF camera to obtain a compensated charge value. The apparatus also includes means for outputting a compensated charge value.

Examples relate to a non-transitory machine readable medium having a program stored thereon, with program code for compensating light reflection from a cover of a ToF camera as described herein, when the program is executed on a processor or programmable hardware.

Other examples relate to a program having program code for performing any method of compensating for light reflection from a cover of a ToF camera as described herein, when the program is executed on a processor or programmable hardware.

Examples according to the proposed concept may allow cover glass error correction (e.g. using continuous wave or coded modulation measurements) of ToF cameras.

The specification and drawings merely illustrate the principles of the disclosure. Moreover, all examples mentioned herein are intended expressly to be only for illustrative purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventors to furthering the art. All statements herein reciting principles, aspects, and examples of the disclosure, as well as specific examples thereof, are intended to encompass equivalents thereof.

For example, the block diagrams may illustrate high-level circuit diagrams embodying the principles of the present disclosure. Similarly, flowcharts, state transition diagrams, pseudocode, and the like may represent various processes, operations, or steps which may be substantially represented in computer readable media and executed by a computer or processor, for example, whether or not such computer or processor is explicitly shown. The methods disclosed in the specification or claims may be implemented by an apparatus having means to perform each respective action of the methods.

It should be understood that the disclosure of acts, processes, operations, steps or functions disclosed in the specification or claims may not be construed as to be within a particular order unless explicitly or implicitly indicated otherwise (e.g., for technical reasons). Thus, the disclosure of multiple acts or functions does not limit them to a particular order unless, for technical reasons, the acts or functions are not interchangeable. Further, in some examples, a single action, function, process, operation, or step may include or may be broken down into multiple sub-actions, sub-functions, sub-processes, sub-operations, or sub-steps. Such sub-acts may be included in or part of the disclosure of such single acts, unless specifically excluded.

Furthermore, the following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separate example. Although each claim may exist individually as a separate example, it should be noted that although a dependent claim may refer in the claims to a particular combination with one or more other claims, other examples may also include combinations of the dependent claim with the subject matter of the claims dependent or independent from each other. Such combinations are expressly set forth herein unless a specific combination is not intended. Furthermore, it is also intended to include features of a claim dependent on any other independent claim, even if that claim is not directly dependent on that independent claim.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种点源靶标反射镜的调整角度计算方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!