Distance measuring sensor, signal processing method and distance measuring module

文档序号:277734 发布日期:2021-11-19 浏览:11次 中文

阅读说明:本技术 距离测量传感器、信号处理方法和距离测量模块 (Distance measuring sensor, signal processing method and distance measuring module ) 是由 小野博明 佐野弘长 于 2020-03-26 设计创作,主要内容包括:本技术涉及距离测量传感器、信号处理方法和距离测量模块,能够单独由距离测量传感器执行距离信息的计算和室外判断。距离测量传感器包括:距离测量单元,根据由光接收单元获得的信号计算到对象的距离信息,光接收单元接收通过对象反射从预定发光源发射的照射光而获得的反射光;环境光计算单元,计算包括在由光接收单元获得的信号中的环境光分量;以及室外信息计算单元,基于环境光分量计算室外信息。本技术可以应用于例如测量到被摄体的距离的距离测量模块等。(The present technology relates to a distance measuring sensor, a signal processing method, and a distance measuring module, which are capable of performing calculation of distance information and outdoor judgment by the distance measuring sensor alone. The distance measuring sensor includes: a distance measuring unit that calculates distance information to the object from a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source; an ambient light calculation unit that calculates an ambient light component included in the signal obtained by the light reception unit; and an outdoor information calculation unit that calculates outdoor information based on the ambient light component. The present technology can be applied to, for example, a distance measurement module that measures a distance to an object, and the like.)

1. A distance measuring sensor comprising:

a distance measuring unit that calculates distance information to an object from a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting irradiation light emitted from a predetermined light emitting source;

an ambient light calculation unit that calculates an ambient light component included in the signal obtained by the light reception unit; and

an outdoor information calculation unit that calculates outdoor information based on the ambient light component.

2. The distance measuring sensor of claim 1, further comprising

A normalization unit that normalizes the ambient light component calculated by the ambient light calculation unit,

wherein the outdoor-information calculating unit estimates the outdoor information based on the ambient-light component that has been normalized.

3. The distance measuring sensor according to claim 2,

wherein the normalization unit normalizes the ambient light component by using an exposure time and a number of pixels.

4. The distance measuring sensor according to claim 2,

wherein the normalization unit normalizes the ambient light component by using an exposure time, a number of pixels, and the distance information.

5. The distance measuring sensor according to claim 1,

wherein the ambient light calculation unit calculates the ambient light component by subtracting a dark current component from the signal obtained by the light receiving unit.

6. The distance measuring sensor according to claim 1,

wherein both the distance information and the outdoor information are calculated by using a signal obtained by the light receiving unit.

7. The distance measuring sensor according to claim 1, wherein as an operation mode, there is provided

Calculating a first operation mode of the distance information and the outdoor information,

a second operation mode in which the distance information is calculated without calculating the outdoor information, or

A third operation mode of calculating the outdoor information without calculating the distance information.

8. The distance measuring sensor according to claim 1,

wherein the ambient light calculation unit calculates the ambient light component of a region of interest that is a part of a light receiving region of the light receiving unit.

9. The distance measuring sensor according to claim 8,

wherein the ambient light calculation unit acquires region information indicating the region of interest, and calculates the ambient light component of the region of interest.

10. The distance measuring sensor according to claim 9,

wherein the ambient light calculation unit acquires region information indicating the region of interest, the region information having been detected by an imaging sensor that generates a captured image obtained by receiving RGB light, and calculates the ambient light component of the region of interest.

11. The distance measuring sensor according to claim 8,

wherein the distance measurement unit calculates confidence information in addition to the distance information,

the distance measurement sensor further includes a region detection unit that detects the region of interest by using at least one of the distance information or the confidence information, and

the ambient light calculation unit calculates the ambient light component of the region of interest.

12. The distance measuring sensor of claim 1, further comprising

A filtering unit that performs a predetermined filtering process on the distance information,

the filtering unit performs the predetermined filtering process based on the outdoor information.

13. The distance measuring sensor of claim 1, further comprising

A control unit that controls an exposure time based on the signal obtained by the light receiving unit.

14. The distance measuring sensor according to claim 1,

wherein the outdoor information calculation unit calculates whether the outdoor information represents outdoors or indoors as the outdoor information.

15. The distance measuring sensor according to claim 1,

wherein the outdoor information calculation unit calculates a probability of being outdoors as the outdoor information.

16. A signal processing method, comprising:

calculating distance information to an object from a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting irradiation light emitted from a predetermined light emission source, by using a distance measurement sensor;

calculating an ambient light component included in the signal obtained by the light receiving unit by using the distance measurement sensor; and

calculating outdoor information based on the ambient light component by using the distance measurement sensor.

17. A distance measurement module comprising:

a predetermined light emitting source; and

a distance-measuring sensor is provided which is,

the distance measuring sensor comprises

A distance measuring unit that calculates distance information to an object based on a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emission source,

an ambient light calculation unit that calculates an ambient light component included in the signal obtained by the light reception unit, an

An outdoor information calculation unit that calculates outdoor information based on the ambient light component.

Technical Field

The present technology relates to a distance measuring sensor, a signal processing method, and a distance measuring module, and more particularly, to a distance measuring sensor, a signal processing method, and a distance measuring module capable of performing calculation of distance information and outdoor judgment by the distance measuring sensor alone.

Background

In recent years, with the progress of semiconductor technology, miniaturization of a distance measurement module that measures a distance to an object has progressed. Therefore, for example, the installation of the distance measuring module on a mobile terminal such as a so-called smartphone, which is a small information processing apparatus having a communication function, is realized.

Examples of the distance measurement method in the distance measurement module include an indirect time-of-flight (ToF) method and a structured light method. In the indirect ToF method, light is emitted toward an object and light reflected on the surface of the object is detected, and a distance to the object is calculated based on a measurement value obtained by measuring the time of flight of the light. In the structured light method, pattern light is emitted toward an object, and a distance to the object is calculated based on an image obtained by imaging distortion of the pattern on the surface of the object.

A distance measurement method in which an object is irradiated with active light and reflected light reflected by the object is received, such as an indirect ToF method, and a structured light method is susceptible to the influence of ambient light such as sunlight. Therefore, it is useful to be able to determine whether the measurement site is outdoors or indoors.

For example, there is an imaging device in which an outdoor detection sensor is provided separately from an imaging element to detect that the imaging device is outdoors (for example, patent document 1).

Reference list

Patent document

Patent document 1: japanese patent application laid-open No. 2005-175888

Disclosure of Invention

Problems to be solved by the invention

However, it is desirable to perform outdoor determination while acquiring distance information by a distance measuring sensor alone, rather than separately providing an outdoor detection sensor.

The present technology is proposed in view of such a situation, and calculation of distance information and outdoor determination can be performed by the distance measuring sensor alone.

Solution to the problem

A distance measuring sensor according to a first aspect of the present technology includes: a distance measuring unit that calculates distance information to the object from a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source; an ambient light calculation unit that calculates an ambient light component included in the signal obtained by the light reception unit; and an outdoor information calculation unit that calculates outdoor information based on the ambient light component.

A signal processing method according to a second aspect of the present technology includes: calculating distance information to the object from a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source, by using a distance measuring sensor; calculating an ambient light component included in the signal obtained by the light receiving unit by using the distance measuring sensor; and calculating outdoor information based on the ambient light component by using the distance measuring sensor.

A distance measuring module according to a third aspect of the present technology includes a predetermined light emitting source and a distance measuring sensor including: a distance measuring unit that calculates distance information to the object from a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source; an ambient light calculation unit that calculates an ambient light component included in the signal obtained by the light reception unit; and an outdoor information calculation unit that calculates outdoor information based on the ambient light component.

In the first to third aspects of the present technology, the distance information to the object is calculated from the signal obtained by the light receiving unit, the light receiving unit receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source, calculates an ambient light component included in the signal obtained by the light receiving unit, and calculates the outdoor information based on the ambient light component.

The distance measuring sensor and the distance measuring module may be separate devices or may be a module incorporated in another device.

Drawings

Fig. 1 is a block diagram showing a schematic configuration example in a first embodiment of a distance measurement module to which the present technology is applied.

Fig. 2 is a diagram illustrating the operation of the distance measuring sensor.

Fig. 3 is a diagram illustrating an operation of a pixel according to the 4-phase method.

Fig. 4 is a diagram illustrating the 4-phase method.

Fig. 5 is a diagram illustrating the 4-phase method.

Fig. 6 is a diagram illustrating a method of calculating depth values by using the 2-phase method and the 4-phase method.

Fig. 7 is a block diagram of a distance measurement module including a detailed configuration of a distance measurement sensor.

Fig. 8 is a diagram showing a configuration of RAW data obtained by one pixel.

Fig. 9 is a diagram illustrating an example of RAW data according to the 4-phase method.

Fig. 10 is a flowchart showing a depth value calculation process performed by the distance measurement module.

Fig. 11 is a diagram illustrating a second operation mode of the distance measurement module.

Fig. 12 is a diagram illustrating a third operation mode of the distance measurement module.

FIG. 13 is a block diagram of a second embodiment of a distance measurement module.

Fig. 14 is a diagram illustrating processing of the ambient light calculation unit according to the second embodiment.

Fig. 15 is a block diagram of a third embodiment of a distance measurement module.

Fig. 16 is a block diagram showing a configuration example of an electronic device to which the present technology is applied.

Fig. 17 is a block diagram showing a configuration example of an embodiment of a computer to which the present technology is applied.

Fig. 18 is a block diagram showing an example of a schematic configuration of a vehicle control system.

Fig. 19 is an explanatory view showing an example of the mounting positions of the vehicle exterior information detecting unit and the imaging unit.

Detailed Description

Hereinafter, an embodiment (hereinafter, referred to as an embodiment) for realizing the present technology will be described. Note that the description will be given in the following order.

1. Schematic configuration example of distance measurement module

2. Operation of distance measuring sensor by indirect ToF method

3. Detailed configuration example of distance measuring sensor

4. Depth value calculation processing

5. Second embodiment

6. Third embodiment

7. Configuration example of electronic device

8. Configuration example of computer

9. Application examples of moving objects

<1. schematic configuration example of distance measuring Module >

Fig. 1 is a block diagram showing a schematic configuration example in a first embodiment of a distance measurement module to which the present technology is applied.

The distance measuring module 11 shown in fig. 1 is a distance measuring module that performs distance measurement by the indirect ToF method, and includes a light emitting source 12, a light emitting control unit 13, and a distance measuring sensor 14.

The distance measurement module 11 irradiates a predetermined object as an object with light, receives light (reflected light) obtained by reflecting light (irradiated light) by the object, and thus measures a depth value and confidence as distance information to the object, and outputs the depth value and confidence.

The light emission source 12 includes, for example, an infrared laser diode or the like as a light source, emits light while performing modulation at a timing corresponding to a light emission control signal supplied from the light emission control unit 13, and irradiates the object with irradiation light.

The light emission control unit 13 controls the light emission source 12 by supplying a light emission control signal of a predetermined frequency (e.g., 20MHz, etc.) to the light emission source 12. Further, the light emission control unit 13 supplies a light emission control signal to the distance measuring sensor 14 so as to drive the light receiving unit 21 according to the light emission timing of the light emission source 12.

Although the distance measurement sensor 14 will be described in detail later with reference to fig. 2, the pixel array unit 32 in which a plurality of pixels 31 are two-dimensionally arranged receives reflected light from an object. Then, the distance measurement sensor 14 generates and outputs a depth value and a confidence of each pixel 31 of the pixel array unit 32.

<2. operation of distance measuring sensor by indirect ToF method >

Next, the operation of the distance measuring sensor 14 will be described with reference to fig. 2 to 6.

The distance measuring sensor 14 includes a light receiving unit 21 shown in fig. 2.

The light receiving unit 21 includes a pixel array unit 32 in which each pixel 31 generates electric charges according to the amount of received light and is two-dimensionally arranged in a matrix in a row direction and a column direction according to electric charge output signals, and a drive control circuit 33 arranged in a peripheral region of the pixel array unit 32.

The drive control circuit 33 outputs control signals (for example, an assignment signal DIMIX, a selection signal ADDRESS DECODE, a reset signal RST, and the like, which will be described later) for controlling the driving of the pixels 31 based on, for example, a light emission control signal supplied from the light emission control unit 13.

The pixel 31 includes a photodiode 51, and a first tap 52A and a second tap 52B that detect electric charges photoelectrically converted by the photodiode 51. In the pixel 31, the electric charge generated in one photodiode 51 is distributed to the first tap 52A or the second tap 52B. Then, the electric charge generated in the photodiode 51 and distributed to the first tap 52A is output from the signal line 53A as the detection signal a, and the electric charge generated in the photodiode 51 and distributed to the second tap 52B is output from the signal line 53B as the detection signal B.

The first tap 52A includes a transfer transistor 41A, a Floating Diffusion (FD) unit 42A, a selection transistor 43A, and a reset transistor 44A. Similarly, second tap 52B includes pass transistor 41B, FD cell 42B, select transistor 43B, and reset transistor 44B.

The operation of the pixel 31 will be described.

As shown in fig. 3, the light emission source 12 outputs modulated irradiation light (one period is 2T) so as to repeat on/off of the irradiation at an irradiation time T, and the photodiode 51 receives reflected light with a delay of a delay time Δ T corresponding to a distance to a subject. Further, the distribution signal DIMIX _ a controls on/off of the transfer transistor 41A, and the distribution signal DIMIX _ B controls on/off of the transfer transistor 41B. The division signal dim _ a is a signal having the same phase as the irradiation light, and the division signal dim _ B has a phase obtained by inverting the phase of the division signal dim _ a.

Accordingly, in fig. 2, when the transfer transistor 41A is turned on in accordance with the distribution signal DIMIX _ a, the electric charges generated by the photodiode 51 receiving the reflected light are transferred to the FD unit 42A, and when the transfer transistor 41B is turned on in accordance with the distribution signal DIMIX _ B, the electric charges generated by the photodiode 51 receiving the reflected light are transferred to the FD unit 42B. Therefore, during a predetermined period in which irradiation with irradiation light of the irradiation time T is periodically performed, the electric charges transferred via the transfer transistor 41A are sequentially accumulated in the FD unit 42A, and the electric charges transferred via the transfer transistor 41B are sequentially accumulated in the FD unit 42B.

Then, if the selection transistor 43A is turned on in accordance with the selection signal ADDRESS _ a after the period of accumulating the electric charges ends, the electric charges accumulated in the FD unit 42A are read via the signal line 53A, and the detection signal a corresponding to the amount of electric charges is output from the light receiving unit 21. Similarly, if the selection transistor 43B is turned on in accordance with the selection signal ADDRESS decoder _ B, the electric charges accumulated in the FD unit 42B are read via the signal line 53B, and the detection signal B corresponding to the amount of electric charges is output from the light receiving unit 21. Further, if the reset transistor 44A is turned on in accordance with the reset signal RST _ a, the electric charges accumulated in the FD unit 42A are discharged, and if the reset transistor 44B is turned on in accordance with the reset signal RST _ B, the electric charges accumulated in the FD unit 42B are discharged.

As described above, the pixel 31 distributes the electric charge generated by the reflected light received by the photodiode 51 to the first tap 52A or the second tap 52B according to the delay time Δ t, and outputs the detection signal a and the detection signal B. Then, the delay time Δ T corresponds to a time in which the light emitted from the light emitting source 12 flies to the object, is reflected by the object, and then flies to the light receiving unit 21, that is, corresponds to a distance to the object. Accordingly, the distance measurement module 11 may obtain the distance to the subject (depth value) from the delay time Δ T based on the detection signal a and the detection signal B.

However, in the pixel array unit 32, there are cases where the detection signal a and the detection signal B are affected differently for each pixel 31 due to a deviation (sensitivity difference) in characteristics of each element such as the photodiode 51 and the pixel transistor such as the transfer transistor 41 included in each pixel 31. Therefore, in the distance measurement module 11 of the indirect ToF method, a technique of removing a sensitivity difference between taps of each pixel and improving an SN ratio by acquiring the detection signal a and the detection signal B obtained by receiving reflected light by changing the phase in the same pixel 31 is employed.

As a method of receiving reflected light by changing the phase and calculating the depth value, for example, a detection method by using two phases (2-phase method) and a detection method by using four phases (4-phase method) will be described.

As shown in fig. 4, the light receiving unit 21 receives the reflected light at light receiving timings shifted by phases of 0 °, 90 °, 180 °, and 270 ° from the irradiation timing of the irradiation light. More specifically, the light receiving unit 21 receives the reflected light by changing the phase in a time-division manner such that, in a certain frame period, the light is received with the phase set to 0 ° with respect to the irradiation timing of the irradiation light, in the next frame period, the light is received with the phase set to 90 °, in the next frame period, the light is received with the phase set to 180 °, and in the next frame period, the light is received with the phase set to 270 °.

Fig. 5 is a diagram in which the exposure periods of the first tap 52A of the pixel 31 in the respective phases of 0 °, 90 °, 180 °, and 270 ° are arranged so that the phase difference can be easily understood.

As shown in fig. 5, in the first tap 52A, a detection signal a obtained by receiving light in the same phase (phase 0 °) as the irradiation light is referred to as a detection signal a0A detection signal a obtained by receiving light in a phase (phase 90 °) shifted by 90 ° from the irradiation light is referred to as a detection signal a90A detection signal a obtained by receiving light in a phase (phase 180 °) shifted by 180 ° from the irradiation light is referred to as a detection signal a180A detection signal a obtained by receiving light in a phase (phase 270 °) shifted from the irradiation light by 270 ° is referred to as a detection signal a270

Further, even if the explanation is omitted, in the second tap 52B, the detection signal B obtained by receiving light in the same phase (phase 0 °) as the irradiation light is referred to as a detection signal B0A detection signal B obtained by receiving light in a phase (phase 90 °) shifted by 90 ° from the irradiation light is referred to as a detection signal B90A detection signal B obtained by receiving light in a phase (phase 180 °) shifted by 180 ° from the irradiation light is referred to as a detection signal B180A detection signal B obtained by receiving light in a phase (phase 270 °) shifted from the irradiation light by 270 ° is referred to as a detection signal B270

Fig. 6 is a diagram illustrating a method of calculating depth values and confidences by using the 2-phase method and the 4-phase method.

In the indirect ToF method, the depth value d may be obtained by the following formula (1).

[ mathematical expression 1]

In formula (1), c represents the speed of light, Δ T represents the delay time, and f represents the modulation frequency of light. Further, in the formula (1)Denotes the amount of phase shift [ rad ] of reflected light]Expressed by the following formula (2).

[ mathematical expression 2]

In the 4-phase method, the detection signal a obtained by setting the phase to 0 °, 90 °, 180 °, and 270 ° is used0To A270And a detection signal B0To B270I and Q in formula (2) are calculated by the following formula (3). I and Q are signals obtained by assuming that the change in the luminance of the illumination light is a cos wave, and converting the phase of the cos wave from polar coordinates to an orthogonal coordinate system (IQ plane), respectively.

I=c0-c180=(A0-B0)-(A180-B180)

Q=c90-c270=(A90-B90)-(A270-B270)………(3)

In the 4-phase method, for example, by taking the difference between detection signals of opposite phases in the same pixel, such as "a" in equation (3)0-A180"and" A90-A270", it is possible to remove characteristic variations between taps existing in each pixel, i.e., fixed pattern noise.

In contrast, in the 2-phase method, it is possible to detect only the signal a by using the detection signal a0To A270And detection signals B obtained by setting the phases to 0 °, 90 °, 180 °, and 270 °0To B270Two phases in the orthogonal relationship between to obtain a depth value d to the object. For example, in the case of using a detection signal A of 0 DEG phase0And B0And a detection signal A of 90 DEG phase90And B90In the case of (2), I and Q are represented by the following formula (4).

I=c0-c180=(A0-B0)

Q=c90-c270=(A90-B90)………(4)

For example, in the case of using a 180 DEG phase detection signal A180And B180And a detection signal A of 270 DEG phase270And B270In the case of (2), I and Q are represented by the following formula (5).

I=c0-c180=-(A180-B180)

Q=c90-c270=-(A270-B270)………(5)

In the 2-phase method, the characteristic variation between taps existing in each pixel cannot be eliminated; however, the depth value d to the subject can be obtained only by the detection signals in the two phases. Therefore, the distance measurement can be performed at twice the frame rate of the 4-phase method. The characteristic variation between taps can be adjusted by a correction parameter such as gain or offset.

In the 2-phase method and the 4-phase method, the confidence cnf can be obtained by the following equation (6).

[ mathematical expression 3]

In the present embodiment, it does not matter whether the distance measurement module 11 uses the I signal and the Q signal corresponding to the delay time Δ T calculated by the 4-phase method or uses the I signal and the Q signal corresponding to the delay time Δ T calculated by the 2-phase method to use the depth value d and the confidence cnf. The 4-phase method or the 2-phase method may be fixedly used, or a method of appropriately selecting the 4-phase method or the 2-phase or mixing the 4-phase method and the 2-phase according to the motion of the subject or the like, for example, may be used. In the following, for simplicity, it is assumed that a 4-phase method is employed.

<3. detailed configuration example of distance measuring sensor >

Fig. 7 is a block diagram of the distance measurement module 11 including a detailed configuration of the distance measurement sensor 14.

The distance measuring sensor 14 includes, in addition to the light receiving unit 21 shown in fig. 2, a distance measuring unit 22, an ambient light calculating unit 23, an ambient light normalizing unit 24, an outdoor information calculating unit 25, and a filtering unit 26.

The distance measurement module 11 irradiates a predetermined object with light, receives light (reflected light) obtained by reflecting light (irradiated light) by the object, and thus measures and outputs a depth value and confidence as distance information to the object.

Specifically, the light receiving unit 21 sets each pixel 31 of the pixel array unit 32 as a measurement target pixel, and supplies RAW data, which is a detection signal corresponding to the light receiving amount of reflected light received by the measurement target pixel, to the light emission control unit 13, the distance measuring unit 22, and the ambient light calculating unit 23.

The light emission control unit 13 controls the light emission source 12 by supplying a light emission control signal of a predetermined frequency to the light emission source 12, controls the exposure time based on the RAW data of the light receiving unit 21, and generates a light emission control signal for realizing the set exposure time. Therefore, the light emission control unit 13 has an AE function (auto exposure function) based on the RAW data of the light receiving unit 21, and supplies the set exposure time to the ambient light normalization unit 24.

Based on the RAW data of the measurement target pixel supplied from the light receiving unit 21, the distance measurement unit 22 calculates a depth value d, which is distance information from the distance measurement module 11 to the object in the measurement target pixel, and a confidence cnf of the depth value d, and supplies the depth value d and the confidence cnf to the filtering unit 26. The method of calculating the depth value d and its confidence cnf is as described above.

The ambient light calculation unit 23 calculates an ambient light component included in the RAW data of the measurement target pixel supplied from the light receiving unit 21, and supplies the ambient light component to the ambient light normalization unit 24.

Fig. 8 shows a configuration of RAW data (detection signal) obtained by one pixel.

The RAW data includes an active component acv, an ambient light component amb, and a dark current component drk. Active component acv is the light component of the illumination light emitted from light emitting source 12, reflected by the object, and returned. The ambient light component amb is a light component of ambient light such as sunlight. The dark current component drk is a noise component generated by the dark current generated in the light receiving unit 21, regardless of light reception.

FIG. 9 shows the detection signal A in the case where the distance measuring sensor 14 measures different distances D1 and D2iAnd BiThe ratio of the active component acv, the ambient light component amb, and the dark current component drk in (i ═ any of 0, 90, 180, and 270).

Here, the ambient light component amb can be obtained by the following formula (7).

[ mathematical expression 4]

The dark current component drk is a fixed value acquired in advance by acquiring the detection signals a and B in a state where the light receiving unit 21 is shielded, for example.

Equation (7) is an equation for obtaining the ambient light component amb in the 4-phase method. However, even in the 2-phase method, it is possible to omit the detection signal a that has not been measured yetiAnd BiAnd the denominator of the score is changed to 4, the ambient light component amb is similarly calculated.

Returning to fig. 7, the ambient-light normalization unit 24 normalizes the ambient-light component amb of the measurement target pixel supplied from the ambient-light calculation unit 23 by using the number of pixels and the exposure time.

Specifically, the ambient light normalization unit 24 normalizes the ambient light component amb of the measurement target pixel supplied from the ambient light calculation unit 23 by using the formula (8) to calculate the ambient light component amb that has been normalizednorm

[ mathematical expression 5]

In formula (8), Σ amb represents the sum of the ambient light components amb of all the pixels of the pixel array unit 32, and pix _ n represents the number of pixels of the pixel array unit 32. Further, base _ exp _ time denotes a preset basic exposure time as an initial value, and current _ exp _ time denotes a current exposure time of the measurement target pixel supplied from the light emission control unit 13.

Further, the ambient-light normalization unit 24 may acquire the depth value d of the measurement target pixel from the distance measurement unit 22 and normalize the ambient-light component amb by using the pixel number, the exposure time, and the depth value. In this case, the ambient light component amb that has been normalized is calculated by the following equation (9)norm

[ mathematical expression 6]

By normalizing the ambient light component amb, individual adjustments based on pixel count, exposure time, distance, etc. become unnecessary. The ambient light normalization unit 24 supplies the calculated ambient light component amb to the outdoor information calculation unit 25normIn which the ambient light component ambnormHas been normalized.

The outdoor-information calculating unit 25 calculates the ambient-light component amb based on the normalized ambient-light component ambnormAnd the ambient light component amb that has been supplied from the ambient light normalization unit 24normOutdoor information regarding whether the current environment measured by the distance measurement module 11 is outdoors is calculated and provided to the filtering unit 26. The outdoor information may be a probability of being outdoors (hereinafter, referred to as an outdoor probability) or may be a binary value indicating outdoors or indoors. In the case where the outdoor information is represented by a binary value, it is only necessary to perform the outdoor judgment by using 50% as a threshold value. In the present embodiment, a description will be given assuming that the outdoor information calculation unit 25 calculates and outputs the outdoor probability as the outdoor information.

For example, the outdoor information calculation unit 25 calculates the outdoor probability α (0 ≦ α ≦ 1) of the measurement target pixel by the following formula (10).

α=a×ambnorm+b………(10)

A and b in the formula (10) are predetermined constants determined in advance.

The filtering unit 26 performs an optimum filtering process on the distance measurement result from the distance measuring unit 22 based on the outdoor probability α as the outdoor information supplied from the outdoor information calculating unit 25.

Specifically, for the measurement target pixel, if the outdoor probability α as the outdoor information is supplied from the outdoor information calculation unit 25, and the depth value d and the confidence cnf are supplied from the distance measurement unit 22, the filtering unit 26 calculates the depth value d 'and the confidence cnf' that have been filtered by the following formula (11).

d’=α·f1(d)+(1-α)·f2(d)

cnf’=α·g1(cnf)+(1-α)·g2(cnf)………(11)

Here, f1() Parameter set representing an outdoor filter with depth value d as input, f2() Representing a set of parameters of the indoor filter with depth value d as input. g1() Parameter set, g, representing an outdoor filter with confidence cnf as input2() Representing a set of parameters of the room filter with confidence cnf as input. The outdoor filter or the indoor filter is a filter obtained by adjusting any filter such as a filter for noise reduction or a filter for sharpening a boundary portion of an object used outdoors or indoors. In the case where the outdoor information is a binary value indicating outdoor or indoor, α is 1 or 0. Therefore, either the outdoor filter or the indoor filter is selected.

The filtering unit 26 outputs the depth value d 'and the confidence cnf' that have been filtered as the depth value and the confidence of the measurement target pixel to the outside of the distance measurement module 11.

Note that the filtering unit 26 may generate mapping data in which the depth value d 'or the confidence cnf' is stored as a pixel value of each pixel 31 of the pixel array unit 32, and output the mapping data to a subsequent stage. In this case, a depth map in which a depth value d 'is stored as a pixel value of each pixel 31 of the pixel array unit 32 and a confidence map in which a confidence cnf' is stored as a pixel value of each pixel 31 of the pixel array unit 32 are generated and output.

<4. depth value calculation processing >

The depth value calculation process performed by the distance measurement module 11 will be described with reference to the flowchart of fig. 10. This process is started, for example, when a light emission control signal is supplied from the light emission control unit 13 to the light emission source 12 and the distance measurement sensor 14.

First, in step S1, the light receiving unit 21 receives the reflected light based on the light emission control signal from the light emission control unit 13. Specifically, the light receiving unit 21 receives the reflected light by changing the phase in a time-division manner so that the light receiving timing of each pixel 31 of the pixel array unit 32 is in the phases of 0 °, 90 °, 180 °, and 270 °, respectively, with respect to the irradiation timing of the irradiation light. Detection signal a of each pixel obtained by sequentially setting the phase to 0 °, 90 °, 180 °, and 270 °0To A270And a detection signal B0To B270Is supplied as RAW data to the light emission control unit 13, the distance measurement unit 22, and the ambient light calculation unit 23.

In step S2, the distance measurement unit 22 sequentially sets the respective pixels 31 of the pixel array unit 32 as measurement target pixels, calculates the depth value d and the confidence cnf of the measurement target pixels based on the RAW data of the measurement target pixels supplied from the light receiving unit 21, and supplies the depth value d and the confidence cnf to the filter unit 26. The depth value d may be calculated by formula (1), and the confidence cnf may be calculated by formula (6).

In step S3, the ambient light calculation unit 23 calculates the ambient light component amb included in the RAW data of the measurement target pixel supplied from the light reception unit 21, and supplies the ambient light component amb to the ambient light normalization unit 24. The ambient light component amb can be obtained by the following formula (7).

In step S4, the ambient-light normalization unit 24 performs the ambient-light component amb of the measurement target pixel supplied from the ambient-light calculation unit 23 by using the number of pixels and the exposure timeAnd (6) normalizing. For example, the ambient light normalization unit 24 calculates the normalized ambient light component amb by equation (8)normAnd normalizing the normalized ambient light component ambnormIs supplied to the outdoor-information calculating unit 25. Note that, as described above, the ambient light component amb that has been normalized can also be calculated by equation (9)norm

In step S5, the outdoor-information calculating unit 25 calculates the ambient-light component amb based on the ambient-light component that has been normalizednormAnd the ambient light component amb that has been supplied from the ambient light normalization unit 24normOutdoor information regarding whether the current environment measured by the distance measurement module 11 is outdoors is calculated and provided to the filtering unit 26. For example, the outdoor information calculation unit 25 calculates the outdoor probability α of the measurement target pixel by the above-described formula (10).

In step S6, the filtering unit 26 performs an optimum filtering process on the distance measurement result from the distance measuring unit 22 according to the outdoor probability α as the outdoor information supplied from the outdoor information calculating unit 25. Specifically, with respect to the depth value d and the confidence cnf of the measurement target pixel from the distance measurement unit 22, the filtering unit 26 calculates the already-filtered depth value d 'and the confidence cnf' by using the formula (11). The depth value d 'and the confidence cnf' that have been filtered are output to the outside of the distance measurement module 11 as the depth value and the confidence of the measurement target pixel.

The processing of steps S2 to S6 is performed on all the pixels 31 of the pixel array unit 32 by sequentially setting the respective pixels 31 of the pixel array unit 32 as measurement target pixels.

In step S7, the light emission control unit 13 sets the next exposure time based on the RAW data supplied from the light receiving unit 21. The process of step S7 may be performed in parallel with steps S2 to S6.

Thus, the depth value calculation process performed by the distance measurement module 11 is completed.

In the above-described depth value calculation process, both the distance information to the object (depth value d, confidence cnf) and the outdoor information are calculated from the detection signal obtained in each pixel 31 of the pixel array unit 32, and the process of reflecting the outdoor information (outdoor probability) in the distance information is performed. However, the outdoor information that has been calculated may be output to the outside in addition to the distance information.

Further, the distance measuring sensor 14 may also perform an operation of outputting only one of the distance information and the outdoor information.

Specifically, the distance measuring sensor 14 includes, as operation modes, a first operation mode for calculating both the distance information and the outdoor information, a second operation mode for calculating the distance information without calculating the outdoor information and outputting only the distance information, and a third operation mode for calculating only the outdoor information without calculating the distance information, and performs processing according to the operation mode specified by the setting screen or the setting control signal.

In the first operation mode, the distance measurement sensor 14 performs the depth value calculation process shown in fig. 10.

In the second operation mode, the distance measuring sensor 14 operates the light receiving unit 21, the distance measuring unit 22, and the filter unit 26 indicated by the solid line in fig. 11, calculates the distance information without calculating the outdoor information, and outputs only the distance information. The filtering unit 26 performs, for example, predetermined filtering processing determined in advance.

In the third operation mode, the distance measuring sensor 14 operates the light receiving unit 21, the ambient light calculation unit 23, the ambient light normalization unit 24, and the outdoor information calculation unit 25, which are indicated by solid lines in fig. 12, and calculates outdoor information without calculating distance information. In this case, the distance measuring sensor 14 operates as an outdoor determination sensor. Since the resolution of the pixel array unit 32 can be made lower than that of an outdoor determination sensor using a general RGB sensor that receives RGB light, driving power can be suppressed and outdoor determination can be achieved with lower power consumption.

As described above, according to the distance measuring sensor 14 and the distance measuring module 11, the calculation of the distance information and the outdoor judgment can be performed by the distance measuring sensor alone. Since calculation of distance information and outdoor determination can be performed by the distance measuring sensor 14 and the distance measuring module 11 alone, power consumption and installation volume can be reduced. Since the distance measuring sensor 14 receives infrared light that cannot be imaged by the RGB camera, the presence of the sun can be detected with higher accuracy than that imaged by the RGB camera.

<5. second embodiment >

Fig. 13 shows a block diagram of a second embodiment of a distance measuring module.

In fig. 13, portions corresponding to those in fig. 7 in the first embodiment are denoted by the same reference numerals, and description thereof will be omitted as appropriate.

The second embodiment of fig. 13 is configured similarly to the first embodiment shown in fig. 7. Except that the object region detection unit 81 is newly set, and the ambient light calculation unit 23 is changed to the ambient light calculation unit 82.

The depth value d and the confidence cnf of each pixel 31 of the pixel array unit 32 are supplied from the distance measurement unit 22 to the object region detection unit 81.

The object region detection unit 81 generates a confidence map in which the confidence cnf is stored as a pixel value of each pixel 31 of the pixel array unit 32. Then, the subject region detection unit 81 detects a subject region, which is a region including a subject (object) in the entire pixel region (hereinafter also referred to as a light receiving region) of the pixel array unit 32, based on the confidence map that has been generated, and supplies the subject region that has been detected as region-of-interest (ROI) information indicating a region-of-interest, that is, a region to be focused in the light receiving region, to the ambient light calculation unit 82.

Note that the subject region detection unit 81 may also generate a depth map in which a depth value d is stored as a pixel value of each pixel 31 of the pixel array unit 32, and also detect a subject region by using the depth map. Also by using the distance information, the object area can be detected more accurately. Alternatively, the object region may be detected by using only the depth map without using the confidence map.

The ambient light calculation unit 82 performs similar processing to that in the first embodiment with respect to the region of interest indicated by the ROI information supplied from the subject region detection unit 81 in the light receiving region. That is, the ambient light calculation unit 82 calculates the ambient light component amb included in the RAW data for each pixel 31 in the region of interest, and supplies the ambient light component amb to the ambient light normalization unit 24.

Further, the ambient light calculation unit 82 may also receive ROI information indicating a region of interest from the outside of the distance measurement sensor 14 or the like. In the case of providing the ROI information, the ambient light calculation unit 82 calculates the ambient light component amb included in the RAW data of the region of interest indicated by the ROI information in the light receiving region, and supplies the ambient light component amb to the ambient light normalization unit 24.

Fig. 14 is a diagram for explaining the processing of the ambient light calculation unit 82.

A of fig. 14 shows an example of the confidence map in which the confidence cnf of each pixel 31 supplied from the distance measurement unit 22 is stored. Note that a of fig. 14 is a conceptual diagram of the confidence map since the confidence map is actually a grayscale image.

For example, region information indicating the region 91 of B of fig. 14 is provided as ROI information to the confidence map shown in a of fig. 14. In the case where the region information indicating the region 91 is a subject region detected by the subject region detection unit 81, the region 91 may dynamically change in accordance with the motion of the subject. Alternatively, in the case where the region information indicating the region 91 is ROI information provided from the outside, the region 91 is fixed unless the ROI information is updated.

The ambient light calculation unit 82 calculates the ambient light components amb of all pixels in the region 91 in the entire region of the confidence map. Alternatively, the ambient light calculation unit 82 may calculate the ambient light component amb only for predetermined sampling pixels 92 in the region 91. In this case, the sampling pixel 92 is predetermined by a parameter or the like.

As described above, the ambient light calculation unit 82 acquires ROI information indicating a part of the region of interest in the entire region of the light receiving region, calculates the ambient light component amb of the region of interest, and supplies the ambient light component amb to the ambient light normalization unit 24.

The ambient light normalization unit 24 and the outdoor information calculation unit 25 in fig. 13 perform similar processing with respect to the region of interest as in the first embodiment. That is, the ambient light normalization unit 24 normalizes the ambient light component amb of each pixel 31 in the region of interest, and normalizes the ambient light component amb that has been normalizednormIs supplied to the outdoor-information calculating unit 25. The outdoor-information calculating unit 25 calculates outdoor information of each pixel 31 in the region of interest, and supplies the outdoor information to the filtering unit 26.

The filtering unit 26 performs similar processing to that of the first embodiment with respect to the region of interest. That is, the filtering unit 26 performs the optimum filtering process on the distance measurement result from the distance measuring unit 22 based on the outdoor information of each pixel 31 in the region of interest. Note that, as for the region other than the region of interest in the entire region of the light receiving region, the value from the distance measuring unit 22 may be used as it is, or processing according to filter processing of the region of interest, for example, average filter processing of filter processing performed on respective pixels of the region of interest, or the like may be performed.

Since the depth value calculation processing in the second embodiment is substantially similar to that in the first embodiment described with reference to fig. 10, detailed description thereof will be omitted. In the depth value calculation process in the second embodiment, between steps S2 and S3 in fig. 10, a subject region detection process performed by the subject region detection unit 81 or a process of acquiring ROI information from the outside by the ambient light calculation unit 82 is added. Then, in steps S4 to S6, the ambient light component amb is calculated for the pixel 31 in the region of interest, the ambient light component amb that has been calculated is normalized, and outdoor information is calculated. The rest of the process is similar to the depth value calculation process of the first embodiment described with reference to fig. 10.

<6 > third embodiment

Fig. 15 is a block diagram of a third embodiment of a distance measurement module.

In fig. 15, portions corresponding to those in the second embodiment shown in fig. 13 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.

In the third embodiment of fig. 15, an imaging sensor 101 is provided in addition to the distance measuring module 11. The imaging sensor 101 is an image sensor that receives light of RGB wavelengths and generates an image (RGB image) of a subject. The distance measuring module 11 and the imaging sensor 101 in fig. 15 constitute a distance measuring system (imaging system).

The imaging sensor 101 includes a light receiving unit 111 and a signal processing unit 112, and the signal processing unit 112 includes a demosaic processing unit 121, an ROI determining unit 122, and a filtering unit 123.

The distance measuring module 11 shown in fig. 15 is different from the distance measuring module 11 of the second embodiment shown in fig. 13 in that the object region detecting unit 81 is omitted. The ROI information generated by the ROI determining unit 122 of the imaging sensor 101 is provided to the ambient light calculation unit 82. Further, the outdoor information generated by the outdoor-information calculating unit 25 is supplied to the filtering unit 26, and is also supplied to the filtering unit 123 of the imaging sensor 101. The other part of the distance measuring module 11 of the third embodiment is similar to that of the second embodiment described above.

The light receiving unit 111 includes a pixel array unit in which respective pixels filled with red (R), green (G), or blue (B) colors arranged in a bayer array or the like thereon are two-dimensionally arranged, and a signal obtained by photoelectrically converting R, G or B wavelength light received by each pixel is supplied as an imaging signal to the demosaic processing unit 121.

The demosaic processing unit 121 generates an image signal including the pixel signals of the R signal, the G signal, and the B signal for each pixel by performing color information interpolation processing or the like using the pixel signal of any one of the pixel signals of the R signal, the G signal, and the B signal supplied from the light receiving unit 111, and supplies the image signal to the ROI determining unit 122 and the filtering unit 123.

The ROI determining unit 122 performs a region-of-interest determining process for determining a region of interest on the image signal supplied from the demosaic processing unit 121. The ROI determining unit 122 performs similar processing to the object region detecting unit 81 except that the processing target image is not a grayscale image but an RGB image. Needless to say, the ROI determining unit 122 may determine the region of interest by a process different from that of the subject region detecting unit 81. The ROI determining unit 122 supplies ROI information indicating the region of interest obtained as a result of the region of interest determining process to the filtering unit 123 of the distance measuring sensor 14 and the ambient light calculating unit 82.

The image signal is supplied from the demosaic processing unit 121, and the ROI information is supplied from the ROI determining unit 122 to the filtering unit 123. Further, outdoor information is also supplied from the outdoor information calculation unit 25 of the distance measurement sensor 14 to the filter unit 123.

The filtering unit 123 performs an optimal filtering process on the image signal of the region of interest among the image signals from the demosaic processing unit 121 based on the outdoor information of the region of interest. Note that a region other than the region of interest in the image captured by the light receiving unit 111 may be left as it is, or processing according to filter processing of the region of interest, for example, average filter processing of filter processing performed on respective pixels of the region of interest, or the like may be performed.

The filtering unit 123 outputs a filtered image signal obtained by applying a predetermined filtering process to at least the region of interest to the outside.

Since the depth value calculation process of the distance measurement module 11 is similar to that of the above-described second embodiment, a description thereof will be omitted.

As described above, according to the third embodiment, the distance measurement module 11 can calculate the ambient light component amb based on the ROI information detected by the imaging sensor 101 that receives RGB light and generates a captured image, and output the depth value d 'and the confidence cnf'. Further, since the filter unit 123 of the imaging sensor 101 performs appropriate filter processing on the RGB image signal based on outdoor information, for example, adjustment of hue and edge can be optimally controlled according to a scene or the like.

The distance measuring module 11 in fig. 1 can be applied to, for example, an in-vehicle system that is mounted on a vehicle and measures a distance to an object outside the vehicle. Further, for example, the distance measurement module 11 in fig. 1 may be applied to a gesture recognition system that measures a distance to an object such as a hand of a user and recognizes a gesture of the user based on the measurement result, or the like.

<7. configuration example of electronic apparatus >

The distance measurement module 11 described above may be mounted on an electronic device such as a smartphone, a tablet terminal, a mobile phone, a personal computer, a game console, a television receiver, a wearable terminal, a digital still camera, or a digital video camera.

Fig. 16 is a block diagram showing a configuration example of a smartphone as an electronic device on which a distance measurement module is mounted.

As shown in fig. 16, a smartphone 201 is configured by connecting a distance measuring module 202, an imaging device 203, a display 204, a speaker 205, a microphone 206, a communication module 207, a sensor unit 208, a touch panel 209, and a control unit 210 via a bus 211. Further, execution of the program by the CPU allows the control unit 210 to have functions as an application processing unit 221 and an operating system processing unit 222.

The distance measurement module 11 in fig. 1 is applied to the distance measurement module 202. For example, the distance measurement module 202 is disposed on the front surface of the smartphone 201, and performs distance measurement on the user of the smartphone 201, so that the depth value of the surface shape of the user's face, hand, finger, or the like can be output as the distance measurement result.

The imaging device 203 is disposed on the front surface of the smartphone 201, and performs imaging with the user of the smartphone 201 as a subject to acquire an image of the user. Note that although not shown, a configuration may be adopted in which the imaging device 203 may also be provided on the rear surface of the smartphone 201.

The display 204 displays an operation screen for executing processing by the application processing unit 221 and the operating system processing unit 222, an image captured by the imaging device 203, and the like. For example, when a call is made by using the smartphone 201, the speaker 205 and the microphone 206 output the voice of the other party and collect the voice of the user.

The communication module 207 performs communication via a communication network. The sensor unit 208 senses speed, acceleration, proximity, and the like, and the touch panel 209 acquires a touch operation of the user on an operation screen displayed on the display 204.

The application processing unit 221 executes processing for providing various services by the smartphone 201. For example, the application processing unit 221 may perform processing of creating a face by virtually reproducing a computer graphic of an expression of the user based on the depth supplied from the distance measurement module 202 and displaying the face on the display 204. Further, the application processing unit 221 may perform processing of creating three-dimensional shape data of an arbitrary solid object, for example, based on the depth supplied from the distance measurement module 202.

The operating system processing unit 222 executes processing for realizing the basic functions and operations of the smartphone 201. For example, the operating system processing unit 222 may perform processes of authenticating the user's face and unlocking the smartphone 201 based on the depth value provided from the distance measurement module 202. Further, based on the depth values provided from the distance measurement module 202, the operating system processing unit 222 may perform, for example, processing of recognizing a gesture of the user and processing of inputting various operations according to the gesture.

In the smartphone 201 configured as described above, for example, by applying the distance measurement module 11 as described above, calculation of distance measurement information and outdoor determination can be performed simultaneously. Therefore, the smartphone 201 can detect the distance measurement information more accurately.

<8. configuration example of computer >

Next, the series of processes may be executed by hardware, or may be executed by software. In the case where the series of processes is executed by software, a program configuring the software is installed on a general-purpose computer or the like.

Fig. 17 is a block diagram showing a configuration example of an embodiment of a computer in which a program for executing the series of processes described above is installed.

In the computer, a Central Processing Unit (CPU)301, a Read Only Memory (ROM)302, a Random Access Memory (RAM)303, and an Electrically Erasable and Programmable Read Only Memory (EEPROM)304 are connected to each other via a bus 305. Further, an input/output interface 306 is connected to the bus 305, and the input/output interface 306 is connected to the outside.

In the computer configured as described above, for example, the CPU 301 loads a program stored in the ROM 302 and the EEPROM 304 into the RAM 303 via the bus 305, and executes the program, and thus executes a series of processes described above. Further, a program executed by the computer (CPU 301) may be written in advance in the ROM 302, installed from the outside into the EEPROM 304 via the input/output interface 306, or updated.

Accordingly, the CPU 301 executes the processing according to the above-described flowchart or the processing executed by the configuration of the above-described block diagram. Then, the CPU 301 may output the processing result to the outside via the input/output interface 306 as needed, for example.

In this specification, the processes performed by the computer according to the program do not necessarily have to be performed in time series in the order described by the flowcharts. That is, the processing executed by the computer according to the program also includes processing executed in parallel or individually (for example, parallel processing or processing by an object).

Further, the program may be processed by one computer (processor), or may be distributed-processed by a plurality of computers. Further, the program may be transferred to a remote computer and executed.

<9. application example of moving object >

The technique according to the present disclosure (present technique) can be applied to various products. For example, the technology according to the present disclosure may be implemented as a device mounted on any type of moving object such as an automobile, an electric automobile, a hybrid electric automobile, a motorcycle, a bicycle, a personal mobile vehicle, an airplane, a drone, a ship, or a robot.

Fig. 18 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a moving object control system to which the technique according to the present disclosure can be applied.

The vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example shown in fig. 18, the vehicle control system 12000 includes a drive system control unit 12010, a vehicle body system control unit 12020, an outside-vehicle information detection unit 12030, an inside-vehicle information detection unit 12040, and an integrated control unit 12050. Further, as a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network interface (I/F)12053 are shown.

The drive system control unit 12010 controls the operations of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 functions as a control device of a driving force generation device for generating a driving force of a vehicle such as an internal combustion engine or a drive motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism that adjusts a steering angle of the vehicle, and a brake device that generates a braking force of the vehicle.

The vehicle body system control unit 12020 controls the operations of various devices provided on the vehicle body according to various programs. For example, the vehicle body system control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lamps such as a headlamp, a backlight, a brake lamp, a flash lamp, or a fog lamp. In this case, a radio wave transmitted from a portable machine instead of a key or a signal of various switches may be input to the vehicle body system control unit 12020. The vehicle body system control unit 12020 receives input of these radio waves or signals, and controls the door lock device, power window device, lamp, and the like of the vehicle.

Vehicle exterior information detection section 12030 detects information outside the vehicle to which vehicle control system 12000 is attached. For example, the imaging unit 12031 is connected to the vehicle exterior information detecting unit 12030. The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the outside of the vehicle, and receives the captured image. The vehicle exterior information detection unit 12030 may perform processing of detecting an object such as a person, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or distance detection processing based on the received image.

The imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to a light-receiving amount of the light. The imaging unit 12031 may output an electric signal as an image, or may output an electric signal as information of distance measurement. Further, the light received by the imaging unit 12031 may be visible light or invisible light, such as infrared light.

The in-vehicle information detection unit 12040 detects information inside the vehicle. For example, a driver condition detector 12041 that detects the condition of the driver is connected to the in-vehicle information detecting unit 12040. For example, the driver condition detector 12041 includes a camera that captures an image of the driver, and the in-vehicle information detecting unit 12040 may calculate the degree of fatigue or the degree of concentration of the driver based on the detection information input from the driver condition detector 12041, or may determine whether the driver is not dozing.

The microcomputer 12051 can arithmetically operate control target values of the driving force generation device, the steering mechanism, or the brake device based on information inside and outside the vehicle acquired by the outside-vehicle information detection unit 12030 or the inside-vehicle information detection unit 12040, and can output a control command to the drive system control unit 12010. For example, the microcomputer 12051 may perform cooperative control for the purpose of realizing functions of an Advanced Driver Assistance System (ADAS), including collision avoidance or shock absorption of the vehicle, follow-up running based on the distance between the vehicles, running to maintain the vehicle speed, vehicle collision warning, vehicle lane departure warning, and the like.

Further, the microcomputer 12051 can perform coordinated control aimed at automatic driving or the like that autonomously travels without depending on the operation of the driver, by controlling the driving force generation device, the steering mechanism, the brake device, and the like based on the vehicle peripheral information acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040.

Further, the microcomputer 12051 can output a control command to the vehicle body system control unit 12020 based on the vehicle exterior information acquired by the vehicle exterior information detecting unit 12030. For example, the microcomputer 12051 may perform cooperative control for anti-glare such as switching from high beam to low beam by controlling the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detecting unit 12030.

The audio image output unit 12052 transmits an output signal of at least one of audio or an image to an output device capable of visually or aurally notifying information to the passenger or the outside of the vehicle. In the example of fig. 18, as examples of the output devices, an audio speaker 12061, a display unit 12062, and a dashboard 12063 are shown. For example, the display unit 12062 may include at least one of an in-vehicle display or a head-up display.

Fig. 19 is a diagram illustrating an example of the mounting position of the imaging unit 12031.

In fig. 19, a vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, 12105 as the imaging unit 12031.

For example, the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumper, rear door, and upper portion of the windshield of the vehicle cabin of the vehicle 12100. The imaging unit 12101 provided on the nose and the imaging unit 12105 provided on the upper portion of the windshield inside the vehicle cabin mainly acquire images in front of the vehicle 12100. The imaging units 12102, 12103 provided on the side mirrors mainly acquire images on the lateral sides of the vehicle 12100. An imaging unit 12104 provided on a rear bumper or a rear door mainly acquires an image behind the vehicle 12100. The front images acquired by the imaging units 12101, 12105 are mainly used to detect a front vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a traffic lane, and the like.

Note that fig. 19 shows an example of the imaging ranges of the imaging units 12101 to 12104. An imaging range 12111 indicates an imaging range of the imaging unit 12101 provided on the nose, imaging ranges 12112, 12113 indicate imaging ranges of the imaging units 12102, 12103 provided on the side mirrors, respectively, and an imaging range 12114 indicates an imaging range of the imaging unit 12104 provided on the rear bumper or the rear door. For example, by overlapping a plurality of pieces of image data captured by the imaging units 12101 to 12104, a bird's eye view of the vehicle 12100 viewed from above can be obtained.

At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or an imaging element having pixels for phase difference detection.

For example, the microcomputer 12051 may specifically extract the nearest solid object on the travel road of the vehicle 12100, which travels as a preceding vehicle in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, 0km/h or more), by determining the distance to each solid object in the imaging ranges 12111 to 12114 and the temporal change in the distance (relative speed to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104. Further, the microcomputer 12051 may be provided with an inter-vehicle distance to be secured behind the preceding vehicle, and may execute automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. As described above, it is possible to perform the cooperative control aimed at the automatic driving or the like that autonomously travels without depending on the operation of the driver.

For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 may classify the stereoscopic object data relating to the stereoscopic object into two-wheeled vehicles, general vehicles, large vehicles, pedestrians, utility poles, and other stereoscopic objects, and extract them so that they can be used to automatically avoid obstacles. For example, the microcomputer 12051 recognizes the obstacles around the vehicle 12100 as obstacles visible to the driver of the vehicle 12100 and obstacles hardly visible to the driver of the vehicle 12100. Then, the microcomputer 12051 judges a collision risk indicating the degree of risk of collision with each obstacle, and in the case where there is a possibility of collision with a collision risk equal to or greater than a set value, the microcomputer 12051 may output an alarm to the driver through the audio speaker 12061 or the display unit 12062 or perform forced deceleration or avoidance steering through the drive system control unit 12010 to perform driving support for collision avoidance.

At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared light. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the images captured by the imaging units 12101 to 12104. Such pedestrian recognition is performed, for example, according to a process for extracting feature points in an image captured by the imaging units 12101 to 12104 as an infrared camera, and a process for performing pattern matching processing on a series of feature points indicating the contour of an object to determine whether the object is a pedestrian. If the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 causes the display unit 12062 to display a square outline for emphasis so as to overlap the recognized pedestrian. Further, the audio image output unit 12052 may cause the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.

Examples of vehicle control systems to which the techniques according to the present disclosure may be applied have been described above. The technique according to the present disclosure can be applied to the vehicle exterior information detection unit 12030 and the vehicle interior information detection unit 12040 in the above-described configuration. Specifically, by using the distance measurement of the distance measurement module 11 as the outside-vehicle information detection unit 12030 and the inside-vehicle information detection unit 12040, it is possible to perform a process of recognizing a gesture of the driver, perform various operations (for example, an audio system, a navigation system, and an air conditioning system) according to the gesture, and more accurately detect the condition of the driver. Further, the distance measurement by the distance measuring module 11 can be used to identify the unevenness of the road surface and allow the unevenness to be reflected in the control of the suspension.

Note that the present technology can be applied to a method for amplitude-modulating light projected onto an object, which is referred to as a continuous wave method in the indirect ToF method. Further, the structure of the photodiode 51 of the light receiving unit 21 may be applied to a distance measuring sensor having a structure of distributing charges to two charge storage units, for example, a distance measuring sensor having a Current Assisted Photon Demodulator (CAPD) structure or a gate type distance measuring sensor in which pulses of charges of photodiodes are alternately applied to two gates. Furthermore, the present technique is applicable to a distance measuring sensor of a structured light method.

Embodiments of the present technology are not limited to the above-described embodiments, and various modifications may be made without departing from the scope of the present technology.

Each of a plurality of the present techniques described in this specification can be independently implemented as long as there is no contradiction. Needless to say, a plurality of arbitrary present techniques may be implemented in combination. For example, some or all of the present technology described in any embodiment may be implemented in combination with some or all of the present technology described in another embodiment. Furthermore, some or all of any of the techniques described above may be implemented in combination with another technique not described above.

Further, for example, the configuration described as one apparatus (or processing unit) may be divided into a plurality of apparatuses (or processing units) and configured as a plurality of apparatuses (or processing units). On the contrary, the configuration described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit). Further, it goes without saying that configurations other than those described above may be added to the configuration of each device (or each processing unit). Further, if the configuration and operation of the entire system are substantially the same, a part of the configuration of a specific apparatus (or processing unit) may be included in the configuration of another apparatus (or another processing unit).

Further, in this specification, a system means a set of a plurality of constituent parts (devices, modules (components), and the like), and it does not matter whether all the constituent parts are the same or not. Therefore, each of a plurality of apparatuses accommodated in separate housings and connected via a network, and one apparatus in which a plurality of modules accommodated in one housing are a system.

Further, for example, the above-described program may be executed in any device. In this case, it is sufficient that the apparatus has necessary functions (function blocks and the like) and can obtain necessary information.

Note that the present technology can be configured as follows.

(1)

A distance measuring sensor comprising

A distance measuring unit that calculates distance information to the object based on the signal obtained by the light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source,

an ambient light calculation unit that calculates an ambient light component included in the signal obtained by the light receiving unit, an

And an outdoor information calculation unit that calculates outdoor information based on the ambient light component.

(2)

The distance measuring sensor according to (1), further comprising

A normalization unit normalizing the ambient light component calculated by the ambient light calculation unit,

wherein the outdoor-information calculating unit estimates the outdoor information based on the ambient-light component that has been normalized.

(3)

The distance measuring sensor according to (2),

wherein the normalization unit normalizes the ambient light component by using the exposure time and the number of pixels.

(4)

The distance measuring sensor according to (2),

wherein the normalization unit normalizes the ambient light component by using the exposure time, the number of pixels, and the distance information.

(5)

The distance measuring sensor according to any one of (1) to (4),

wherein the ambient light calculation unit calculates the ambient light component by subtracting the dark current component from the signal obtained by the light receiving unit.

(6)

The distance measuring sensor according to any one of (1) to (5),

wherein both the distance information and the outdoor information are calculated by using the signal obtained by the light receiving unit.

(7)

The distance measuring sensor according to any one of (1) to (6), wherein as an operation mode, there is provided

A first operation mode of calculating distance information and outdoor information,

a second operation mode in which distance information is calculated without calculating outdoor information, or

A third operation mode of calculating outdoor information without calculating distance information.

(8)

The distance measuring sensor according to any one of (1) to (7),

wherein the ambient light calculation unit calculates an ambient light component of a region of interest that is a part of a light receiving region of the light receiving unit.

(9)

The distance measuring sensor according to (8),

wherein the ambient light calculation unit acquires region information indicating a region of interest, and calculates an ambient light component of the region of interest.

(10)

The distance measuring sensor according to (9),

wherein the ambient light calculation unit acquires region information indicating a region of interest, the region information having been detected by an imaging sensor that generates a captured image obtained by receiving RGB light, and calculates an ambient light component of the region of interest.

(11)

The distance measuring sensor according to (8),

wherein, in addition to the distance information, the distance measurement unit also calculates confidence information,

the distance measurement sensor further includes a region detection unit that detects the region of interest by using at least one of the distance information or the confidence information, and

the ambient light calculation unit calculates an ambient light component of the region of interest.

(12)

The distance measuring sensor according to any one of (1) to (11), further comprising

A filtering unit that performs a predetermined filtering process on the distance information,

the filtering unit performs predetermined filtering processing based on the outdoor information.

(13)

The distance measuring sensor according to any one of (1) to (12), further comprising

And a control unit that controls the exposure time based on the signal obtained by the light receiving unit.

(14)

The distance measuring sensor according to any one of (1) to (13),

wherein the outdoor information calculation unit calculates whether the outdoor information indicates outdoor or indoor as the outdoor information.

(15)

The distance measuring sensor according to any one of (1) to (13),

wherein the outdoor information calculation unit calculates the probability of being outdoors as the outdoor information.

(16)

A signal processing method comprising

Calculating distance information to the object from a signal obtained by a light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source, by using a distance measuring sensor;

calculating an ambient light component included in the signal obtained by the light receiving unit by using the distance measuring sensor; and

by using the distance measuring sensor, outdoor information is calculated based on the ambient light component.

(17)

A distance measuring module comprising

A predetermined light emitting source, and

a distance-measuring sensor is provided which is,

the distance measuring sensor comprises

A distance measuring unit that calculates distance information to the object based on the signal obtained by the light receiving unit that receives reflected light obtained by the object reflecting the irradiation light emitted from the predetermined light emitting source,

an ambient light calculation unit that calculates an ambient light component included in the signal obtained by the light receiving unit, an

And an outdoor information calculation unit that calculates outdoor information based on the ambient light component.

List of reference marks

11 distance measuring module

12 light emitting source

13 light emission control unit

14 distance measuring sensor

21 light receiving unit

22 distance measuring unit

23 ambient light calculation unit

24 ambient light normalization unit

25 outdoor information calculating unit

26 Filter Unit

81 object region detection unit

82 ambient light calculation unit

Region 91

101 imaging sensor

111 light receiving unit

112 signal processing unit

121 demosaicing processing unit

122 ROI determination unit

123 filter unit

201 Smart Mobile phone

202 distance measuring module

301 CPU

302 ROM

303 RAM

41页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:现场设备的附加模块

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!