Photodetector and distance measuring apparatus

文档序号:863457 发布日期:2021-03-16 浏览:2次 中文

阅读说明:本技术 光电检测器和距离测量设备 (Photodetector and distance measuring apparatus ) 是由 槙本宪太 于 2019-07-17 设计创作,主要内容包括:根据本公开的光电检测器设置有:光电检测单元,其具有二维布置的多个像素;连接至像素的信号线;时间测量单元,其连接到信号线,并且测量从光发射命令定时到光接收定时的时间;直方图生成单元,其生成由时间测量单元测量的测量值的直方图;存储单元,其存储与光电检测单元中的像素的位置相对应的校正值;校正处理单元,其基于存储在存储单元中的校正值,对由直方图生成单元生成的直方图进行校正处理;以及输出单元,其输出由校正处理单元进行的校正处理的信号。根据本发明的距离测量设备使用具有上述构造的光电检测器。(A photodetector according to the present disclosure is provided with: a photodetecting unit having a plurality of pixels arranged two-dimensionally; a signal line connected to the pixel; a time measuring unit that is connected to the signal line and measures a time from the light emission command timing to the light reception timing; a histogram generating unit that generates a histogram of the measurement values measured by the time measuring unit; a storage unit that stores a correction value corresponding to a position of a pixel in the photodetection unit; a correction processing unit that performs correction processing on the histogram generated by the histogram generation unit based on the correction value stored in the storage unit; and an output unit that outputs the signal of the correction processing performed by the correction processing unit. The distance measuring apparatus according to the present invention uses the photodetector having the above-described configuration.)

1. A light receiving device comprising:

a light receiving section having a plurality of pixels arranged in a two-dimensional shape;

a signal line connected to each of the pixels;

a time measuring section that is connected to the signal line and measures a time from a light emission instruction timing to a light reception timing;

a histogram creation section that creates a histogram of the measurement values measured by the time measurement section;

a storage section that stores a correction value corresponding to a position of the pixel in the light receiving section;

a correction processing section that performs correction processing on the histogram created by the histogram creation section based on the correction value stored in the storage section; and

an output section that outputs the signal subjected to the correction processing by the correction processing section.

2. The light receiving device according to claim 1, wherein a correction value is a value based on a distance from a pixel to the time measurement section.

3. The light receiving device according to claim 2, wherein the correction values for the other pixels are calculated by linear interpolation based on the correction values for the pixels at the ends in the light receiving section.

4. The light receiving device according to claim 1, wherein a plurality of the histogram creation sections are provided corresponding to pixel rows in the light receiving section, and

the correction processing section performs correction processing on each histogram created by each of the plurality of histogram creation sections.

5. The light receiving device according to claim 4, wherein the correction processing section performs correction processing in units of a bin in the histogram.

6. The light receiving device according to claim 4, wherein the correction processing section performs correction processing using a system correction value common to all the histograms created by each of the plurality of histogram creation sections.

7. The light receiving device according to claim 6, wherein the system correction value is a value corresponding to a delay common to all the histograms created by each of the plurality of histogram creation sections.

8. The light receiving device according to claim 1, wherein the storage section includes a set of correction registers in which the correction value is set for each histogram.

9. The light receiving device according to claim 8, wherein the correction processing section is provided at a subsequent stage of the histogram creation section, and the correction processing section performs correction processing by adding the correction value to the bin value of the histogram created by each of the histogram creation sections.

10. The light receiving device according to claim 8, wherein the correction processing section is provided in a preceding stage of the histogram creation section, and the correction processing section performs correction processing by adding the correction value to each measurement value measured by the time measurement section.

11. The light receiving device according to claim 1, wherein the light receiving element in each of the pixels includes an element that generates a signal in response to reception of a photon.

12. The light receiving device according to claim 1, wherein the light receiving section includes a pixel group in units of the plurality of pixels,

the signal lines include a signal line group having a plurality of the signal lines as a unit, an

The plurality of pixels included in the pixel group are respectively connected to the plurality of signal lines included in the signal line group one-to-one.

13. A distance measuring device comprising:

a light source that irradiates the measurement target with light; and

a light receiving device that receives light reflected by the measurement target, the light receiving device including

A light receiving section having a plurality of pixels arranged in a two-dimensional shape,

a signal line connected to each of the pixels,

a time measuring section connected to the signal line and measuring a time from a light emission instruction timing to a light reception timing,

a histogram creation section that creates a histogram of the measurement values measured by the time measurement section,

a storage section that stores a correction value corresponding to a position of the pixel in the light receiving section,

a correction processing section that performs correction processing on the histogram created by the histogram creation section based on the correction value stored in the storage section, an

An output section that outputs the signal subjected to the correction processing by the correction processing section.

Technical Field

The present invention relates to a light receiving device and a distance measuring apparatus.

Background

Some light receiving devices use an element that generates a signal in response to the reception of photons as a light receiving element (for example, see PTL 1). As a measuring method for measuring a distance to a measurement target, this type of light receiving apparatus employs a TOF (time of flight) method for measuring a time from light irradiation toward the measurement target until the light is returned after being reflected by the measurement target. In a direct TOF method, which is a type of the TOF method, the method involves directly calculating a distance from a difference in time of flight of light, and thus it is necessary to accurately determine the time of flight of a photon.

In a light receiving device in which pixels are arranged in a two-dimensional shape and each pixel includes a light receiving element, a three-dimensional depth map (depthmap) is acquired, and the light receiving device in which the length of a path from each pixel to a time-to-digital converter (TDC) varies disadvantageously causes propagation delay skew in a two-dimensional plane (hereinafter referred to as "in-plane delay skew").

[ citation list ]

[ patent document ]

[ PTL1] Japanese patent laid-open publication No. 2016-211881

Disclosure of Invention

[ problem ] to

To eliminate in-plane delay skew, one possible technique includes directly adding a buffer for delay adjustment to route in a path from a plurality of light receiving elements (pixels) to a time-to-digital converter (TDC). However, characteristic variations between added buffers may further degrade the in-plane delay skew. Therefore, it is difficult to correct the in-plane delay skew using the technique for adding a buffer.

In addition, the correction of the in-plane delay skew may be performed using an application processor provided in a later stage of the light receiving apparatus. However, in the case of correcting the in-plane delay skew using the application processor, the processing delay in the system as a whole occurs in units of frames in which all signals from a plurality of pixels are acquired. Therefore, the processing delay becomes significant, and as a result, an application program requiring immediate response is adversely affected.

Accordingly, an object of the present disclosure is to provide a light receiving device that can perform an excellent correction process for in-plane retardation skew, and a distance measuring apparatus using the light receiving device.

[ problem to be solved ]

The light receiving device of the present disclosure for achieving the above object includes: a light receiving section having a plurality of pixels arranged in a two-dimensional shape; a signal line connected to each pixel; a time measuring section connected to the signal line and measuring a time from a light emission instruction timing to a light reception timing; a histogram creation section that creates a histogram of the measurement values measured by the time measurement section; a storage section that stores a correction value corresponding to a position of the pixel in the light receiving section; a correction processing section that performs correction processing on the histogram created by the histogram creation section based on the correction value stored in the storage section; and an output section that outputs the signal subjected to the correction processing by the correction processing section.

In addition, a distance measuring apparatus of the present disclosure for achieving the above object includes: a light source that irradiates a measurement target with light, and a light receiving device that receives light reflected by the measurement target, and as the light receiving device, the light receiving device configured as described above is used.

Drawings

Fig. 1 is a schematic configuration diagram showing a distance measuring apparatus according to an embodiment of the present disclosure.

Fig. 2A and 2B are block diagrams showing a specific configuration of a distance measuring apparatus according to an embodiment of the present disclosure.

Fig. 3 is a circuit diagram showing a basic pixel circuit of a light receiving device using a SPAD element.

Fig. 4A is a characteristic diagram depicting a current-voltage characteristic of a PN junction of a SPAD element, and fig. 4B is a waveform diagram for describing a circuit operation of a pixel circuit.

Fig. 5 is a schematic plan view showing an example of a light receiving portion of a light receiving device.

Fig. 6 is a block diagram showing a basic configuration of a light receiving device distance measurement control section.

Fig. 7 is a diagram showing delay skew in a two-dimensional plane.

Fig. 8 is a block diagram showing a configuration of a light receiving device according to example 1.

Fig. 9 is a block diagram showing a configuration example of an in-plane delay correcting section of a light receiving device according to example 1.

Fig. 10 is a timing chart showing a timing relationship among DATA related to each histogram, an address ADDR of the histogram, a correction amount OFST, and a BIN value BIN of each corrected histogram.

Fig. 11A is a flowchart depicting a flow of a correction process of an in-plane delay skew in an optical receiving apparatus according to example 1, and fig. 11B is a diagram showing a positional relationship between data relating to an uncorrected histogram and data relating to a corrected histogram in a time axis direction in the case of example 1.

Fig. 12A is a diagram showing a positional relationship of each pixel with respect to a time measuring section in the case of example 2, and fig. 12B is a diagram showing that a delay from each pixel to the time measuring section is linear in a plane.

Fig. 13 is a diagram showing a positional relationship between data relating to an uncorrected histogram and data relating to a corrected histogram in the time axis direction in the case of example 3.

Fig. 14 is a block diagram showing a configuration of a light receiving device according to example 4.

Fig. 15 is a block diagram showing an example of a schematic configuration of a vehicle control system to which an example according to the technique of the present disclosure can be applied.

Fig. 16 is a diagram to help explain an example of the installation position of the distance measuring apparatus.

Detailed Description

Embodiments (hereinafter, referred to as "embodiments") for implementing the technology of the present disclosure will be described in detail below using the drawings. The technique of the present disclosure is not limited to the embodiments, and various numerical values and the like in the embodiments are illustrative. In the following description, the same elements or elements having the same function are denoted by the same reference numerals, and overlapping description is omitted. Note that the description will be given in the following order.

1. Overview of light receiving device and distance measuring apparatus of the present disclosure

2. Distance measuring apparatus according to embodiments

2-1 basic configuration of light receiving device using SPAD element

2-2. Structure of light receiving part of light receiving device

2-3 basic structure of signal processing section of light receiving device

2-4. in-plane retardation deflection

3. Light receiving device according to embodiments

3-1. example 1 (example of performing correction processing of in-plane delay skew when data relating to histogram is read out from histogram creation section)

3-2. example 2 (modified example of example 1: example in which the delay of each pixel to the time measuring section tends to be linear in plane)

3-3. example 3 (modified example of example 1: example in which correction processing is also performed on a delay common to all histograms)

3-4. example 4 (example of performing correction processing of in-plane delay skew when writing data relating to histogram creation section)

4. Technical application example (mobile example) according to the present disclosure

5. Configurations that the present disclosure may take

< overview of the light receiving device and distance measuring apparatus of the present disclosure >

The light receiving device and the distance measuring apparatus in the present disclosure may be configured such that the correction value is a value based on a distance from the pixel to the time measuring section. The light receiving device and the distance measuring apparatus in the present disclosure may be configured such that, based on the correction values for the pixels at the ends in the light receiving section, the correction values for the other pixels may be calculated by linear interpolation.

The light receiving device and the distance measuring apparatus of the present disclosure including the above-described preferred embodiment and configuration may be configured such that a plurality of histogram creation sections are provided corresponding to pixel rows in the light receiving section. In this case, the light receiving device and the distance measuring apparatus may be configured such that the correction processing section performs the correction processing on each histogram created by each of the plurality of histogram creation sections. In addition, the light receiving device and the distance measuring apparatus may be configured such that the correction processing section performs the correction processing in units of bins in the histogram.

Further, the light receiving device and the distance measuring apparatus of the present disclosure including the above-described preferred embodiment and configuration may be configured such that the correction processing section performs the correction processing using the system correction value common to all the histograms created by each of the plurality of histogram creation sections. The light receiving device and the distance measuring apparatus of the present disclosure may be configured such that the system correction value is a value corresponding to a delay common to all histograms created by each of the plurality of histogram creation sections.

Further, the light receiving device and the distance measuring apparatus including the above-described preferred embodiments and configurations may be configured such that the storage section includes a set of correction registers in which a correction value is set for each histogram. Further, the light receiving device and the distance measuring apparatus may be configured such that the correction processing section is provided in a later stage of the histogram creation section, and the correction processing is performed by adding a correction value to a bin value of the histogram created by each histogram creation section. Alternatively, the light receiving device and the distance measuring apparatus may be configured such that the correction processing section is provided in a preceding stage of the histogram creation section, and the correction processing is performed by adding the correction value to each measurement value measured by the time measurement section.

In addition, the light receiving device and the distance measuring apparatus including the above-described preferred embodiments and configurations may be configured such that the light receiving element in each pixel includes an element that generates a signal in response to the reception of photons.

In addition, the light receiving device and the distance measuring apparatus including the above-described preferred embodiments and configurations may be configured such that the light receiving section includes a group of pixels in units of a plurality of pixels, such that the signal line includes a group of signal lines in units of a plurality of signal lines, and such that each of the plurality of pixels included in the pixel group is connected one-to-one with each of the plurality of signal lines included in the signal line group.

< distance measuring apparatus according to the embodiment >

Fig. 1 is a schematic configuration diagram illustrating a distance measuring apparatus according to an embodiment of the present disclosure. The distance measuring apparatus 1 according to the present embodiment employs a TOF (time of flight) method (as a measuring method for measuring a distance to the subject 10 corresponding to a measurement target) to measure a time from radiation of light (for example, laser light having a peak wavelength in an infrared wavelength range) to the subject 10 until the light returns after being reflected by the subject 10. In order to realize distance measurement according to the TOF method, the distance measuring apparatus 1 according to the present embodiment includes: a light source 20 and a light receiving device 30. As the light receiving device 30, a light receiving device according to an embodiment of the present disclosure described below is used.

Fig. 2A and 2B show a specific configuration of the distance measuring apparatus 1 according to the present embodiment. The light source 20 includes, for example, a laser driver 21, a laser light source 22, and a diffusion lens 23 to irradiate the object 10 with laser light. The laser driver 21 drives the laser light source 22 under the control of the control section 40. The laser light source 22 includes, for example, a semiconductor laser, which emits laser light by being driven by the laser driver 21. The diffusion lens 23 diffuses the laser light emitted from the laser light source 22 to irradiate the object 10 with the laser light.

The light receiving device 30 includes a light receiving lens 31, an optical sensor 32 as a light receiving portion, and a logic circuit 33, and receives reflected laser light corresponding to radiation laser light reflected by the subject 10 after being emitted from the laser irradiation portion 20. The light receiving lens 31 focuses the reflected laser light from the subject 10 on the light receiving surface of the optical sensor 32. The optical sensor 32 receives the reflected laser light from the subject 10 (which has passed through the light receiving lens 31) in units of pixels, and then performs photoelectric conversion.

The output signal from the optical sensor 32 is fed to the control section 40 via the logic circuit 33. The optical sensor 32 will be described in detail below. The control section 40 includes, for example, a CPU (central processing unit) or the like, and controls the light source 20 and the light receiving device 30, and measures a time t from laser irradiation of the light source 20 toward the subject 10 until laser light returns after being reflected by the subject 10. Based on the time t, the distance L to the subject 10 can be obtained.

The method of time measurement includes starting a timer at the time when pulsed light is radiated from the light source 20, stopping the timer at the time when the light-receiving device 30 receives the pulsed light, and measuring time t. Another method for time measurement may include: the pulse light is radiated from the light source 20 at a predetermined period, the period at which the light-receiving device 30 receives the pulse light is detected, and the time t is measured from the phase difference between the light-emitting period and the light-receiving period. The time measurement is performed a plurality of times to measure the time t by detecting a peak of the histogram generated by accumulating the times of the plurality of measurements.

As the optical sensor 32, a two-dimensional array sensor (so-called area sensor) in which pixels each including a light receiving element are two-dimensionally arranged in a matrix (array) may also be used, or a one-dimensional array sensor (so-called line sensor) in which pixels each including a light receiving element are linearly arranged.

In the present embodiment, as the optical sensor 32, a sensor is used in which the light receiving element of each pixel includes an element that generates a signal in response to the reception of photons, such as an SPAD (signal photon avalanche diode) element. Specifically, the light receiving device 30 according to the present embodiment is configured such that the light receiving element of each pixel includes a SPAD element. Note that the light receiving element is not limited to the SPAD element, and may be any of various elements, such as an APD (avalanche photodiode) and CAPD (current assisted photon demodulator).

[ basic Circuit of light receiving device Using SPAD element ]

Fig. 3 shows a circuit diagram of a basic pixel circuit of the light receiving device 30 using the SPAD element. Here, a basic configuration of one pixel is depicted.

The pixel circuit in the pixel 50 according to the present embodiment is configured such that the cathode electrode of the SPAD element 51 is via the P-type MOS transistor Q as a loadLIs connected to a supply voltage VDDAnd the anode electrode is connected to supply an anode voltage VbdAnd terminal 53 of (1). As anode voltage VbdA large negative voltage at which avalanche multiplication occurs is applied. The capacitive element C is connected between the anode electrode and ground. By including series-connected P-type MOS transistors QpAnd an N-type MOS transistor QnThe CMOS inverter 54 of (1), and the cathode voltage V of the SPAD element 51CADerived as SPAD output (pixel output).

Equal to or higher than breakdown voltage VBDIs applied to the SPAD element 51. Equal to or higher than breakdown voltage VBDIs referred to as an excess bias voltage VEXAnd typically about 2V to 5V. The SPAD element 51 operates in a region called the geiger mode in a region where there is no DC stabilization point. Fig. 4A shows I (current) -V (voltage) characteristics of the PN junction of the SPAD element 51.

Now, the circuit operation of the pixel circuit in the pixel 50 configured as described above will be described using the waveform diagram in fig. 4B.

In the case where no current flows through the SPAD element 51, the voltage V is setDD-VbdIs applied to the SPAD element 51. Value of voltage (V)DD-Vbd) Is (V)BD+VEX). In addition, electrons generated at the PN junction of the SPAD element 51 cause avalanche multiplication due to the generation rate (dark count rate) of the dark current DCR or light irradiation. Then, an avalanche current is generated. This phenomenon occurs randomly in a light-shielded state (i.e., a state where no light is incident). This is the incidence of dark current DCR.

When the cathode voltage VCAReduced so that the voltage between the terminals of the SPAD element 51 is equal to the breakdown voltage V of the PN diodeBDWhen this occurs, the avalanche current stops. Then, the electrons generated and accumulated by the avalanche multiplication are discharged by the resistance element R (or the P-type MOS transistor QL), and the cathode voltage VCAIncrease to the supply voltage VDDAnd thus returns to the original state again.

When light enters the SPAD element 51 to generate at least one electron-hole pair, an avalanche current is generated using the electron-hole pair as a seed. Thus, even one photon incidence can be detected with a certain probability PDE (photon detection efficiency). In many cases, the probability PDE at which photons can be detected is typically on the order of a few percent to 20%.

The above operation is repeated. Then, in a series of operations, the cathode voltage VCAWith the waveform shaped by the CMOS inverter 54, while the SPAD output (pixel output) is a pulse signal with a pulse width T, the starting point of which corresponds to the arrival time of one photon.

[ arrangement of light-receiving sections of light-receiving devices ]

A configuration example of the light receiving section of the light receiving device 30 in which the pixels 50 configured as described above are two-dimensionally arranged in a matrix form will be described with reference to fig. 5. Fig. 5 shows a light receiving section 60 including a group of pixels 50 two-dimensionally arranged in n rows and m columns.

The light receiving section 60 has a plurality of signal lines 61 for each pixel row in a pixel arrangement of n rows and m columns. For the pixels 50 provided in units of the number of signal lines 61, one pixel is connected to the signal line 61 for each unit. Specifically, the x pixels 50 are defined as one unit, and are sequentially connected to the x signal lines 61 in such a manner that a first pixel within the unit is connected to a first one of the x signal lines 61, a second pixel within the unit is connected to a second one of the x signal lines 61, and so on. Note that the "pixel group" described in the claims is an example of a unit of x pixels 50. The "signal line group" described in the claims is an example of a unit of x signal lines 61.

Therefore, in one pixel row, the signal from every x pixels 50 is sent to the subsequent distance measurement control section 70 (see fig. 6) through the same signal line 61 shared by the signals. However, the timing control of each pixel 50 is performed such that every x pixels 50 sharing the same signal line 61 are not active at the same time, i.e., such that every x pixels 50 use the same signal line 61 in a time-division manner.

With this configuration, even in the case where pulse signals are output from adjacent pixels substantially simultaneously, the pulse signals are output through different signal lines 61, so that interference in a plurality of pulse signals can be prevented. Note that it is desirable to make the number x of pixels 50 defined as a unit as large as possible, but an excessively large x requires a large space for arranging the signal lines 61 and is undesirable in terms of layout. The number x of pixels 50 defined as a unit may be in the range of 2 to 50, and may also desirably be in the range of 5 to 15.

[ basic configuration of distance measurement control section of light receiving device ]

Fig. 6 shows a basic configuration of the distance measurement control section of the light receiving device 30. The light receiving device 30 includes a light receiving section 60 corresponding to the optical sensor 32 in fig. 2A and a distance measurement control section 70 corresponding to the logic circuit 33 in fig. 2A. The distance measurement control section 70 processes the signal of the pixel 50 fed from the light receiving section 60 through the signal line 61.

The distance measurement control section 70 includes a Multiplexer (MUX)71, a time measurement section (TDC)72, a histogram creation section (Hist)73, and an output section 74. N time measuring sections 72 and n histogram creating sections 73 (72) corresponding to the pixel rows 0 to n-1 of the light receiving section 60 are provided0To 72n-1And 730To 73n-1)。

For each pixel row of the light receiving section 60, the multiplexer 71 sequentially selects the signals of the pixels 50 fed through the x signal lines 61 and feeds the signals to the time measuring section 720To 72n-1. Time measuring unit 720To 72n-1The time from the timing of issuing a light emission instruction to the laser light source 22 to the light reception timing at the light receiving elements of the pixels 50 is measured for each pixel row in the light receiving section 60. Specifically, the time measuring unit 720To 72n-1The time from when the laser light emitted from the laser light source 22 toward the object as a measurement target until the laser light is received by the light receiving element of the pixel 50 after being reflected by the object is measured using a well-known TOF method.

The distance measurement control section 70 performs measurement in one measurement sequenceFor example, tens or hundreds of measurements are performed. Then, each histogram creation section 730To 73n-1Created by the time measuring section 720To 72n-1A histogram of the measurement values (time) repeatedly measured, specifically, a histogram indicating the time on the horizontal axis and the measurement frequency on the vertical axis.

The output unit 74 is connected to the histogram creation unit 73 for each pixel row0To 73n-1Is sequentially output to the application processor 80 provided outside the light receiving device 30 as information on the time of flight (TOF) of the laser light from the light emission indication timing to the light receiving timing.

The application processor 80 corresponds to the control section 40 in fig. 2A, and extracts the maximum value of the histogram based on the data relating to the histogram output through the output section 74. Then, the application processor 80 calculates a distance corresponding to the maximum value of the extracted histogram as the distance to the subject.

As described above, the time measuring section 72 is created by each time measuring section 720To 72n-1A histogram of measured values (time) is measured, and the maximum value of the histogram is extracted as the laser flight time from the light emission indication timing to the light reception timing. This makes it possible to accurately measure the time of flight without being affected by ambient light or the like.

[ in-plane retardation bias ]

As described above, in the light receiving device 30 in which the plurality of pixels 50 are two-dimensionally arranged, the pixels 50 and the distance measurement control section 70 are connected together through the signal lines 61 provided for the respective pixel rows. Therefore, the time measuring section 72 is measured from the pixel 500To 72n-1The length of the path of (a) varies. In this way, the measurement section 72 is measured from the pixel 50 to the time0To 72n-1In the case where the length of the path of (b) varies, wiring delay in the signal line 61 disadvantageously causes delay skew in a two-dimensional plane.

For example, in FIG. 6, when the pixel 50 of the 0 th row and the 0 th column is defined as the pixel 0, and the pixel 50 of the N-1 th row and the m-1 th column is defined as the pixel N, as shown in FIG. 7, an in-plane delay skew occurs between the maximum value of the histogram of the pixel 0 and the maximum value of the histogram of the pixel N. In the histogram of fig. 7, the horizontal axis represents time, and the vertical axis represents measurement frequency.

When the in-plane delay skew is corrected using the application processor 80 provided in the subsequent stage of the light receiving device 30, the application processor 80 performs processing using data relating to the histogram stored in the memory, resulting in the processing delay in the entire system occurring in units of frames. Therefore, the processing delay is significant, thereby adversely affecting the application program requiring immediate response. Incidentally, the light receiving device 30 having a driving frequency of 60fps has a processing delay of about 17 milliseconds.

An example of an application requiring immediate response may be cooperative control intended for an automatic driving operation or the like in which the vehicle is caused to travel autonomously without depending on the operation of the driver by controlling a driving force generation device, a steering mechanism, a brake device, and the like based on information about the vehicle surroundings acquired by the distance measurement device 1 (including the light receiving apparatus 30).

< light receiving apparatus according to the embodiment >

In the present embodiment, correction of in-plane delay skew is performed in the light receiving device 30 to realize high-speed correction processing for in-plane delay skew. More specifically, the histogram creation section 730To 73n-1The created histogram is typically shifted in the direction of the time axis to enable correction of in-plane delay skew. As described above, the light receiving device 30 according to the present embodiment can realize a high-speed correction process for in-plane delay skew, and thus can be used for applications requiring immediate response (high-speed response), such as automatic driving operation and distance measurement of a measurement target corresponding to a moving object.

A specific example of the present embodiment will be described in which, in the light receiving device 30, the histogram is shifted in the time axis direction in general to perform the correction of the in-plane delay skew.

[ example 1]

Example 1 is whereinSlave histogram creation unit 730To 73n-1In reading out each data relating to the histogram, an example of a correction process for in-plane delay skew is performed. Fig. 8 shows a configuration of a light receiving device 30 according to example 1.

As shown in fig. 8, the light receiving device 30 according to example 1 includes an in-plane delay correcting section 75 in a later stage of the histogram creating section 73 (i.e., a preceding stage of the output section 74), and the in-plane delay correcting section 75 performs a correction process for an in-plane delay skew.

In the light receiving device 30 according to example 1, the signal from the time measuring part (TDC)720To 72n-1Each piece of data relating to the measured value of (3) is written into the histogram creation unit 730To 73n-1The processing speed of (2) is high and is about several hundred MHz. In addition, the slave histogram creation unit 730To 73n-1The processing speed of reading out the pieces of data relating to the histogram is low and is about several tens MHz.

Fig. 9 shows an example of the configuration of the in-plane retardation correction section 75 in the light receiving device 30 according to example 1. Here, a configuration is shown in which the in-plane retardation correction section 75 is built in the output section 74. However, the present example is not limited to the built-in configuration.

The output section 74 includes a Multiplexer (MUP)741 and a control counter 742. The multiplexer 741 receives the histogram creation section 730To 73n-1The respective DATA on the histograms provided are input, and under the control of the control counter 742, the DATA are sequentially selected and output to the subsequent application processor 80 as DATA on one of the corresponding histograms.

The in-plane delay correction section 75 includes an address counter 751, a storage section 752, a Multiplexer (MUP)753, and an adder 754. Address counter 751 is controlled by histogram creation section 730To 73n-1Address ADDR of the created histogram. The address ADDR is a bin value that is a unit of the histogram and is provided as one of the inputs of the adder 754 to the adder 754 having two inputs.

The storage section 752 includes a histogram creation section730To 73n-1N correction registers reg corresponding to (i.e., pixel rows in the light receiving section 60)0To regn-1(correction register group). Correction register reg0To regn-1The correction value (correction amount) corresponding to the position of the pixel 50 in the light receiving section 60 is stored. The correction value is a value for correcting the in-plane delay skew, and specifically, is based on the time measurement section 72 from the pixel 500To 72n-1The value of (d).

Stored in a correction register reg0To regn-1The correction value (correction amount) in (1) is dedicated to the light receiving device 30, and therefore, can be acquired in advance in verification before shipment, evaluation measurement, and the like for the light receiving device 30 as a value for correcting the in-plane delay skew by using a predetermined method. However, the present example is not limited to being acquired by pre-shipment verification, evaluation measurement, or the like. For example, when the light receiving device 30 is activated, it is also possible to acquire a correction value using a predetermined technique and store the correction value in the correction register reg of the storage section 7520To regn-1In (1).

Under the control of the control counter 742, the multiplexer 753 sequentially selects the correction registers reg in synchronization with the multiplexer 7410To regn-1And outputs correction values OFST for collectively shifting each histogram in the time axis direction. Correction value OFST corresponds to the other input of adder 754 having two inputs.

For each histogram, the adder 754 adds the correction value OFST, which is another input, to the BIN value BIN, which is one of the inputs of the adder 754, to collectively shift each histogram in the time axis direction. Therefore, each histogram is collectively shifted in the time axis direction to realize the correction process for the in-plane delay skew.

As is apparent from the above description, the in-plane delay correcting section 75 is a correction processing section that performs processing for the correction value stored in the storage section 752 for the histogram creation section 730To 73n-1The created histogram performs a correction process. Figure 10 shows the DATA associated with each histogram,the address ADDR of the histogram, the correction value OFST and the BIN value BIN for each histogram to be corrected.

Now, the flow of the correction process of the in-plane delay skew in the light receiving device 30 according to example 1 will be described using the flowchart in fig. 11A.

To correct the in-plane delay skew, first, a correction value for correcting the in-plane delay skew is acquired in advance (step S11). For example, during evaluation measurement of the light receiving device 30 or during activation of the light receiving device 30 by using a predetermined technique as described above, a correction value specific to the light receiving device 30 may be acquired.

Then, the correction register reg of the storage section 752 is stored0To regn-1The correction value acquired in advance is set (step S12). Then, the correction register reg is used in the storage section 7520To regn-1And (d) each correction value set (stored) (as the correction value OFST of the overall movement histogram in the time axis direction) and adds the correction value OFST to the BIN value BIN of the corresponding histogram to perform the correction process of the in-plane delay skew (step S13). The addition process effects correction of in-plane delay skew to center each histogram.

By the in-plane delay skew correction processing described above, the histogram creation unit 73 performs the correction processing0To 73n-1In reading out the respective data relating to the histograms, by a simple addition process of adding the correction value OFST to the BIN value BIN of each histogram, for each histogram, the correction process can be performed on the in-plane delay skew at high speed. This processing is about one cycle of the operation clock, and is a processing delay of about several tens of nanoseconds.

Therefore, the present example can significantly reduce the processing delay as compared with the case where the subsequent application processor 80 performs the correction processing. Incidentally, in the case where the subsequent application processor 80 performs the correction processing, pieces of data relating to the histogram are accumulated in the memory to be processed. Therefore, the processing delay in the entire system occurs in units of frames as a whole, and in the optical receiving device having the driving frequency of 60fps, the processing delay is about 17 msec.

Fig. 11B shows a positional relationship between data relating to an uncorrected histogram and data relating to a corrected histogram in the time axis direction (BIN direction). Here, a case is shown in which the histograms having three BINs are collectively shifted by one BIN in the BIN direction (time axis direction). As is apparent from fig. 11B, correction using the correction value OFST is performed in units of bins. Note that, here, the correction by the correction value OFST is performed in units of bins, but the present example is not limited to the units of bins, and for example, half of the bins may be used as the units, or the resolution may be further improved.

[ example 2]

Example 2 is a modified example of example 1, and corresponds to a time measurement section (TDC)72 from the pixel 500To 72n-1The delay of (b) shows a linear trend in the plane of the light receiving section 60.

Here, in the pixel arrangement of n rows and m columns in the light receiving section 60 shown in fig. 12A, the closest time measuring section 720To 72n-1The m-1 th column of pixels 50 and the time interval measuring section 720To 72n-1The delay amount of the pixels between the pixels 50 in the farthest 0 th column is linear as shown in fig. 12B.

In this way, the measurement section 72 is measured from the pixel 50 to the time0To 72n-1In the case where the delay of (b) exhibits a linear trend in plane, in example 2, the time from the time measuring section 72 is measured from the pixel 50 at the end of the light receiving section 60 (i.e., the pixel 50 in the first column, which is the pixel 50 in the first column)0To 72n-1Farthest), the correction values of the other pixels 50 (i.e., the pixels 50 between the pixel column 0 and the pixel column m-1) are calculated by linear interpolation.

In the flowchart of fig. 11A showing the flow of the correction process for the in-plane delay skew, example 2 in which the correction value is obtained by linear interpolation can shorten the time required to acquire the correction value in step S11, compared with example 1.

[ example 3]

Example 3 is a modified example of example 1, and in example 3, the correction processing is also performed for a delay common to all histograms. Here, examples of "delay common to all histograms" include a processing delay in a circuit, a delay outside the light receiving device 30, specifically, a delay of a wiring through which a trigger signal for causing the laser light source 22 of the light source 20 shown in fig. 2A to emit light is transmitted.

In example 1, different delay corrections are performed for the respective histograms. However, in addition to the in-plane delays, there are also the above delays that are common to all histograms. The presence of the delay common to all the histograms causes an error between the distance measured at the light receiving device 30 and the actual distance, which corresponds to the delay common to all the histograms.

Therefore, in example 3, different delay corrections are performed for the respective histograms, and also by using a system correction value that is common to all the histograms and corresponds to a delay common to all the histograms, a correction process is performed for the delay common to all the histograms. For example, the system correction value may be calculated in advance by dividing the difference (error) between the distance measured by the light receiving device 30 and the actual difference by the speed of light.

Fig. 13 shows a positional relationship between data relating to an uncorrected histogram and data relating to a corrected histogram in the time axis direction (BIN direction). Here, a histogram Hist is shown0Sum histogram Histn-1(ii) a Histogram Hist0By the histogram creation section 73 corresponding to the pixel row 00Create, and histogram Histn-1By the histogram creation section 73 corresponding to the pixel row n-1n-1And (4) creating.

In fig. 13, solid arrows indicate skew correction values for the case where each histogram is corrected separately, and broken arrows indicate system correction values for the case where all histograms are corrected in common. In the present example, the delay correction is performed in the unit of a bin. However, the present example is not in units of bins, and for example, half bins may be used as units, or the resolution may be further improved.

According to example 3, in addition to different delay corrections for the respective histograms, correction may be performed on a delay common to all the histograms. Therefore, even if there is a delay common to all histograms, the actual distance can be accurately measured.

[ example 4]

Example 4 is when writing respective data relating to the histogram creation section 730To 73n-1An example of the correction processing for the in-plane delay skew is performed. Fig. 14 shows a configuration of a light receiving device 30 according to example 4.

As shown in fig. 14, the light receiving device 30 according to example 4 is in the histogram creation section 730To 73n-1Has an in-plane delay correcting section 75, and the in-plane delay correcting section 75 performs a correction process for the in-plane delay skew. The in-plane retardation correction unit 75 includes: a storage section 752 storing correction values corresponding to the positions of the pixels 50 within the light receiving section 60, and a histogram creating section 730To 73n-1N adders 754 provided in respective preceding stages of (1)0To 754n-1

The storage section 752 includes n time measuring sections 720To 72n-1Corresponding n correction registers reg0To regn-1(correction register group). In the correction register reg0To regn-1As in the case of example 1, a correction value for correcting the in-plane retardation skew is set, specifically, based on the time measurement section 72 from the pixel 500To 72n-1A corrected value of the distance of (a).

n adders 7540To 754n-1Each having a time from the time measuring section 720To 72n-1Has as one of the inputs the measured value of the corresponding one of them, and has a register reg in the correction0To regn-1As another input, the correction value set in the corresponding one of the two. Adder 7540To 754n-1Can be corrected by correcting the register reg from (for each histogram)0To regn-1And the correction value from the time measuring section 720To 72n-1To perform addition for the measured value of the corresponding one ofAnd (4) correcting in-plane delay skew.

As described above, like example 1 (wherein when the histogram creation section 73 is operated from)0To 73n-1In which correction processing is performed when each piece of data relating to the histogram is read), the pieces of data relating to the histogram are written in the histogram creation section 73)0To 73n-1Example 4 of performing the correction process can perform the correction process on the in-plane delay skew. Further, as in the case of example 1, the present example can significantly reduce the processing delay as compared with the case where the correction processing is performed by the subsequent application processor 80.

Note that the technique in example 2 and the technique in example 3 can also be applied to example 4; the technique in example 2 involves measuring the time from the time measuring section 72 by linear interpolation0To 72n-1The delay correction value for the farthest 0 th column of pixels 50 calculates the correction value for the pixel 50 between the 0 th column and the m-1 th column, and the technique in example 3 involves correcting the delay common to all histograms.

< application example of the technique according to the present disclosure >

The techniques according to the present disclosure may be applied to a variety of products. A more specific application example will be described below. For example, the technology according to the present disclosure may be implemented as a distance measuring device installed in any of various types of moving bodies such as automobiles, electric automobiles, hybrid automobiles, motorcycles, bicycles, personal mobile devices, airplanes, unmanned airplanes, ships, robots, construction machines, and agricultural machines (tractors).

[ moving body ]

Fig. 15 is a block diagram showing an example of a schematic configuration of a vehicle control system 7000 that is an example of a mobile body control system to which the technique according to the embodiment of the present disclosure is applicable. The vehicle control system 7000 includes a plurality of electronic control units connected to each other via a communication network 7010. In the example shown in fig. 15, the vehicle control system 7000 includes a drive system control unit 7100, a main body system control unit 7200, a vehicle body control unit 7200, a battery control unit 7300, an outside-vehicle information detection unit 7400, an inside-vehicle information detection unit 7500, and an integrated control unit 7600. For example, the communication network 7010 that connects a plurality of control units to each other may be an in-vehicle communication network conforming to any standard, such as a Controller Area Network (CAN), a Local Interconnect Network (LIN), a Local Area Network (LAN), FlexRay (registered trademark), or the like.

Each control unit includes: a microcomputer which performs arithmetic processing according to various programs; a storage unit for storing a program executed by the microcomputer, parameters for various operations, and the like; and a drive circuit that drives various control target devices. Each control unit further comprises: a network interface (I/F) for communicating with other control units via a communication network 7010; and a communication I/F for communicating with devices inside and outside the vehicle, sensors, and the like by wired communication or radio communication. The functional configuration of the integrated control unit 7600 shown in fig. 15 includes a microcomputer 7610, a general communication I/F7620, an exclusive communication I/F7630, a positioning section 7640, a beacon receiving section 7650, an in-vehicle device I/F7660, a sound/image output section 7670, an in-vehicle network I/F7680, and a storage section 7690. Similarly, other control units include a microcomputer, a communication I/F, a storage section, and the like.

The drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 7100 functions as a control device of a drive force generation device for generating a drive force of a vehicle, such as an internal combustion engine, a drive motor, or the like, a drive force transmission mechanism for transmitting the drive force to wheels, a steering mechanism for adjusting a steering angle of the vehicle, a brake device for generating a brake force of the vehicle, and the like. The drive system control unit 7100 may have a function as a control device of an Antilock Brake System (ABS), an Electronic Stability Control (ESC), or the like.

Drive system control unit 7100 is connected to vehicle state detection unit 7110. The vehicle state detection unit 7110 includes, for example, at least one of: a gyro sensor that detects an angular velocity of axial rotation of a vehicle body, an acceleration sensor that detects an acceleration of a vehicle, and a sensor for detecting an operation amount of an accelerator pedal, an operation amount of a brake pedal, a steering angle of a steering wheel, an engine rotational speed, a rotational speed of a wheel, or the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection section 7110, and controls an internal combustion engine, a drive motor, an electric power steering apparatus, a brake apparatus, and the like.

The main body system control unit 7200 controls the operations of various devices provided on the vehicle body according to various programs. For example, the main body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as a headlamp, a backup lamp, a brake lamp, a turn lamp, a fog lamp, or the like. In this case, radio waves transmitted from the mobile device in place of signals of a key or various switches may be input to the main body system control unit 7200. The main body system control unit 7200 receives these input radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.

The battery control unit 7300 controls a secondary battery (which serves as a power source for driving a motor) 7310 according to various programs. For example, information on the battery temperature, the battery output voltage, the amount of charge remaining in the battery, and the like is provided from a battery device including the secondary battery 7310 to the battery control unit 7300. Battery control unit 7300 performs arithmetic processing using these signals, and performs control for adjusting the temperature of secondary battery 7310 or control of a cooling device provided to a battery device or the like.

The off-vehicle information detection unit 7400 detects information on the outside of the vehicle including the vehicle control system 7000. For example, the vehicle exterior information detecting means 7400 is connected to at least one of the imaging unit 7410 and the vehicle exterior information detecting unit 7420. The imaging section 7410 includes at least one of a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The vehicle exterior information detection unit 7420 includes, for example, at least one of the following: an environment sensor for detecting a current atmospheric condition or weather condition, and a peripheral information detection sensor for detecting another vehicle, an obstacle, a pedestrian, or the like around the vehicle including the vehicle control system 7000.

The environmental sensor may be at least one of a raindrop sensor that detects rain, a fog sensor that detects fog, a sunlight sensor that detects a degree of sunshine, and a snow sensor that detects snowfall, for example. The peripheral information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR device (light detection and ranging device, or a laser imaging detection and ranging device). Each of the imaging section 7410 and the off-vehicle information detecting section 7420 may be provided as an independent sensor or device, or may be provided as a device in which a plurality of sensors or devices are integrated.

Fig. 16 shows an example of the mounting positions of the imaging section 7410 and the vehicle exterior information detecting section 7420. The imaging portions 7910, 7912, 7914, 7916, and 7918 are arranged, for example, in at least one of positions on a front nose bridge, side mirrors, a rear bumper, and a rear door of the vehicle and in at least one of positions in an upper portion of a windshield inside the vehicle 7900. The imaging portion 7910 provided at the nose of the vehicle and the imaging portion 7918 provided at the upper portion of the windshield inside mainly obtain images of the front of the vehicle 7900. The image forming portions 7912 and 7914 provided to the side view mirror mainly obtain images of the side of the vehicle 7900. The imaging portion 7916 provided on the rear bumper or the rear door mainly obtains an image of the rear of the vehicle 7900. The imaging portion 7918 provided at the upper portion of the windshield inside the vehicle is mainly used to detect a preceding vehicle, a pedestrian, an obstacle, a signal, a traffic sign, a lane, and the like.

Incidentally, fig. 16 shows an example of the image forming ranges of the respective image forming portions 7910, 7912, 7914, and 7916. The imaging range a indicates an imaging range of the imaging portion 7910 provided to the anterior nose. The imaging ranges b and c represent imaging ranges set to the imaging portions 7912 and 7914 of the side view mirror, respectively. The imaging range d indicates an imaging range of the imaging portion 7916 provided on the rear bumper or the rear door. For example, by superimposing the image data imaged by the imaging portions 7910, 7912, 7914, and 7916, a bird's eye view image of the vehicle 7900 viewed from above can be obtained.

The vehicle exterior information detecting portions 7920, 7922, 7924, 7926, 7928, and 7930 provided at the front, rear, side, and corner of the vehicle 7900 and the upper portion of the windshield in the vehicle interior may be, for example, ultrasonic sensors or radar devices. The outside-vehicle information detecting portions 7920, 7926 and 7930 provided at the front nose and rear bumper of the vehicle 7900, the rear door of the vehicle 7900, and the upper portion of the windshield inside the vehicle may be LIDAR devices, for example. These vehicle exterior information detecting portions 7920 to 7930 are mainly used for detecting a vehicle, a pedestrian, an obstacle, and the like in front.

Returning to fig. 15, the description will be continued. The vehicle exterior information detecting unit 7400 images an image of the outside of the vehicle by the imaging section 7410 and receives the imaged image data. Vehicle exterior information detecting section 7400 receives detection information from vehicle exterior information detecting unit 7420 connected to vehicle exterior information detecting section 7400. When vehicle exterior information detecting unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, vehicle exterior information detecting section 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives information of the received reflected waves. Based on the received information, the off-vehicle information detecting unit 7400 may perform a process of detecting an object such as a person, a vehicle, an obstacle, a sign, a character on a road, or the like, or a process of detecting a distance thereof. The vehicle-exterior information detecting unit 7400 may perform an environment recognition process of recognizing rainfall, fog, road surface conditions, etc., based on the received information. The vehicle exterior information detecting unit 7400 may calculate the distance to an object outside the vehicle based on the received information.

In addition, based on the received image data, the vehicle exterior information detecting unit 7400 may perform an image recognition process of recognizing a person, a vehicle, an obstacle, a sign, a character on a road, or the like, or a process of detecting a distance thereof. The vehicle exterior information detecting unit 7400 may perform processing such as distortion correction, alignment, and the like on the received image data, and combine the image data imaged by the plurality of different imaging sections 7410 to generate a bird's eye view image or a panoramic image. The vehicle exterior information detecting unit 7400 may perform viewpoint conversion processing using image data imaged by the imaging section 7410 including different imaging means.

The in-vehicle information detection unit 7500 detects information about the inside of the vehicle. The in-vehicle information detection unit 7500 is connected to a driver state detection unit 7510 that detects the state of the driver, for example. The driver state detection portion 7510 may include a camera that images the driver, a biosensor that detects biological information of the driver, a microphone that collects sound inside the vehicle, and the like. The biosensor is, for example, arranged in a seat surface, a steering wheel, or the like, and detects biological information of an occupant seated in the seat or a driver holding the steering wheel. The in-vehicle information detection unit 7500 can calculate the fatigue degree of the driver or the concentration degree of the driver, or can determine whether the driver is dozing, based on the detection information input from the driver state detection unit 7510. The in-vehicle information detection unit 7500 can perform processing such as noise cancellation processing or the like on an audio signal obtained by collection of sound.

The integrated control unit 7600 controls the overall operation within the vehicle control system 7000 according to various programs. The integrated control unit 7600 is connected to the input unit 7800. The input portion 7800 is realized by a device capable of input operation by the occupant, such as a touch panel, a button, a microphone, a switch, a joystick, or the like. The integrated control unit 7600 may be provided with data obtained by voice recognition of voice input through a microphone. The input 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device supporting the operation of the vehicle control system 7000, such as a mobile phone, a Personal Digital Assistant (PDA), or the like. The input 7800 may be, for example, a camera. In this case, the occupant can input information by a gesture. Alternatively, data obtained by detecting a motion of a wearable device worn by the occupant may be input. The input unit 7800 may include, for example, an input control circuit or the like, which generates an input signal from information input by an occupant or the like using the input unit 7800, and outputs the generated input signal to the integrated control unit 7600. The occupant or the like inputs various data or instructs processing to the vehicle control system 7000 by operating the input unit 7800.

The storage portion 7690 may include: a Read Only Memory (ROM) that stores various programs executed by the microcomputer, and a Random Access Memory (RAM) that stores various parameters, operation results, sensor values, and the like. In addition, the storage portion 7690 can be realized by a magnetic storage device such as a Hard Disk Drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.

The general communication I/F7620 is a widely used communication I/F that mediates communication with various devices present in the external environment 7750. The universal communication I/F7620 may implement a cellular communication protocol such as global system for mobile communications (GSM) (registered trademark), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE), LTE-advanced (LTE-a), etc., or other wireless communication protocols such as wireless LAN (also referred to as wireless fidelity (Wi-Fi) (registered trademark), bluetooth (registered trademark), etc.). The general communication I/F7620 may be connected to a device (e.g., an application server or a control server) existing on an external network (e.g., the internet, a cloud network, or a company private network), for example, via a base station or an access point. In addition, for example, the general communication I/F7620 may be connected to a terminal existing in the vicinity of the vehicle (for example, a terminal of a driver, a pedestrian or a shop, or a Machine Type Communication (MTC) terminal) using a point-to-point (P2P) technique.

The dedicated communication I/F7630 is a communication I/F supporting a communication protocol developed for vehicle use. The dedicated communication I/F7630 may implement a standard protocol, such as wireless access in a vehicular environment (WAVE), which is an Institute of Electrical and Electronics Engineers (IEEE)802.11p as a lower layer, and IEEE1609 as a higher layer, which is a combination of Dedicated Short Range Communication (DSRC) or cellular communication protocols. The dedicated communication I/F7630 generally performs V2X communication, the concept of which includes vehicle-to-vehicle communication (vehicle-to-vehicle), road-to-vehicle communication (vehicle-to-infrastructure), vehicle-to-house communication (vehicle-to-house), and pedestrian-to-vehicle communication (vehicle-to-pedestrian). One or more of (a).

The positioning portion 7640 performs positioning by, for example, receiving a Global Navigation Satellite System (GNSS) signal from a GNSS satellite (for example, a GPS signal from a Global Positioning System (GPS) satellite), and generates position information including the latitude, longitude, and altitude of the vehicle. Incidentally, the positioning portion 7640 may recognize the current position by exchanging signals with a wireless access point, or may obtain position information from a terminal such as a mobile phone, a Personal Handyphone System (PHS), or a smart phone having a positioning function.

The beacon receiving section 7650 receives, for example, radio waves or electromagnetic waves transmitted from a radio station installed on a road or the like, and thereby obtains information on the current position, congestion, closed roads, necessary time, and the like. Incidentally, the function of the beacon receiving section 7650 may be included in the above-described dedicated communication I/F7630.

The in-vehicle device I/F7660 is a communication interface that mediates connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle. The in-vehicle device I/F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, bluetooth (registered trademark), Near Field Communication (NFC), or Wireless Universal Serial Bus (WUSB). In addition, the in-vehicle device I/F7660 may establish a wired connection through a Universal Serial Bus (USB), a high-definition multimedia interface (HDMI) (registered trademark), a mobile high-definition link (MHL), or the like via a connection terminal not shown in the drawing (a cable may also be used if necessary). The in-vehicle device 7760 may include, for example, at least one of a mobile device and a wearable device owned by an occupant and an information device carried or attached in a vehicle. The in-vehicle device 7760 may also include a navigation device that searches for a route to any destination. The in-vehicle device I/F7660 exchanges control signals or data signals with these in-vehicle devices 7760.

The in-vehicle network I/F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The in-vehicle network I/F7680 transmits and receives signals and the like according to a predetermined protocol supported by the communication network 7010.

The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 according to various programs based on information obtained via at least one of the general communication I/F7620, the dedicated communication I/F7630, the positioning portion 7640, the beacon receiving portion 7650, the in-vehicle device I/F7660, and the in-vehicle network I/F7680. For example, the microcomputer 7610 can calculate control target values of the driving force generation apparatus, the steering mechanism, or the brake apparatus based on the obtained information on the interior and exterior of the vehicle, and output a control command to the drive system control unit 7100. For example, the microcomputer 7610 may execute cooperative control intended to realize Advanced Driver Assistance System (ADAS) functions including collision avoidance or impact mitigation for the vehicle, driving based on following distance, driving with the vehicle kept at speed, warning of a vehicle collision, warning of a vehicle lane departure, and the like. In addition, the microcomputer 7610 can execute cooperative control intended for automatic driving, which causes the vehicle to travel autonomously without depending on the operation of the driver, by controlling the driving force generation device, the steering mechanism, the brake device, and the like, based on the obtained information about the environment around the vehicle.

Based on information obtained via at least one of the general communication I/F7620, the dedicated communication I/F7630, the positioning portion 7640, the beacon receiving portion 7650, the in-vehicle device I/F7660, and the in-vehicle network I/F7680, the microcomputer 7610 can generate three-dimensional distance information between the vehicle and an object such as a surrounding structure, a person, or the like, and generate local map information including information on the surrounding environment of the current position of the vehicle. In addition, the microcomputer 7610 can predict dangers such as a vehicle collision, approach of a pedestrian, or the like, entering a closed road, or the like, based on the obtained information, and generate a warning signal. The warning signal may be, for example, a signal for generating a warning sound or illuminating a warning lamp.

The sound/image output portion 7670 transmits an output signal of at least one of a sound and an image to an output device capable of visually or aurally notifying information to an occupant of the vehicle or the outside of the vehicle. In the example of fig. 15, an audio speaker 7710, a display portion 7720, and a dashboard 7730 are shown as output devices. The display section 7720 may include, for example, at least one of an in-vehicle display and a flat-view display. The display section 7720 may have an Augmented Reality (AR) display function. The output device may be a device other than these devices, and may be other devices, such as headphones, wearable devices (e.g., a glasses-type display worn by a passenger, etc.), projectors, lights, and so forth. In the case where the output device is a display device, the display device visually displays results obtained by various processes performed by the microcomputer 7610 or information received from another control unit in various forms of text, images, tables, charts, and the like. In addition, in the case where the output apparatus is an audio output apparatus, the audio output apparatus converts an audio signal composed of reproduced audio data, sound data, or the like into an analog signal and outputs the analog signal acoustically.

Incidentally, in the example shown in fig. 15, at least two control units connected to each other via the communication network 7010 may be integrated into one control unit. Alternatively, each individual control unit may comprise a plurality of control units. Furthermore, the vehicle control system 7000 may comprise a further control unit, which is not shown in the figure. In addition, part or all of the functions performed by one control unit in the above description may be allocated to another control unit. That is, predetermined arithmetic processing may be performed by any control unit as long as information is transmitted and received via the communication network 7010. Similarly, a sensor or a device connected to one control unit may be connected to another control unit, and a plurality of control units may transmit and receive detection information to and from each other via the communication network 7010.

Examples of the vehicle control system to which the technique according to the present disclosure is applied have been described. The technique according to the present disclosure can be applied to, for example, the imaging portions 7910, 7912, 7914, 7916, and 7918, the vehicle exterior information detecting portions 7920, 7922, 7924, 7926, 7928, 7930, the driver state detecting portion 7510, and the like in the above-described configuration. Then, application of the technique according to the present disclosure enables excellent correction processing to be performed for the in-plane delayed skew in the light receiving device, thereby allowing a vehicle control system having high-speed response to be constructed. More specifically, application of the technique according to the present disclosure allows variation in distance measurement results to be suppressed according to the positions of pixels within the same plane, thereby enabling accurate distance measurement. As a result, a distance measurement error in the detection of an incoming vehicle or pedestrian is reduced, so that safe vehicle travel can be achieved.

< configurations that the present disclosure can adopt >

The present disclosure may also adopt the following configurations

[ light-receiving device ]

[ A-1] A light receiving device comprising:

a light receiving section having a plurality of pixels arranged in a two-dimensional shape;

a signal line connected to each of the pixels;

a time measuring section that is connected to the signal line and measures a time from a light emission instruction timing to a light reception timing;

a histogram creation section that creates a histogram of the measurement values measured by the time measurement section;

a storage section that stores a correction value corresponding to a position of the pixel in the light receiving section;

a correction processing section that performs correction processing on the histogram created by the histogram creation section based on the correction value stored in the storage section; and

an output section that outputs the signal subjected to the correction processing by the correction processing section.

[A2] The light receiving device according to the above [ a-1], wherein the correction value is a value based on a distance from the pixel to the time measurement section.

[ A-3] the light receiving device according to the above [ A-2], wherein the correction values for the other pixels are calculated by linear interpolation based on the correction values for the pixels at the ends in the light receiving section.

[ A-4] the light receiving device according to any one of [ A-1] to [ A-3] above, wherein a plurality of the histogram creation sections are provided corresponding to pixel rows in the light receiving section, and

the correction processing section performs correction processing on each histogram created by each of the plurality of histogram creation sections.

[ A-5] the light receiving device according to the above [ A-4], wherein the correction processing section performs correction processing in units of a section in the histogram.

[ A-6] the light-receiving device according to [ A-4] or [ A-5] above, wherein the correction processing section performs correction processing using a system correction value common to all the histograms created by each of the plurality of histogram creation sections.

[ A-7] the light receiving device according to [ A-6] above, wherein the system correction value is a value corresponding to a delay common to all the histograms created by each of the plurality of histogram creation sections.

[ A-8] the light-receiving device according to any one of [ A-1] to [ A-7] above, wherein the storage section includes a set of correction registers, wherein the correction value is set for each histogram.

[ A-9] the light-receiving device according to [ A-8] above, wherein the correction processing section is provided at a subsequent stage of the histogram creation section, and the correction processing section performs correction processing by adding the correction value to the bin value of the histogram created by each of the histogram creation sections.

[ A-10] the light receiving device according to the above [ A-8], wherein the correction processing section is provided in a preceding stage of the histogram creation section, and the correction processing section performs correction processing by adding the correction value to each of the measurement values measured by the time measurement section.

[ A-11] the light-receiving device according to any one of [ A-1] to [ A-10] above, wherein the light-receiving element in each of the pixels includes an element that generates a signal in response to reception of photons.

[ A-12] the light-receiving device according to any one of [ A-1] to [ A-11] above, wherein the light-receiving section includes a pixel group in which the plurality of pixels are a unit,

the signal lines include a signal line group having a plurality of the signal lines as a unit, an

The plurality of pixels included in the pixel group are respectively connected to the plurality of signal lines included in the signal line group one-to-one.

< B. distance measuring apparatus >

[ B-1] A distance measuring apparatus comprising:

a light source that irradiates the measurement target with light; and

a light receiving device that receives light reflected by the measurement target, the light receiving device including

A light receiving section having a plurality of pixels arranged in a two-dimensional shape,

a signal line connected to each of the pixels,

a time measuring section connected to the signal line and measuring a time from a light emission instruction timing to a light reception timing,

a histogram creation section that creates a histogram of the measurement values measured by the time measurement section,

a storage section that stores a correction value corresponding to a position of the pixel in the light receiving section,

a correction processing section that performs correction processing on the histogram created by the histogram creation section based on the correction value stored in the storage section, an

An output section that outputs the signal subjected to the correction processing by the correction processing section.

[ B-2] the distance measuring apparatus according to the above [ B-1], wherein each of the correction values includes a value based on a distance from a corresponding one of the pixels to the time measuring section.

[ B-3] the distance measuring apparatus according to the above [ B-2], wherein the correction values of the other pixels are calculated by linear interpolation based on the correction values of the pixels at the ends in the light receiving section.

[ B-4] the distance measuring apparatus according to any one of [ B-1] to [ B-3] above, wherein a plurality of histogram creation sections are provided corresponding to pixel rows in the light receiving section, and

the correction processing section performs correction processing on each histogram created by each of the plurality of histogram creation sections.

[ B-5] the distance measuring apparatus according to the above [ B-4], wherein the correction processing section performs the correction processing in bins in the histogram.

[ B-6] the distance measuring apparatus according to [ B-4] or [ B-5] above, wherein the correction processing section performs the correction processing using a system correction value common to all histograms generated by the plurality of respective histogram creation sections.

[ B-7] the distance measuring apparatus according to the above [ B-6], wherein the system correction value includes a value corresponding to a delay common to all histograms created by the plurality of respective histogram creation sections.

[ B-8] the distance measuring apparatus according to any one of [ B-1] to [ B-7] above, wherein the storage section includes a set of correction registers in which a correction value is set for each histogram.

[ B-9] the distance measuring apparatus according to the above [ B-8], wherein the correction processing section is provided at a subsequent stage of the histogram creation section and performs the correction processing by adding the correction value to the bin value of the histogram generated by each histogram creation section.

[ B-10] the distance measuring apparatus according to the above [ B-8], wherein the correction processing section is provided in a preceding stage of the histogram creation section, and the correction processing is performed by adding the correction value to each of the measurement values measured by the time measurement section.

[ B-11] the distance measuring apparatus according to any one of [ B-1] to [ B-10] above, wherein the light receiving element in each pixel includes an element that generates a signal in response to reception of photons.

[ B-12] the distance measuring apparatus according to any one of [ B-1] to [ B-11] above, wherein the light receiving section includes a pixel group in units of a plurality of pixels,

the signal lines include a signal line group in units of a plurality of signal lines, an

A plurality of pixels included in the pixel group are connected to a plurality of signal lines included in the signal line group one-to-one.

[ list of reference symbols ]

A distance measuring apparatus, 10.. a subject (measurement target), 20.. a light source, 21.. a laser driver, 22.. a laser light source, 23.. a diffusing lens, 30.. a light receiving device, 31 … light receiving lens, 32 … optical sensor, 33 … circuit section, 40 … control section, 50 … pixel, 51 … SPAD element, 60 … light receiving section, 61.. a signal line, 70.. a distance measurement control section, 71.. a Multiplexer (MUP), 72 (72.. a distance measuring control section, 71.. a Multiplexer (MUP), 72 (72.. a measuring target)0To 72n-1) .. time measuring section (TDC), 73 (73)0To 73n-1) .. histogram creation section, 74.. output section, 75 … in-plane delay correction section, 80.. application processor, 752.. storage sectionAn adder.

36页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种目标识别的方法和装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类