Imaging apparatus and imaging system

文档序号:1804626 发布日期:2021-11-05 浏览:4次 中文

阅读说明:本技术 成像设备和成像系统 (Imaging apparatus and imaging system ) 是由 中川庆 高桥裕嗣 于 2020-03-06 设计创作,主要内容包括:根据本公开的成像设备设置有检测事件的事件检测传感器和控制事件检测传感器的控制单元。此外,控制单元根据移动体的行驶状态来切换事件检测传感器的分辨率。此外,根据本公开的成像系统包括:事件检测传感器,其检测事件;控制单元,其根据移动体的行驶状态来切换事件检测传感器的分辨率;以及物体识别单元,其基于从事件检测传感器输出的事件信号执行事件识别。(An imaging apparatus according to the present disclosure is provided with an event detection sensor that detects an event and a control unit that controls the event detection sensor. Further, the control unit switches the resolution of the event detection sensor according to the traveling state of the mobile body. Further, an imaging system according to the present disclosure includes: an event detection sensor that detects an event; a control unit that switches a resolution of the event detection sensor according to a traveling state of the mobile body; and an object recognition unit that performs event recognition based on an event signal output from the event detection sensor.)

1. An image forming apparatus comprising:

an event detection sensor that detects an event; and

a control unit controlling the event detection sensor, wherein,

the control unit switches the resolution of the event detection sensor according to a traveling state of the mobile body.

2. The imaging apparatus of claim 1, wherein:

the event detection sensor includes an asynchronous imaging device that detects, as an event, a luminance change in a pixel that photoelectrically converts incident light exceeding a predetermined threshold.

3. The imaging apparatus of claim 2, wherein:

the imaging apparatus is used by being mounted on the moving body.

4. The imaging apparatus of claim 3, wherein:

the control unit sets the resolution of the event detection sensor to a first resolution mode in which the resolution is relatively low or a second resolution mode in which the resolution is relatively high, according to the travel state of the mobile body.

5. The imaging apparatus of claim 4, wherein:

the control unit sets the first resolution mode when the speed of the mobile body is greater than or equal to a certain speed, and sets the second resolution mode when the speed of the mobile body is less than the certain speed.

6. The imaging apparatus of claim 4, wherein:

the control unit sets the second resolution mode when a relative speed with a front object is greater than or equal to a certain relative speed, and sets the first resolution mode when the relative speed with the front object is less than the certain relative speed.

7. The imaging apparatus of claim 4, wherein:

the control unit sets the first resolution mode when the speed of the mobile body is greater than or equal to a certain speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and sets the second resolution mode when the speed of the mobile body is less than the certain speed and the number of events detected by the event detection sensor is less than the predetermined threshold.

8. The imaging apparatus of claim 4, wherein:

the control unit sets the second resolution mode when a relative speed with a preceding object is greater than or equal to a certain relative speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and sets the first resolution mode when the relative speed with the preceding object is less than the certain relative speed and the number of events detected by the event detection sensor is less than the predetermined threshold.

9. The imaging apparatus of claim 4, wherein:

in the traveling state of the first resolution mode, when the speed of the mobile body is less than a predetermined threshold, the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio, the control unit determines that congestion has occurred and switches from the first resolution mode to the second resolution mode.

10. The imaging apparatus of claim 4, wherein:

in the travel state of the first resolution mode, when the speed of the mobile body is greater than or equal to a predetermined threshold value, the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold value, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio, the control unit determines to travel on a highway and switches from the first resolution mode to the second resolution mode.

11. The imaging apparatus of claim 4, wherein:

the control unit sets the first resolution mode when the mobile body travels straight, and sets the second resolution mode when a route is changed.

12. The imaging apparatus of claim 11, wherein:

the control unit determines the route change of the mobile body when rotation of a steering wheel is greater than or equal to a certain angle and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold.

13. The imaging apparatus of claim 11, wherein

The control unit determines that the moving body travels straight when the rotation of the steering wheel is within a certain angle and the number of events detected by the event detection sensor is less than a predetermined threshold value.

14. The imaging apparatus of claim 4, wherein

The control unit switches the resolution of the event detection sensor for each area of a pixel array unit in the event detection sensor.

15. An imaging system, comprising:

an event detection sensor that detects an event;

a control unit that switches a resolution of the event detection sensor according to a traveling state of the mobile body; and

an object recognition unit that performs an event recognition process based on the event signal output from the event detection sensor.

16. The imaging system of claim 15, further comprising:

an image sensor performs imaging at a predetermined frame rate.

17. The imaging system of claim 16, wherein:

the object recognition unit performs the event recognition processing based on image data output from the image sensor.

18. The imaging system of claim 17, wherein:

when it is determined that the control unit cannot perform recognition processing using only the event signal output from the event detection sensor, the control unit performs control using the event recognition processing, the event signal output from the event detection sensor, and the image data output from the image sensor.

19. The imaging system of claim 17, wherein

Based on the result of the event recognition processing, the control unit specifies a region that can be detected as a moving body in the angle of view of the event detection sensor, and when the region that can be detected as the moving body is greater than or equal to a predetermined threshold value, the control unit determines that congestion has occurred and sets a second resolution mode for the specified region on the condition that the traveling speed of the moving body is less than the predetermined threshold value.

20. The imaging system of claim 17, wherein:

based on the result of the event recognition processing, the control unit specifies a region that can be detected as a moving body in the angle of view of the event detection sensor, and when the region that can be detected as the moving body is greater than or equal to a predetermined threshold value, the control unit determines that the vehicle is traveling on a highway under the condition that the traveling speed of the moving body is greater than or equal to the predetermined threshold value, and sets a second resolution mode for the specified region.

Technical Field

The present disclosure relates to an imaging apparatus and an imaging system.

Background

As one of the event-driven imaging apparatuses, there is an asynchronous imaging apparatus called a Dynamic Vision Sensor (DVS). The asynchronous imaging device may detect as an event a luminance change in a pixel photoelectrically converting incident light exceeding a predetermined threshold. Accordingly, such asynchronous imaging devices may be referred to as event detection sensors. An event detection sensor is generally mounted on a vehicle, and is used as an event-based vision sensor for monitoring a traveling road surface (for example, see patent document 1).

CITATION LIST

Patent document

Patent document 1: japanese patent application laid-open No. 2013-79937

Disclosure of Invention

Problems to be solved by the invention

Incidentally, during the travel of the vehicle, various travel states may be encountered, such as not only a state in which the vehicle is congested and a state in which the vehicle travels on a general road but also a state in which the vehicle travels on a highway. Therefore, it is desirable that an event detection sensor mounted and used on a moving body such as a vehicle be capable of detecting an event such as another vehicle or a pedestrian, regardless of the traveling state of the moving body.

An object of the present disclosure is to provide an imaging apparatus capable of accurately detecting an event regardless of a traveling state of a moving body and an imaging system using the imaging apparatus.

Solution to the problem

The imaging system of the present disclosure for achieving the above object includes: an event detection sensor that detects an event; and a control unit that controls the event detection sensor. The control unit switches the resolution of the event detection sensor according to a traveling state of the mobile body.

Further, an object recognition system of the present disclosure for achieving the above object includes: an event detection sensor that detects an event; a control unit that switches a resolution of the event detection sensor according to a traveling state of the mobile body; and an object recognition unit that performs event recognition based on an event signal output from the event detection sensor.

Drawings

Fig. 1 is a block diagram showing an example of a system configuration of an imaging system according to a first embodiment of the present disclosure.

Fig. 2A is a block diagram showing an example of a configuration of a motion recognition unit in the imaging system according to the first embodiment, and fig. 2B is a block diagram showing an example of a configuration of an object recognition unit in the imaging system according to the first embodiment.

Fig. 3 is a block diagram showing an example of the configuration of an event detection sensor in the imaging system according to the first embodiment.

Fig. 4 is a block diagram showing an example of the configuration of a pixel array unit in the event detection sensor.

Fig. 5 is a circuit diagram showing an example of a circuit configuration of a pixel in the event detection sensor.

Fig. 6 is a block diagram showing an example of a circuit configuration of an event detection unit in a pixel of an event detection sensor.

Fig. 7 is a circuit diagram showing an example of the configuration of the current-voltage conversion unit in the event detection unit.

Fig. 8 is a circuit diagram showing an example of the configuration of the subtraction unit and the quantization unit in the event detection unit.

Fig. 9 is an exploded perspective view showing the outline of the stacked-chip structure of the event detection sensor.

Fig. 10 is a circuit diagram showing an example of a specific configuration of the event detection sensor whose resolution is variable.

Fig. 11 is a diagram showing an operation mode of a connection control unit in the event detection sensor whose resolution is variable.

Fig. 12A, 12B, and 12C are diagrams showing the calculation table TL1, the calculation table TL2, and the calculation table TL3 for calculation in the calculation unit.

Fig. 13 is a diagram showing the flow of photocurrent in the case where the operation mode of the connection control unit is the high resolution mode.

Fig. 14 is a diagram showing the flow of photocurrent in the case where the operation mode of the connection control unit is the low resolution mode.

Fig. 15 is a diagram showing the flow of photocurrent in the case where the operation mode of the connection control unit is the current-averaging mode.

Fig. 16 is a diagram showing an example of connection of a plurality of pixels to be subjected to connection control by the connection control unit.

Fig. 17 is a flowchart showing a flow of resolution switching control according to example 1.

Fig. 18 is a flowchart showing a flow of resolution switching control according to example 2.

Fig. 19 is a flowchart showing a flow of resolution switching control according to example 3.

Fig. 20 is a flowchart showing a flow of resolution switching control according to example 4.

Fig. 21 is a flowchart showing a flow of resolution switching control according to example 5.

Fig. 22 is a flowchart showing a flow of resolution switching control according to example 6.

Fig. 23 is a flowchart showing a flow of resolution switching control according to example 7.

Fig. 24A is a flowchart showing an example of a specific process of determining a route change, and fig. 24B is a flowchart showing an example of a specific process of determining straight traveling (straight traveling).

Fig. 25 is a flowchart showing a flow of resolution switching control according to example 8.

Fig. 26 is a block diagram showing an example of a system configuration of an imaging system according to a second embodiment of the present disclosure.

Fig. 27 is a block diagram showing an outline of a configuration of a CMOS image sensor as an example of an image sensor in the imaging system according to the second embodiment.

Fig. 28 is a circuit diagram showing an example of a circuit configuration of a pixel in an image sensor.

Fig. 29 is a plan view showing an outline of a horizontally mounted chip structure of the image sensor.

Fig. 30 is a plan view showing an outline of a stacked chip structure of the image sensor.

Fig. 31 is a flowchart showing a flow of resolution switching control according to example 9.

Fig. 32 is a flowchart showing a flow of resolution switching control according to example 10.

Fig. 33 is a flowchart showing a flow of resolution switching control according to example 11.

Fig. 34 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile body control system to which the technique according to the present disclosure can be applied.

Fig. 35 is a diagram showing an example of the mounting position of the imaging unit in the vehicle control system.

Detailed Description

Hereinafter, modes for implementing the technique of the present disclosure (hereinafter, referred to as "embodiments") will be described in detail using the drawings. The techniques of this disclosure are not limited to the embodiments. In the following description, the same reference numerals will be used for the same elements or elements having the same functions, and redundant description will be omitted. Note that description will be made in the following order.

1. General description of imaging apparatus and imaging system of the present disclosure

2. First embodiment of the present disclosure

2-1. configuration example of imaging System according to first embodiment

2-2. example of configuration of event detection sensor

2-2-1. configuration example of pixel array unit

2-2-2. configuration example of pixels

2-2-3. example of configuration of event detection Unit

2-2-3-1. configuration example of current-voltage conversion unit

2-2-3-2. configuration example of subtracting unit and quantizing unit

2-2-4. configuration example of chip Structure

2-2-5. example of configuration of variable resolution event detection sensor

2-2-5-1. case of high resolution mode

2-2-5-2. case of Low resolution mode

2-2-5-3. case of Current average mode

2-2-5-4. connection example of pixels of object subjected to connection control by connection control unit

2-3. example 1 (example of switching resolution based on vehicle speed of host vehicle)

2-4. example 2 (example of switching resolution based on relative speed with another vehicle)

2-5 example 3 (example of switching resolution based on vehicle speed and event number of host vehicle)

2-6 example 4 (example of switching resolution based on relative speed and event number with another vehicle)

2-7 example 5 (example of switching resolution when congestion occurrence is detected)

2-8 example 6 (example of switching resolution when detecting expressway traveling)

2-9 example 7 (example of switching resolution when a route change is detected)

2-10 example 8 (example of switching resolution per area of pixel array Unit)

3. Second embodiment of the present disclosure

3-1. configuration example of imaging System according to second embodiment

3-2. configuration example of image sensor

3-2-1. configuration example of CMOS image sensor

3-2-2. configuration example of pixels

3-2-3. configuration example of chip Structure

3-2-3-1. horizontal mounting chip Structure (so-called horizontal mounting Structure)

3-2-3-2. stacked chip Structure (so-called stacked Structure)

3-3. example 9 (example of switching resolution when congestion occurrence is detected)

3-4 example 10 (example of switching resolution when detecting expressway traveling)

3-5 example 11 (embodiment mounted on vehicle with auto cruise function)

4. Modifying

5. Application example of the technique according to the present disclosure

5-1. Mobile body application example

6. Configurations that the present disclosure may take

< general description of image forming apparatus and image forming system of the present disclosure >

In the imaging device and the imaging system of the present disclosure, the event detection sensor may include an asynchronous imaging device that detects, as an event, a luminance change in a pixel that photoelectrically converts incident light exceeding a predetermined threshold. Then, the imaging apparatus of the present disclosure is preferably mounted and used on a mobile body.

In the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configurations, the control unit may be configured to set the resolution of the event detection sensor to a first resolution mode in which the resolution is relatively low or a second resolution mode in which the resolution is relatively high, in accordance with the traveling state of the moving body.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to set the first resolution mode when the speed of the moving body is greater than or equal to a certain speed, and set the second resolution mode when the speed of the moving body is less than the certain speed.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to set the second resolution mode when the relative speed with the front object is greater than or equal to a certain relative speed, and set the first resolution mode when the relative speed with the front object is less than a certain relative speed.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to set the first resolution mode when the speed of the moving body is greater than or equal to a certain speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and set the second resolution mode when the speed of the moving body is less than the certain speed and the number of events detected by the event detection sensor is less than the predetermined threshold.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configurations, the control unit may be configured to set the second resolution mode when the relative speed with the preceding object is greater than or equal to a certain relative speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and set the first resolution mode when the relative speed with the preceding object is less than a certain relative speed and the number of events detected by the event detection sensor is less than a predetermined threshold.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to determine that congestion has occurred and switch from the first resolution mode to the second resolution mode when the speed of the mobile body is less than a predetermined threshold value, the number of events detected by the event detection sensor is greater than or equal to the predetermined threshold value, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio in the traveling state of the first resolution mode.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to determine to travel on a highway and switch from the first resolution mode to the second resolution mode when the speed of the mobile body is greater than or equal to a predetermined threshold value, the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold value, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio in the travel state of the first resolution mode.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to set the first resolution mode when the moving body travels straight, and set the second resolution mode when the route is changed. At this time, the control unit may be configured to determine that the course of the moving body is changed when the rotation of the steering wheel is greater than or equal to a certain angle and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, or alternatively, determine that the moving body travels straight when the rotation of the steering wheel is within a certain angle and the number of events detected by the event detection sensor is less than a predetermined threshold.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configurations, the control unit may be configured to switch the resolution of the event detection sensor for each region of the pixel array unit in the event detection sensor.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, an image sensor that performs imaging at a predetermined frame rate may be included. Then, the object recognition unit may be configured to perform an event recognition process based on the image data output from the image sensor.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, when it is determined that the control unit cannot perform the recognition processing using only the event signal output from the event detection sensor, the control unit may be configured to perform control using the event recognition processing, the event signal output from the event detection sensor, and the image data output from the image sensor.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to specify an area detectable as the moving body in the angle of view of the event detection sensor based on a result of the event recognition processing, and when the area detectable as the moving body is greater than or equal to a predetermined threshold value, determine that congestion has occurred on a condition that the traveling speed of the moving body is less than the predetermined threshold value, and set the second resolution mode for the specified area.

Further, in the imaging apparatus and the imaging system of the present disclosure including the above-described preferred configuration, the control unit may be configured to specify an area detectable as the moving body in the angle of view of the event detection sensor based on a result of the event recognition processing, and when the area detectable as the moving body is greater than or equal to a predetermined threshold value, determine to travel on a highway on a condition that a travel speed of the moving body is greater than or equal to the predetermined threshold value, and set the second resolution mode for the specified area.

< first embodiment of the present disclosure >)

< configuration example of imaging system according to first embodiment >

Fig. 1 is a block diagram showing an example of a system configuration of an imaging system according to a first embodiment of the present disclosure.

As shown in fig. 1, an imaging system 1A according to a first embodiment of the present disclosure includes an event detection sensor 10, a motion recognition unit 30, an object recognition unit 40, a control unit 50, an operation mode definition unit 60, a recording unit 70, and an interface 80. The imaging system 1A according to the first embodiment can be mounted and used on a mobile body such as a vehicle.

Taking the case of being mounted on a vehicle as an example, the imaging system 1A is arranged and used at a predetermined position of the vehicle, such as at least one position of a front nose, a rear view mirror, a rear bumper, a rear door, or an upper portion of a windshield in the vehicle. Details of an application example of the technique according to the present disclosure (i.e., the imaging system 1A according to the first embodiment) will be described later.

As the event detection sensor 10, an asynchronous imaging device called DVS, which detects a luminance change in a pixel photoelectrically converting incident light exceeding a predetermined detection threshold as an event, may be used. In contrast to a synchronous imaging device that performs imaging in synchronization with a vertical synchronization signal, an asynchronous imaging device is an imaging device that detects an event asynchronously with a vertical synchronization signal. Details of the event detection sensor 10 including the asynchronous imaging device will be described later.

The motion recognition unit 30 recognizes (detects) the motion of the object based on an event signal (event data) indicating the occurrence of an event output from the event detection sensor 10. An example of a specific configuration of the motion recognition unit 30 is shown in fig. 2A. For example, the motion recognition unit 30 includes an event frame generation unit 31 and a motion detection unit 32.

The event frame generating unit 31 generates an event frame by framing how many events have occurred within a certain period of time based on an event signal output from the event detection sensor 10. The motion detection unit 32 performs motion detection between the event frames framed by the event frame generation unit 31. Note that the event frame generation unit 31 does not necessarily perform framing, that is, the motion recognition unit 30 may directly receive the asynchronously output event signal to perform motion detection.

The object recognition unit 40 performs recognition processing on the object detected as the event based on the result of the motion detection given by the motion recognition unit 30. An example of a specific configuration of the object recognition unit 40 is shown in fig. 2B. For example, the object recognition unit 40 includes an ROI extraction unit 41 and a recognition processing unit 42.

The ROI extracting unit 41 extracts a specific region for performing object recognition, i.e., extracts a region of interest (ROI). The recognition processing unit 42 performs recognition processing of the object based on the data of the region extracted by the ROI extracting unit 41. For the object recognition in the recognition processing unit 42, a pattern recognition technique of machine learning by using a neural network or the like, for example, a technique of performing image recognition by comparing feature points of an image given as teaching data with feature points of an imaged subject image, may be used.

For example, the control unit 50 includes a processor (CPU), and controls the event detection sensor 10, specifically, controls the resolution of the event detection sensor 10 based on information given from the operation mode defining unit 60. Various information such as the vehicle speed is provided to the control unit 50 via the interface 80 from a vehicle control system 7000 (see fig. 34), the vehicle control system 7000 being an example of a mobile body control system to which the technique according to the present disclosure described below can be applied. The details of controlling the resolution of the event detection sensor 10 will be described below.

Under the control of the control unit 50, the operation mode defining unit 60 detects a running state of the vehicle, such as a congestion state, a running state of an expressway, and the like, as an example of the moving body, using the motion recognition result given by the motion recognition unit 30 and the object recognition result given by the object recognition unit 40.

The information output from the operation mode defining unit 60 is supplied to the control unit 50 as information for controlling the resolution of the event detection sensor 10, and is stored in the storage unit 70 as needed. Further, the information output from the operation mode defining unit 60 is supplied to the vehicle control system 7000 (see fig. 34) via the interface 80.

In the imaging system 1A according to the first embodiment of the present disclosure having the above-described configuration, the imaging apparatus of the present disclosure is configured by including at least the event detection sensor 10 and the control unit 50. In the imaging apparatus of the present disclosure, the control unit 50 controls the resolution of the switching event detection sensor 10 according to the running state of the vehicle as an example of the moving body. Further, an imaging system in which the vehicle control system 7000 has the functions of the operation mode defining unit 60 and the recording unit 70 may also be configured.

< example of configuration of event detecting sensor >

Hereinafter, details of the event detection sensor 10 will be described. Fig. 3 is a block diagram showing an example of the configuration of the event detection sensor 10 in the imaging system 1 of the present disclosure having the above-described configuration.

As shown in fig. 3, the event detection sensor 10 includes a pixel array unit 12 in which a plurality of pixels 11 are two-dimensionally arranged in a matrix (array). Each of the plurality of pixels 11 generates, as a pixel signal, an analog signal having a voltage corresponding to a photocurrent as an electric signal generated by photoelectric conversion. Further, each of the plurality of pixels 11 detects the presence or absence of an event according to whether a change exceeding a predetermined threshold value occurs in a photocurrent corresponding to the luminance of incident light. In other words, each of the plurality of pixels 11 detects a change in luminance exceeding a predetermined threshold as an event.

The event detection sensor 10 includes, as peripheral circuit units of the pixel array unit 12, a drive unit 13, an arbitration unit (arbitration unit) 14, a column processing unit 15, and a signal processing unit 16 in addition to the pixel array unit 12.

When an event is detected, each of the plurality of pixels 11 outputs a request for outputting event data indicating the occurrence of the event to the arbitration unit 14. Then, in a case where a response indicating permission to output the event data is received from the arbitration unit 14, each of the plurality of pixels 11 outputs the event data to the drive unit 13 and the signal processing unit 16. Further, the pixel 11 that detects an event outputs an analog pixel signal generated by photoelectric conversion to the column processing unit 15.

The driving unit 13 drives each pixel 11 of the pixel array unit 12. For example, the driving unit 13 drives the pixels 11 that detect an event and output event data, and outputs analog pixel signals of the pixels 11 to the column processing unit 15.

The arbitration unit 14 arbitrates an output request of event data supplied from each of the plurality of pixels 11, and transmits a response based on the arbitration result (permission/non-permission of output of event data) and a reset signal for resetting event detection to the pixels 11.

For example, the column processing unit 15 includes an analog-to-digital conversion unit including a group of analog-to-digital converters provided for each pixel column of the pixel array unit 12. As the analog-to-digital converter, for example, a single slope type analog-to-digital converter can be exemplified.

The column processing unit 15 performs processing of converting analog pixel signals output from the pixels 11 of the pixel array unit 12 into digital signals for each pixel column of the pixel array unit 12. The column processing unit 15 can also perform Correlated Double Sampling (CDS) processing on the digitized pixel signals.

The signal processing unit 16 performs predetermined signal processing on the digitized pixel signals supplied from the column processing unit 15 and the event data output from the pixel array unit 12, and outputs the event data and the pixel signals after the signal processing.

As described above, a change in the photocurrent generated in the pixel 11 can also be regarded as a change in the amount of light incident on the pixel 11 (a change in luminance). Therefore, it can be said that the event is that the change in the amount of light (change in brightness) of the pixel 11 exceeds a predetermined threshold. The event data indicating the occurrence of an event includes at least position information such as coordinates indicating the position of the pixel 11 in which a change in the amount of light as an event has occurred. In addition to the position information, the event data may include the polarity of the light amount change.

For a series of event data output from the pixels 11 at the timing of occurrence of an event, it can be said that the event data implicitly includes time information indicating the relative time when the event occurred, as long as the interval between the event data is maintained as it was when the event occurred. However, if the interval between event data is no longer maintained as it is when the event occurs because the event data is stored in a memory or the like, time information implicitly included in the event data is lost. For this reason, the signal processing unit 16 includes time information indicating a relative time when the event occurs, for example, a time stamp in the event data until the interval between the event data is no longer maintained as it is when the event occurs.

[ configuration example of Pixel array Unit ]

Fig. 4 is a block diagram showing an example of the configuration of the pixel array unit 12 in the event detection sensor 10.

In the pixel array unit 12 in which a plurality of pixels 11 are two-dimensionally arranged in a matrix, each of the plurality of pixels 11 includes a light receiving unit 61, a pixel signal generating unit 62, and an event detecting unit 63.

In the pixel 11 having the above-described configuration, the light receiving unit 61 photoelectrically converts incident light to generate a photocurrent. Then, according to the control of the driving unit 13 (see fig. 3), the light receiving unit 61 supplies a voltage signal corresponding to a photocurrent generated by photoelectrically converting incident light to the pixel signal generating unit 62 or the event detecting unit 63.

The pixel signal generation unit 62 generates a signal of a voltage corresponding to the photocurrent supplied from the light reception unit 61 as an analog pixel signal SIG. Then, the pixel signal generating unit 62 supplies the generated analog pixel signal SIG to the column processing unit 15 via the vertical signal line VSL wired for each pixel column of the pixel array unit 12 (see fig. 3).

The event detecting unit 63 detects the presence or absence of an event according to whether or not the amount of change in the photocurrent from each light receiving unit 61 exceeds a predetermined threshold value. For example, the events include an on-event (on-event) indicating that the amount of change in the photocurrent exceeds an upper threshold value and an off-event (off-event) indicating that the amount of change falls below a lower threshold value. Also, for example, the event data indicating the occurrence of an event includes 1 bit indicating the on event detection result and 1 bit indicating the off event detection result. Note that the event detecting unit 63 may be configured to detect only a conduction event.

When an event occurs, the event detection unit 63 outputs a request for outputting event data indicating the occurrence of the event to the arbitration unit 14 (see fig. 3). Then, in the case of receiving a response to the request from the arbitration unit 14, the event detection unit 63 outputs event data to the drive unit 13 and the signal processing unit 16.

[ example of Circuit configuration of Pixel ]

Fig. 5 is a circuit diagram showing an example of the circuit configuration of the pixels 11 of the pixel array unit 12 in the event detection sensor 10.

As described above, each of the plurality of pixels 11 includes the light receiving unit 61, the pixel signal generating unit 62, and the event detecting unit 63.

In the pixel 11 having the above-described configuration, the light receiving unit 61 includes a light receiving element (photoelectric conversion element) 611, a transfer transistor 612, and a transfer transistor 613. For example, N-type Metal Oxide Semiconductor (MOS) transistors may be used as the transfer transistor 612 and the transfer transistor 613. The transfer transistor 612 and the transfer transistor 613 are connected in series with each other.

The light receiving element 611 is connected to the common connection node N of the ground and the transfer transistors 612 and 6131And photoelectrically converts incident light to generate charges of a charge amount corresponding to the amount of incident light.

The transfer signal TRG is supplied from the driving unit 13 shown in fig. 3 to the gate electrode of the transfer transistor 612. The transfer transistor 612 is turned on in response to the transfer signal TRG, thereby supplying an electric signal generated by photoelectric conversion of the light receiving element 611 to the pixel signal generation unit 62.

The control signal OFG is supplied from the driving unit 13 to the gate electrode of the transfer transistor 613. The transfer transistor 613 is turned on in response to the control signal OFG, thereby supplying an electric signal generated by photoelectric conversion of the light receiving element 611 to the event detection unit 63. The electrical signal supplied to the event detecting unit 63 is a photocurrent including electric charges.

The pixel signal generating unit 62 includes a reset transistor 621, an amplification transistor 622, a selection transistor 623, and a floating diffusion layer 624. For example, N-type MOS transistors may be used as the reset transistor 621, the amplification transistor 622, and the selection transistor 623.

The electric charges photoelectrically converted by the light receiving element 611 of the light receiving unit 61 are supplied to the pixel signal generating unit 62 through the transfer transistor 612. The charges supplied from the light receiving unit 61 are accumulated in the floating diffusion layer 624. The floating diffusion layer 624 generates a voltage signal of a voltage value corresponding to the amount of charge of the accumulated charges. That is, the floating diffusion layer 624 is a charge-voltage conversion unit that converts charges into voltages.

The reset transistor 621 is connected to the power supply voltage VDDAnd the floating diffusion layer 624. A reset signal RST is supplied from the driving unit 13 to the gate electrode of the reset transistor 621. The reset transistor 621 turns on in response to a reset signal RST, thereby initializing (resetting) the floating diffusion layer 624.

The amplifying transistor 622 and the selection transistor 623 are connected in series to the power supply voltage VDDAnd the vertical signal line VSL. The amplifying transistor 622 amplifies a voltage signal subjected to charge-voltage conversion in the floating diffusion layer 624.

A selection signal RST is supplied from the driving unit 13 to the gate electrode of the selection transistor 623. The selection transistor 623 is turned on in response to a selection signal SEL, thereby outputting the voltage signal amplified by the amplification transistor 622 as an analog pixel signal SIG to the column processing unit 15 (fig. 3) via the vertical signal line VSL.

In the event detection sensor 10 including the pixel array unit 12 in which the pixels 11 having the above-described configuration are two-dimensionally arranged, the control unit 50 shown in fig. 1 instructs the drive unit 13 to start event detection. Then, when an instruction to start event detection is given, the driving unit 13 drives the transfer transistor 613 by supplying the control signal OFG to the transfer transistor 613 of the light receiving unit 61, and causes a photocurrent corresponding to the electric charge generated by the light receiving element 611 to be supplied to the event detecting unit 63.

Then, when an event is detected in a certain pixel 11, the driving unit 13 turns off the transfer transistor 613 of the pixel 11 and stops supplying the photocurrent to the event detecting unit 63. Next, the driving unit 13 drives the transfer transistor 612 by supplying the transfer signal TRG to the transfer transistor 612, and causes the electric charges photoelectrically converted by the light receiving element 611 to be transferred to the floating diffusion layer 624.

In this way, the event detection sensor 10 including the pixel array unit 12 in which the pixels 11 having the above-described configuration are two-dimensionally arranged outputs only pixel signals of the pixels 11 in which an event is detected to the column processing unit 15. As a result, power consumption and the amount of image processing of the event detection sensor 10 can be reduced compared to the case where pixel signals of all pixels are output regardless of the presence or absence of an event.

Note that the configuration of the pixel 11 shown here is an example, and is not limited to this configuration example. For example, in the case where it is not necessary to output the pixel signal, a pixel configuration not including the pixel signal generating unit 62 may be used. In the case of such a pixel configuration, it is only necessary to omit the transfer transistor 612 in the light receiving unit 61. Further, the column processing unit 15 of fig. 3 may be configured not to include an analog-to-digital conversion function. By adopting a pixel configuration that does not output a pixel signal, the scale of the event detection sensor 10 can be suppressed.

[ example of configuration of event detecting Unit ]

Fig. 6 is a block diagram showing an example of the circuit configuration of the event detection unit 63 in the pixel 11 of the event detection sensor 10.

As shown in fig. 6, the event detecting unit 63 according to the present example includes a current-voltage converting unit 631, a buffer 632, a subtracting unit 633, a quantizing unit 634, and a transmitting unit 635.

The current-voltage conversion unit 631 converts the photocurrent supplied from the light receiving unit 63 of the pixel 11 into a voltage signal (hereinafter, may be referred to as a "photovoltage") that is a logarithm of the photocurrent, and supplies the voltage signal to the buffer 632. The buffer 632 buffers the photo voltage supplied from the current-voltage conversion unit 631 and supplies the photo voltage to the subtraction unit 633.

The subtraction unit 633 calculates a difference between the current photovoltage and the photovoltage at a time that differs from the current by one minute time, and supplies a difference signal corresponding to the difference to the quantization unit 634. The quantization unit 634 quantizes the difference signal supplied from the subtraction unit 633 into a digital signal, and supplies the digital value of the difference signal to the transmission unit 635.

When the digital value of the difference signal is supplied from the quantization unit 634, the transmission unit 635 supplies a request for transmission of event data to the arbitration unit 14. Then, when receiving a response to the request, i.e., a response to allow the event data to be output, from the arbitration unit 14, the transmission unit 635 supplies the event data to the drive unit 13 and the signal processing unit 16 according to the digital value of the difference signal supplied from the quantization unit 634.

Subsequently, a configuration example of the current-voltage conversion unit 631, the subtraction unit 633, and the quantization unit 634 in the event detection unit 63 will be described.

(configuration example of Current-Voltage conversion Unit)

Fig. 7 is a circuit diagram showing an example of the configuration of the current-voltage conversion unit 631 in the event detection unit 63.

As shown in fig. 7, the current-voltage conversion unit 631 according to the present example has a circuit configuration including a transistor 6311, a transistor 6312, and a transistor 6313. N-type MOS transistors may be used as the transistor 6311 and the transistor 6313, and a P-type MOS transistor may be used as the transistor 6312.

The transistor 6311 is connected to the power supply voltage VDDAnd signal input line 6314. The transistor 6312 and the transistor 6313 are connected in series to the power supply voltage VDDAnd ground. Then, the gate electrode of the transistor 6311 and the input terminal of the buffer 632 shown in fig. 6 are connected to the common connection of the transistor 6312 and the transistor 6313Node N2

Predetermined bias voltage VbiasIs applied to the gate electrode of the transistor 6312. As a result, the transistor 6312 supplies a constant current to the transistor 6313. A photocurrent is input from the light receiving unit 61 to the gate electrode of the transistor 6313 through the signal input line 6314.

The drain electrode of the transistor 6311 is connected to the power supply voltage VDDAnd having a source follower configuration. A gate electrode of the transistor 6313 is connected to a source electrode of the transistor 6311. Then, the photocurrent from the light receiving unit 61 is converted into a photovoltage corresponding to the logarithm of the photocurrent by the transistor 6311 and the transistor 6313 having a source follower configuration.

(configuration example of subtracting unit and quantizing unit)

Fig. 8 is a circuit diagram showing an example of the configuration of the subtraction unit 633 and the quantization unit 634 in the event detection unit 63.

The subtraction unit 633 according to the present example includes a capacitive element 6331, an operational amplifier 6332, a capacitive element 6333, and a switching element 6334.

One end of the capacitor element 6331 is connected to the output end of the buffer 632 shown in fig. 6, and the other end of the capacitor element 6331 is connected to the input end of the operational amplifier 6332. As a result, the photo voltage supplied from the buffer 632 is input to the input terminal of the operational amplifier 6332 via the capacitance element 6331.

The capacitive element 6333 is connected in parallel with the operational amplifier 6332. The switching element 6334 is connected between both ends of the capacitance element 6333. The reset signal is supplied from the arbitration unit 14 shown in fig. 3 to the switching element 6334 as a control signal for opening and closing the switching element 6334. The switching element 6334 opens and closes a path connecting both ends of the capacitive element 6333 in response to a reset signal.

In the subtraction unit 633 having the above-described configuration, when the switching element 6334 is turned on (closed), the photovoltage input to the terminal on the buffer 632 side of the capacitive element 6331 is Vinit. When the photovoltage VinitWhen the signal is input to the terminal of the capacitor 6331 on the buffer 632 side, the terminal on the opposite side is a virtual ground terminal. For convenience, willThe potential of the virtual ground terminal is set to zero. At this time, when the capacitance value of the capacitive element 6331 is C1At this time, the electric charge Q accumulated in the capacitive element 6331initRepresented by the following equation (1).

Qinit=C1×Vinit (1)

Further, when the switching element 6334 is turned on, both ends of the capacitive element 6333 are short-circuited, so that the electric charge accumulated in the capacitive element 6333 is zero. Thereafter, the switching element 6334 is turned off (opened). When the switching element 6334 is closed, the photovoltage at the terminal on the buffer 632 side of the capacitive element 6331 is represented by Vafter. When the switching element 6334 is turned off, the electric charge Q accumulated in the capacitor element 6331afterRepresented by the following equation (2).

Qafter=C1×Vafter (2)

When the capacitance value of the capacitive element 6333 is represented by C2And the output voltage of the operational amplifier 6332 is represented as VoutWhile, the electric charge Q accumulated in the capacitive element 63332Represented by the following equation (3).

Q2=-C2×Vout (3)

Before and after the switching element 6334 is turned off, the total charge amount including the charge amount of the capacitor element 6331 and the charge amount of the capacitor element 6333 does not change, and thus the following equation (4) holds.

Qinit=Qafter+Q2 (4)

When equations (1) to (3) are substituted into equation (4), the following equation (5) is obtained.

Vout=-(C1/C2)×(Vafter-Vinit) (5)

According to equation (5), in the subtraction unit 633, at the photovoltage VinitAnd photovoltage VafterPerforms a subtraction with the photovoltage VinitAnd photovoltage VafterDifference (V) betweeninit-Vafter) Corresponding difference signal VoutAnd (4) calculating. Further, according to equation (5), of subtracting unit 633The subtraction gain is C1/C2. In general, it is desirable to maximize the subtraction gain of subtraction unit 633, thereby preferentially selecting capacitance value C of capacitive element 63311Designed to be large so as to increase capacitance value C of capacitive element 63332The design is small.

On the other hand, if the capacitance value C of the capacitive element 63332Too small, kTC noise may increase and noise characteristics may deteriorate, so that the capacitance value C of the capacitive element 63332Is limited to a range of noise that can be tolerated. Further, since the event detecting unit 63 including the subtraction unit 633 is mounted on each pixel 11, the capacitive element 6331 and the capacitive element 6333 have a limitation in area. In view of this, the capacitance value C of the capacitive element 6331 is determined1And capacitance value C of capacitive element 63332

In fig. 8, the quantization unit 634 includes a comparator 6341. In the comparator 6341, the difference signal from the subtraction unit 430 (i.e., the output signal of the operational amplifier 6332) is a non-inverting (+) input, and a predetermined threshold voltage V is setthIs the inverting (-) input. Then, the comparator 6341 subtracts the difference signal V from the subtraction unit 430outAnd a predetermined threshold voltage VthComparing the difference signal V with a high level or a low level indicating the comparison resultoutThe quantized value of (a) is output to the transmission unit 635 shown in fig. 6.

Based on the difference signal V from the quantization unit 634outIdentifies that a change in the amount of light (change in brightness) as an event has occurred, i.e., in the difference signal VoutGreater than (or less than) a predetermined threshold voltage VthIn this case, the transmission unit 635 outputs, for example, high-level event data indicating the occurrence of an event to the signal processing unit 16 of fig. 3. I.e. the threshold voltage VthIs a threshold value for detecting an event based on a change in the amount of light (change in luminance) of the pixel 11.

The signal processing unit 16 includes, in the event data supplied from the transmission unit 635, position information of the pixels 11 that detect the event represented by the event data, and time information indicating the time when the event occurs, and further includes polarity information of a change in the amount of light as an event as necessary, and outputs the event data.

For example, as a data format of event data including position information of the pixels 11 that detect an event, time information indicating the time when the event occurs, and polarity information that is a change in the light amount of the event, a data format called Address Event Representation (AER) may be employed.

Note that the pixel 11 may receive any light as incident light by providing a filter (e.g., a color filter) that transmits predetermined light. For example, in the case where the pixels 11 receive visible light as incident light, the event data represents the occurrence of a change in pixel value in an image in which a visible object appears. Further, for example, in the case where the pixels 11 receive infrared rays, millimeter waves, or the like for distance measurement as incident light, the event data indicates the occurrence of a change in distance to the subject. Further, for example, in the case where the pixel 11 receives infrared rays for measuring temperature as incident light, the event data indicates the occurrence of a temperature change of the subject. In the present embodiment, the pixel 11 receives visible light as incident light.

[ configuration example of chip Structure ]

For example, a stacked chip structure may be adopted as the chip (semiconductor integrated circuit) structure of the event detection sensor 10 described above. Fig. 9 is an exploded perspective view showing the outline of the stacked-chip structure of the event detection sensor 10.

As shown in fig. 9, the stacked chip structure, that is, the so-called stacked structure, is a structure in which at least two chips of a light receiving chip 101 as a first chip and a detection chip 102 as a second chip are stacked. Then, in the circuit configuration of the pixel 11 shown in fig. 5, each light receiving element 611 is arranged on the light receiving chip 101, and all elements except the light receiving element 611, elements of other circuit parts of the pixel 11, and the like are arranged on the detection chip 102. The light receiving chip 101 and the detection chip 102 are electrically connected together VIA a connection portion such as a through-hole (VIA), a Cu — Cu bond, a bump, or the like.

Note that here, a configuration example has been exemplified in which the light receiving element 611 is disposed on the light receiving chip 101, and elements other than the light receiving element 611, elements of other circuit portions of the pixel 11, and the like are disposed on the detection chip 102; however, the present disclosure is not limited to this configuration example.

For example, in the circuit configuration of the pixel 11 shown in fig. 5, each element of the light receiving unit 61 may be arranged on the light receiving chip 101, and elements other than the light receiving unit 61, elements of other circuit portions of the pixel 11, and the like may be arranged on the detection chip 102. Further, each element of the light receiving unit 61, the reset transistor 621 of the pixel signal generating unit 62, and the floating diffusion layer 624 may be disposed on the light receiving chip 101, and elements other than these elements may be disposed on the detection chip 102. Further, some of the elements constituting the event detection unit 63 may be arranged on the light receiving chip 101 together with the elements of the light receiving unit 61 and the like.

[ example of configuration of variable-resolution event detection sensor ]

The event detection sensor 10 having the above-described configuration may have a variable resolution. Fig. 10 shows an example of a specific configuration of the variable-resolution event detection sensor 10.

In order to make the resolution variable, the event detection sensor 10 includes pixels 11 and a connection control unit 64 between the pixels 11, and in the pixel array unit 12, a plurality of pixels 11 are arranged in a matrix. Fig. 10 shows a configuration in which the connection control unit 64 is arranged between two pixels 11 adjacent in the column direction (longitudinal direction/vertical direction).

The connection control unit 64 performs connection control of turning on/off the connection between a plurality of pixels 11, in this example, between two adjacent pixels 11 adjacent in the longitudinal direction, according to an operation mode described later (i.e., connecting/disconnecting two pixels 11 to each other). That is, the connection control unit 64 turns on/off the connection between the pixel 11 of a certain column of the odd-numbered row and the pixel 11 of the same column of the next row (even-numbered row) of the odd-numbered row. In this example, the pixel array unit 12 is provided with a connection control unit 64 of 1/2 the number of pixels 11 for turning on/off a connection between two pixels 11 adjacent in the column direction.

Here, for convenience, one of the two pixels 11 of the connection control object of the connection control unit 64 is described as a pixel 11A, and the other is described as a pixel 11B. Further, the event detection unit 63 of the pixel 11A is described as an event detection unit 63A, and the event detection unit 63 of the pixel 11B is described as an event detection unit 63B.

As shown in fig. 10, the connection control unit 64 includes a transistor 641 and a calculation unit 642. For example, an N-type MOS transistor may be used as the transistor 641.

The transistor 641 functions as a switching element that selectively connects the pixel 11A and the pixel 11B together so that the photocurrents generated by the two pixels 11A and 11B are combined between the two pixels. For example, the transistor 641 is turned on/off according to the operation mode of the connection control unit 64, thereby turning on/off the connection between the source electrode of the transistor 6311 through which the photocurrent of the pixel 11A flows and the source electrode of the transistor 6311 through which the photocurrent of the pixel 11B flows.

For example, the operation mode of the connection control unit 64 is specified from the drive unit 13 or the arbitration unit 14 of the event detection sensor 10 shown in fig. 3, or from the outside of the event detection sensor 10. Details of the operation mode of the connection control unit 64 will be described below.

The event data α is supplied from the quantization unit 634 in the event detection unit 63A of the pixel 11A to the calculation unit 642, and the event data β is supplied from the quantization unit 634 in the event detection unit 63B of the pixel 11B to the calculation unit 642.

The calculation unit 642 performs calculation on the event data α supplied from the pixel 11A and the event data β supplied from the pixel 11B according to the operation mode of the connection control unit 64. Then, the calculation unit 642 supplies new event data α 'and β' obtained by the calculation of the event data α and β to the transmission unit 635 of the respective pixels 11A and 11B.

Note that the connection control unit 64 performs on/off control of the connection between the pixel 11A and the pixel 11B, and also by the bias voltage V applied to the transistor 6312 of the current-voltage conversion unit 81 of the event detection unit 63biasTo execute the bagControl of the on/off operation of the current-voltage conversion unit 631 including the transistor 6311, the transistor 6312, and the transistor 6313.

That is, the connection control unit 64 turns on the transistor 6312, thereby turning on the operation of the current-voltage converting unit 631, i.e., putting the current-voltage converting unit 631 in an operating state. Further, the connection control unit 64 turns off the transistor 6312, thereby turning off the operation of the current-voltage conversion unit 631, i.e., putting the current-voltage conversion unit 631 in a stopped state.

Here, in the configuration example of the connection control unit 64 of fig. 10, under the control of the connection control unit 64, the transistor 6312 of one of the event detection unit 63A or the event detection unit 63B, for example, the transistor 6312 of the event detection unit 63A is always on, and the transistor 6312 of the other event detection unit 63B is turned on/off.

Note that, in fig. 10, in the event detection unit 63A and the event detection unit 63B, the buffer 632 of fig. 6 is not shown.

Incidentally, during vehicle travel as an example of a mobile body, various travel states may be encountered, such as not only a state in which the vehicle is congested and a state in which the vehicle travels on a general road but also a state in which the vehicle travels on a highway. Therefore, it is desirable that the event detection sensor 10 mounted and used on the vehicle be capable of accurately detecting events such as another vehicle and a pedestrian, regardless of the running state of the vehicle.

Therefore, in the present embodiment, in order to be able to accurately detect an event regardless of the running state of the vehicle, in the event detection sensor 10 having the above-described configuration, the resolution of the event detection sensor 10 is switched according to the running state of the vehicle under the control of the control unit 50. Specifically, the operation mode of the connection control unit 64 is set according to the running state of the vehicle, and switching of the resolution of the event detection sensor 10 is performed according to the operation mode.

Examples of the operation mode of the connection control unit 64 include a high resolution (normal) mode, a low resolution mode, and a current averaging mode. Here, the low resolution mode is a mode of a first resolution at which the resolution of the event detection sensor 10 is relatively low (i.e., low resolution). The high-resolution mode is a mode of a second resolution in which the resolution of the event detection sensor 10 is relatively high (i.e., high resolution).

Fig. 11 is a diagram showing an operation mode of the connection control unit 64 in the variable-resolution event detection sensor 10. Note that the transistor 641 of the connection control unit 64 and the transistors of the event detection unit 63A and the event detection unit 63B are turned on or off under the control of the connection control unit 64.

In the high resolution mode, the transistor 641 of the connection control unit 64 is turned off, and the transistor 6312 of the event detection unit 63B is turned on. Further, the calculation in the calculation unit 642 is performed according to a calculation table TL1 described later.

In the low resolution mode, the transistor 641 of the connection control unit 64 is turned on, and the transistor 6312 of the event detection unit 63B is turned off. Further, the calculation in the calculation unit 642 is performed according to a calculation table TL2 described later.

In the current average mode, the transistor 641 of the connection control unit 64 is turned on, and the transistor 6312 of the event detection unit 63B is turned off. Further, the calculation in the calculation unit 642 is performed according to a calculation table TL3 described later.

Fig. 12A, 12B, and 12C show a calculation table TL1, a calculation table TL2, and a calculation table TL3 for calculation in the calculation unit 642.

According to the calculation table TL1 in fig. 12A, the event data α and the event data β are output as the event data α 'and the event data β', respectively, as they are, from the event data α and the event data β.

According to the calculation table TL2 of fig. 12B, the event data α is output as the event data α' as it is from the event data α and the event data β. Further, as the event data β', 0(0 volt) indicating that no event has occurred is output. Therefore, in the operation according to the calculation table TL2, a calculation is performed on the event data β to restrict the output of the event data indicating the occurrence of the event.

According to the calculation table TL3 in fig. 12C, the calculation result of the expression (α ═ β. Further, as the event data β', 0 indicating that no event has occurred is output.

The above expression (α ═ β.

According to the calculation table TL3, in the case where the event data α and the event data β are equal to each other, the event data α (═ β) is output as the event data α'. In the case where the event data α and the event data β are not equal, 0 indicating that no event has occurred is output as the event data α'.

Further, as the event data β', 0 indicating that no event has occurred is output. Therefore, in the calculation according to the calculation table TL3, the calculation of limiting the output of the event data indicating the occurrence of an event is performed in the case where the event data α and the event data β are not equal to each other for the event data α, and the calculation of limiting the output of the event data indicating the occurrence of an event is always performed for the event data β.

(case of high resolution mode)

Fig. 13 is a diagram illustrating the flow of photocurrent in the case where the operation mode of the connection control unit 64 of fig. 10 is the high resolution mode.

In the connection control unit 64 of fig. 10, the transistor 6312 of the event detection unit 63A is always on. Then, in the case where the operation mode of the connection control unit 64 is the high resolution mode, the transistor 641 of the connection control unit 64 is turned off, and the transistor 6312 of the event detection unit 63B is turned on.

As a result, in the high resolution mode, the connection between the pixel 11A and the pixel 11B is electrically disconnected by the transistor 641 in the off state, and the event detection unit 63A of the pixel 11A and the event detection unit 63B of the pixel 11B operate independently.

Further, in the high resolution mode, the calculation according to the calculation table TL1 of fig. 12A is performed in the calculation unit 642, and the event data α and the event data β ' are supplied as they are to the transmission unit 635 of the event detection unit 63A and the event detection unit 63B as the event data α ' and the event data β ', respectively.

As a result, the pixels 11A and 11B operate similarly to the case where the connection control unit 64 is not provided, and in the pixel array unit 12, resolution corresponding to the number of pixels 11 arranged in the pixel array unit 12, that is, high-resolution event data can be output as event data indicating occurrence of an event.

Here, the photocurrents generated by the light receiving elements (photoelectric conversion elements) 611 of the pixels 11A and 11B are represented by IphAnd Iph' means. In the high resolution mode, the photocurrent I generated by the pixel 11AphA photocurrent I flowing through the transistor 6311 of the event detection unit 63A and generated by the pixel 11Bph' flows through the transistor 6311 of the event detection unit 63B.

(case of Low resolution mode)

Fig. 14 is a diagram illustrating the flow of photocurrent in the case where the operation mode of the connection control unit 64 of fig. 10 is the low resolution mode.

In the connection control unit 64 of fig. 10, the transistor 6312 of the event detection unit 63A is always on. Then, in a case where the operation mode of the connection control unit 64 is the low resolution mode, the transistor 641 of the connection control unit 64 is turned on, and the transistor 6312 of the event detection unit 63B is turned off.

As a result, in the low resolution mode, the pixel 11A and the pixel 11B are electrically connected together via the transistor 641 in an on state. That is, the source electrode of the transistor 6311 of the pixel 11A and the source electrode of the transistor 6311 of the pixel 11B are connected together, whereby the pixel 11A and the pixel 11B are connected together.

Further, in the low resolution mode, the transistor 6312 of the event detection unit 63B is turned off, whereby the transistors 6311 to 6313 of the current-voltage conversion unit 631 of the event detection unit 63B are turned off.

Further, in the low resolution mode, the calculation according to the calculation table TL2 of fig. 12B is performed in the calculation unit 642, and the event data α is output as the event data α' as it is. As the event data β', 0 indicating that no event has occurred is always output. Then, these event data α 'and event data β' are supplied to the transmission unit 635 of the event detection unit 63A and the event detection unit 63B, respectively.

As a result, with respect to the pixel 11A and the pixel 11B, only the pixel 11A outputs event data indicating the occurrence of an event, and the pixel 11B always outputs event data indicating that no event has occurred (i.e., does not output event data indicating the occurrence of an event).

Therefore, the pixel array unit 12 can output event data having a resolution of 1/2 corresponding to the number of pixels 11 arranged in the pixel array unit 12 as event data indicating the occurrence of an event. That is, in the low resolution mode, in the case of the present example, 1/2 indicating the case where the resolution (maximum number) of event data in which an event occurs is the high resolution mode.

As described above, in the low resolution mode, the number of pixels 11 that output event data indicating the occurrence of an event can be suppressed. As a result, in the low resolution mode, a large number of events can be suppressed from occurring simultaneously as compared with the case of the high resolution mode.

Further, in the low resolution mode, the source electrode of the transistor 6311 of the pixel 11A and the source electrode of the transistor 6311 of the pixel 11B are connected together, and the transistors 6311 to 6313 of the current-voltage conversion unit 631 of the event detection unit 63B are turned off. As a result, in the transistor 6311 of the event detection unit 63A, the current (I) is combinedph+Iph') flows in which the photocurrents I generated by the pixels 11A are combinedphAnd a photocurrent I generated by the pixel 11Bph’。

As a result, the shot noise is relatively reduced, so that the S/N of the signal processed by the transistors 6311 to 6313 of the current-voltage conversion unit 631 of the event detection unit 63A can be improved to the case of the high resolution modeAnd the reliability of event detection can be improved. In addition, since the transistor of the current-voltage conversion unit 631 of the event detection unit 63B6311 to 6313 are turned off, and thus power consumption can be reduced.

(case of current average mode)

Fig. 15 is a graph showing the flow of photocurrent in the case where the operation mode of the connection control unit 64 of fig. 10 is the current-averaging mode.

In the connection control unit 64 of fig. 10, the transistor 6312 of the event detection unit 63A is always on. Then, in a case where the operation mode of the connection control unit 64 is the current averaging mode, the transistor 641 of the connection control unit 64 is turned on, and the transistor 6312 of the event detection unit 63B is turned on.

As a result, in the current average mode, the pixel 11A and the pixel 11B are electrically connected together via the transistor 641 in an on state. That is, the source electrode of the transistor 6311 of the pixel 11A and the source electrode of the transistor 6311 of the pixel 11B are connected together, whereby the pixel 11A and the pixel 11B are connected together.

Further, in the current averaging mode, the transistor 6312 of the event detection unit 63B is turned on, whereby the transistors 6311 to 6313 of the current-voltage conversion unit 631 of the event detection unit 63B are turned on. As a result, in both the event detecting unit 63A and the event detecting unit 63B, the current-voltage converting unit 631 is in an operating state.

Further, in the current average mode, the calculation according to the calculation table TL3 of fig. 12C is performed in the calculation unit 642, and in the case where the event data α and the event data β are equal to each other, the event data α (═ β) is output as the event data α'. Further, in the case where the event data α and the event data β are not equal to each other, 0 indicating that no event has occurred is output. As the event data β', 0 indicating that no event has occurred is always output. Then, these event data α 'and event data β' are supplied to the transmission unit 635 of the event detection unit 63A and the event detection unit 63B, respectively.

As a result, with respect to the pixel 11A and the pixel 11B, only the pixel 11A outputs event data indicating the occurrence of an event, and the pixel 11B always outputs event data indicating that no event has occurred (i.e., does not output event data indicating the occurrence of an event).

Therefore, the pixel array unit 12 can output event data having a resolution of 1/2 corresponding to the number of pixels 11 arranged in the pixel array unit 12 as event data indicating the occurrence of an event. That is, in the current average mode, similar to the case of the low resolution mode, 1/2 indicating the case where the resolution (maximum number) of the event data of the event occurrence is the high resolution mode.

As described above, in the current averaging mode, the number of pixels 11 that output event data indicating the occurrence of an event can be suppressed. As a result, in the current averaging mode, similarly to the low resolution mode, it is possible to suppress a large number of events from occurring simultaneously.

Further, in the current average mode, the source electrode of the transistor 6311 of the pixel 11A and the source electrode of the transistor 6311 of the pixel 11B are connected together, and the transistors 6311 to 6313 of the current-voltage conversion unit 631 of each of the event detection unit 63A and the event detection unit 63B are turned on. As a result, the photocurrent I generated by the pixel 11AphAnd a photocurrent I generated by the pixel 11BphThe average value of' flows through each transistor 6311 of the event detection unit 63A and the event detection unit 63B.

As a result, noise is suppressed, so that S/N of signals processed by the transistors 6311 to 6313 of the current-voltage conversion unit 631 of the event detection unit 63A can be improved, and reliability of event detection can be improved.

Further, in the current average mode, in the case where the event data α and the event data β are equal to each other according to the calculation of the calculation table TL3 in fig. 12C, the equal event data α and the event data β are output as the event data α', so that the reliability of the event data can be improved.

Note that the configuration of the variable-resolution event detection sensor 10 shown in fig. 10 is an example, and the configuration of the variable-resolution event detection sensor 10 is not limited to the configuration of fig. 10. For example, it is also possible to provide a switching element (e.g., a transistor) which selectively connects together the gate electrode of the transistor 6311 through which the photo-electricity of the pixel 11A flows and the gate electrode of the transistor 6311 through which the photo-electricity of the pixel 11B flows, and selectively combines the photo-voltages corresponding to the respective photocurrents.

(connection example of pixels to be connection-controlled by the connection control Unit)

In the above, the connection control unit 64 performs connection control for two pixels 11 adjacent to each other in the column direction (vertical direction/longitudinal direction), but this is not a limitation. That is, in addition to two pixels 11 adjacent to each other in the column direction, for example, 4 pixels 11 in 2 × 2 (horizontal (lateral) × vertical (vertical)), 9 pixels 11 in 3 × 3, 16 pixels 11 in 4 × 4, 4 pixels 11 in 4 × 1, 8 pixels 11 in 4 × 2, and any other combination of a plurality of pixels 11 may be the subject of connection control by the connection control unit 64.

Here, as an example, a connection example in which a plurality of pixels 11 are connected to maintain an aspect ratio will be described with reference to fig. 16. Fig. 16 is a diagram showing an example of connection of the plurality of pixels 11 by the connection control unit 64.

In the connection example of fig. 16, 2 × 2 pixels 11 are connected together as a first connection. In fig. 16, the first connected pixel group represents a group of 2 × 2 pixels 11. Further, the 2 × 2 pixels 11 connected together in the first connection are taken as a block, and the 2 × 2 pixel blocks are connected together as a second connection. In fig. 16, the second connected pixel group represents a group of 2 × 2 pixel blocks.

Further, in the connection example of fig. 16, the control signal line L11 for controlling on/off of the first connection (i.e., whether the 2 × 2 pixels 11 are connected together by the first connection) and the control signal line L12 for controlling on/off of the second connection (i.e., whether the 2 × 2 pixel blocks are connected together by the second connection) are wired in the column direction.

In fig. 16, in the case where the first connection and the second connection are disconnected, the resolution of event data indicating the occurrence of an event (the number of pixels 11 from which event data can be output) is a high resolution of 12 × 12. In the case where the first connection is turned on and the second connection is turned off, the resolution of event data indicating the occurrence of an event is a low resolution of 6 × 6. In the case where the first connection and the second connection are turned on, the resolution of event data indicating the occurrence of an event is a lower resolution of 3 × 3.

By adding a connection method of the pixels 11 and a connection control line for controlling connection in addition to the first connection and the second connection in fig. 16, lower resolution can be achieved as resolution of event data indicating occurrence of an event.

Hereinafter, a specific example will be described in which, in the imaging system 1A according to the first embodiment, the resolution of the event detection sensor 10 is switched according to the running state of the vehicle to enable the event to be accurately detected regardless of the running state of the vehicle. In each example described below, the switching of the resolution is performed under the control of the control unit 50 of fig. 1. In this control, various types of information, such as the vehicle speed of the host vehicle, are provided from the vehicle control system 7000 shown in fig. 34 to the control unit 50 via the interface 80.

< example 1>

Example 1 is an example of detecting the resolution of the sensor 10 based on the vehicle speed switching event of the host vehicle. A flow of resolution switching control according to example 1 is shown in a flowchart of fig. 17.

During the vehicle running, the control unit 50 determines whether the vehicle speed of the host vehicle is greater than or equal to a certain speed (step S11), and sets the resolution of the event detection sensor 10 to the low resolution mode if the vehicle speed is greater than or equal to a certain speed (step S12). In a running state where the vehicle speed is high, a plurality of event detections occur, and the power consumption of the event detection sensor 10 increases, thereby setting the low resolution mode.

In the low resolution mode, event detection may be suppressed. In other words, in the low resolution mode, a region in which the pixel array unit 12 is in an operation state in event detection can be suppressed. As a result, the number of pixels in an operating state in event detection is reduced, and power consumption of the event detection sensor 10 can be reduced accordingly.

In the running state in the low resolution mode, the control unit 50 determines whether the vehicle speed of the host vehicle is less than a certain speed (step S13), and if the vehicle speed is less than a certain speed, switches from the low resolution mode to the high resolution mode (step S14) in which event detection can be performed with high accuracy. Then, the control unit 50 monitors the stop of the vehicle (step S15), and returns to step S11 to repeat the series of processes described above until the vehicle stops.

According to the resolution switching control of example 1 described above, the low resolution mode in which event detection can be suppressed is set in the running state at or above a certain speed at which multiple event detections may occur, based on the vehicle speed of the own vehicle, so that the power consumption of the event detection sensor 10 can be reduced.

< example 2>

Example 2 is an example of switching the resolution of the event detection sensor 10 based on the relative speed with a preceding object (e.g., a vehicle). The flow of resolution switching control according to example 2 is shown in the flowchart of fig. 18.

During the vehicle travel, based on the recognition result of the object recognition unit 40 in fig. 1, the control unit 50 recognizes another vehicle traveling ahead of the own vehicle (step S21), then calculates the relative speed between the own vehicle and the other vehicle (step S22), and determines whether the relative speed is greater than or equal to a certain relative speed (step S23). Then, if the relative speed is greater than or equal to a certain relative speed, the control unit 50 sets a high resolution mode in which event detection can be performed with high accuracy, with respect to the resolution of the event detection sensor 10 (step S24).

In the running state in the high resolution mode, the control unit 50 calculates a relative speed between the host vehicle and another vehicle (step S25), and determines whether the relative speed is less than a certain relative speed (step S26). If the relative speed is less than a certain relative speed, the control unit 50 switches from the high resolution mode to the low resolution mode (step S27). The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced. Then, the control unit 50 monitors the stop of the vehicle (step S28), and returns to step S21 to repeat the series of processes described above until the vehicle stops.

According to the resolution switching control of example 2 described above, the resolution of the event detection sensor 10 can be set to a mode suitable for the relative speed based on the relative speed between the host vehicle and the preceding vehicle.

< example 3>

Example 3 is an example in which the resolution of the event detection sensor 10 is switched based on the vehicle speed of the host vehicle and the number of events (the number of occurrences). The flow of resolution switching control according to example 3 is shown in the flowchart of fig. 19.

During the running of the vehicle, the control unit 50 determines whether the vehicle speed of the host vehicle is greater than or equal to a certain speed (step S31), and then determines whether the number of events detected by the event detection sensor 10 is greater than or equal to a predetermined threshold (step S32). If the vehicle speed is greater than or equal to a certain speed and the number of events is greater than or equal to a predetermined threshold, the control unit 50 sets the resolution of the event detection sensor 10 to the low resolution mode (step S33). In a running state where the vehicle speed is high, a plurality of event detections occur, and the power consumption of the event detection sensor 10 increases, thereby setting the low resolution mode. The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced.

In the running state in the low resolution mode, the control unit 50 determines whether the vehicle speed of the own vehicle is less than a certain speed (step S34), and then determines whether the number of events detected by the event detection sensor 10 is less than a predetermined threshold (step S35). If the vehicle speed is less than a certain speed and the number of events is less than a predetermined threshold, the control unit 50 switches from the low resolution mode to the high resolution mode (step S36) in which event detection can be performed with high accuracy. Then, the control unit 50 monitors the stop of the vehicle (step S37), returns to step S31 to repeat the series of processes described above until it is determined that the vehicle has stopped (yes in S37).

According to the resolution switching control of example 3 described above, the low resolution mode is set in the running state at or above a certain speed at which a plurality of event detections may occur, based on the vehicle speed of the own vehicle and the number of events (the number of events occurring), so that the power consumption of the event detection sensor 10 can be reduced.

< example 4>

Example 4 is an example of switching the resolution of the event detection sensor 10 based on the relative speed with another vehicle and the event number. The flow of resolution switching control according to example 4 is shown in the flowchart of fig. 20.

During the vehicle travel, based on the recognition result of the object recognition unit 40 in fig. 1, the control unit 50 recognizes another vehicle traveling ahead of the own vehicle (step S41), and then calculates the relative speed between the own vehicle and the other vehicle (step S42). Next, the control unit 50 determines whether the relative speed is greater than or equal to a certain relative speed (step S43), and then determines whether the number of events detected by the event detection sensor 10 is greater than or equal to a predetermined threshold (step S44). Then, if the relative speed is greater than or equal to a certain relative speed and the number of events is greater than or equal to a predetermined threshold, the control unit 50 sets a high resolution mode, in which event detection can be performed with high accuracy, to the resolution of the event detection sensor 10 (step S24).

In the driving state of the high resolution mode, the control unit 50 calculates a relative speed between the host vehicle and another vehicle (step S46), determines whether the relative speed is less than a certain relative speed (step S47), and then determines whether the number of events detected by the event detection sensor 10 is less than a predetermined threshold (step S48). Then, if the relative speed is less than a certain relative speed and the event number is less than a predetermined threshold value, the control unit 50 switches from the high resolution mode to the low resolution mode (step S49). The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced. Then, the control unit 50 monitors the stop of the vehicle (step S50), and returns to step S41 to repeat the series of processes described above until the vehicle stops.

According to the resolution switching control of example 4 described above, the resolution of the event detection sensor 10 can be set to a mode suitable for the relative speed and the number of events, based on the relative speed between the host vehicle and the preceding vehicle and the number of events (the number of occurrences) detected by the event detection sensor 10.

< example 5>

Example 5 is an example of switching the resolution of the event detection sensor 10 when the occurrence of congestion is detected. A flow of resolution switching control according to example 5 is shown in the flowchart of fig. 21.

Here, it is assumed that the host vehicle is traveling in a state where the event detection sensor 10 is in the low resolution mode (step S51). The control unit 50 determines whether the host vehicle is traveling at a low speed at which the vehicle speed of the host vehicle is less than a predetermined threshold value during traveling in the low resolution mode (step S52), then determines whether the number of events (the number of occurrences) is greater than or equal to the predetermined threshold value (step S53), and then determines whether the area of the vehicle occupying the angle of view (imageable range) of the event detection sensor 10 is greater than or equal to a certain ratio based on the recognition result of the object recognition unit 40 in fig. 1 (step S54).

The vehicle speed of the own vehicle, the number of events detected by the event detection sensor 10, and the area of the vehicle occupying the angle of view are parameters for detecting the occurrence of congestion. For example, during congestion, the number of vehicles around the host vehicle increases, and the number of events detected by the event detection sensor 10 increases accordingly. During low-speed travel in which the vehicle speed of the host vehicle is less than the threshold value, if the number of events is greater than or equal to the threshold value and the area of the vehicle occupying the angle of view is greater than or equal to a certain ratio, the control unit 50 detects that congestion has occurred (step S55), and since safe driving is generally required during congestion, the control unit 50 switches from the low-resolution mode to the high-resolution mode in which event detection can be performed with high accuracy (step S56).

Next, the control unit 50 determines whether the congestion has been resolved (step S57). The determination process is reverse to step S52, step S53, and step S54 for detecting the occurrence of congestion. That is, the control unit 50 may determine that the congestion has been resolved on the condition that the host vehicle is traveling at a vehicle speed at which the vehicle speed of the host vehicle is greater than or equal to the threshold value, the number of events is less than the threshold value, and the area of the vehicle occupying the angle of view is less than a certain ratio.

In the case where it is determined that the congestion has been resolved, the control unit 50 returns to step S51, switches from the high resolution mode to the low resolution mode, and repeats the series of processes described above. The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced. Further, the control unit 50 monitors the stop of the vehicle (step S58), and in the case where it is determined that the vehicle is stopped, the control unit 50 ends the above-described series of processes when congestion is detected.

According to the resolution switching control of example 5 described above, when the occurrence of congestion is detected, by switching from the low resolution mode to the high resolution mode (in the high resolution mode, event detection can be performed with high accuracy), the occurrence of an event can be detected more accurately, so that safe driving in congestion can be facilitated.

< example 6>

Example 6 is an example of switching the resolution of the event detection sensor 10 when detecting highway travel. The flow of resolution switching control according to example 6 is shown in the flowchart of fig. 22.

Here, it is assumed that the host vehicle is traveling in a state where the event detection sensor 10 is in the low resolution mode (step S61). The control unit 50 determines whether the vehicle speed of the own vehicle is traveling at a high speed at which the vehicle speed of the own vehicle is greater than or equal to a predetermined threshold value during traveling in the low resolution mode (step S62), then determines whether the number of events (the number of occurrences) is greater than or equal to a predetermined threshold value (step S63), and then determines whether the area of the vehicle occupying the angle of view is greater than or equal to a certain ratio based on the recognition result of the object recognition unit 40 in fig. 1 (step S64).

The vehicle speed of the own vehicle, the number of events detected by the event detection sensor 10, and the area of the vehicle occupying the angle of view are parameters for detecting highway travel. For example, during high-speed travel, the number of events detected by the event detection sensor 10 increases. During high-speed travel at which the vehicle speed of the host vehicle is greater than or equal to the predetermined threshold value, if the number of events is greater than or equal to the threshold value and the area of the vehicle occupying the angle of view is greater than or equal to a certain ratio, the control unit 50 detects that the host vehicle is traveling on an expressway (step S65), and since safe driving is generally required during high-speed travel, the control unit 50 switches from the low resolution mode to the high resolution mode in which event detection can be performed with high accuracy (step S66).

Next, the control unit 50 determines whether highway driving has been completed in the driving state in the high resolution mode (step S67). The determination process is reverse to step S62, step S63, and step S64 for detecting highway travel. That is, the control unit 50 may determine that highway driving has been completed on the condition that the host vehicle is traveling at a vehicle speed at which the vehicle speed of the host vehicle is less than the threshold value, the number of events is less than the threshold value, and the area of the vehicle occupying the angle of view is less than a certain ratio.

In the case where it is determined that the highway travel has been completed, the control unit 50 returns to step S61, switches from the high resolution mode to the low resolution mode, and repeats the series of processes described above. The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced. Further, the control unit 50 monitors the stop of the vehicle (step S68), and in the case where it is determined that the vehicle is stopped, the control unit 50 ends the above-described series of processes when the highway travel is detected.

According to the resolution switching control of example 6 described above, when highway travel is detected, by switching from the low resolution mode to the high resolution mode (in the high resolution mode, event detection can be performed with high accuracy), the occurrence of an event can be detected more accurately, so that safe driving on a highway can be facilitated.

< example 7>

Example 7 is an example of switching the resolution of the event detection sensor 10 when a change in the route of the host vehicle is detected. A flow of resolution switching control according to example 7 is shown in the flowchart of fig. 23.

Here, it is assumed that the host vehicle is traveling straight (step S71). The control unit 50 determines whether a route change of a right turn or a left turn is performed during the straight traveling (step S72), and sets a high resolution mode to the resolution of the event detection sensor 10 when it is determined that the route change of the right turn or the left turn is performed (step S73). That is, in a course change of a right turn or a left turn, the change of the scenery around the host vehicle is large, so that a high resolution mode is set in which event detection can be performed with high accuracy. The specific process of determining the route change in step S72 will be described below.

Next, the control unit 50 determines whether the host vehicle is traveling straight (step S74), and when it is determined that the vehicle is traveling straight, the change in the surrounding scenery during straight traveling is smaller than during a course change, so that the control unit 50 switches from the high resolution mode to the low resolution mode (step S75). The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced. The specific process of determining the straight traveling in step S74 will be described below.

Next, the control unit 50 monitors the stop of the vehicle (step S68), and if not, returns to step S71 to switch to straight running, and then repeats the series of processes described above. In the case where it is determined that the vehicle is stopped, the control unit 50 ends the above-described series of processes when the change in the route of the own vehicle is detected.

According to the resolution switching control of example 7 described above, when a route change is detected, by switching from the low resolution mode to the high resolution mode (in the high resolution mode, event detection can be performed with high accuracy), the occurrence of an event can be detected more accurately, so that safe driving during the route change can be facilitated.

Subsequently, an example of a specific process of the determination of the route change in step S72 and the determination of the straight traveling in step S74 will be described.

An example of a specific process of determining a route change is shown in the flowchart of fig. 24A, and an example of a specific process of determining straight-driving is shown in the flowchart of fig. 24B. In any determination process, one of various types of information provided from a vehicle control system 7000 shown in fig. 34 described below, for example, information about the steering angle of a steering wheel, may be used.

In the course change determination process of fig. 24A, the control unit 50 determines whether the steering has been rotated by an angle greater than or equal to a certain angle based on the information of the steering angle of the steering wheel (step S721), and then determines whether the number of events detected by the event detection sensor 10 (the number of occurrences) is greater than or equal to a predetermined threshold (step S722). Then, when the steering rotation is greater than or equal to a certain angle and the number of events is greater than or equal to the threshold value, the control unit 50 detects that there is a right turn or a left turn (step S723), and then returns to the flow of fig. 23 and proceeds to step S73.

In the straight-line travel determination process of fig. 24B, the control unit 50 determines whether the steering is within a certain angle (step S741), and then determines whether the number of events detected by the event detection sensor 10 (the number of occurrences) is less than a predetermined threshold (step S742). Then, when the steering is within a certain angle and the number of events is smaller than the threshold value, the control unit 50 detects that the vehicle is traveling straight (step S743), and then returns to the flow of fig. 23 and proceeds to step S75.

Note that in this example, information on the steering angle of the steering wheel is used to detect a route change, but the present invention is not limited to this, and for example, the acceleration of the steering wheel may be detected and acceleration information may be used.

< example 8>

Example 8 is an example of the resolution of the event detection sensor 10 for each area switching of the pixel array unit. A flow of resolution switching control according to example 8 is shown in the flowchart of fig. 25.

Here, it is assumed that the host vehicle is traveling in a state where the event detection sensor 10 is in the high resolution mode (step S81). In the high resolution mode, the control unit 50 divides the pixel array unit 12 into a plurality of regions in units of a predetermined number of pixels, and calculates the number of events (the number of detection events) detected by the event detection sensor 10 for each region subjected to ROI extraction (step S82).

Next, the control unit 50 determines whether there is an area in which the number of events calculated for each area exceeds a predetermined threshold (step S83), and if there is an area in which the number of events exceeds the threshold, the control unit 50 specifies an area in which the number of events exceeds the threshold (step S84). For example, the region may be specified based on information such as the address of the pixel 11 forming the region.

Next, the control unit 50 sets the low resolution mode to the area designated as the area in which the event number exceeds the threshold value (step S85). Thereafter, the control unit 50 determines whether the number of events detected by the event detection sensor 10 is less than a predetermined threshold in the region where the low resolution mode is set (step S86), and if the number of events is less than the threshold, proceeds to step S81 and returns to the high resolution mode.

If the number of events is not less than the threshold value, the control unit 50 monitors the stop of the vehicle in a state in which the low resolution mode is set for the region specified as the region in which the number of events exceeds the threshold value (step S87), and if it is determined that the vehicle is stopped, ends the above-described series of processes for switching the resolution of each region of the pixel array unit 12.

The technique of example 8 described above, i.e., the technique of switching the resolution of the event detection sensor 10 for each area of the pixel array unit 12, can also be applied to the resolution switching control according to example 1 to the resolution switching control according to example 7 described above.

In the imaging apparatus 1A according to the first embodiment described above, the identification of the other vehicle traveling ahead in example 4 and the determination of the area of the vehicle occupying the angle of view in examples 5 and 6 are performed based on the output of the event detection sensor 10, but this is not a limitation. The identification of other vehicles and the determination of the area of the vehicle occupying the angle of view may also be performed based on the output of the synchronous imaging sensor provided in the imaging apparatus 1B according to the second embodiment described later.

< second embodiment of the present disclosure >

< configuration example of imaging system according to second embodiment >

Fig. 26 is a block diagram showing an example of a system configuration of an imaging system according to a second embodiment of the present disclosure.

As shown in fig. 26, an imaging system 1B according to the second embodiment of the present disclosure includes an event detection sensor 10, an image sensor 20, a motion recognition unit 30, an object recognition unit 40, a control unit 50, an operation mode definition unit 60, and a recording unit 70.

The functions of the event detection sensor 10, the motion recognition unit 30, the object recognition unit 40, the control unit 50, the operation mode definition unit 60, and the recording unit 70 are as described in the imaging system 1A according to the first embodiment. The imaging system 1B according to the second embodiment can also be mounted and used on a mobile body such as a vehicle, similarly to the imaging system 1A according to the first embodiment.

< example of configuration of image sensor >

A basic configuration of the image sensor 20 in the imaging system 1B according to the second embodiment will be described. Here, as the image sensor 20, a CMOS image sensor, which is a kind of XY address type image sensor, will be described as an example. A CMOS image sensor is an image sensor manufactured by applying or partially using a CMOS process. However, the image sensor 20 is not limited to the CMOS image sensor.

[ configuration example of CMOS image sensor ]

Fig. 27 is a block diagram showing an outline of the configuration of a CMOS image sensor as an example of the image sensor 20 in the imaging system 1B according to the second embodiment.

The image sensor 20 according to this example includes a pixel array unit 22 in which pixels 21 including light receiving units (photoelectric conversion units) are arranged in a row direction and a column direction, that is, two-dimensionally in a matrix, and peripheral circuit units of the pixel array unit 22. Here, the row direction refers to the arrangement direction of the pixels 21 in the pixel row, and the column direction refers to the arrangement direction of the pixels 21 in the pixel column. The pixel 21 generates and accumulates photocharge corresponding to the received-light amount by performing photoelectric conversion.

The image sensor 20 according to the present example is an RGB sensor in which, for example, red (R), green (G), and blue (B) color filters are included in each pixel 21 of a pixel array unit 22. However, the image sensor 20 is not limited to the RGB sensor.

For example, the peripheral circuit units of the pixel array unit 22 include a row selection unit 23, a constant current source unit 24, an analog-to-digital conversion unit 25, a horizontal transfer scanning unit 26, a signal processing unit 27, a timing control unit 28, and the like.

In the pixel array unit 22, a pixel driving line 31 is provided for a matrix-like pixel array1To 31m(hereinafter, may be collectively referred to as "pixel driving line 31") is wired in the row direction for each pixel row. In addition, the vertical signal line 321To 32n(hereinafter, may be collectively referred to as "vertical signal line 32") is wired in the column direction for each pixel column. The pixel driving line 31 transmits a driving signal for performing driving during reading of a signal from each pixel 21. In fig. 1, the pixel driving line 31 is illustrated as one wiring, but the wiring is not limited to one wiring. One end of the pixel driving line 31 is connected to an output terminal corresponding to each row of the row selecting unit 23.

Next, a description will be given of each circuit unit of the peripheral circuit units of the pixel array unit 22, that is, the row selecting unit 23, the constant current source unit 24, the analog-to-digital converting unit 25, the horizontal transfer scanning unit 26, the signal processing unit 27, and the timing control unit 28.

The row selection unit 23 includes a shift register, an address decoder, and the like, and controls scanning of a pixel row and addressing of the pixel row when each pixel 21 of the pixel array unit 22 is selected. Although a specific configuration of the row selecting unit 23 is not shown, it generally includes two scanning systems, a read scanning system and a sweep (sweep) scanning system.

The read scanning system sequentially performs selective scanning of the pixels 21 of the pixel array unit 22 in units of rows to read pixel signals from the pixels 21. The pixel signal read from the pixel 21 is an analog signal. The sweep scanning system performs sweep scanning on a read row on which the read scanning system performs read scanning at a shutter speed time before the read scanning.

By the sweep scanning of the sweep scanning system, the unnecessary electric charges are swept from the light receiving units (photoelectric conversion units) of the pixels 21 of the read row, whereby the light receiving units are reset. Then, a so-called electronic shutter operation is performed by sweeping (resetting) the unnecessary electric charges with the sweep scanning system. Here, the electronic shutter operation refers to an operation of discarding the photo-charges of the light receiving unit and starting a new exposure (starting accumulation of the photo-charges).

The constant current source unit 24 includes a plurality of current sources I (see fig. 18) (for example, the plurality of current sources I include a current source connected to each vertical signal line 32 for each pixel column1To 32nMOS transistor) and through each vertical signal line 321To 32nA bias current is supplied to each pixel 21 of the pixel row selectively scanned by the row selection unit 23.

The analog-to-digital conversion unit 25 includes a group of a plurality of analog-to-digital converters provided corresponding to the pixel columns of the pixel array unit 22 (e.g., provided for each pixel column). The analog-to-digital conversion unit 25 is a column-parallel type analog-to-digital conversion unit that will pass through the vertical signal line 32 of each pixel column1To 32nThe analog pixel signal output from each of the pixels is converted into a digital signal.

For example, a single slope type analog-to-digital converter as an example of the reference signal comparison type analog-to-digital converter may be used as the analog-to-digital converter in the column-parallel analog-to-digital conversion unit 25. However, the analog-to-digital converter is not limited to the single slope type analog-to-digital converter, and a sequential comparison type analog-to-digital converter and a delta-sigma modulation type (delta-sigma modulation type) analog-to-digital converter or the like may be used.

As for the example of the analog-to-digital converter in the column-parallel analog-to-digital conversion unit 25, the same applies to the analog-to-digital converter in the analog-to-digital conversion unit constituting the column processing unit 15 (see fig. 3) of the above-described event detection sensor 10.

The horizontal transfer scanning unit 26 includes a shift register, an address decoder, and the like, and controls scanning of pixel columns and addressing of the pixel columns when reading a signal of each pixel 21 of the pixel array unit 22. The pixel signals converted into digital signals by the analog-to-digital conversion unit 25 are read to the horizontal transfer lines (horizontal output lines) 29 in units of pixel columns under the control of the horizontal transfer scanning unit 26.

The signal processing unit 27 performs predetermined signal processing on the digital pixel signals supplied through the horizontal transfer line 29 to generate two-dimensional image data. For example, the signal processing unit 27 corrects vertical line defects and point defects, clamps signals, or performs digital signal processing such as parallel-to-serial conversion, compression, encoding, addition, averaging, and intermittent operations. The signal processing unit 27 outputs the generated image data to a subsequent apparatus as an output signal of the image sensor 20.

The timing control unit 28 generates various timing signals, clock signals, control signals, and the like based on a vertical synchronization signal VD, a horizontal synchronization signal HD, and further a master clock MCK (not shown) and the like supplied from the outside. Then, the timing control unit 28 performs drive control of the row selecting unit 23, the constant current source unit 24, the analog-to-digital converting unit 25, the horizontal transfer scanning unit 26, the signal processing unit 27, and the like based on these generated signals.

Under the control of the timing control unit 28, the image sensor 20 performs imaging in synchronization with a synchronization signal (for example, a vertical synchronization signal VD). That is, the image sensor 20 is a synchronous imaging device that performs imaging at a predetermined frame rate.

[ example of Circuit configuration of Pixel ]

Fig. 28 is a circuit diagram showing an example of the circuit configuration of the pixels 21 of the pixel array unit 22 in the image sensor 20.

For example, the pixel 21 includes a photodiode 211 as a light receiving unit (photoelectric conversion unit). The pixel 21 includes, in addition to the photodiode 211, a transfer transistor 212, a reset transistor 213, an amplification transistor 214, and a selection transistor 215.

Note that, here, for example, N-type MOS transistors are used as the four transistors of the transfer transistor 212, the reset transistor 213, the amplification transistor 214, and the selection transistor 215, but combinations of conductivity types of the four transistors 212 to 215 illustrated here are only examples, and are not limited to combinations of these.

As for the pixels 21, as the above-described pixel driving line 31, a plurality of pixel driving lines are wired in common to each of the pixels 21 in the same pixel row. A plurality of pixel driving lines are connected to output terminals corresponding to each pixel row of the row selecting unit 23 in units of pixel rows. The row selecting unit 23 appropriately outputs a transfer signal TRG, a reset signal RST and a selection signal SEL to a plurality of pixel driving lines.

In the photodiode 211, an anode electrode is connected to a low-potential-side power supply (e.g., ground), and received light is photoelectrically converted into photocharges (here, photoelectrons) having an electric charge amount corresponding to the light amount, and the photocharges are accumulated. A cathode electrode of the photodiode 211 is electrically connected to a gate electrode of the amplifying transistor 214 via the transfer transistor 212. Here, a region to which the gate electrode of the amplification transistor 214 is electrically connected is a floating diffusion (floating diffusion region/impurity diffusion region) FD. The floating diffusion FD is a charge-voltage conversion unit that converts charges into voltages.

High level (e.g. V)DDLevel) is supplied from the row selecting unit 23 to the gate electrode of the transfer transistor 212. When turned on in response to a transfer signal TRG, the transfer transistor 212 transfers the photocharge, which is photoelectrically converted by the photodiode 211 and accumulated in the photodiode 211, to the floating diffusion FD.

The reset transistor 213 is connected to the power supply voltage VDDAnd the floating diffusion FD. A reset signal RST active at a high level is supplied from the row selecting unit 23 to the gate electrode of the reset transistor 213. The reset transistor 213 is turned on in response to a reset signal RST, and discharges the charge of the floating diffusion FD to the power supply voltage VDDTo reset the floating diffusion FD.

In the amplifying transistor 214, the gate electrode is connected to the floating diffusion FD, and the drain electrode is connected to the power supply voltage VDDThe power line of (1). The amplification transistor 214 functions as an input unit of a source follower that reads a signal obtained by photoelectric conversion in the photodiode 211. The source electrode of the amplifying transistor 214 is connected to the vertical signal line 32 via the selection transistor 215. Then, the amplification transistor 214 and the current source I connected to one end of the vertical signal line 32 constitute a source follower that converts the voltage of the floating diffusion FD into the potential of the vertical signal line 32.

In the selection transistor 215, the drain electrode is connected to the source electrode of the amplification transistor 214, and the source electrode is connected to the vertical signal line 32. A selection signal SEL active at a high level is supplied from the row selecting unit 23 to the gate electrode of the selection transistor 215. The selection transistor 215 is turned on in response to a selection signal SEL, thereby selecting the pixel 21 to transmit the signal output from the amplification transistor 214 to the vertical signal line 32.

Note that here, as a pixel circuit of the pixel 21, a 4Tr configuration is exemplified, including the transfer transistor 212, the reset transistor 213, the amplification transistor 214, and the selection transistor 215, that is, including four transistors (Tr), but this is not a limitation. For example, a 3TR configuration in which the selection transistor 215 is omitted and the function of the selection transistor 25 is given to the amplification transistor 214 may be adopted, or a configuration of 5TR or more in which the number of transistors is increased may be adopted as necessary.

[ configuration example of chip Structure ]

Examples of the chip (semiconductor integrated circuit) structure of the image sensor 20 having the above-described configuration include a horizontally mounted chip structure and a stacked chip structure. In any of the image sensors 20 having the horizontally mounted chip structure and the stacked chip structure, when the substrate surface on the side where the wiring layers are arranged is the front surface of the pixel 21, a front illumination pixel structure that captures light emitted from the front surface side may be employed, or a rear illumination pixel structure that captures light emitted from the rear surface side (the opposite side of the front surface) may be employed. The horizontally mounted chip structure and the stacked chip structure will be described below.

(chip Structure for horizontal mounting)

Fig. 29 is a plan view showing the outline of the horizontally mounted chip structure of the image sensor 20.

As shown in fig. 29, a horizontally mounted chip structure (so-called horizontal mounting structure) has a structure in which a circuit portion surrounding the pixel array unit 22 is formed on a semiconductor substrate 201, as with the pixel array unit 22 in which the pixels 21 are arranged in a matrix. Specifically, on the same semiconductor substrate 201 as the pixel array unit 22, a row selection unit 23, a constant current source unit 24, an analog-to-digital conversion unit 25, a horizontal transfer scanning unit 26, a signal processing unit 27, a timing control unit 28, and the like are formed.

(stacked chip Structure)

Fig. 30 is an exploded perspective view showing the outline of the stacked-chip structure of the image sensor 20.

As shown in fig. 30, the stacked chip structure (so-called stacked structure) has a structure in which at least two semiconductor substrates of a first semiconductor substrate 202 and a second semiconductor substrate 203 are stacked. In this laminated structure, the pixel array unit 22 is formed on the first semiconductor substrate 202 of the first layer. Further, circuit portions such as the row selecting unit 23, the constant current source unit 24, the analog-to-digital converting unit 25, the horizontal transfer scanning unit 26, the signal processing unit 27, and the timing control unit 28 are formed on the second semiconductor substrate 203 of the second layer. Then, the first semiconductor substrate 202 of the first layer and the second semiconductor substrate 203 of the second layer are electrically connected together through the connection portions 33A and 33B (such as VIA and Cu — Cu bonding).

According to the image sensor 20 having the stacked-layer structure, a process suitable for manufacturing the pixels 21 may be applied to the first semiconductor substrate 202 of the first layer, and a process suitable for manufacturing the circuit portion may be applied to the second semiconductor substrate 203 of the second layer, so that the process may be optimized when manufacturing the image sensor 20. In particular, advanced processes may be applied to manufacture the circuit portion.

Note that here, a stacked structure having a two-layer structure in which the first semiconductor substrate 202 and the second semiconductor substrate 203 are stacked is illustrated, but the stacked structure is not limited to the two-layer structure, and may have three or more layers. Then, in the case of a stacked structure of three or more layers, circuit portions such as the row selecting unit 23, the constant current source unit 24, the analog-to-digital converting unit 25, the horizontal transfer scanning unit 26, and the signal processing unit 27 may be dispersedly formed on the second layer and subsequent layers of the semiconductor substrate.

In the imaging system 1B according to the second embodiment having the above-described configuration, the event detection sensor 10 and the image sensor 20 perform the event detection operation and the imaging operation, respectively, under the control of the control unit 50. The event signal (event data) output from the event detection sensor 10 and the image data output from the image sensor 20 are supplied to the motion recognition unit 30.

The motion recognition unit 30 recognizes (detects) the motion of the object based on the event signal output from the event detection sensor 10. More specifically, the motion recognition unit 30 generates event frames by framing event signals output from the event detection sensor 10, and performs motion detection between the event frames. In the case where object recognition of an event is performed using an event signal output from the event detection sensor 10, the object recognition unit 40 performs object recognition based on the result of motion detection given by the motion recognition unit 30.

The image sensor 20 includes a synchronous imaging device, and in order to perform imaging at a predetermined frame rate (e.g., a fixed frame rate), it is not necessary to generate an event frame as in the case of the event detection sensor 10. Accordingly, the image data output from the image sensor 20 in units of frames is directly supplied to the object recognition unit 40. Then, the object recognition unit 40 performs object recognition in units of frames based on the image data.

Incidentally, since the event detection sensor 10 including the asynchronous imaging device has a pixel configuration including the event detection unit 63, the pixel size must be larger than that of the image sensor 20 including the synchronous imaging device. Therefore, the event detection sensor 10 has a lower resolution than the image sensor 20 that performs imaging at a fixed frame rate. On the other hand, the image sensor 20 including the synchronous imaging device is superior to the asynchronous imaging device in terms of resolution.

Hereinafter, a specific example will be described in which, in the imaging system 1B according to the second embodiment, the resolution of the event detection sensor 10 is switched according to the running state of the vehicle to enable the event to be accurately detected, regardless of the running state of the vehicle. In each example described below, the resolution switching is performed under the control of the control unit 50 of fig. 26. In this control, various types of information, such as the vehicle speed of the host vehicle, are provided from the vehicle control system 7000 shown in fig. 34 to the control unit 50 via the interface 80.

< example 9>

Example 9 is an example of switching the resolution of the event detection sensor 10 when the occurrence of congestion is detected. A flow of resolution switching control according to example 9 is shown in the flowchart of fig. 31.

Here, it is assumed that the host vehicle is traveling in a state where the event detection sensor 10 is in the low resolution mode (step S91). The control unit 50 determines whether the host vehicle is traveling at a low speed (the vehicle speed of the host vehicle is less than a predetermined threshold) during traveling in the low resolution mode (step S92), and then determines whether an object (e.g., a vehicle) can be detected only by information of the event detection sensor 10 including the asynchronous imaging device (step S93).

If the vehicle can be identified only by the information of the event detection sensor 10, the control unit 50 detects the vehicle using the information of the event detection sensor 10 (e.g., event data indicating the occurrence of an event) (step S94), and then specifies an area within a viewing angle that can be detected as the vehicle (step S95). If the vehicle cannot be detected only by the information of the event detection sensor 10, the control unit 50 detects the vehicle using the information of the event detection device sensor and the image sensor 20 (step S96), and then proceeds to the process of step S95.

Next, the control unit 50 determines whether the area that can be detected as the vehicle is greater than or equal to a predetermined threshold (step S97), and if less than the predetermined threshold, returns to the process of step S95. Further, if greater than or equal to the predetermined threshold, the control unit 50 detects that congestion has occurred (step S98), and since safe driving is generally required during congestion, the control unit 50 switches from the low resolution mode to the high resolution mode (step S99) in which event detection can be performed with high accuracy, for the area specified in step S95.

Next, the control unit 50 determines whether the congestion has been resolved (step S100). This determination process is contrary to the case of detecting the occurrence of congestion. That is, the control unit 50 may determine that the congestion has been resolved on the condition that the vehicle speed of the host vehicle is greater than or equal to the threshold value and that it may be detected that the area of the vehicle is less than the predetermined threshold value.

In the case where it is determined that the congestion has been resolved, the control unit 50 returns to step S91, switches from the high resolution mode to the low resolution mode, and repeats the series of processes described above. The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced. Further, the control unit 50 monitors the stop of the vehicle (step S101), and in the case where it is determined that the vehicle is stopped, the control unit 50 ends the above-described series of processes when the congestion is detected.

According to the resolution switching control of example 9 described above, by using the image data of the image sensor 20 having a resolution superior to that of the event detection sensor 10, it is possible to accurately detect the occurrence of congestion. Then, when the occurrence of congestion is detected, by switching from the low resolution mode to the high resolution mode in which the occurrence of an event is easily detected, the occurrence of an event can be more accurately detected, so that safe driving in congestion can be facilitated.

< example 10>

Example 10 is an example of switching the resolution of the event detection sensor 10 when detecting highway travel. The flow of resolution switching control according to example 10 is shown in the flowchart of fig. 32.

Here, it is assumed that the host vehicle travels in a state where the event detection sensor 10 is in the low resolution mode (step S111). The control unit 50 determines whether the vehicle speed of the own vehicle is traveling at a high speed (the vehicle speed of the own vehicle is greater than or equal to a predetermined threshold) during traveling in the low resolution mode (step S112), and then determines whether an object (e.g., a vehicle) can be detected only by the information of the event detection sensor 10 (step S113).

If the vehicle can be identified only by the information of the event detection sensor 10, the control unit 50 detects the vehicle using the information of the event detection sensor 10 (e.g., event data indicating the occurrence of an event) (step S114), and then specifies an area within the angle of view that can be detected as the vehicle (step S115). If the vehicle cannot be detected only by the information of the event detection sensor 10, the control unit 50 detects the vehicle using the information of the event detection device sensor and the image sensor 20 (step S116), and then proceeds to the process of step S115.

Next, the control unit 50 determines whether the area that can be detected as the vehicle is greater than or equal to a predetermined threshold value (step S117), and if less than the predetermined threshold value, returns to the process of step S115. Further, if greater than or equal to the predetermined threshold value, the control unit 50 detects that the vehicle is traveling on the expressway (step S118), and since safe driving is generally required during high-speed traveling, the control unit 50 switches from the low resolution mode to the high resolution mode (step S119) in which event detection can be performed with high accuracy, for the area specified in step S115.

Next, the control unit 50 determines whether highway travel has been completed in the travel state in the high resolution mode (step S120). This determination is contrary to the case of detecting highway driving. That is, the control unit 50 may determine that highway travel has been completed on the condition that the vehicle speed of the host vehicle is less than the threshold value and that the area in which the vehicle is detectable is less than the predetermined threshold value.

In the case where it is determined that the highway travel has been completed, the control unit 50 returns to step S111, switches from the high resolution mode to the low resolution mode, and repeats the series of processes described above. The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced. Further, the control unit 50 monitors the stop of the vehicle (step S121), and in the case where it is determined that the vehicle is stopped, the control unit 50 ends the above-described series of processes when the highway travel is detected.

According to the resolution switching control of the above example 10, by using the image data of the image sensor 20 having a resolution superior to that of the event detection sensor 10, it is possible to accurately detect highway driving. Then, when the highway travel is detected, by switching from the low resolution mode to the high resolution mode (in the high resolution mode, event detection can be performed with high accuracy), the occurrence of the event can be detected more accurately, so that safe driving on the highway can be facilitated.

< example 11>

Example 11 is an example of switching the resolution of the event detection sensor 10 in the auto cruise mode in the case where the imaging system 1B according to the second embodiment is mounted on a vehicle having an auto cruise function. A flow of resolution switching control according to example 11 is shown in the flowchart of fig. 33.

The auto cruise mode is selected by a user (driver), and when the user turns on (selects) the auto cruise mode, information on the turning on of the auto cruise mode is given to the control unit 50 from the vehicle control system 7000 shown in fig. 34.

The control unit 50 monitors the user' S selection of the auto cruise mode (step S131), and sets a low resolution mode to the resolution of the event detection sensor 10 when it is determined that the auto cruise mode is on (step S132). Next, the control unit 50 specifies an area of the vehicle that can be detected within the angle of view (step S133), and switches from the low resolution mode to the high resolution mode for the specified area, in which event detection can be performed with high accuracy (step S135).

Next, the control unit 50 specifies a vehicle traveling ahead of the host vehicle (step S135), and shifts to the low resolution mode for a region other than the specified vehicle (step S136). The low resolution mode is set, whereby the power consumption of the event detection sensor 10 can be reduced.

Next, the control unit 50 calculates the relative speed between the host vehicle and the preceding vehicle (step S137), then determines whether the relative speed between the host vehicle and the preceding vehicle is zero (step S138), changes the vehicle speed of the host vehicle if the relative speed is not zero (step S139), and then proceeds to step S138.

If the relative speed with the preceding vehicle is zero, the control unit 50 maintains the vehicle speed of the own vehicle (step S140), then monitors that the user turns off the auto cruise mode (step S141), and if the auto cruise mode is to be continued, returns to step S132, and repeats the series of processes described above. Further, if the auto cruise mode is off, the control unit 50 monitors the stop of the vehicle (step S142), and in the case where it is determined that the vehicle is stopped, the control unit 50 ends the series of processes described above.

As described above, according to the resolution switching control according to example 11, even in the case where the imaging system 1B according to the second embodiment is mounted on the vehicle having the auto cruise function, the control of switching the resolution of the event detection sensor 10 according to the running state can be performed in the auto cruise mode.

< modification >

The technique according to the present disclosure has been described above based on the preferred embodiments, but the technique according to the present disclosure is not limited to the embodiments. The configurations and structures of the imaging apparatus and the imaging system described in the above embodiments are examples and may be changed. For example, in the above-described embodiment, the pixel signal generating unit 62 is provided for each light receiving unit 61 to form the pixel 11; however, a configuration may be adopted in which blocks are made in units of a plurality of light receiving units 61, a pixel signal generating unit 62 is provided in each pixel block, and the pixel signal generating unit 62 is shared by the plurality of light receiving units 61 in the pixel block.

< application example of the technique according to the present disclosure >

The techniques according to the present disclosure may be applied to a variety of products. A more specific application example will be described below. The technology according to the present disclosure may be implemented as an imaging apparatus or an imaging system mounted on any type of moving body, for example, an automobile, an electric automobile, a hybrid electric automobile, a motorcycle, a bicycle, a personal mobile device, an airplane, an unmanned aerial vehicle, a ship, a robot, a construction machine, an agricultural machine (tractor), or the like.

< moving body >

Fig. 34 is a block diagram showing a schematic configuration example of a vehicle control system 7000, and the vehicle control system 7000 is an example of a mobile body control system to which the technique according to the present disclosure can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected to each other via a communication network 7010. In the example shown in fig. 34, the vehicle control system 7000 includes a drive system control unit 7100, a vehicle body system control unit 7200, a battery control unit 7300, a vehicle external information detection unit 7400, a vehicle internal information detection unit 7500, and an integrated control unit 7600. For example, the communication network 7010 that connects these plural control units to each other may be an in-vehicle communication network conforming to any standard, such as a Controller Area Network (CAN), a Local Interconnect Network (LIN), a Local Area Network (LAN), or FlexRay (registered trademark).

Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores the programs executed by the microcomputer, parameters used in various calculations, and the like, and a drive circuit that drives a device to be controlled. Each control unit includes a network I/F for communicating with other control units via the communication network 7010 and a communication I/F for communicating with devices, sensors, and the like inside and outside the vehicle via wired communication or wireless communication. Fig. 34 shows a microcomputer 7610, a general communication I/F7620, an exclusive communication I/F7630, a positioning unit 7640, a beacon receiving unit 7650, a vehicle interior device I/F7660, an audio image output unit 7670, an in-vehicle network I/F7680, and a storage unit 7690, which are functional configurations of an integrated control unit 7600. Similarly, the other control units each include a microcomputer, a communication I/F, a storage unit, and the like.

The drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 7100 functions as a control device of a drive force generation device (e.g., an internal combustion engine or a drive motor) for generating a drive force of the vehicle, a drive force transmission mechanism for transmitting the drive force to wheels, a steering mechanism for adjusting a steering angle of the vehicle, a brake device for generating a braking force of the vehicle, and the like. The drive system control unit 7100 may include a function as a control device, such as an anti-lock brake system (ABS) or an Electronic Stability Control (ESC).

The drive system control unit 7100 is connected to a vehicle state detection unit 7110. For example, the vehicle state detecting unit 7110 includes at least one of a gyro sensor that detects an angular velocity of a shaft rotational motion of a vehicle body, an acceleration sensor that detects an acceleration of the vehicle, or a sensor for detecting an operation amount of an accelerator pedal, an operation amount of a brake pedal, a steering angle of a steering wheel, an engine speed, a wheel rotational speed, or the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering apparatus, a brake apparatus, and the like.

The vehicle body system control unit 7200 controls the operation of various devices mounted to the vehicle body according to various programs. For example, the vehicle body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device of various lamps (e.g., headlights, tail lights, brake lights, turn signal lights, and fog lights). In this case, a radio wave or a signal of various switches transmitted from the portable device instead of the key may be input to the vehicle body system control unit 7200. The vehicle body system control unit 7200 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.

The battery control unit 7300 controls the secondary battery 7310 as a power source for driving the motor according to various programs. For example, information such as battery temperature, battery output voltage, and remaining battery capacity is input to battery control unit 7300 from a battery device including secondary battery 7310. Battery control unit 7300 uses these signals to perform arithmetic processing, and performs temperature adjustment control of secondary battery 7310 or control of a cooling apparatus or the like provided in the battery apparatus.

The vehicle exterior information detecting unit 7400 detects information on the exterior of the vehicle on which the vehicle control system 7000 is mounted. For example, at least one of the imaging unit 7410 or the vehicle external information detection unit 7420 is connected to the vehicle external information detection unit 7400. The imaging unit 7410 includes at least one of a ToF (time of flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. For the vehicle external information detection unit 7420, for example, at least one of an environment sensor for detecting the current climate or weather, or a peripheral information detection sensor for detecting other vehicles, obstacles, pedestrians, and the like around the vehicle in which the vehicle control system 7000 is installed.

For example, the environmental sensor may be at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, or a snow sensor that detects snowfall. The peripheral information detection sensor may be at least one of an ultrasonic sensor, a radar device, or a light detection and ranging (LIDAR) device (laser imaging detection and ranging (LIDAR) device). The imaging unit 7410 and the vehicle external information detection unit 7420 may be provided as separate sensors or devices, respectively, or may be provided as devices in which a plurality of sensors or devices are integrated together.

Here, fig. 35 shows an example of the mounting positions of the imaging unit 7410 and the vehicle outside information detecting unit 7420. For example, the imaging units 7910, 7912, 7914, 7916, and 7918 are provided at least one position of a front nose, a rear view mirror, a rear bumper, a rear door of the vehicle 7900, an upper portion of a windshield inside the vehicle. The imaging unit 7910 provided on the nose and the imaging unit 7918 provided on the upper portion of the windshield inside the vehicle mainly acquire images in front of the vehicle 7900. The imaging units 7912 and 7914 provided on the rear view mirror mainly acquire images of the sides of the vehicle 7900. The imaging unit 7916 provided on the rear bumper or the rear door mainly acquires an image of the rear of the vehicle 7900. The imaging unit 7918 provided at the upper portion of the windshield inside the vehicle is mainly used to detect a preceding vehicle, a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, and the like.

Note that fig. 35 shows an example of imaging ranges of the imaging units 7910, 7912, 7914, and 7916. The imaging range a indicates an imaging range of the imaging unit 7910 provided on the nose, the imaging ranges b and c indicate imaging ranges of the imaging units 7912 and 7914 provided on the rear view mirror, respectively, and the imaging range d indicates an imaging range of the imaging unit 7916 provided on the rear bumper or the rear door. For example, the image data captured by the imaging units 7910, 7912, 7914, and 7916 are superimposed on each other, thereby obtaining an overhead image of the vehicle 7900 viewed from above.

For example, the vehicle outside information detecting units 7920, 7922, 7924, 7926, 7928, and 7930 are provided at the front, rear, side, corner, and upper portions of the windshield in the vehicle interior of the vehicle 7900, and may be ultrasonic sensors or radar devices. For example, the vehicle external information detecting units 7920, 7926, and 7930 provided at the front nose, the rear bumper, the rear door, and the upper portion of the windshield inside the vehicle 7900 may be LIDAR devices. These vehicle external information detecting units 7920 to 7930 are mainly used to detect a preceding vehicle, a pedestrian, an obstacle, and the like.

The description is continued back to fig. 34. The vehicle external information detection unit 7400 causes the imaging unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Furthermore, vehicle external information detecting section 7400 receives detection information from connected vehicle external information detecting section 7420. In the case where the vehicle external information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the vehicle external information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives information on the received reflected waves. Based on the received information, the vehicle external information detection unit 7400 may perform object detection processing or distance detection processing on a person, a vehicle, an obstacle, a sign, a character on a road surface, or the like. Based on the received information, the vehicle external information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, and the like. Based on the received information, the vehicle external information detection unit 7400 may calculate a distance to an object outside the vehicle.

Further, based on the received image data, the vehicle external information detection unit 7400 may perform distance detection processing or image recognition processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like. The vehicle exterior information detecting unit 7400 may perform processing such as distortion correction or alignment on the received image data, and synthesize the image data captured by the different imaging units 7410 to generate an overhead view image or a panoramic image. The vehicle external information detection unit 7400 may perform viewpoint conversion processing using image data captured by the different imaging unit 7410.

The vehicle interior information detection unit 7500 detects information about the vehicle interior. For example, the vehicle interior information detection unit 7500 is connected to a driver state detection unit 7510 that detects the state of the driver. The driver state detection unit 7510 may include a camera that captures an image of the driver, a bio-sensor that detects bio-information of the driver, a microphone that collects sound inside the vehicle, and the like. For example, the biosensor is provided on a seat surface, a steering wheel, or the like, and detects biological information of a passenger sitting on the seat or a driver holding the steering wheel. Based on the detected information input from the driver state detection unit 7510, the vehicle interior information detection unit 7500 can calculate the degree of fatigue or concentration of the driver, and can determine whether the driver is dozing. The vehicle interior information detection unit 7500 may perform noise cancellation processing or the like on the collected sound signal.

The integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs. The integrated control unit 7600 is connected to the input unit 7800. The input unit 7800 is implemented by devices such as a touch panel, buttons, a microphone, switches, a lever, and the like, and a passenger can perform input operations thereon. Data obtained by performing voice recognition on the sound input from the microphone may be input to the integrated control unit 7600. For example, the input unit 7800 may be a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile phone or a Personal Digital Assistant (PDA) adapted to the operation of the vehicle control system 7000. For example, the input unit 7800 may be a camera, and in this case, the passenger may input information through gestures. Alternatively, data obtained by detecting movement of a wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit or the like that generates an input signal based on information input by the passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600. Through the operation input unit 7800, the passenger or the like inputs various data or gives instructions to the vehicle control system 7000 to perform processing operations.

The storage unit 7690 may include a Read Only Memory (ROM) that stores various programs executed by the microcomputer and a Random Access Memory (RAM) that stores various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be implemented by a magnetic storage device, such as a Hard Disk Drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.

General communication I/F7620 is a general communication I/F that coordinates communication with various devices present in external environment 7750. The general communication I/F7620 may implement a cellular communication protocol such as global system for mobile communications (GSM (registered trademark)), WiMAX, Long Term Evolution (LTE), or LTE-advanced (LTE-a), or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi (registered trademark)) and bluetooth (registered trademark). For example, the general communication I/F7620 may be connected to a device (e.g., an application server or a control server) on an external network (e.g., the internet, a cloud network, or a company private network) through a base station or an access point. Further, for example, the general communication I/F7620 may be connected to a terminal existing in the vicinity of the vehicle (for example, a terminal of a driver, a pedestrian, or a shop or a Machine Type Communication (MTC) terminal) by using a peer-to-peer (P2P) technology.

The dedicated communication I/F7630 is a communication I/F supporting a communication protocol designed for a vehicle. For example, the dedicated communication I/F7630 may implement a standard protocol, such as wireless access in a vehicular environment (WAVE), which is a combination of IEEE802.11p of a lower layer and IEEE1609 of an upper layer, a Dedicated Short Range Communication (DSRC), or a cellular communication protocol. Typically, the dedicated communication I/F7630 performs V2X communication, and the V2X communication is a concept including one or more of vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication.

For example, the positioning unit 7640 receives Global Navigation Satellite System (GNSS) signals (e.g., Global Positioning System (GPS) signals from GPS satellites) from GNSS satellites to perform positioning, and generates position information including latitude, longitude, and altitude of the vehicle. Note that the positioning unit 7640 may specify the current position by exchanging signals with a wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or a smartphone having a positioning function.

For example, the beacon receiving unit 7650 receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on a road, and acquires information such as the current location, congestion, road closure, or required time. Note that the function of the beacon reception unit 7650 may be included in the dedicated communication I/F7630 described above.

The vehicle interior device I/F7660 is a communication interface that coordinates the connection between the microcomputer 7610 and various vehicle interior devices 7760 existing inside the vehicle. The vehicle interior device I/F7660 may establish a wireless connection using a wireless communication protocol, for example, that of wireless LAN, bluetooth (registered trademark), Near Field Communication (NFC), or wireless usb (wusb). Further, the vehicle interior device I/F7660 may establish a wired connection, such as a Universal Serial Bus (USB), a high-definition multimedia interface (HDMI) (registered trademark), or a mobile high-definition link (MHL), via a connection terminal (not shown) (and a cable as necessary). For example, the vehicle interior device 7760 may include at least one of a mobile device or a wearable device owned by a passenger, or an information device carried on or attached to a vehicle. Further, the vehicle interior device 7760 may include a navigation device that performs a search for a route to an arbitrary destination. The vehicle interior device I/F7660 exchanges control signals or data signals with these vehicle interior devices 7760.

The in-vehicle network I/F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The in-vehicle network I/F7680 transmits and receives signals and the like according to a predetermined protocol supported by the communication network 7010.

The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 based on various programs according to information acquired via at least one of the general communication I/F7620, the special communication I/F7630, the positioning unit 7640, the beacon receiving unit 7650, the vehicle internal device I/F7660, and the in-vehicle network I/F7680. For example, based on the acquired information of the inside and outside of the vehicle, the microcomputer 7610 can calculate a control target value of the driving force generation apparatus, the steering mechanism, or the brake apparatus, and output a control command to the drive system control unit 7100. For example, the microcomputer 7610 may execute cooperative control with the aim of realizing functions of an Advanced Driver Assistance System (ADAS) including collision avoidance or collision mitigation of the vehicle, following travel based on inter-vehicle distance, vehicle speed keeping travel, collision warning of the vehicle, lane departure warning of the vehicle, and the like. Further, by controlling the driving force generation device, the steering mechanism, the brake device, and the like based on the acquired information of the vehicle periphery, the microcomputer 7610 can perform cooperative control for automatic driving or the like that autonomously travels without depending on the operation of the driver.

The microcomputer 7610 may generate three-dimensional distance information between the vehicle and an object (e.g., surrounding building or person) based on information acquired via at least one of the general communication I/F7620, the special communication I/F7630, the positioning unit 7640, the beacon receiving unit 7650, the vehicle internal device I/F7660, or the in-vehicle network I/F7680, and create local map information including peripheral information of the current position of the vehicle. Further, based on the acquired information, the microcomputer 7610 can predict a danger such as a collision of a vehicle, an approach of a pedestrian, or the like, or an entrance into a closed road, and generate a warning signal. For example, the warning signal may be a signal for generating a warning sound or turning on a warning lamp.

The audio image output unit 7670 transmits an output signal of at least one of audio or an image to an output device capable of visually or audibly notifying information to a passenger inside the vehicle or the outside of the vehicle. In the example of fig. 34, an audio speaker 7710, a display unit 7720, and a dashboard 7730 are exemplified as the output devices. For example, the display unit 7720 may include at least one of an in-vehicle display and a flat-view display. The display unit 7720 may have an Augmented Reality (AR) display function. The output device may be another device than these devices, such as a light, a projector, or a wearable device (e.g., a headphone and a glasses-type display worn by a passenger). In the case where the output device is a display device, the display device visually displays results obtained by various types of processing performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, or graphics. Further, in the case where the output apparatus is an audio output apparatus, the audio output apparatus converts an audio signal including reproduced audio data, sound data, and the like into an analog signal, and outputs the analog signal in an audio manner.

Note that in the example shown in fig. 34, at least two control units connected together via the communication network 7010 may be integrated into one control unit. Alternatively, each control unit may be configured by a plurality of control units. Furthermore, the vehicle control system 7000 may comprise a further control unit, not shown. Further, in the above description, part or all of the functions performed by any control unit may be performed by another control unit. That is, predetermined arithmetic processing may be performed by any control unit as long as information is transmitted and received via the communication network 7010. Similarly, a sensor or a device connected to any control unit may be connected to another control unit, and a plurality of control units may transmit and receive detection information to and from each other via the communication network 7010.

Examples of vehicle control systems to which the techniques according to the present disclosure may be applied have been described above. The technique according to the present disclosure can be applied to, for example, the imaging units 7910, 7912, 7914, 7916, 7918, and the like in the above-described configuration. In particular, the imaging system of the present disclosure may be applied to these imaging units. Since the imaging system of the present disclosure can accurately detect an event regardless of the driving state of the vehicle, it is possible to contribute to achieving safe driving of the vehicle.

< configurations that can be adopted by the present disclosure >)

Note that the present disclosure may also adopt the following configuration.

< A. image forming apparatus >

[ A-1] an image forming apparatus comprising:

an event detection sensor that detects an event; and

a control unit which controls the event detection sensor, wherein

The control unit switches the resolution of the event detection sensor according to the traveling state of the mobile body.

[ A-2] the image forming apparatus according to [ A-1], wherein

The event detection sensor includes an asynchronous imaging device that detects, as an event, a luminance change in a pixel that photoelectrically converts incident light exceeding a predetermined threshold.

[ A-3] the image forming apparatus according to [ A-2], wherein

The imaging apparatus is used by being mounted on a moving body.

[ A-4] the image forming apparatus according to [ A-3], wherein

The control unit sets the resolution of the event detection sensor to a first resolution mode in which the resolution is relatively low or a second resolution mode in which the resolution is relatively high, according to the traveling state of the mobile body.

[ A-5] the image forming apparatus according to [ A-4], wherein

The control unit sets the first resolution mode when the speed of the mobile body is greater than or equal to a certain speed, and sets the second resolution mode when the speed of the mobile body is less than the certain speed.

[ A-6] the image forming apparatus according to [ A-4], wherein

The control unit sets the second resolution mode when the relative speed with the preceding object is greater than or equal to a certain relative speed, and sets the first resolution mode when the relative speed with the preceding object is less than the certain relative speed.

[ A-7] the image forming apparatus according to [ A-4], wherein

The control unit sets the first resolution mode when the speed of the mobile body is greater than or equal to a certain speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and sets the second resolution mode when the speed of the mobile body is less than the certain speed and the number of events detected by the event detection sensor is less than the predetermined threshold.

[ A-8] the image forming apparatus according to [ A-4], wherein

The control unit sets the second resolution mode when the relative speed with the preceding object is greater than or equal to a certain relative speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and sets the first resolution mode when the relative speed with the preceding object is less than the certain relative speed and the number of events detected by the event detection sensor is less than the predetermined threshold.

[ A-9] the image forming apparatus according to [ A-4], wherein

In the traveling state of the first resolution mode, when the speed of the mobile body is less than a predetermined threshold value, the number of events detected by the event detection sensor is greater than or equal to the predetermined threshold value, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio, the control unit determines that congestion has occurred and switches from the first resolution mode to the second resolution mode.

[ A-10] the image forming apparatus according to [ A-4], wherein

In the travel state of the first resolution mode, when the speed of the mobile body is greater than or equal to a predetermined threshold value, the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold value, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio, the control unit determines that the mobile body is traveling on a highway and switches from the first resolution mode to the second resolution mode.

[ A-11] the image forming apparatus according to [ A-4], wherein

The control unit sets the first resolution mode when the mobile body travels straight, and sets the second resolution mode when the route is changed.

[ A-12] the image forming apparatus according to [ A-11], wherein

The control unit determines that the course of the moving body is changed when the rotation of the steering wheel is greater than or equal to a certain angle and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold.

[ A-13] the image forming apparatus according to [ A-11], wherein

The control unit determines that the moving body travels straight when the rotation of the steering wheel is within a certain angle and the number of events detected by the event detection sensor is less than a predetermined threshold value.

[ A-14] the image forming apparatus according to any one of [ A-4] to [ A-13], wherein

The control unit switches the resolution of the event detection sensor for each area of the pixel array unit in the event detection sensor.

< B. imaging System >

[ B-1] an imaging system comprising:

an event detection sensor that detects an event;

a control unit that switches a resolution of the event detection sensor according to a traveling state of the mobile body; and

an object recognition unit that performs an event recognition process based on an event signal output from the event detection sensor.

[ B-2] the imaging system according to [ B-1], wherein

The event detection sensor includes an asynchronous imaging device that detects, as an event, a luminance change in a pixel that photoelectrically converts incident light exceeding a predetermined threshold.

[ B-3] the imaging system according to [ B-2], wherein

The imaging system is used by being mounted on a mobile body.

[ B-4] the imaging system according to [ B-3], wherein

The control unit sets the resolution of the event detection sensor to a first resolution mode in which the resolution is relatively low or a second resolution mode in which the resolution is relatively high, according to the traveling state of the mobile body.

[ B-5] the imaging system according to [ B-4], wherein

The control unit sets the first resolution mode when the speed of the mobile body is greater than or equal to a certain speed, and sets the second resolution mode when the speed of the mobile body is less than the certain speed.

[ B-6] the imaging system according to [ B-4], wherein

The control unit sets the second resolution mode when the relative speed with the preceding object is greater than or equal to a certain relative speed, and sets the first resolution mode when the relative speed with the preceding object is less than the certain relative speed.

[ B-7] the imaging system according to [ B-4], wherein

The control unit sets the first resolution mode when the speed of the mobile body is greater than or equal to a certain speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and sets the second resolution mode when the speed of the mobile body is less than the certain speed and the number of events detected by the event detection sensor is less than the predetermined threshold.

[ B-8] the imaging system according to [ B-4], wherein

The control unit sets the second resolution mode when the relative speed with the preceding object is greater than or equal to a certain relative speed and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold, and sets the first resolution mode when the relative speed with the preceding object is less than the certain relative speed and the number of events detected by the event detection sensor is less than the predetermined threshold.

[ B-9] the imaging system according to [ B-4], wherein

In the traveling state of the first resolution mode, when the speed of the mobile body is less than a predetermined threshold value, the number of events detected by the event detection sensor is greater than or equal to the predetermined threshold value, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio, the control unit determines that congestion has occurred and switches from the first resolution mode to the second resolution mode.

[ B-10] the imaging system according to [ B-4], wherein

In the travel state of the first resolution mode, when the speed of the mobile body is greater than or equal to a predetermined threshold value, the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold value, and the area of the object occupying the angle of view of the event detection sensor is greater than or equal to a certain ratio, the control unit determines that the mobile body is traveling on a highway and switches from the first resolution mode to the second resolution mode.

[ B-11] the imaging system according to [ B-4], wherein

The control unit sets the first resolution mode when the mobile body travels straight, and sets the second resolution mode when the course is changed.

[ B-12] the imaging system according to [ B-11], wherein

The control unit determines that the course of the moving body is changed when the rotation of the steering wheel is greater than or equal to a certain angle and the number of events detected by the event detection sensor is greater than or equal to a predetermined threshold.

[ B-13] the imaging system according to [ B-11], wherein

The control unit determines that the moving body travels straight when the rotation of the steering wheel is within a certain angle and the number of events detected by the event detection sensor is less than a predetermined threshold value.

[ B-14] the imaging system according to any one of [ B-4] to [ B-13], wherein

The control unit switches the resolution of the event detection sensor for each area of the pixel array unit in the event detection sensor.

[ B-15] the imaging system according to any one of [ B-1] to [ B-14], further comprising

An image sensor that performs imaging at a predetermined frame rate.

[ B-16] the imaging system according to [ B-15], wherein

The object recognition unit performs an event recognition process based on image data output from the image sensor.

[ B-17] the imaging system according to [ B-16], wherein

When it is determined that the control unit cannot perform the recognition processing using only the event signal output from the event detection sensor, the control unit performs control using the event recognition processing, the event signal output from the event detection sensor, and the image data output from the image sensor.

[ B-18] the imaging system according to [ B-16] or [ B-17], wherein

Based on the result of the event recognition processing, the control unit specifies a region that can be detected as a moving body in the angle of view of the event detection sensor, and when the region that can be detected as a moving body is greater than or equal to a predetermined threshold value, determines that congestion has occurred on the condition that the traveling speed of the moving body is less than the predetermined threshold value, and sets a second resolution mode for the specified region.

[ B-19] the imaging system according to [ B-16] or [ B-17], wherein

Based on the result of the event recognition processing, the control unit specifies a region that can be detected as a moving body in the angle of view of the event detection sensor, and when the region that can be detected as a moving body is greater than or equal to a predetermined threshold value, determines to travel on a highway on the condition that the travel speed of the moving body is greater than or equal to the predetermined threshold value, and sets a second resolution mode for the specified region.

REFERENCE SIGNS LIST

1A imaging system according to a first embodiment

1B imaging System according to the second embodiment

10 event detection sensor

11 pixels

12 pixel array unit

13 drive unit

14 arbitration unit (arbitration unit)

15 rows of processing units

16 signal processing unit

20 image sensor

21 pixel

22 pixel array unit

23 row select unit

24 constant current source unit

25 analog-to-digital conversion unit

26 horizontal transfer scanning unit

27 Signal processing unit

28 timing control unit

30 motion recognition unit

40 object recognition unit

50 control unit

60 operating mode defining unit

70 recording unit

73页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:电子设备、拍摄装置以及移动体

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类