Object detection device, object detection system, and object detection method

文档序号:1804635 发布日期:2021-11-05 浏览:4次 中文

阅读说明:本技术 对象检测装置、对象检测系统以及对象检测方法 (Object detection device, object detection system, and object detection method ) 是由 中川庆 北野伸 于 2020-03-10 设计创作,主要内容包括:本发明的目的是通过检测闪烁分量的特性来更快速且更准确地进行对象检测。根据本发明实施方式的对象检测装置设置有:固态图像捕获装置(200),具有以矩阵布置的多个像素,并根据入射到每个像素上的光量来检测像素中事件的发生;闪烁检测单元(12),基于由固态图像捕获装置检测到的事件的发生来产生闪烁信息;以及对象检测单元(15),基于由固态图像捕获装置检测到的闪烁信息来检测对象。(An object of the present invention is to perform object detection more quickly and accurately by detecting the characteristics of a flicker component. An object detection device according to an embodiment of the present invention is provided with: a solid-state image capturing device (200) having a plurality of pixels arranged in a matrix and detecting occurrence of an event in the pixels according to an amount of light incident on each pixel; a flicker detection unit (12) that generates flicker information based on the occurrence of an event detected by the solid-state image capturing apparatus; and an object detection unit (15) that detects an object based on the flicker information detected by the solid-state image capturing device.)

1. An object detecting apparatus comprising:

a first solid-state imaging device provided with a plurality of pixels arranged in a matrix, the first solid-state imaging device detecting occurrence of an event in the pixels according to an amount of light incident on each pixel;

a flicker detection unit that generates flicker information based on an occurrence of an event detected by the first solid-state imaging device; and

an object detection unit that detects an object based on flicker information detected by the first solid-state imaging device.

2. The object detecting apparatus according to claim 1,

the flicker detection unit specifies an event detection area in which occurrence of an event is detected among a plurality of pixels, and generates the flicker information based on the number of occurrence events detected per predetermined time for the event detection area.

3. The object detecting apparatus according to claim 2,

the flicker detection unit generates the flicker information based on a maximum value or an average value of the number of occurrence events detected per predetermined time in the respective pixels belonging to the event detection area.

4. The object detecting apparatus according to claim 2,

the flicker information includes an edge shape of the event detection area and the number of occurrence events detected per predetermined time.

5. The object detecting apparatus according to claim 2,

the flicker detection unit sets, as a flicker detection region, a region in the event detection region in which pixels in which the number of occurrence events detected per predetermined time is a first threshold or more are arranged;

the object detection unit detects an object imaged in the flicker detection area based on the flicker information.

6. The object detecting apparatus according to claim 5,

the flicker detection unit sets, as the flicker detection region, a region in the event detection region in which the number of the occurrence events detected per predetermined time is a first threshold or more and the number of pixels is a second threshold or more.

7. The object detecting device according to claim 2, further comprising:

a second solid-state imaging device that obtains image data, wherein,

the object detection unit detects an object based on the flicker information and the image data.

8. The object detecting apparatus according to claim 7,

the flicker detection unit specifies an event detection area in which occurrence of an event is detected among a plurality of pixels, and generates flicker information based on the number of occurrence events detected per predetermined time for the event detection area,

the object detection unit specifies an object detection area that includes an area on the image data corresponding to the event detection area and corresponds to an object, and detects the object based on the object detection area and the flicker information.

9. The object detecting apparatus according to claim 7,

the object detection unit specifies an object detection area corresponding to an object from the image data;

the flicker detection unit generates the flicker information based on the number of occurrence events detected per predetermined time for an area corresponding to the object detection area among the plurality of pixels, and

the object detection unit detects an object based on the object detection region and the flicker information.

10. The object detecting device according to claim 2, further comprising:

a control unit that controls reading of the first solid-state imaging device, wherein,

the control unit reduces a resolution at the time of monitoring the occurrence of the event for an area in which pixels other than pixels included in the event detection area are arranged among the plurality of pixels.

11. The object detecting device according to claim 7, further comprising:

a control unit that controls reading of the second solid-state imaging device, wherein,

the control unit reduces a resolution of image data read from an area corresponding to the event detection area in the second solid-state imaging device.

12. An object detection system comprising:

a first solid-state imaging device provided with a plurality of pixels arranged in a matrix, the first solid-state imaging device detecting occurrence of an event in the pixels according to an amount of light incident on each pixel;

a flicker detection unit that generates flicker information based on an occurrence of an event detected by the first solid-state imaging device; and

an object detection unit that detects an object based on the flicker information detected by the first solid-state imaging device.

13. An object detection method, comprising:

generating flicker information based on an occurrence of an event detected by a solid-state imaging device provided with a plurality of pixels arranged in a matrix, the solid-state imaging device detecting the occurrence of the event in the pixels according to an amount of light incident on each pixel; and

an object is detected based on the flicker information.

Technical Field

The present disclosure relates to an object detection apparatus, an object detection system, and an object detection method.

Background

In recent years, the following techniques have been actively developed: a technique of detecting an object existing around a vehicle from an image obtained by imaging the surroundings of the vehicle, and assisting automatic driving based on a detection result to allow the vehicle to autonomously travel or be driven by a driver.

Documents of the prior art

Patent document

Patent document 1: japanese unexamined patent publication No. 2016-

Disclosure of Invention

Problems to be solved by the invention

Here, the traffic lights, electronic bulletin boards, vehicle lights, and the like that perform traffic control include various flicker components corresponding to the light sources, power sources, and the like, and it is considered that the objects can be detected with higher accuracy and higher speed using these flicker components.

However, in a general image sensor that obtains image data at a predetermined frame rate, it is difficult to measure characteristics such as the frequency of a flicker component emitted from a traffic light, an electronic bulletin board, a car light, or the like.

Accordingly, the present disclosure proposes an object detection apparatus, an object detection system, and an object detection method capable of detecting characteristics of a flicker component and detecting an object at higher speed and with higher accuracy.

Problem solving scheme

In order to solve the above-described problem, an object detection device according to an embodiment of the present disclosure is provided with a solid-state imaging device provided with a plurality of pixels arranged in a matrix, the solid-state imaging device detecting occurrence of an event in each pixel according to an amount of light incident on the pixel, a flicker detection unit generating flicker information based on the occurrence of the event detected by the solid-state imaging device, and an object detection unit detecting an object based on the flicker information detected by the solid-state imaging device.

Drawings

Fig. 1 is a diagram for explaining an object in which a flicker component is detected within a view angle.

Fig. 2 is a block diagram showing a schematic configuration example of an object detection apparatus (system) of the first embodiment.

Fig. 3 is a block diagram showing a functional configuration example of the DVS according to the first embodiment.

Fig. 4 is a circuit diagram showing a schematic configuration example of the unit pixel of the first embodiment.

Fig. 5 is a block diagram showing a schematic configuration example of the address event detection unit of the first embodiment.

Fig. 6 is a circuit diagram showing another example of the current-voltage converting unit according to the first embodiment.

Fig. 7 is a circuit diagram showing a schematic configuration example of the subtractor and the quantizer according to the first embodiment.

Fig. 8 is a block diagram showing a schematic configuration example of a vehicle control system as an example of a mobile body control system according to the first embodiment.

Fig. 9 is a view showing an example of the mounting position of the DVS with respect to the vehicle according to the first embodiment.

Fig. 10 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the first embodiment.

Fig. 11 is a diagram for explaining flicker components detected within a viewing angle.

Fig. 12 is a block diagram showing a schematic configuration example of an object detection apparatus (or system) of the second embodiment.

Fig. 13 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the second embodiment.

Fig. 14 is a flowchart showing an example of an object detection operation according to a modification of the second embodiment.

Fig. 15 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the third embodiment.

Fig. 16 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the fourth embodiment.

Fig. 17 is a flowchart showing an action mode determination operation according to a first example of the fifth embodiment.

Fig. 18 is a flowchart showing an action mode determination operation according to a second example of the fifth embodiment.

Fig. 19 is a flowchart showing an action mode determination operation according to a third example of the fifth embodiment.

Fig. 20 is a flowchart showing an action mode determination operation according to a fourth example of the fifth embodiment.

Detailed Description

Hereinafter, embodiments of the present disclosure are described in detail with reference to the accompanying drawings. Note that, in the following embodiments, the same portions are denoted by the same reference numerals, and the description thereof is not repeated.

Further, the present disclosure is described according to the following sequence of items.

1. First embodiment

1.1 configuration example of object detection apparatus (or System)

1.2 configuration examples of DVS

1.3 configuration example of Unit Pixel

1.4 configuration example of Address event detection Unit

1.4.1 configuration example of Current-to-Voltage conversion Unit

1.4.2 configuration examples of subtractors and quantizers

1.5 application example of moving body

1.6 example of the layout of DVS

1.7 object detection operation example

1.8 action and Effect

2. Second embodiment

2.1 example of configuration of object detection apparatus (or System)

2.2 object detection operation example

2.3 action and Effect

2.4 modifications

3. Third embodiment

3.1 example of object detection operations

3.2 action and Effect

4. Fourth embodiment

4.1 example of object detection operations

4.2 action and Effect

5. Fifth embodiment

5.1 first example

5.2 second example

5.3 third example

5.4 fourth example

1. First embodiment

First, the first embodiment is described in detail with reference to the drawings.

As described above, objects such as traffic lights, electronic bulletin boards, and car lights emit flicker components that repeatedly flicker at high speed. For example, as shown in fig. 1, in the case where there are the traffic light 51 and the preceding vehicles 52, 53 within the view angle G1, the lighting lamp of the traffic light 51, the lighting arrow of the arrow-like traffic light, the tail lamp, the brake lamp, and the strobe lamp of the preceding vehicles 52 and 53 emit flicker components that repeatedly flash at high speed.

It is difficult to obtain characteristics (hereinafter, referred to as flicker characteristics) such as the frequency and duty ratio of a flicker component emitted by such an object by a synchronous image sensor that obtains image data in synchronization with a synchronization signal such as a vertical synchronization signal at a certain predetermined frame rate.

Therefore, in the present embodiment, the characteristics of the flicker component emitted by the subject are acquired using an asynchronous image sensor (hereinafter, referred to as a Dynamic Vision Sensor (DVS)) in which a detection circuit that detects in real time that the light reception amount exceeds a threshold value as an address event is provided for each pixel.

In a general DVS, a so-called event-driven driving system is employed in which whether an address event occurs per unit pixel is detected, and in the case where the occurrence of the address event is detected, a pixel signal is read from the unit pixel where the address event occurs.

Further, since the reading operation is performed on the unit pixel where the occurrence of the address event is detected in the general DVS, the DVS has a characteristic of being able to perform the reading at a much higher speed than a synchronous image sensor that performs the reading operation on all the unit pixels.

By using the DVS having such characteristics, the flicker characteristics of the object can be detected with high accuracy. Therefore, an object detection system and an object detection method capable of detecting an object at higher speed and with higher accuracy can be realized.

Note that a unit pixel in this specification is a minimum unit of a pixel including one photoelectric conversion element (also referred to as a light receiving element), and corresponds to, for example, each point in image data read from an image sensor. Further, the address event is an event occurring in each address assigned to each of the plurality of unit pixels arranged in the two-dimensional lattice pattern, for example, an event in which a current value of a current based on charges generated in the photoelectric conversion element (hereinafter referred to as a photocurrent) or a variation amount thereof exceeds a certain threshold value.

1.1 configuration example of object detection apparatus (or System)

Fig. 2 is a block diagram showing a schematic configuration example of an object detection apparatus (or an object detection system, hereinafter referred to as an object detection apparatus) according to the first embodiment. As shown in fig. 2, the object detection apparatus 1 has, for example, an imaging lens 11, a DVS (first solid-state imaging apparatus) 200, and a data processing unit 120, and monitors occurrence of an address event in the DVS 200 based on an instruction from an external control unit 130.

The imaging lens 11 is an example of an optical system that condenses incident light and forms an image thereof on a light receiving surface of the DVS 200. The light receiving surface may be a surface where the photoelectric conversion element is disposed in the DVS 200.

The DVS 200 detects the occurrence of an address event based on the amount of incident light, and generates address information for specifying a unit pixel in which the occurrence of the address event is detected as event detection data. The event detection data may include time information, such as a timestamp indicating the time at which the address event was detected to occur.

The generated event detection data is output to the data processing unit 120 via the signal line 209.

The data processing unit 120 is provided with, for example, a memory 121 and a DSP (digital signal processor) 122.

The memory 121 includes, for example, a flash memory, a DRAM (dynamic random access memory), an SRAM (static random access memory), and the like, and records data input from the DVS 200.

For example, the DSP 122 performs predetermined signal processing on event detection data input from the DVS 200 to perform object detection processing on an object causing an address event to occur.

More specifically, the DSP 122 specifies a region (corresponding to a flicker detection region described later) in which flicker components are detected (serving as a flicker detection unit), for example, based on the number of event detection data (corresponding to the number N of event detections described later), a duty ratio, and the like, which are input from the DVS 200 in a unit time. Then, for example, the DSP 122 performs an object detection process (serving as an object detection unit) on the object causing the address event to occur based on the designated flicker detection area.

At this time, the DSP 122 may function as a machine learning unit using a Deep Neural Network (DNN), for example, by executing a program stored in the memory 121. In this case, the DSP 122 executes the program of the learning model stored in the memory 121, thereby executing the process of multiplying the dictionary coefficient stored in the memory 121 by the event detection data. Note that as a method of machine learning, various methods such as RNN (recurrent neural network) and CNN (convolutional neural network) may be used.

Note that the DSP 122 does not necessarily have to perform all the steps of the object detection processing, and may perform at least some of the steps. For example, in the case of detecting objects from event detection data using CNN, DSP 122 may execute a convolutional layer and/or a pooling layer as part of a hidden layer.

The result obtained by such object detection processing (including also the result obtained by the DSP 122 performing a part of the object detection processing) is output to, for example, an external information processing apparatus (corresponding to the vehicle external information detection unit 12030 and/or the microcomputer 12051 described later) or the like.

The control unit 130 includes, for example, a CPU (central processing unit) or the like, and controls each unit in the object detection apparatus 1, for example, the DVS 200, by outputting various instructions via a signal line 139.

In the above configuration, the DVS 200 and the data processing unit 120 may include, for example, a single semiconductor chip. Further, the semiconductor chip may be a multilayer chip obtained by attaching a plurality of dies to each other.

1.2 configuration examples of DVS

Subsequently, a configuration example of the DVS 200 is described in detail with reference to the drawings. Fig. 3 is a block diagram showing a functional configuration example of the DVS according to the first embodiment. As shown in fig. 3, the DVS 200 is equipped with a driving circuit 211, a signal processing unit 212, an arbiter 213, and a pixel array unit 300.

In the pixel array unit 300, a plurality of unit pixels are arranged in a two-dimensional lattice pattern. As described later in detail, the unit pixel includes a photoelectric conversion element such as a photodiode and a pixel circuit (in this embodiment, corresponding to an address event detection unit 400 described later) that detects whether an address event occurs based on whether or not a current value of a photocurrent of electric charges generated in the photoelectric conversion element or a variation amount thereof exceeds a predetermined threshold value. Here, the pixel circuit may be shared by a plurality of photoelectric conversion elements. In this case, each unit pixel includes one photoelectric conversion element and a shared pixel circuit.

The plurality of unit pixels of the pixel array unit 300 may be grouped into a plurality of pixel blocks, each pixel block including a predetermined number of unit pixels. Hereinafter, a group of unit pixels or pixel blocks arranged in the horizontal direction is referred to as "rows", and a group of unit pixels or pixel blocks arranged in the direction perpendicular to the rows is referred to as "columns".

When the occurrence of an address event is detected in the pixel circuit, each unit pixel outputs a request for reading a signal from the unit pixel to the arbiter 213.

The arbiter 213 arbitrates requests from one or more unit pixels and transmits a predetermined response to the requesting unit pixel based on the result of the arbitration. The unit pixel receiving the response outputs a detection signal indicating the occurrence of an address event to the driving circuit 211 and the signal processing unit 212.

The driving circuit 211 sequentially drives the unit pixels outputting the detection signals, thereby allowing the unit pixels detecting the occurrence of the address event to output a signal according to the light reception amount to, for example, the signal processing unit 212. Note that the DVS 200 may be provided with an analog-to-digital converter for converting a signal read from a photoelectric conversion element 333 described later into a signal of a digital value according to the amount of charge thereof (for example, for each unit pixel or a plurality of unit pixels, or for each column).

The signal processing unit 212 performs predetermined signal processing on the signal input from the unit pixel, and supplies the result of the signal processing as event detection data to the data processing unit 120 via the signal line 209. Note that, as described above, the event detection data may include address information of the unit pixel in which the occurrence of the address event is detected, and time information such as a time stamp indicating the timing of the occurrence of the address event.

1.3 configuration example of Unit Pixel

Next, a configuration example of the unit pixel 310 is described. Fig. 4 is a circuit diagram showing a schematic configuration example of the unit pixel of the first embodiment. As shown in fig. 4, the unit pixel 310 is provided with, for example, a light receiving unit 330 and an address event detecting unit 400. Note that the logic circuit 210 in fig. 4 may be, for example, a logic circuit including the driving circuit 211, the signal processing unit 212, and the arbiter 213 in fig. 3.

The light receiving unit 330 is provided with a photoelectric conversion element 333 such as a photodiode, and the output thereof is connected to the address event detection unit 400.

The address event detection unit 400 is provided with, for example, a current-voltage conversion unit 410 and a subtractor 430. However, the address event detection unit 400 is equipped with a buffer, a quantizer, and a transmission unit in addition to them. The address event detection unit 400 will be described in detail later with reference to fig. 5 and the like.

In such a structure, the photoelectric conversion element 333 of the light receiving unit 330 photoelectrically converts incident light to generate electric charges. The electric charges generated in the photoelectric conversion element 333 are input to the address event detecting unit 400 as a photocurrent according to the current value of the amount of electric charges.

1.4 configuration example of Address event detection Unit

Fig. 5 is a block diagram showing a schematic configuration example of the address event detection unit of the first embodiment. As shown in fig. 5, the address event detection unit 400 is provided with a buffer 420 and a transmission unit 450 in addition to the current-voltage conversion unit 410, the subtractor 430, and the quantizer 440 shown in fig. 5.

The current-voltage converting unit 410 converts the photocurrent from the light receiving unit 330 into a logarithmic voltage signal thereof and outputs the resultant voltage signal to the buffer 420.

The buffer 420 corrects the voltage signal from the current-voltage converting unit 410 and outputs the corrected voltage signal to the subtractor 430.

The subtractor 430 reduces the voltage level of the voltage signal from the buffer 420 according to the row driving signal from the driving circuit 211, and outputs the reduced voltage signal to the quantizer 440.

The quantizer 440 quantizes the voltage signal from the subtractor 430 into a digital signal, and outputs the resulting digital signal as a detection signal to the transmission unit 450.

The transmission unit 450 transmits the detection signal from the quantizer 440 to the signal processing unit 212 and the like. For example, when detecting the occurrence of an address event, the transmission unit 450 outputs a request to the arbiter 213 requesting transmission of the detection signal of the address event from the transmission unit 450 to the drive circuit 211 and the signal processing unit 212. Then, upon receiving a response to the request from the arbiter 213, the transmission unit 450 outputs a detection signal to the drive circuit 211 and the signal processing unit 212.

1.4.1 configuration example of Current-to-Voltage conversion Unit

The current-voltage converting unit 410 in the configuration shown in fig. 5 may be, for example, a so-called source follower current-voltage converting unit equipped with an LG transistor 411, an amplifying transistor 412, and a constant current circuit 415, as shown in fig. 4. However, it is not limited thereto; this may also be a so-called gain boost current-voltage converter equipped with, for example, two LG transistors 411 and 413, two amplifying transistors 412 and 414, and a constant current circuit 415 shown in fig. 6.

As shown in fig. 4, the source of the LG transistor 411 and the gate of the amplifying transistor 412 are connected to, for example, the cathode of the photoelectric conversion element 333 of the light receiving unit 330. The drain of LG transistor 411 is connected to, for example, the power supply terminal VDD.

Further, for example, the source of the amplifying transistor 412 is grounded, and the drain thereof is connected to the power supply terminal VDD via the constant current circuit 415. The constant current circuit 415 may include, for example, a load MOS transistor such as a P-type MOS (metal oxide semiconductor) transistor.

In contrast, in the case of the gain boosting type, as shown in fig. 6, the source of the LG transistor 411 and the gate of the amplifying transistor 412 are connected to, for example, the cathode of the photoelectric conversion element 333 of the light receiving unit 330. Further, the drain of the LG transistor 411 is connected to, for example, the source of the LG transistor 413 and the gate of the amplifying transistor 412. For example, the drain of the LG transistor 413 is connected to the power supply terminal VDD.

Further, for example, the source of the amplifying transistor 414 is connected to the gate of the LG transistor 411 and the drain of the amplifying transistor 412. The drain of the amplifying transistor 414 is connected to the power supply terminal VDD via, for example, a constant current circuit 415.

A ring-shaped source follower circuit is formed by the connection relationship shown in fig. 4 or fig. 6. Accordingly, the photocurrent from the light receiving unit 330 is converted into a logarithmic voltage signal according to the amount of charge thereof. Note that each of the LG transistors 411 and 413 and the amplifying transistors 412 and 414 may include, for example, an NMOS transistor.

1.4.2 configuration examples of subtractors and quantizers

Fig. 7 is a circuit diagram showing a schematic configuration example of the subtractor and the quantizer according to the first embodiment. As shown in fig. 7, the subtractor 430 is provided with capacitors 431 and 433, an inverter 432, and a switch 434. Further, the quantizer 440 is provided with a comparator 441.

One end of the capacitor 431 is connected to the output terminal of the buffer 420, and the other end is connected to the input terminal of the inverter 432. The capacitor 433 is connected in parallel with the inverter 432. The switch 434 opens/closes a path connecting both ends of the capacitor 433 according to a row driving signal.

The inverter 432 inverts a voltage signal input via the capacitor 431. The inverter 432 outputs an inverted signal to the non-inverting input terminal (+) of the comparator 441.

When the switch 434 is turned on, the voltage signal Vinit is input to one side of the buffer 420 of the capacitor 431. In addition, the opposite side becomes a virtual ground terminal. For convenience, the potential of the virtual ground terminal is set to zero. At this time, when the capacitance of the capacitor 431 is set to C1, the potential Qinit accumulated in the capacitor 431 is represented by the following expression (1). In contrast, since both ends of the capacitor 433 are short-circuited, the accumulated charge thereof is zero.

Qinit=C1×Vinit (1)

Next, considering a case where the switch 434 is turned off and the voltage on the buffer 420 side of the capacitor 431 changes to Vafter, the charge Qafter accumulated in the capacitor 431 is represented by the following expression (2).

Qafter=C1×Vafter (2)

In contrast, when the output voltage is set to Vout, the charge Q2 accumulated in the capacitor 433 is represented by the following expression (3).

Q2=-C2×Vout (3)

At this time, the total charge amount of the capacitors 431 and 433 does not change, so that the following expression (4) is established.

Qinit=Qafter+Q2 (4)

By substituting expressions (1) to (3) into expression (4) and transforming, expression (5) below is obtained.

Vout=-(C1/C2)×(Vafter-Vinit) (5)

Expression (5) represents the subtraction of the voltage signal, and the gain of the subtraction result is C1/C2. Since it is generally desirable to maximize gain, it is preferable to design C1 larger and C2 smaller. In contrast, if C2 is too small, kTC noise increases and noise characteristics may deteriorate, so that the capacitance reduction of C2 is limited within the range of allowable noise. Further, since the address event detection unit 400 including the subtractor 430 is installed for each unit pixel, there is an area limitation in the capacitance C1 and the capacitance C2. In view of these factors, the values of the capacitance C1 and the capacitance C2 are determined.

The comparator 441 compares the voltage signal from the subtractor 430 with a predetermined threshold voltage Vth applied to the inverting input terminal (-). The comparator 441 outputs a signal indicating the comparison result as a detection signal to the transmission unit 450.

Further, when the conversion gain of the current-voltage converting unit 410 is set to CGlogAnd the gain of the buffer 420 is set to '1', the gain a of the above-described entire address event detection unit 400 is represented by the following expression (6).

[ mathematical expression 1]

In expression (6), iphotoN denotes a photocurrent of the nth unit pixel, and the unit thereof is, for example, ampere (a). N denotes the number of unit pixels 310 in the pixel block, and is '1' in the present embodiment.

1.5 application example to Mobile body

The above object detection apparatus 1 can be applied to various products. For example, this may be mounted on any type of moving body, such as an automobile, an electric automobile, a hybrid electric automobile, a motorcycle, a bicycle, a personal mobile vehicle, an airplane, an unmanned airplane, a ship, and a robot.

Fig. 8 is a block diagram showing a schematic configuration example of a vehicle control system as an example of a mobile body control system according to the first embodiment.

As shown in fig. 8, the vehicle control system 12000 is provided with a plurality of electronic control units connected to each other via a communication network 12001. In the example shown in fig. 8, the vehicle control system 12000 is provided with a drive system control unit 12010, a vehicle body system control unit 12020, a vehicle external information detection unit 12030, a vehicle internal information detection unit 12040, and an integrated control unit 12050. Further, the microcomputer 12051, the audio image output unit 12052, and the in-vehicle network interface (I/F)12053 are shown as a functional configuration of the integrated control unit 12050.

The drive system control unit 12010 controls the operations of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 functions as a control device for a driving force generation device for generating a driving force of a vehicle such as an internal combustion engine or a drive motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting a steering angle of the vehicle, a brake device for generating a braking force of the vehicle, and the like.

The vehicle body system control unit 12020 controls the operations of various devices mounted on the vehicle body according to various programs. For example, the vehicle body system control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lamps (such as a headlight, a backup lamp, a brake lamp, a flash lamp, or a fog lamp). In this case, radio waves transmitted from the portable device instead of keys or signals of various switches may be input to the vehicle body system control unit 12020. The vehicle body system control unit 12020 receives input of radio waves or signals and controls a door lock device, a power window device, a lamp, and the like of the vehicle.

The vehicle external information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000. For example, the imaging unit 12031 is connected to the vehicle external information detection unit 12030. The vehicle outside information detection unit 12030 allows the imaging unit 12031 to obtain information outside the vehicle and receive the obtained data. The vehicle external information detection unit 12030 may perform detection processing or distance detection processing of an object such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received data.

The imaging unit 12031 may be an image sensor that outputs an electric signal as an image, or may be a ranging sensor that outputs the same as ranging information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.

The imaging unit 12031 is provided with, for example, a plurality of imaging units 12101 to 12105 (see fig. 9) described later. The DVS 200 described above is used for at least one of the plurality of imaging units 12101 to 12105. The DVS 200 and the vehicle external information detection unit 12030 connected thereto form the object detection apparatus 1 according to the first embodiment. In this case, the vehicle external information detection unit 12030 and/or the microcomputer 12051 function as the data processing unit 120.

The vehicle interior information detection unit 12040 detects information of the vehicle interior. The vehicle interior information detecting unit 12040 is connected to, for example, a driver state detecting unit 12041 that detects the state of the driver. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 may calculate the degree of fatigue or the degree of concentration of the driver, or may determine whether the driver is dozing based on the detection information input from the driver state detection unit 12041.

The microcomputer 12051 can perform arithmetic operation of control target values of the driving force generating device, the steering mechanism, or the braking device based on the vehicle interior information obtained by the vehicle interior information detecting unit 12040 and/or the vehicle exterior information obtained by the vehicle exterior information detecting unit 12030, and output a control instruction to the drive system control unit 12010. For example, the microcomputer 12051 can execute cooperative control for realizing functions of an ADAS (advanced driver assistance system) including collision avoidance or impact attenuation of the vehicle, follow-up running based on an inter-vehicle distance, vehicle speed hold running, vehicle collision warning, vehicle lane departure warning, and the like.

Further, the microcomputer 12051 can perform cooperative control for realizing automatic driving or the like to autonomously travel independently of the operation by the driver by controlling the driving force generation device, the steering mechanism, the brake device, and the like based on the information around the vehicle obtained by the vehicle external information detection unit 12030 or the vehicle internal information detection unit 12040.

Further, the microcomputer 12051 can output a control instruction to the vehicle body system control unit 12020 based on the vehicle external information obtained by the vehicle external information detecting unit 12030. For example, the microcomputer 12051 may control headlights according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle outside information detection unit 12030 to perform cooperative control for realizing glare protection, such as switching high beam to low beam.

The audio image output unit 12052 transmits at least one of audio or an image output signal to an output device capable of visually or audibly notifying an occupant of the vehicle or the outside of the vehicle of the information. In the example of fig. 72, as output devices, an audio speaker 12061, a display unit 12062, and a dashboard 12063 are shown. The display unit 12062 may include, for example, at least one of an onboard display or a head-up display.

1.6 example of the layout of DVS

Fig. 9 is a view showing an example of the mounting position of the DVS with respect to the vehicle according to the first embodiment. In fig. 9, as the imaging unit 12031, a total of five imaging units including the imaging units 12101, 12102, 12103, 12104, and 12105 are provided for the vehicle 12100. At least one of the imaging units 12101 to 12105 includes the DVS 200. For example, the DVS 200 is used for an imaging unit 12101 that images an area in front of the vehicle 12100.

The imaging units 12101, 12102, 12103, 12104, and 12105 are disposed in positions such as the front nose, side mirrors, rear bumper, rear door, and the upper portion of the windshield in the vehicle interior of the vehicle 12100. The imaging unit 12101 provided on the nose and the imaging unit 12105 provided in the upper portion of the windshield inside the vehicle mainly obtain images of the area in front of the vehicle 12100. The imaging units 12102 and 12103 provided on the side mirrors mainly obtain images of the side of the vehicle 12100. An imaging unit 12104 provided on a rear bumper or a rear door mainly obtains an image of an area behind the vehicle 12100. The imaging unit 12105 provided on the upper portion of the windshield inside the vehicle is mainly used to detect a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, and the like.

Further, in fig. 9, an example of the imaging ranges of the imaging units 12101 to 12104 is shown. An imaging range 12111 indicates an imaging range of the imaging unit 12101 provided on the nose, imaging ranges 12112 and 12113 indicate imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, and an imaging range 12114 indicates an imaging range of the imaging unit 12104 provided on the rear bumper or the back door. For example, image data imaged by the imaging units 12101 to 12104 are superimposed, thereby obtaining an overhead view image of the vehicle 12100 viewed from above.

At least one of the imaging units 12101 to 12104 may have a function of obtaining distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element including pixels for phase difference detection.

For example, the microcomputer 12051 (refer to fig. 8) may specifically extract a three-dimensional object closest on the travel path of the vehicle 12100, which travels at a predetermined speed (e.g., 0km/h or more) in substantially the same direction as the direction of the vehicle 12100 as the preceding vehicle, by obtaining the distance to each three-dimensional object in the imaging ranges 12111 to 12114 and the temporal change of the distance (relative speed to the vehicle 12100), based on the distance information obtained from the imaging units 12101 to 12104. The microcomputer 12051 may set a distance to the preceding vehicle to be secured in advance, and may perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, cooperative control for realizing automatic driving or the like can be performed to autonomously travel independently of the operation of the driver.

For example, the microcomputer 12051 may extract the physical object data on the physical object while classifying the physical object into a motorcycle, a standard vehicle, a large vehicle, a pedestrian, and other physical objects (such as a utility pole) based on the distance information obtained from the imaging units 12101 to 12104, and serve to automatically avoid an obstacle. For example, the microcomputer 12051 discriminates the obstacles around the vehicle 12100 as obstacles visible to the driver of the vehicle 12100 and obstacles difficult to see. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk is equal to or higher than a set value and there is a possibility of collision, the microcomputer 12051 may perform driving assistance for avoiding a collision by outputting a warning to the driver via the audio speaker 12061 and the display unit 12062 or performing forced deceleration or avoidance steering via the drive system control unit 12010.

At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared light. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the images acquired by the imaging units 12101 to 12104. Such pedestrian recognition is performed, for example, by a process of extracting feature points in images acquired by the imaging units 12101 to 12104 as infrared cameras, and a process of performing pattern matching processing on a series of feature points indicating the contour of an object to discriminate whether the object is a pedestrian. When the microcomputer 12051 determines that a pedestrian is present in the images acquired by the imaging units 12101 to 12104 and identifies a person, the audio image output unit 12052 controls the display unit 12062 to superimpose a rectangular outline to emphasize the identified pedestrian for display. Further, the audio image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.

1.7 object detection operation example

Next, an operation example of the object detection apparatus 1 according to the first embodiment is described. Fig. 10 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the first embodiment. Note that in the following description, attention is focused on the operation of the data processing unit 120.

As shown in fig. 10, in this operation, first, the data processing unit 120 inputs event detection data from the DVS 200 (step S101). Note that since the DVS 200 operates asynchronously as described above, when the occurrence of an address event is detected, event detection data is input from the DVS 200 to the data processing unit 120 as necessary.

In contrast, the data processing unit 120 specifies one or more areas (hereinafter referred to as event detection areas) in which the occurrence of an address event is detected, based on the address information included in the input event detection data (step S102). For example, in a case where the object shown in fig. 1 is included within the angle of view of the DVS 200, the data processing unit 120 specifies the region R1 of the traffic light 51 and the regions R2 and R3 of the tail lights, brake lights, and the like of the preceding vehicles 52 and 53. Note that, for example, in step S102, one or more event detection areas in which occurrence of an address event is detected may be specified based on one or more event detection data input within a predetermined time (e.g., 1 millisecond (ms)).

Next, the data processing unit 120 counts the number of address events (hereinafter, referred to as the number of detected events) N whose occurrence is detected per unit time (for example, 1 second) for each event detection area specified in step S102 (step S103). Note that the number N of detected events is a value for determining whether each event detection region includes a flicker component, and is a feature value corresponding to the frequency of the flicker component. As the number N of detected events, a value counted for each address (i.e., pixel) belonging to each event detection area may be used, or a maximum value, an average value, or the like of the value counted for each address in each event detection area may be used.

Next, the data processing unit 120 determines whether there is an event detection area in which the number N of detected events is equal to or greater than a preset threshold Nth (step S104). In the absence of an event detection area in which the number N of detection events is equal to or greater than the threshold Nth (no in step S104), the data processing unit 120 determines that no flicker object is detected, and proceeds to step S108.

In contrast, in the case where there is an event detection region in which the number N of detected events is equal to or greater than the threshold Nth (yes in step S104), the data processing unit 120 specifies a region in which flicker is detected (hereinafter, referred to as a flicker detection region) from the above-described one or more event detection regions (step S105).

Next, the data processing unit 120 detects the object imaged in each flicker detection area based on the edge shape of the flicker detection area specified in step S105 and the number N of detection events counted for the flicker detection area (step S106).

Note that the object in each flicker detection area may be detected based on, for example, the edge shape of the object specified by the address information and characteristics such as the frequency and duty ratio of the flicker component specified by the number N of detected events, or the like. For example, the object may be detected by registering a correspondence relationship between the edge shape, the flicker characteristic, and the like and the object in the memory 121 and performing pattern matching between the detected edge shape, the flicker characteristic, and the like of the registered object and the edge shape, the flicker characteristic, and the like. Alternatively, the object may be detected by machine learning using an edge shape, a flicker characteristic, or the like as an input.

Next, the data processing unit 120 outputs the result of the object detection processing performed as described above to, for example, the integrated control unit 12050 via the communication network 12001 (step S107), and proceeds to step S108.

In step S108, the data processing unit 120 determines whether to end the operation, and in the case of ending (yes in step S108), ends the operation. In contrast, in the case of continuing (no in step S108), the data processing unit 120 returns to step S101 and performs the subsequent operation.

1.8 actions and effects

As described above, according to the present embodiment, the object can be detected in consideration of not only the edge shape of the object but also the characteristics of flicker generated by the object and the like. Therefore, the object can be detected at higher speed and with higher accuracy.

2. Second embodiment

Next, the second embodiment is described in detail with reference to the drawings. Note that in the following description, configurations and operations similar to those of the first embodiment are referred to, and a description thereof is not repeated.

In the first embodiment described above, the case where the object emitting the flicker component is detected using the event detection data obtained from the DVS 200 is exemplified. In contrast, in the second embodiment, a case where an object emitting a flicker component is detected using image data obtained by a normal image sensor in addition to event detection data is described by way of example.

For example, as shown in fig. 11, in the case where the road surface is wet with rain water or the like, light output from the headlight of the oncoming vehicle 55 is reflected on the road surface; in this case, not only the region R5 of the headlight but also the region R5a on the road surface that reflects the light of the headlight have a flicker component in the viewing angle G2. Therefore, in the object detection based only on the flicker component, there is a possibility that erroneous detection occurs, for example, an object on a road surface on which nothing originally exists is detected. Note that, in fig. 11, a region R4 corresponding to the street lamp is also detected as a region having a flicker component.

Therefore, in the present embodiment, by combining the event detection data and the image data, the occurrence of erroneous detection is suppressed, and more accurate object detection is made possible.

2.1 example of configuration of object detection apparatus (or System)

Fig. 12 is a block diagram showing a schematic configuration example of an object detection apparatus (or system) of the second embodiment. As shown in fig. 12, the object detection apparatus 2 is provided with an imaging lens 110, a DVS 200, a flicker detection unit 12, an imaging lens 13, an image sensor (second solid-state imaging apparatus) 14, an object detection unit 15, an action mode determination unit 16, a storage unit 17, and an interface (I/F) unit 18. In this configuration, the imaging lens 11 and the DVS 200 may be similar to the imaging lens 11 and the DVS 200 according to the first embodiment.

The imaging lens 13 is an example of an optical system that condenses incident light and forms an image thereof on a light receiving surface of the image sensor 14, as in the case of the imaging lens 11. The light receiving surface may be a surface where photoelectric conversion elements are arranged in the image sensor 14.

The image sensor 14 may be various image sensors capable of obtaining image data, such as a CCD (charge coupled device) image sensor and a CMOS (complementary metal oxide semiconductor) image sensor. The image sensor 14 has a configuration, for example, included in the imaging unit 12031 in fig. 8, and is mounted on the vehicle 12100 to image at substantially the same angle of view as that of the DVS 200.

The flicker detection unit 12 specifies one or more flicker detection areas based on, for example, event detection data input from the DVS 200. The flicker detection unit 12 may be, for example, the data processing unit 120 in the first embodiment. In this case, the flicker detection unit 12 can not only specify the flicker detection areas but also detect the subject imaged in each flicker detection area.

The object detection unit 15 detects an object imaged in each flicker detection area by using the flicker detection area detected by the flicker detection unit 12 and the image data input from the image sensor 14.

The motion pattern determination unit 16 determines the motion pattern of the vehicle 12100 based on, for example, the object detected by the flicker detection unit 12 or the object detected by the object detection unit 15. The operation mode determination means 16 may be, for example, the microcomputer 12051 of fig. 8, or may be a computer different from the microcomputer 12051.

The storage unit 17 stores various programs and data necessary for the motion pattern determination unit 16 to determine the motion pattern of the vehicle 12100.

For example, when the operation mode determination unit 16 is a computer different from the microcomputer 12051, the I/F unit 18 is an interface for transmitting and receiving data to and from the integrated control unit 12050 via the communication network 12001. However, in the case where the action mode determination unit 16 is the microcomputer 12051, the I/F unit 18 corresponds to, for example, the in-vehicle network I/F12053.

2.2 object detection operation example

Next, an operation example of the object detection apparatus 2 according to the second embodiment is described. Fig. 13 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the second embodiment. Note that in the following description, an operation similar to that described with reference to fig. 10 in the first embodiment is cited, and the description thereof is not repeated.

As shown in fig. 13, in this operation, first, the flicker detection unit 12 performs operations similar to those described in steps S101 to S105 in fig. 10 in the first embodiment, thereby specifying one or more flicker detection regions. Note that region information (for example, address information) for specifying each flicker detection region is input from the flicker detection unit 12 to the object detection unit 15.

Next, the image data is input from the image sensor 14 to the object detection unit 15, and the area information of the flicker detection area is input to the object detection unit 15 (step S201). Note that the image sensor 14 may periodically output image data at a predetermined frame rate, or may output image data at a timing instructed by the object detection unit 15 or the control unit 130 (refer to fig. 1).

The object detection unit 15 that inputs image data specifies a region of an object (hereinafter, referred to as an object detection region) including a flicker detection region in the input image data (step S202). For example, in the example shown in fig. 11, a region of the vehicle 55 including the region R5 corresponding to the flicker detection region is designated as the object detection region R6. On the other hand, since the region R5a corresponding to the flicker detection region is not included in the specific object, the object detection unit 15 does not specify the object detection region to the region R5 a.

Next, the object detection unit 15 detects an object imaged in each object detection area based on the edge shape of the object included in the object detection area and the number N of events detected in the flicker detection area superimposed on the object detection area in the image data specified in step S202 (step S203). Note that pattern matching and machine learning may be used for object detection, for example, as in step S106 in fig. 10.

Next, the object detection unit 15 outputs the object detection result to, for example, the integrated control unit 12050 via the communication network 12001 (step S107), as by step S107 and subsequent steps in fig. 10, and thereafter, determines whether or not the operation is completed at step S108.

2.3 action and Effect

As described above, according to the present embodiment, the flicker detection region in which the flicker component is detected is masked by the object detection region specified from the image data. This can reduce or avoid the occurrence of erroneous detection such as detection of an object on a road surface that does not exist originally, and can detect the object with higher accuracy.

Other configurations, operations, and effects may be similar to those of the above-described embodiments, and thus detailed descriptions thereof are omitted herein.

2.4 modifications

In the second embodiment described above, the case is exemplified in which the occurrence of false detection is reduced or avoided by masking flicker information (e.g., a flicker detection area and/or the number N of detected events) obtained from event detection data with object information (e.g., an object detection area) obtained from image data. However, the method of reducing/avoiding false detection by the combination of flicker information and object information is not limited to this method. For example, false detections may also be reduced or avoided by enabling object detection of image data using flicker information.

Fig. 14 is a flowchart showing an example of an object detection operation according to a variation of the second embodiment. As shown in fig. 14, in this variation, first, image data is input to the object detection unit 15 (step S221). The object detection unit 15 to which the image data is input performs object detection processing on the input image data, thereby specifying one or more regions of the object (object detection regions) included in the image data (step S222). For example, in the example shown in fig. 11, the object detection unit 15 specifies the object detection region R6 including the vehicle 55. Note that region information (for example, address information) for specifying each object detection region is input from the object detection unit 15 to the flicker detection unit 12.

Next, the flicker detection unit 12 counts the number N of events detected per unit time in each object detection area (step S223). Note that as the number N of detection events, a value counted for each address (i.e., pixel) belonging to each object detection area may be used, or a maximum value, an average value, or the like in each area of the value counted for each address may be used. Note that the number N of detected events per object detection area is input from the flicker detection unit 12 to the object detection unit 15.

Next, the object detection unit 15 detects an object imaged in each object detection area based on the edge shape of the object included in the object detection area and the number N of events detected in the object detection area in the image data specified in step S222 (step S224). Note that pattern matching and machine learning may be used for object detection, for example, as in step S106 in fig. 10.

Next, the object detection unit 15 outputs the object detection result to, for example, the integrated control unit 12050 via the communication network 12001 (step S107), as by step S107 and subsequent steps in fig. 10, and thereafter, determines whether or not the operation is completed at step S108.

According to the operation as described above, since object detection for image data is supported by using flicker information, it is possible to reduce or avoid the occurrence of false detection such as detection of an object on a road surface that does not originally exist, and to perform object detection with higher accuracy as in the second embodiment.

3. Third embodiment

Next, the third embodiment is described in detail with reference to the drawings. Note that in the following description, configurations and operations similar to those of the above-described embodiments are referred to, and a description thereof is not repeated.

For example, an object that emits a flicker component such as a traffic light, an electronic bulletin board, or a car light is not a point but has a specific area. In contrast, most of the error detections output from the DVS 200 are very small regions of one or a few points.

Therefore, in the present embodiment, a lower limit is set in the size of the region designated as the flicker detection region, and a region smaller than the lower limit is not set as the flicker detection region, thereby reducing false detection of the object and reducing the amount of data to be processed in the object detection processing.

A configuration example of the object detection apparatus (or system) according to the present embodiment may be similar to, for example, the configuration example of the object detection apparatus 1 illustrated in the first embodiment or the object detection apparatus 2 illustrated in the second embodiment.

3.1 example of object detection operations

Fig. 15 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the third embodiment. Note that in the following description, operations similar to those of the above-described embodiment are cited, and a description thereof is not repeated. Further, in the following description, a case based on the operation described with reference to fig. 13 in the second embodiment is exemplified, but the underlying embodiment is not limited to the second embodiment, and may be other embodiments described above or later or variations thereof.

As shown in fig. 15, in the object detection operation according to the present embodiment, for example, in the same operation as that described with reference to fig. 13 in the second embodiment, step S301 is added between step S104 and step S105.

In step S301, the flicker detection unit 12 determines whether or not a certain number or more of pixels (i.e., addresses) are present in each event detection area designated in step S102, where the number N of detected events is equal to or greater than a threshold Nth. Note that the specific number may be, for example, a number corresponding to a size larger than the area size by the error detection address event.

In the case where there is no event detection area of a certain number of pixels or more in which the number N of detected events is equal to or greater than the threshold Nth (no at step S301), the operation proceeds to step S108. In contrast, in the case where there is such a case (yes in step S301), the flicker detection unit 12 designates an event detection area in which there are a certain number of pixels or more in which the number N of detected events is equal to or greater than the threshold Nth as a flicker detection area (step S105), and performs the subsequent operation.

3.2 actions and effects

As described above, according to the present embodiment, since an event detection area having a specific size or larger is specified as a flicker detection area, erroneous detection of an object is reduced and more accurate object detection is made possible, and the amount of data to be processed in the object detection processing is reduced and higher speed object detection is made possible.

Other configurations, operations, and effects may be similar to those of the above-described embodiments, and thus detailed descriptions thereof are omitted herein.

4. Fourth embodiment

Next, the fourth embodiment is described in detail with reference to the drawings. Note that in the following description, configurations and operations similar to those of the above-described embodiments are referred to, and a description thereof is not repeated.

In the above-described embodiment, the occurrence of the address event is monitored without changing the resolution between the flicker detection region and the other region. Further, in the second and third embodiments, image data of the same resolution is read from the flicker detection region and the other regions.

In contrast, in the third embodiment, the resolution when the monitoring address event occurs and/or the resolution of the image data read from the image sensor 14 is changed between the flicker detection area and the other area.

This can reduce the data processing amount in the region other than the flicker detection region, for example, and can realize high-speed object detection.

A configuration example of the object detection apparatus (or system) according to the present embodiment may be similar to, for example, the configuration example of the object detection apparatus 1 illustrated in the first embodiment or the object detection apparatus 2 illustrated in the second embodiment.

4.1 example of object detection operations

Fig. 16 is a flowchart showing an example of an object detection operation performed by the object detection apparatus according to the fourth embodiment. Note that in the following description, operations similar to those of the above-described embodiment are cited, and a description thereof is not repeated. In the following description, the operation shown in fig. 15 according to the third embodiment is exemplified, but the basic embodiment is not limited to the third embodiment, and may be another embodiment or its modification described above or later.

As shown in fig. 16, in the object detection operation according to the present embodiment, for example, in the same operation as that described with reference to fig. 15 in the third embodiment, step S401 is added between step S107 and step S108.

In step S401, the flicker detection unit 12 sets the region other than the flicker detection region specified in step S105 as the low resolution region. The setting of the low resolution area may be performed on the control unit 130, for example.

In contrast, for the low resolution area in the DVS 200, the control unit 130 sets the resolution at the time of monitoring the occurrence of the address event to low. For example, for a low resolution area in the DVS 200, the control unit 130 controls the DVS 200 so as to stop monitoring the occurrence of the address event of the unit pixels 310 in the odd-numbered lines, and monitor only the occurrence of the address event of the unit pixels 310 in the even-numbered lines.

Further, the control unit 130 reduces the resolution of the image data read from the low resolution area in the image sensor 14. For example, for a low resolution area in the image sensor 14, the control unit 130 reads pixel signals only from pixels in odd rows to generate image data.

4.2 actions and effects

As described above, according to the present embodiment, the data processing amount of the region other than the flicker detection region can be reduced, and the object detection can be performed at a higher speed.

Other configurations, operations, and effects may be similar to those of the above-described embodiments, and thus detailed descriptions thereof are omitted herein.

5. Fifth embodiment

Next, the fifth embodiment is described in detail with reference to the drawings. Note that in the following description, configurations and operations similar to those of the above-described embodiments are referred to, and a description thereof is not repeated.

In the fifth embodiment, the action mode determination operation performed by the microcomputer 12051 or the action mode determination unit 16 in the above-described embodiments is described by some examples. Note that in the following description, it is assumed that the motion pattern determination unit 16 performs a motion pattern determination operation for the sake of clarity.

5.1 first example

Fig. 17 is a flowchart showing an action mode determination operation according to the first example. In addition, in the first example, the case where the object detection device 1 or 2 according to the above-described embodiment detects a traffic light while the vehicle 12100 waits at the traffic light is exemplified. Further, at this time, it is assumed that the engine is automatically stopped by an idle stop system mounted on the vehicle 12100.

As shown in fig. 17, in the first example, the action pattern determination unit 16 first waits for input of an object detection result from the object detection apparatus 1 or 2 (no in step S501), and when the object detection result is input (yes in step S501), this recognizes that a traffic light is detected on the basis of the object detection result (step S502).

Next, the action pattern determination unit 16 specifies the current illumination color of the traffic light according to the object detection result (step S503). Then, in the case where the current illumination color is blue (yes in step S504), the operation mode determination unit 16 starts the engine (step S505), and proceeds to step S507. At this time, if the automatic driving function of the vehicle 12100 is active, the action mode determination unit 16 may start forward movement of the vehicle 12100 or the like together with the start of the engine.

In contrast, in the case where the current illumination color is a color other than blue, for example, red (no in step S504), the action mode determination unit 16 continues to stop (step S506), and proceeds to step S507.

In step S507, the action pattern determination unit 16 determines whether to end the operation, and in the case of ending (yes in step S507), ends the operation. In contrast, in the case of continuing (no in step S507), the action mode determination unit 16 returns to step S501 and performs the subsequent operation.

5.2 second example

Fig. 18 is a flowchart showing an action mode determination operation according to the second example. In addition, in the second example, a case where the oncoming vehicle is detected by the object detection device 1 or 2 according to the above-described embodiment during the traveling of the vehicle 12100 is illustrated.

As shown in fig. 18, in the second example, first, when the object detection result is input from the object detection device 1 or 2 (yes in step S521), the operation pattern determination unit 16 detects a headlight as an oncoming vehicle based on the object detection result (step S522). Subsequently, the action pattern determination unit 16 detects the oncoming vehicle from the area including the headlights in the object detection result (step S523).

When the presence of the oncoming vehicle is detected in this way, the action pattern determination unit 16 drives the audio image output unit 12052 in fig. 8 to issue a warning to notify the driver of the presence of the oncoming vehicle, for example, from the audio speaker 12061 and/or to display the presence of the oncoming vehicle on the display unit 12062 (step S524), and proceeds to step S525.

In step S525, the action pattern determination unit 16 determines whether to end the operation, and in the case of ending (yes in step S525), ends the operation. In contrast, in the case of continuing (no in step S525), the action mode determination unit 16 returns to step S521 and performs the subsequent operation.

5.3 third example

Fig. 19 is a flowchart showing an action mode determination operation according to the third example. In addition, in the third example, another action mode determination operation in a case where an oncoming vehicle is detected by the object detection device 1 or 2 according to the above-described embodiment during traveling of the vehicle 12100 is illustrated.

As shown in fig. 19, in the third example, the action mode determination unit 16 first detects an oncoming vehicle by performing operations similar to those described in steps S521 to S523 in fig. 18 in the second example.

Next, the motion pattern determination unit 16 calculates a motion vector of the oncoming vehicle from the object detection result that is continuously input within a predetermined period (step S531). Note that, for example, in step S531, the center of gravity or the center of the object detection region corresponding to the oncoming vehicle may be calculated, and the motion vector of the oncoming vehicle may be calculated from the amount of motion and the moving direction of the center of gravity or the center per unit time.

Next, the motion pattern determination unit 16 determines whether the motion vector calculated in step S531 is directed to the host vehicle (vehicle 12100) (step S532). In the case where the motion vector is not directed to the host vehicle 12100 (no in step S532), the action mode determination unit 16 determines that the risk of collision is low or non-existent, and proceeds to step S525.

In contrast, in the case where the motion vector of the oncoming vehicle is directed toward the host vehicle 12100 (yes in step S532), the action mode determination unit 16 then determines whether the magnitude M of the motion vector of the oncoming vehicle is equal to or greater than the threshold Mth set in advance (step S533). In a case where the magnitude M of the motion vector is smaller than the threshold Mth (step S533), the motion pattern determination unit 16 determines that the collision risk is low or non-existent, and proceeds to step S525.

Further, in the case where the magnitude M of the motion vector is equal to or larger than the threshold Mth (yes in step S533), the motion pattern determination unit 16 detects the risk of collision (step S534). Then, the action mode determination unit 16 executes the automatic braking control, for example, by controlling the drive system control unit 12010 of fig. 8, and drives the audio image output unit 12052 of fig. 8, whereby a warning notifying the driver of the risk of collision with the oncoming vehicle is issued from the audio speaker 12061, and/or the risk of collision with the oncoming vehicle is displayed on the display unit 12062 (step S535), and proceeds to step S525.

In step S525, as in the second example, the action pattern determination unit 16 determines whether to end the operation, and in the case of ending (yes in step S525), ends the operation, and in the case of continuing (no in step S525), returns to step S521 and performs the subsequent operation.

5.4 fourth example

Fig. 20 is a flowchart showing an action mode determination operation according to the fourth example. In addition, in the fourth example, a case is illustrated where the start of flashing of the flashing light of the preceding vehicle is detected by the object detection device 1 or 2 according to the above-described embodiment while the vehicle 12100 is traveling.

As shown in fig. 20, in the fourth example, when the object detection result is input from the object detection device 1 or 2 (yes in step S541), the action mode determination unit 16 detects that the strobe of the vehicle is a preceding vehicle based on the object detection result (step S542). Subsequently, the action mode determination unit 16 detects a preceding vehicle from the region including the flash in the object detection result (step S543).

Next, the action mode determination unit 16 specifies the center of the object detection region corresponding to the detected preceding vehicle (step S544), and specifies the route change direction indicated by the preceding vehicle with the flashing light in accordance with the positional relationship between the center position corresponding to the flashing light and the flicker detection region (step S546).

Next, the action mode determination unit 16 drives the audio image output unit 12052 in fig. 8, for example, to issue a warning from the audio speaker 12061 to notify the driver of the route change of the preceding vehicle and/or to display the route change of the preceding vehicle on the display unit 12062 (step S546), and proceeds to step S525.

In step S525, the action pattern determination unit 16 determines whether to end the operation, and in the case of ending (yes in step S525), ends the operation. In contrast, in the case of continuing (no in step S525), the action mode determination unit 16 returns to step S521 and performs the subsequent operation.

Although the embodiments of the present disclosure are described above, the technical scope of the present disclosure is not limited to the above-described embodiments, and various modifications may be made without departing from the gist of the present disclosure. Further, the components of the different embodiments and variations may be combined as appropriate.

Further, the effects described in each embodiment of the present specification are merely illustrative and not restrictive; another effect may also be present.

Note that the present technology may have the following configuration.

(1)

An object detection apparatus has:

a first solid-state imaging device provided with a plurality of pixels arranged in a matrix, the first solid-state imaging device detecting occurrence of an event in the pixels according to an amount of light incident on each of the pixels;

a flicker detection unit that generates flicker information based on an occurrence of an event detected by the first solid-state imaging device; and

an object detection unit that detects an object based on flicker information detected by the first solid-state imaging device.

(2)

The object detecting apparatus according to the above (1), wherein,

the flicker detection unit specifies an event detection area in which occurrence of an event is detected among the plurality of pixels, and generates flicker information based on the number of occurrence events detected per predetermined time for the event detection area.

(3)

The object detecting device according to the above (2), wherein

The flicker detection unit generates flicker information based on a maximum value or an average value of the number of occurrence events detected per predetermined time in each pixel belonging to the event detection area.

(4)

The object detecting apparatus according to the above (2) or (3), wherein,

the flicker information includes the edge shape of the event detection area and the number of occurrence events detected per predetermined time.

(5)

The object detection apparatus according to any one of the above (2) to (4), wherein,

a flicker detection unit sets, as a flicker detection region, a region in which pixels, of the event detection regions, in which the number of occurrence events detected per predetermined time is a first threshold or more are arranged;

the object detection unit detects an object imaged in the flicker detection area based on the flicker information.

(6)

The object detecting apparatus according to the above (5), wherein,

the flicker detection unit sets, as a flicker detection region, a region in the event detection region in which the number of the occurrence events detected per predetermined time is a first threshold or more and the number of pixels is a second threshold or more.

(7)

The object detection device according to the above (2), further comprising:

a second solid-state imaging device that obtains image data, wherein,

the object detection unit detects an object based on the flicker information and the image data.

(8)

The object detecting apparatus according to the above (7), wherein,

the flicker detection unit specifies an event detection area in which occurrence of an event is detected among the plurality of pixels, and generates flicker information based on the number of occurrence events detected per predetermined time for the event detection area,

the object detection unit specifies an object detection area that includes an area on the image data corresponding to the event detection area and corresponds to the object, and detects the object based on the object detection area and the flicker information.

(9)

The object detecting apparatus according to the above (7), wherein,

an object detection unit specifies an object detection area corresponding to an object from image data;

the flicker detection unit generates flicker information based on the number of occurrence events detected per predetermined time for an area corresponding to the object detection area among the plurality of pixels, and

the object detection unit detects an object based on the object detection area and the flicker information.

(10)

The object detection device according to the above (2), further comprising:

a control unit that controls reading of the first solid-state imaging device, wherein,

the control unit reduces a resolution at the time of monitoring the occurrence of the event for an area in which pixels other than pixels included in the event detection area are arranged among the plurality of pixels.

(11)

The object detection device according to the above (7), further comprising:

a control unit that controls reading of the second solid-state imaging device, wherein,

the control unit reduces the resolution of image data read from an area corresponding to the event detection area in the second solid-state imaging device.

(12)

An object detection system having:

a first solid-state imaging device provided with a plurality of pixels arranged in a matrix, the first solid-state imaging device detecting occurrence of an event in the pixels according to an amount of light incident on each of the pixels;

a flicker detection unit that generates flicker information based on an occurrence of an event detected by the first solid-state imaging device; and

an object detection unit that detects an object based on flicker information detected by the first solid-state imaging device.

(13)

An object detection method having:

generating flicker information based on an occurrence of an event detected by a solid-state imaging device provided with a plurality of pixels arranged in a matrix, the solid-state imaging device detecting the occurrence of the event in the pixels according to an amount of light incident on each pixel; and

an object is detected based on the flicker information.

Description of the symbols

1.2 object detection device (system)

11. 13 imaging lens

12 flicker detection unit

14 image sensor

15 object detection unit

16 action mode determination unit

17 memory cell

18I/F unit

120 data processing unit

121 memory

122 DSP

130 control unit

200 DVS

210 logic circuit

211 drive circuit

212 signal processing unit

213 arbitrator

300 pixel array unit

310 unit pixel

330 light receiving unit

333 photoelectric conversion element

400 address event detection unit

410 current-voltage conversion unit

411. 413 LG transistor

412. 414 amplifying transistor

415 constant current circuit

420 buffer

430 subtracter

431. 433 capacitor

432 inverter

434 switch

440 quantizer

441 comparator

450 transfer unit

12001 communication network

12010 drive system control unit

12030 vehicle external information detecting unit

12031. 12101 to 12105 imaging unit

12050 Integrated control Unit

12051 micro-computer

12052 Audio image output unit

12053 vehicle network I/F

12061 Audio frequency loudspeaker

12062 display unit

12100 vehicle.

39页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:电子设备和成像系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类