Image processing apparatus, image processing method, and program

文档序号:621630 发布日期:2021-05-07 浏览:11次 中文

阅读说明:本技术 图像处理装置、图像处理方法和程序 (Image processing apparatus, image processing method, and program ) 是由 佐藤竜太 李骏 山本启太郎 于 2019-09-13 设计创作,主要内容包括:图像处理单元30-1的比较区域检测单元31基于由使用滤色器的图像捕获单元20生成的图像信号来检测褪色判定物体的图像区域作为比较区域。颜色信息生成单元32a从比较区域的图像信号生成颜色信息并将该颜色信息用作比较对象信息。颜色信息比较单元33a将褪色判定物体的颜色信息作为褪色判定基准信息与比较对象信息进行比较。褪色信息生成单元35a基于指示褪色判定基准信息与比较对象信息之间的比较结果的比较信息来生成指示滤色器的褪色水平超过预定水平或者指示滤色器的褪色水平的褪色信息。能够精确地检测滤色器的褪色状态。(The comparison area detection unit 31 of the image processing unit 30-1 detects an image area of the fading determination object as a comparison area based on the image signal generated by the image capturing unit 20 using the color filter. The color information generating unit 32a generates color information from the image signal of the comparison area and uses the color information as comparison target information. The color information comparing unit 33a compares the color information of the fading determination object as fading determination reference information with the comparison target information. The fading information generating unit 35a generates fading information indicating that the level of fading of the color filter exceeds a predetermined level or indicating the level of fading of the color filter based on comparison information indicating the result of comparison between the fading determination reference information and the comparison target information. The color fading state of the color filter can be accurately detected.)

1. An image processing apparatus comprising:

a color fading information generating unit configured to generate color fading information indicating color fading of the color filter based on comparison target information and color fading determination reference information, the comparison target information being based on an image signal generated by an image capturing unit using the color filter.

2. The image processing apparatus according to claim 1, further comprising:

a color information generating unit configured to generate color information from the image signal and use the color information as the comparison target information; and

a color information comparison unit configured to use color information in a case where the color filter is not faded as the fading determination reference information and compare the comparison target information with the fading determination reference information,

wherein the fading information generating unit generates the fading information based on the comparison result of the color information comparing unit.

3. The image processing apparatus according to claim 2, further comprising:

a comparison area detection unit configured to detect a comparison area indicating an image area of a fading determination object based on the image signal,

wherein the color information generating unit generates the color information from the image signal of the comparison area detected by the comparison area detecting unit and uses the color information as the comparison target information; and is

The color information comparison unit uses the color information of the fading determination object as the fading determination reference information.

4. The image processing apparatus according to claim 3,

the fading information generating unit estimates a level of fading based on the comparison result of the color information comparing unit, and

the comparison area detection unit detects the comparison area according to the level of fading estimated by the fading information generation unit.

5. The image processing apparatus according to claim 3,

the comparison area detection unit detects the comparison area within a preset image area.

6. The image processing apparatus according to claim 2,

the image capturing unit is provided with color component pixels provided with color filters and reference pixels in which color fading does not occur,

the color information generating unit generates color information from the image signal generated in the color component pixel and uses the color information as the comparison target information, and

the color information comparison unit uses color information generated based on the signal of the reference pixel as the fading determination criterion information.

7. The image processing apparatus according to claim 6,

the reference pixels include pixels not provided with color filters.

8. The image processing apparatus according to claim 6,

the reference pixels include pixels provided with a spectral filter in place of the color filter.

9. The image processing apparatus according to claim 6, further comprising:

an image capturing unit provided with the color component pixels and the reference pixels.

10. The image processing apparatus according to claim 2, further comprising:

a comparison information accumulation unit configured to accumulate comparison information indicating a comparison result of the color information comparison unit,

wherein the fading information generating unit generates the fading information based on the comparison information accumulated by the comparison information accumulating unit.

11. The image processing apparatus according to claim 1, further comprising:

a luminance information accumulation unit configured to accumulate luminance information generated based on the image signal,

wherein the fading information generating unit uses the illuminance information accumulated by the illuminance information accumulating unit as the comparison target information, and uses information indicating a relationship between a fading level of the color filter and an accumulation result of the illuminance information as the fading determination reference information.

12. The image processing apparatus according to claim 11, further comprising:

an environment information accumulation unit configured to accumulate environment information at the time of generating the image signal and include the environment information in the comparison target information,

wherein the fading information generating unit includes the environmental information accumulated by the environmental information accumulating unit in the comparison target information, and uses information indicating a relationship between a fading level of the color filter and an accumulation result of illuminance information and environmental information as the fading determination reference information.

13. The image processing apparatus according to claim 1,

the fading information indicates that a level of fading of the color filter exceeds a predetermined level, or indicates a level of fading of the color filter.

14. An image processing method comprising:

generating, by a fading information generating unit, fading information indicating fading of a color filter based on comparison target information and fading determination reference information, the comparison target information being based on an image signal generated by an image capturing unit using the color filter.

15. A program for causing a computer to execute generation of information on a color filter used in an image capturing unit, the program being configured to cause the computer to execute:

generating comparison target information based on an image signal generated by the image capturing unit using the color filter; and

generating fading information indicating fading of the color filter based on the comparison target information and fading determination reference information.

Technical Field

The present technology relates to an image processing apparatus, an image processing method, and a program, and enables accurate detection of the color fading state of a color filter.

Background

For a color filter mounted on the front surface of an image capturing element, it is known that fading is accelerated and performance is deteriorated due to long-time exposure. Therefore, in patent document 1, when the accumulated exposure time, which is the sum of the times during which the shutter is opened, exceeds the limit exposure time of the image capturing element, a notification is made about this fact.

CITATION LIST

Patent document

Patent document 1: japanese patent application laid-open No. 2011-

Disclosure of Invention

Problems to be solved by the invention

However, in capturing a moving image, for example, in the case of capturing a dark scene and in the case of capturing a bright scene, even if the image capturing time is the same, the incident light amounts of the image capturing elements in the two cases are different. Therefore, by using the accumulated exposure time as the sum of the times during which the shutters are opened as shown in patent document 1, the fading state cannot be accurately determined.

Therefore, an object of the present technology is to provide an image processing apparatus, an image processing method, and a program capable of accurately detecting the color fading state of a color filter.

Solution to the problem

A first aspect of the present technique is

An image processing apparatus comprising:

a color fading information generating unit configured to generate color fading information indicating color fading of the color filter based on comparison target information and color fading determination reference information, the comparison target information being based on an image signal generated by an image capturing unit using the color filter.

In the present technology, color information generated from an image signal generated by an image capturing unit using a color filter is used as comparison target information. Further, color information in the case where the color filter is not faded is used as fading determination reference information. The comparison target information is compared with the fading determination reference information, and fading information is generated based on the comparison result. In the image processing apparatus, for example, an image area of the fading determination object is detected as a comparison area based on an image signal generated by the image capturing unit, and color information generated from the image signal of the comparison area is used as comparison target information. Further, the detection of the comparison area may be performed according to a level of color fading estimated based on the comparison result, and the detection of the comparison area may be performed within an image area set in advance.

Further, the image capturing unit may be provided with color component pixels provided with color filters and reference pixels in which color fading does not occur. Color information generated from an image signal generated in a color component pixel may be used as comparison target information, and color information generated based on a signal of a reference pixel may be used as fading determination criterion information. The reference pixel is, for example, a pixel provided with no color filter or a pixel provided with a spectral filter instead of a color filter.

Comparison information indicating a result of comparison of the comparison target information with the fading determination reference information is accumulated, and fading information indicating that a level of fading of the color filter exceeds a predetermined level or indicating a level of fading of the color filter is generated based on the accumulated comparison information.

Further, illuminance information generated based on the image signal or the illuminance information and environment information at the time of generating the image signal are accumulated. The accumulated illuminance information or the illuminance information and the environment information are used as comparison target information. Information indicating a relationship between the level of color fading of the color filter and the accumulated result of illuminance information or environment information is used as fading determination reference information. Fading information indicating that the level of fading of the color filter exceeds a predetermined level or indicating the level of fading of the color filter is generated based on the comparison target information and the fading determination reference information.

A second aspect of the present technique is

An image processing method comprising:

generating, by a fading information generating unit, fading information indicating fading of a color filter based on comparison target information and fading determination reference information, the comparison target information being based on an image signal generated by an image capturing unit using the color filter.

A third aspect of the present technology is

A program for causing a computer to execute generation of information on a color filter used in an image capturing unit, the program being configured to cause the computer to execute:

generating comparison target information based on an image signal generated by the image capturing unit using the color filter; and

generating fading information indicating fading of the color filter based on the comparison target information and fading determination reference information.

Note that the program of the present technology is, for example, a program that can be provided by a storage medium provided in a computer-readable format, a communication medium (for example, a storage medium such as an optical disk, a magnetic disk, a semiconductor memory, or a communication medium such as a network) to a general-purpose computer that can execute various program codes. By providing such a program in a computer-readable format, processing according to the program can be realized on a computer.

Drawings

Fig. 1 is a diagram showing the configuration of an image capturing system.

Fig. 2 is a diagram illustrating the configuration of the first embodiment.

Fig. 3 is a flowchart illustrating the operation of the first embodiment.

Fig. 4 is a flowchart illustrating the operation of the first embodiment.

Fig. 5 is a diagram illustrating an object without fading.

Fig. 6 is a diagram illustrating the configuration of an image capturing unit that generates an image signal and a reference signal.

Fig. 7 is a diagram illustrating the configuration of the second embodiment.

Fig. 8 is a flowchart illustrating the operation of the second embodiment.

Fig. 9 is a flowchart illustrating the operation of the second embodiment.

Fig. 10 is a diagram illustrating the configuration of the third embodiment.

Fig. 11 is a flowchart illustrating the operation of the third embodiment.

Fig. 12 is a flowchart illustrating the operation of the third embodiment.

Fig. 13 is a block diagram showing a schematic functional configuration example of the vehicle control system.

Detailed Description

Modes for carrying out the present technology will be described below. Note that description will be made in the following order.

1. With respect to image capture systems

2. First embodiment

2-1. configuration of the first embodiment

2-2 operation of the first embodiment

3. Second embodiment

3-1. configuration of the second embodiment

3-2 operation of the second embodiment

4. Third embodiment

4-1. configuration of the third embodiment

4-2 operation of the third embodiment

5. Other embodiments

6. Information utilization unit for fading

7. Application example

<1. related to image capturing System >

Fig. 1 shows a configuration of an image capturing system using an image processing apparatus of the present technology. The image capturing system 10 includes an image capturing unit 20(21) and an image processing unit 30. Further, the image capturing system 10 may be provided with an environmental information generating unit 40, a fading information utilizing unit 50, and an image utilizing unit 60.

The image capturing unit 20(21) includes an image capturing element such as a Complementary Metal Oxide Semiconductor (CMOS) or a Charge Coupled Device (CCD). Further, a color filter is provided on the image capturing surface side of the image capturing element. The image capturing unit 20 performs photoelectric conversion of the subject optical image, generates an image signal corresponding to the subject optical image, and outputs the image signal to the image processing unit 30 and the fading information utilizing unit 50 or the image utilizing unit 60. Further, as described later, the image capturing unit 21 generates an image signal and a reference signal and outputs to the image processing unit 30.

The image processing unit 30 generates color fading information indicating color fading of the color filter by using comparison target information based on the image signal generated by the image capturing unit 20(21) using the color filter and color fading determination reference information for determining color fading of the color filter. The image processing unit 30 may generate color information from an image signal to be used as comparison target information, or accumulate illuminance information generated based on the image signal, or accumulate the illuminance information and environment information at the time of generating the image signal to be used as the comparison target information, for example. Further, the image processing unit 30 may use color information that is not faded (e.g., color information of an object that is not faded) or color information based on a signal generated in a reference pixel where fading does not occur as fading determination criterion information. The image processing unit 30 may use information indicating a relationship between the level of color fading of the color filter and the accumulation result of illuminance information or the accumulation result of illuminance information and environment information as fading determination reference information. The image processing unit 30 compares the comparison target information with the fading determination reference information, determines fading of the color filter based on the comparison result, and generates fading information. The image processing unit 30 outputs the generated fading information to the fading information utilization unit 50.

The environment information generation unit 40 includes a sensor that detects an image capturing environment, detects an environment at the time of image capturing (e.g., a temperature at the time of image capturing), and outputs image capturing environment information indicating the detected temperature to the image processing unit 30.

The fading information utilization unit 50 issues a warning about fading and executes control corresponding to fading, based on the fading information generated by the image processing unit 30. For example, when the fading information indicates that fading exceeding a predetermined level has occurred, the fading information utilizing unit 50 presents a warning display or a warning sound to the user. Further, in the case where the plurality of image capturing units 20(21) are switchably provided in the image capturing system 10, when the color fading information indicates that color fading exceeding a predetermined level has occurred, switching to another image capturing unit may be performed. Also, the color fading information utilizing unit 50 may correct color fading of the image signal generated by the image capturing unit 20(21) based on the color fading information.

The image utilizing unit 60 performs, for example, application operations such as driving control and monitoring, recording of image signals, and the like by using the image signals acquired by the image capturing unit 20(21) or the image signals whose color fading has been corrected by the color fading information utilizing unit 50.

<2 > first embodiment

Next, a first embodiment of the present technology will be described. The first embodiment shows a case where color information generated from an image signal is used as comparison target information and color information of an object that does not fade is used as fade determination reference information.

<2-1. configuration of the first embodiment >

Fig. 2 illustrates the configuration of the first embodiment. The image processing unit 30-1 includes a comparison area detecting unit 31, a color information generating unit 32a, a color information comparing unit 33a, a comparison information accumulating unit 34a, and a fading information generating unit 35 a.

The comparison area detection unit 31 identifies a subject by using the image signal generated by the image capturing unit 20, and detects an image area of an object that is not faded as a comparison area. The comparison area detection unit 31 outputs the image signal of the detected comparison area to the color information generation unit 32 a. Further, as described later, the fading information generating unit 35a may notify the comparison area detecting unit 31 of the fading level, and the comparison area detecting unit 31 may perform the identification processing corresponding to the fading level. For example, the comparison area detection unit 31 switches the dictionary to be used for the recognition processing according to the notified fading level. If the object is recognized in this way, the comparison area can be accurately detected regardless of the fading state of the color filter. Further, the comparison area detection unit 31 may set the object recognition area in accordance with the object detected as the comparison area. For example, as described later with reference to fig. 5, in the case of detecting a traffic signal or a lighting device as the detection of an object without fading, since the traffic signal or the lighting device is located on the upper side of the image center of a captured image, if an area on the upper side of the image center is set as a subject recognition area in the captured image, the traffic signal or the lighting device can be effectively detected.

The color information generating unit 32a outputs color information (e.g., color information indicating the level of each color component of the three primary colors) as comparison target information from the image signal of the comparison area detected by the comparison area detecting unit 31 to the color information comparing unit 33 a.

The color information comparing unit 33a stores in advance color information of an object that has not faded as fading determination reference information. The color information comparing unit 33a compares the color information (comparison target information) generated by the color information generating unit 32a with the color information (fading determination reference information) of the object detected by the comparison area detecting unit 31. The color information comparing unit 33a generates comparison information indicating the ratio (fade level) of the comparison target information with respect to the fade determination reference information or whether or not fade has occurred with the fade level exceeding a predetermined level. The color information comparing unit 33a outputs the generated comparison information to the comparison information accumulating unit 34 a.

The comparison information accumulation unit 34a accumulates the comparison information generated by the color information comparison unit 33 a. Further, the comparison information accumulation unit 34a outputs the accumulated comparison information as the accumulation result of the comparison information to the fading information generation unit 35 a.

The fading information generating unit 35a performs statistical processing or the like on the accumulation comparison information supplied from the comparison information accumulating unit 34a to calculate fading determination information. For example, based on the accumulation comparison information, when the comparison information accumulated most recently shows that fading has occurred more than a predetermined level set in advance, the fading information generating unit 35a calculates the duration and the number of durations of the comparison result indicating that fading has occurred more than the predetermined level as fading determination information. Further, the fading information generating unit 35a may estimate the current fading level based on the accumulated comparison information and output the estimated fading level to the comparison area detecting unit 31. In the case where the duration and the number of durations of the comparison result indicating that fading has occurred exceeding the predetermined level exceed the threshold values, the fading information generating unit 35a determines that fading has occurred, generates fading information indicating that fading has occurred, and outputs the fading information to the fading information utilizing unit 50. Further, the fading information generating unit 35a may not only indicate that fading has occurred with the fading information, but may also include the level of fading in the fading information.

Note that the comparison information accumulation unit 34a is not a necessary component, and the comparison information generated by the color information comparison unit 33a may be output to the fading information generation unit 35 a. In the case where the comparison information accumulation unit 34a is provided, even if the comparison information generated by the color information comparison unit 33a changes, stable fading information can be generated based on the accumulated comparison information. Further, in the case where the comparison information accumulation unit 34a is not provided, the configuration of the image processing unit 30-1 can be simplified.

<2-2. operation of the first embodiment >

Fig. 3 and 4 are flowcharts illustrating the operation of the first embodiment, and in fig. 3, a processing operation to be used for determining information of color fading is performed. Further, fig. 4 shows the generation operation of the fading information. The processing operation of the information to be used for determining color fading is performed during the operation of the image capturing unit, and the generation operation of the color fading information is performed at predetermined time intervals.

In step ST1 of fig. 3, the image processing unit acquires an image signal. The image processing unit 30-1 acquires the image signal generated by the image capturing unit 20, and proceeds to step ST 2.

In step ST2, the image processing unit performs the comparison-area detection process. The image processing unit 30-1 detects, as a comparison area, an image area indicating an object that is not faded, in the image based on the image signal acquired in step ST 1. Fig. 5 illustrates an object without fading, and a traffic signal light with clear color as shown in (a) of fig. 5, an illumination device emitting illumination light with a predetermined wavelength as shown in (b) of fig. 5, and the like are objects detected as a comparison area. Further, as shown in (c) of fig. 5, a mark or the like of a predetermined color provided as a Road Side Unit (RSU) may be used as the object detected as the comparison area. The image processing unit 30-1 performs the comparison area detection process, and proceeds to step ST 3.

In step ST3, the image processing unit determines whether a comparison area has been detected. In the case where the comparison area is detected in the comparison-area detecting process at step ST2, the image processing unit 30-1 proceeds to step ST4, or returns to step ST1 in the case where the comparison area is not detected.

In step ST4, the image processing unit executes comparison target information generation processing. The image processing unit 30-1 proceeds to step ST5 using, as comparison target information, color information generated by using the image signal of the comparison area detected in step ST 2.

In step ST5, the image processing unit performs information comparison processing. The image processing unit 30-1 compares the comparison target information generated in step ST4 with the fading determination reference information registered in advance as the color information of the object having no fading, generates comparison information based on the comparison result, and proceeds to step ST 6.

In step ST6, the image processing unit performs the comparison information accumulation process. The image processing unit 30-1 accumulates the comparison information generated in step ST5, and returns to step ST 1.

In step ST11 of fig. 4, the image processing unit acquires accumulation comparison information. The image processing unit 30-1 acquires the accumulated comparison information generated by accumulating the comparison information with the comparison information accumulation process in step ST6 of fig. 3, and proceeds to step ST 12.

In step ST12, the image processing unit calculates color fading determination information. The image processing unit 30-1 calculates the fading determination information based on the accumulation comparison information acquired in step ST 11. The image processing unit 30-1 calculates, for example, the duration and the number of durations of the comparison result indicating that color fading has occurred exceeding a predetermined level as color fading determination information, and proceeds to step ST 13.

In step ST13, the image processing unit determines whether or not color fading has occurred. The image processing unit 30-1 determines whether or not the color fading determination information calculated in step ST12 exceeds a threshold value set in advance, proceeds to step ST14 in the case where it is determined based on the color fading determination information that color fading exceeding the threshold value has occurred, or proceeds to step ST15 in the case where it is determined based on the color fading determination information that color fading exceeding the threshold value has not occurred.

In step ST14, the image processing unit generates color fading information. The image processing unit 30-1 generates color fading information indicating that color fading or the like exceeding a threshold has occurred in the color filter used in the image capturing unit 20.

In step ST15, the image processing unit performs area detection feedback processing. The image processing unit 30-1 estimates the current level of fading based on the accumulated comparison information acquired in step ST11, and feeds back the estimated level of fading to the detection processing of the comparison area in step ST2 of fig. 3, thereby enabling the comparison area to be accurately detected, and then returns to step ST 11.

In this way, the first embodiment enables accurate detection of the fading state. Further, it is possible to detect an image area of an object that is not discolored from the image capturing surface, and determine discoloration from color information of the detected area and color information of the object. Therefore, if the color information of the object that has not faded is stored in advance, the fading of the color filter can be detected without using information or the like acquired by an external device.

<3. second embodiment >

Next, a second embodiment of the present technology will be described. The second embodiment shows a case where color information generated from an image signal is used as comparison target information and color information based on a signal generated in a reference pixel where color fading does not occur is used as fading determination criterion information.

<3-1. configuration of the second embodiment >

Fig. 6 illustrates a configuration of an image capturing unit that generates an image signal and a reference signal. The image capturing element of the image capturing unit 21 includes color pixels provided with color filters and reference pixels in which color fading does not occur.

Fig. 6 (a) illustrates a case where the reference pixel is a pixel where no color filter is provided. For example, a block of 2 × 2 pixels includes red pixels, green pixels, blue pixels, and reference pixels.

Fig. 6 (b) illustrates a case where the reference pixel is a pixel provided with a spectral filter. As the spectral filter, for example, a filter using surface plasmon resonance (hereinafter referred to as "surface plasmon filter") is used. In the surface plasmon filter, a metal thin film pattern having a periodicity corresponding to a wavelength of light transmitted through a surface of a dielectric is formed. The reference pixels generate signals that are not affected by the fading of the color filters. Note that in the case where a spectral filter is used as the reference pixel, the spectral filter (red, green, blue) is formed so as to transmit light having a wavelength equal to that of the color filter. Note that in (b) of fig. 6, for example, a block of 2 × 2 pixels includes red pixels, green pixels, blue pixels, and reference pixels. The reference pixels of the neighboring blocks use different color spectral filters.

Fig. 6 (c) illustrates a case where the reference pixel is a pixel where no color filter is provided, and the reference pixel is provided in the row direction. Note that the arrangement of the reference pixels shown in fig. 6 is an example, and is not limited to the arrangement of fig. 6.

The image capturing unit 21 outputs an image signal generated by reading a signal from a color pixel and a reference signal generated by reading a signal from a reference pixel that is not faded. Note that as shown in (a) of fig. 6 or (b) of fig. 6, if a reference pixel is provided for each pixel block including a plurality of pixels, the pixels may be arranged without imbalance. Further, if the arrangement is made as shown in (c) of fig. 6, the reference signal can be easily generated only by reading the row.

Fig. 7 illustrates the configuration of the second embodiment. The image processing unit 30-2 includes color information generating units 32b and 32c, a color information comparing unit 33b, a comparison information accumulating unit 34b, and a fading information generating unit 35 b.

The color information generating unit 32b generates color information (e.g., color information indicating the level of each of the three primary colors) as comparison target information from the image signal generated by the image capturing unit 21 and outputs to the color information comparing unit 33 b.

The color information generating unit 32c generates color information from the reference signal generated by the image capturing unit 21, and uses the color information as comparison target information. In the case where the reference pixel is a pixel where no color filter is provided, the color information generating unit 32c generates information indicating the illuminance level as the fading determination criterion information. Further, in the case where the reference pixel is a pixel provided with a spectral filter, the color information generating unit 32c generates fading determination criterion information indicating the level of each color component equal to the color filter. The color information generating unit 32c outputs the generated fading judgment reference information to the color information comparing unit 33 b.

The color information comparing unit 33b compares the comparison target information generated by the color information generating unit 32b with the fading judgment reference information generated by the color information generating unit 32 c. The color information comparing unit 33b generates comparison information indicating the ratio (fade level) of the comparison target information with respect to the fade determination reference information or whether or not fade has occurred with the fade level exceeding a predetermined level.

In the case where the reference pixel is a pixel where no color filter is provided, the color information comparing unit 33B calculates color information Lrgb indicating the illuminance level as comparison target information as shown in formula (1) from the color information (the red component R, the green component G, and the blue component B) generated by the color information generating unit 32B. Note that, in formula (1), the coefficient kr indicates the proportion of the red component, the coefficient kg indicates the proportion of the green component, and the coefficient kb indicates the proportion of the blue component.

Lrgb=krR+kgG+kbB...(1)

The color information comparing unit 33b compares the signal level of the reference pixel where no color filter is provided with the fading determination reference information Lref, compares the comparison target information Lrgb with the fading determination reference information Lref, and outputs comparison information indicating the comparison result to the comparison information accumulating unit 34 b.

In the case where the reference pixel is a pixel provided with a spectral filter, the color information comparing unit 33B compares the comparison target information (red component R, green component G, and blue component B) generated by the color information generating unit 32B with the fading determination reference information (reference red component Rref, reference green component Gref, and reference blue component Bref) generated by the color information generating unit 32c for each color component. The color information comparing unit 33b outputs comparison information indicating the comparison result to the comparison information accumulating unit 34 b.

Note that in the case where the reference pixel is set as shown in (a) of fig. 6, the color information comparing unit 33b may generate the comparison information for each pixel block, for example. Further, in the case where the reference pixel is set as shown in (b) of fig. 6, the comparison information may be generated for each pixel block or for each of a plurality of pixel blocks including the reference pixel of each color. Further, in the case where the reference pixels are set as shown in (c) of fig. 6, the comparison information may be generated, for example, for each region having a predetermined line width with the line of the reference pixels as a reference.

The comparison information accumulation unit 34b accumulates the comparison information generated by the color information comparison unit 33 b. Further, the comparison information accumulation unit 34b outputs the accumulated comparison information as the accumulation result of the comparison information to the fading information generation unit 35 b.

The fading information generating unit 35b performs statistical processing or the like on the accumulation comparison information supplied from the comparison information accumulating unit 34b to calculate fading determination information. For example, based on the accumulation comparison information, when the comparison information accumulated most recently shows that fading has occurred more than a predetermined level set in advance, the fading information generating unit 35b calculates the duration and the number of durations of the comparison result indicating that fading has occurred more than the predetermined level as fading determination information. In the case where the duration and the number of durations of the comparison result indicating that fading has occurred exceeding the predetermined level exceed the threshold values, the fading information generating unit 35b determines that fading has occurred, generates fading information indicating that fading has occurred, and outputs the fading information to the fading information utilizing unit 50. Further, the fading information generating unit 35b may not only indicate that fading has occurred with the fading information, but may also include the level of fading in the fading information.

Note that the comparison information accumulation unit 34b is not a necessary component, and the comparison information generated by the color information comparison unit 33b may be output to the fading information generation unit 35 b. In the case where the comparison information accumulation unit 34b is provided, even if the comparison information generated by the color information comparison unit 33b changes, stable fading information can be generated based on the accumulated comparison information. Further, in the case where the comparison information accumulation unit 34b is not provided, the configuration of the image processing unit 30-2 can be simplified.

<3-2 > operation of the second embodiment

Fig. 8 and 9 are flowcharts illustrating the operation of the second embodiment, fig. 8 shows a processing operation of information to be used for determination of color fading, and fig. 9 shows a generation operation of color fading information. The processing operation of the information to be used for determining color fading is performed during the operation of the image capturing unit, and the generation operation of the color fading information is performed at predetermined time intervals.

In step ST21 of fig. 8, the image processing unit acquires an image signal. The image processing unit 30-2 acquires the image signal generated by the image capturing unit 21, and proceeds to step ST 22.

In step ST22, the image processing unit executes comparison target information generation processing. Based on the image signal acquired in step ST21, the image processing unit 30-2 generates color information indicating the illuminance or the signal level of each color component as comparison target information, and proceeds to step ST 23.

In step ST23, the image processing unit acquires a reference signal. The image processing unit 30-2 acquires the reference signal generated by the image capturing unit 21, and proceeds to step ST 24.

In step ST24, the image processing unit executes color fading determination reference information generation processing. Based on the reference signal acquired in step ST23, the image processing unit 30-2 generates color information indicating the illuminance or the signal level of each color component as color fading determination reference information, and proceeds to step ST 25.

In step ST25, the image processing unit performs information comparison processing. The image processing unit 30-2 compares the comparison target information generated in step ST22 with the fading determination reference information generated in step ST24 to generate comparison information based on the comparison result, and proceeds to step ST 26.

In step ST26, the image processing unit performs the comparison information accumulation process. The image processing unit 30-2 accumulates the comparison information generated in step ST25, and returns to step ST 21.

In step ST31 of fig. 9, the image processing unit acquires accumulation comparison information. The image processing unit 30-2 acquires the accumulated comparison information generated by accumulating the comparison information with the comparison information accumulation process in step ST26 of fig. 8, and proceeds to step ST 32.

In step ST32, the image processing unit calculates color fading determination information. The image processing unit 30-2 calculates the fading determination information based on the accumulation comparison information acquired in step ST 31. The image processing unit 30-2 calculates, for example, the duration and the number of durations of the comparison result indicating that color fading has occurred exceeding a predetermined level as color fading determination information, and proceeds to step ST 33.

In step ST33, the image processing unit determines whether or not color fading has occurred. The image processing unit 30-2 determines whether or not the color fading determination information calculated in step ST32 exceeds a threshold value set in advance, proceeds to step ST34 in the case where it is determined based on the color fading determination information that color fading exceeding the threshold value has occurred, or returns to step ST31 in the case where it is determined based on the color fading determination information that color fading exceeding the threshold value has not occurred.

In step ST34, the image processing unit generates color fading information. The image processing unit 30-2 generates color fading information indicating that color fading or the like exceeding a threshold has occurred in the color filter used in the image capturing unit 20.

In this way, the second embodiment enables accurate detection of the fading state, as in the first embodiment. Further, since color fading can be determined based on the image signal generated by the image capturing unit 21 and the reference signal, color fading of the color filter can be detected without using information or the like acquired by an external device.

<4. third embodiment >

Next, a third embodiment of the present technology will be described. The third embodiment shows a case where the accumulated illuminance information obtained by accumulating the illuminance information generated based on the image signal and the accumulated environment information obtained by accumulating the environment information are used as the comparison target information and the information indicating the relationship between the color fading level of the color filter and the accumulation result of the illuminance information and the environment information is used as the color fading determination reference information.

<4-1. configuration of third embodiment >

Fig. 10 illustrates the configuration of the third embodiment. The image processing unit 30-3 includes a illuminance information accumulation unit 36, an environment information accumulation unit 37, and a fading information generation unit 38.

The illuminance information accumulation unit 36 generates illuminance information based on the image signal supplied from the image capturing unit 20. The illuminance information accumulation unit 36 accumulates, for example, an average value of pixel values in a captured image as illuminance information, and outputs a time-integrated value of the illuminance information until now from the first use of the color filter of the image capturing unit 20 to the color fading information generation unit 38 as accumulated illuminance information.

The environmental information accumulation unit 37 accumulates environmental information (e.g., temperature measurement value) supplied from the environmental information generation unit 40, and outputs a time-integrated value of the temperature measurement value until now from the first use of the color filter of the image capturing unit 20 to the color fading information generation unit 38 as accumulated environmental information. Note that the environmental information accumulation unit may use not only the temperature measurement value but also the humidity measurement value as the environmental information.

The color fading information generating unit 38 determines whether color fading exceeding a predetermined level set in advance has occurred based on the accumulated illuminance information supplied from the illuminance information accumulating unit 36 and the accumulated temperature information supplied from the environment information accumulating unit 37. The fading information generating unit 38 stores characteristics (fading model and the like) indicating the level of fading with respect to the accumulated illuminance information and the accumulated temperature information as fading determination reference information in advance. By using this characteristic, the fading information generating unit 38 determines the level of fading based on the accumulated illuminance information supplied from the illuminance information accumulating unit 36 and the accumulated temperature information supplied from the environment information accumulating unit 37. Further, by separately using the characteristic indicating the level of color fading with respect to the accumulated illuminance information and the characteristic indicating the level of color fading with respect to the accumulated temperature information, the level of color fading based on the accumulated illuminance information and the level of color fading based on the accumulated temperature information can be determined, and these levels of color fading can be integrated as the level of color fading of the color filter. Also, the evaluation value may be calculated by using the cumulative illuminance information and the cumulative temperature information as parameters, and the calculated evaluation value may be used as the level of color fading. The fading information generating unit 38 compares the determined fading level with a threshold value set in advance, and generates fading information indicating whether or not fading exceeding the threshold value has occurred based on the comparison result, and outputs the fading information to the fading information utilizing unit 50. Further, the fading information generating unit 38 may not only indicate that fading has occurred using fading information, but may also include a level of fading in the fading information.

<4-2. operation of the third embodiment >

Fig. 11 and 12 are flowcharts illustrating the operation of the third embodiment, fig. 11 shows a processing operation of information to be used for determination of color fading, and fig. 12 shows a generation operation of color fading information. The processing operation of the information to be used for determining color fading is performed during the operation of the image capturing unit, and the generation operation of the color fading information is performed at predetermined time intervals.

In step ST41 of fig. 11, the image processing unit acquires an image signal. The image processing unit 30-3 acquires the image signal generated by the image capturing unit 20, and proceeds to step ST 42.

In step ST42, the image processing unit accumulates illuminance information. The image processing unit 30-3 generates illuminance information based on the image signal acquired in step ST 41. Also, the image processing unit 30-3 accumulates the illuminance information, and proceeds to step ST 43.

In step ST43, the image processing unit acquires environmental information. The image processing unit 30-3 acquires the environmental information generated by the environmental information generating unit 40, and proceeds to step ST 44.

In step ST44, the image processing unit accumulates the environmental information. The image processing unit 30-3 accumulates the environmental information acquired in step ST43, and returns to step ST 41.

In step ST51 of fig. 12, the image processing unit acquires comparison target information. The image processing unit 30-3 acquires the illuminance information accumulated in step ST42 and the environment information accumulated in step ST44 of fig. 11 as comparison target information, and proceeds to step ST 52.

In step ST52, the image processing unit performs estimation to determine the level of color fading. The image processing unit 30-3 determines the level of color fading corresponding to the comparison target information acquired in step ST51 using color fading determination reference information indicating the level of color fading with respect to the accumulated illuminance information and the accumulated temperature information, and proceeds to step ST 53.

In step ST53, the image processing unit determines whether or not color fading has occurred. The image processing unit 30-3 determines whether the level of color fading determined in step ST52 exceeds a threshold value set in advance, proceeds to step ST54 in the case where it is determined that color fading exceeding the threshold value has occurred, or returns to step ST51 in the case where it is determined that color fading exceeding the threshold value has not occurred.

In step ST54, the image processing unit generates color fading information. The image processing unit 30-3 generates color fading information indicating that color fading exceeding a threshold value has occurred in the color filter used in the image capturing unit 20, and outputs the color fading information.

In this way, the third embodiment enables accurate detection of the fading state, as in the first and second embodiments. Further, by using, as comparison target information, accumulated illuminance information obtained by accumulating illuminance information generated based on an image signal or accumulated environment information obtained by accumulating the accumulated illuminance information and the environment information, it is possible to determine color fading based on color fading determination reference information indicating a relationship between a color fading level of a color filter and an accumulation result of the illuminance information and the environment information and the comparison target information. Therefore, the color fading of the color filter can be easily detected without performing a recognition process or the like.

<5. other embodiments >

The image processing unit 30 may perform the first to third embodiments in combination. For example, if the first embodiment and the third embodiment are performed in combination, even in a case where an object that is not faded cannot be detected for a long time, fading of the color filter can be detected based on the accumulated illuminance information or the like.

Further, in the second embodiment, the image capturing unit 21 is provided with reference pixels in which color fading does not occur, and generates color fading determination criterion information. The spectral camera may be provided independently of the image capturing unit, and the color fading determination reference information may be generated by the spectral camera.

Note that the effects described in the above-described first to third embodiments and other embodiments are merely illustrative and not restrictive, and additional effects may be produced.

<6. information on fading utilization Unit >

Next, the fading information utilizing unit will be described. In the case where the fading condition is obtained by the image processing unit 30, the fading information utilization unit 50 performs fading correction on the image signal acquired by the image capturing unit 20.

The image processing unit 30 performs color fading correction using a correction coefficient corresponding to the color fading level of the color filter. For example, the red correction coefficient Cr, the green correction coefficient Cg, and the blue correction coefficient Cb are determined according to the level of fading. Further, the calculations of (2) to (4) are performed on the red component R, the green component G, and the blue component B in the image signal generated by the image capturing unit 20, and the faded red component Rc, green component Gc, and blue component Bc are calculated.

Rc=Cr×R...(2)

Gc=Cg×G...(3)

Bc=Cb×B...(4)

Further, since the level of fading can be estimated using the characteristics (fading model, etc.) indicating the level of fading with respect to the cumulative illuminance and cumulative temperature, fading can be corrected by using the lookup table. For example, as shown in formula (5), the red component Rc after the color fading correction corresponding to the red component R is calculated by using the red lookup table LUTr corresponding to the color fading level. Similarly, as shown in formula (6), the green component Gc after the color fading correction corresponding to the green component G is calculated by using the green lookup table LUTg corresponding to the color fading level, and as shown in formula (7), the blue component Bc after the color fading correction corresponding to the blue component B is calculated by using the blue lookup table LUTb corresponding to the color fading level.

Rc=LUTr[R]...(5)

Gc=LUTg[G]...(6)

Bc=LUTb[B]...(7)

Also, not limited to the case where the fading correction is performed for each color component, the image processing unit 30 may calculate the respective color components Rc, Gc, and Bc after the fading correction for each pair of color components R, G and B. For example, the calculation of formula (8) may be performed using the correction coefficient Cw corresponding to the fading level to calculate the respective color components Rc, Gc, and Bc. As shown in equations (9) to (11), each color component after the fading correction can be calculated.

[Rc,Gc,Bc]T=Cw[R,G,B]T...(8)

Rc=LUTr[R,G,B]...(9)

Gc=LUTg[R,G,B]...(10)

Bc=LUTb[R,G,B]...(11)

Further, the fading correction can be performed by machine learning. For example, using a method called "Conditional generation countermeasure network (Conditional GAN)", rules for converting an image and conversion of faded colors into original colors are learned, and an image in which fading has occurred is converted into an image without fading based on the learning result. Further, in the machine learning, a method called "Pix 2 Pix" may be used to previously learn a relationship between an image which is not faded and an image which has faded, and an image which is not faded may be generated from an image which has faded acquired by an image capturing unit based on the learned relationship. Further, an image without fading can be generated by using a method called "cyclic generation countermeasure network (Cycle GAN)".

<7. application example >

The techniques according to the present disclosure may be applied to a variety of products. For example, the technology according to the present disclosure may be implemented as an apparatus mounted on any type of moving body including an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobile device, an airplane, a drone, a ship, a robot, a construction machine, an agricultural machine (tractor), and the like.

Fig. 13 is a block diagram showing a schematic functional configuration example of a vehicle control system 100 as one example of a mobile body control system to which the present technology can be applied.

Note that, hereinafter, in the case of distinguishing a vehicle provided with the vehicle control system 100 from another vehicle, the vehicle is referred to as an own vehicle or an own vehicle.

The vehicle control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, an in-vehicle device 104, an output control unit 105, an output unit 106, a drive-affinity control unit 107, a drive-affinity system 108, a body-affinity control unit 109, a body-affinity system 110, a storage unit 111, and an automatic driving control unit 112. The input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive association control unit 107, the vehicle body association control unit 109, the storage unit 111, and the automatic drive control unit 112 are connected to each other via a communication network 121. The communication network 121 includes, for example, an in-vehicle communication network, a bus, or the like conforming to any standard such as a Controller Area Network (CAN), a Local Interconnect Network (LIN), a Local Area Network (LAN), or FlexRay (registered trademark). Note that each unit of the vehicle control system 100 may be directly connected without passing through the communication network 121.

Note that, hereinafter, in the case where each unit of the vehicle control system 100 performs communication via the communication network 121, description of the communication network 121 will be omitted. For example, in the case where the input unit 101 and the automatic driving control unit 112 communicate with each other via the communication network 121, it is simply described that the input unit 101 and the automatic driving control unit 112 communicate with each other.

The input unit 101 includes a device for use by an occupant to input various data, instructions, and the like. For example, the input unit 101 includes operation devices such as a touch panel, buttons, a microphone, switches, and a joystick, and operation devices that enable input by methods other than manual operation (such as voice and gestures), and the like. Further, for example, the input unit 101 may be a remote control device using infrared rays or other radio waves, or may be an external connection device including a mobile device, a wearable device, or the like that supports the operation of the vehicle control system 100. The input unit 101 generates an input signal based on data, an instruction, and the like input by an occupant, and supplies the input signal to each unit of the vehicle control system 100.

The data acquisition unit 102 includes various sensors and the like that acquire data used for processing of the vehicle control system 100, and supplies the acquired data to each unit of the vehicle control system 100.

For example, the data acquisition unit 102 includes various sensors for detecting the state of the own vehicle and the like. Specifically, for example, the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an Inertia Measurement Unit (IMU), and sensors for detecting an accelerator pedal operation amount, a brake pedal operation amount, a steering angle of a steering wheel, the number of engine revolutions, the number of motor revolutions, the number of wheel revolutions, and the like.

Further, for example, the data acquisition unit 102 includes various sensors for detecting information outside the own vehicle. Specifically, for example, the data acquisition unit 102 includes an image capturing device such as a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. Further, for example, the data acquisition unit 102 includes an environmental sensor for detecting weather, or the like, and a surrounding information detection sensor for detecting an object around the own vehicle. The environmental sensors include, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like. The surrounding information detection sensor includes, for example, an ultrasonic sensor, a radar, light detection and ranging, laser imaging detection and ranging (LiDAR), a sonar, and the like.

Also, for example, the data acquisition unit 102 includes various sensors for detecting the current position of the own vehicle. Specifically, for example, the data acquisition unit 102 includes a Global Navigation Satellite System (GNSS) receiver that receives GNSS signals from GNSS satellites, and the like.

Further, for example, the data acquisition unit 102 includes various sensors for detecting in-vehicle information. Specifically, for example, the data acquisition unit 102 includes an image capture device that captures an image of the driver, a biometric sensor that detects biometric information of the driver, a microphone that collects sound inside the vehicle, and the like. The biometric sensor is provided, for example, on a seat surface, a steering wheel, or the like, and detects biometric information of an occupant seated on the seat or a driver holding the steering wheel.

The communication unit 103 communicates with the in-vehicle device 104 and various devices, servers, base stations, and the like outside the vehicle. The communication unit 103 transmits data supplied from each unit of the vehicle control system 100, and supplies the received data to each unit of the vehicle control system 100. Note that the communication protocol supported by the communication unit 103 is not particularly limited, and further, the communication unit 103 may support a plurality of types of communication protocols.

For example, the communication unit 103 wirelessly communicates with the in-vehicle device 104 by wireless LAN, bluetooth (registered trademark), Near Field Communication (NFC), wireless usb (wusb), or the like. Further, for example, the communication unit 103 performs wired communication with the in-vehicle device 104 through a Universal Serial Bus (USB), a High Definition Multimedia Interface (HDMI) (registered trademark), a mobile high definition link (MHL) via a connection terminal (not shown) (and a cable as necessary).

Also, for example, the communication unit 103 communicates with a device (e.g., an application server or a control server) existing on an external network (e.g., the internet, a cloud network, or a service operator-specific network) via a base station or an access point. Further, for example, the communication unit 103 communicates with a terminal (e.g., a pedestrian terminal or a shop terminal, or a Machine Type Communication (MTC) terminal) existing in the vicinity of the own vehicle by using a peer-to-peer (P2P) technique. Also, for example, the communication unit 103 performs V2X communication including vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, vehicle-to-pedestrian communication, and the like. Further, for example, the communication unit 103 includes a beacon receiving unit that receives radio waves or electromagnetic waves transmitted from a wireless station or the like installed on a road, and acquires information including a current position, traffic congestion, traffic control, required time, and the like.

The in-vehicle device 104 includes, for example, a mobile device or a wearable device owned by an occupant, an information device carried in or attached to the own vehicle, a navigation device for searching for a route to an arbitrary destination, and the like.

The output control unit 105 controls output of various kinds of information to an occupant of the own vehicle or the outside of the vehicle. For example, the output control unit 105 generates an output signal including at least one of visual information (e.g., image data) or auditory information (e.g., voice data), and supplies the output signal to the output unit 106, thereby controlling output of the visual information and the auditory information from the output unit 106. Specifically, for example, the output control unit 105 synthesizes image data captured by different image capturing devices of the data acquisition unit 102 to generate a bird's-eye view image, a panoramic image, and the like, and supplies an output signal including the generated image to the output unit 106. Further, for example, the output control unit 105 generates voice data including a warning sound, a warning message, and the like for a hazard including a collision, a scratch, an entrance into an unsafe zone, and the like, and supplies an output signal including the generated voice data to the output unit 106.

The output unit 106 includes a device that can output visual information or auditory information to an occupant of the own vehicle or outside the vehicle. For example, the output unit 106 includes a display device, an instrument panel, an audio speaker, an earphone, a wearable device including a glasses type display worn by a passenger or the like, a projector, a lamp, and the like. The display device included in the output unit 106 may be, for example, a device that displays visual information in the field of view of the driver, including a head-up display, a transmissive display, a device having an Augmented Reality (AR) display function, or the like, in addition to a device having a conventional display.

The drive association control unit 107 generates various control signals and supplies the control signals to the drive association system 108, thereby controlling the drive association system 108. Further, the drive association control unit 107 supplies a control signal to each unit other than the drive association system 108 as necessary, and performs notification of a control state of the drive association system 108 and the like.

The drive association system 108 includes various devices associated with the drive of the own vehicle. For example, the drive association system 108 includes: a driving force generation device for generating a driving force, including an internal combustion engine, a drive motor, and the like; a driving force transmission mechanism for transmitting a driving force to a wheel; a steering mechanism for adjusting a steering angle; a brake device that generates a braking force; anti-lock braking systems (ABS); an electronic stability control system (ESC); an electric power steering apparatus, and the like.

The vehicle body association control unit 109 generates various control signals and supplies the control signals to the vehicle body association system 110, thereby controlling the vehicle body association system 110. Further, the vehicle body association control unit 109 supplies a control signal to each unit other than the vehicle body association system 110 as necessary, and performs notification of the control state of the vehicle body association system 110 and the like.

The vehicle body association system 110 includes various devices associated with a vehicle body equipped in the vehicle body. For example, the body-related system 110 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, various lights (e.g., a front light, a rear light, a stop light, a turn signal light, a fog light, etc.), and the like.

The storage unit 111 includes, for example, a magnetic storage device such as a Read Only Memory (ROM), a Random Access Memory (RAM), a Hard Disk Drive (HDD), a semiconductor storage device, an optical magnetic storage device, or the like. The storage unit 111 stores various programs, data, and the like used by each unit of the vehicle control system 100. For example, the storage unit 111 stores map data including a three-dimensional high-precision map (such as a dynamic map), a global map that is lower in precision than the high-precision map and that covers a large area, a local map that includes information around the own vehicle, and the like.

The automatic driving control unit 112 controls automatic driving including autonomous traveling, driving assistance, and the like. Specifically, for example, the automatic driving control unit 112 performs cooperative control intended to realize the function of an Advanced Driving Assistance System (ADAS) including collision avoidance or collision mitigation of the own vehicle, follow-up running based on the distance between vehicles, running while maintaining the vehicle speed, collision warning of the own vehicle, lane departure warning of the own vehicle, and the like. Further, for example, the automated driving control unit 112 performs cooperative control of automated driving or the like that aims at the vehicle to autonomously travel without depending on the operation of the driver. The automatic driving control unit 112 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.

The detection unit 131 detects various types of information required to control the automated driving. The detection unit 131 includes a vehicle exterior information detection unit 141, a vehicle interior information detection unit 142, and a vehicle state detection unit 143.

The vehicle exterior information detection unit 141 executes detection processing of information outside the own vehicle based on data or a signal from each unit of the vehicle control system 100. For example, the vehicle exterior information detecting unit 141 performs detection processing, recognition processing, and tracking processing of an object around the own vehicle, and detection processing of a distance to the object. The objects to be detected include, for example, vehicles, people, obstacles, buildings, roads, traffic lights, traffic signs, road markings, and the like. Further, for example, the vehicle exterior information detecting unit 141 performs a detection process of the environment around the own vehicle. The ambient environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like. The vehicle exterior information detecting unit 141 supplies data indicating the result of the detection processing to the self position estimating unit 132, the map analyzing unit 151, the traffic regulation identifying unit 152 and the situation identifying unit 153 of the situation analyzing unit 133, the emergency avoiding unit 171 of the operation control unit 135, and the like.

The in-vehicle information detection unit 142 performs detection processing of information inside the vehicle based on data or a signal from each unit of the vehicle control system 100. For example, the in-vehicle information detection unit 142 executes driver authentication processing and recognition processing, driver state detection processing, occupant detection processing, in-vehicle environment detection processing, and the like. The driver's state to be detected includes, for example, physical condition, arousal level, attention, fatigue, direction of sight, and the like. The environment in the vehicle to be detected includes, for example, temperature, humidity, brightness, odor, and the like. The in-vehicle information detection unit 142 supplies data indicating the result of the detection processing to the situation recognition unit 153 of the situation analysis unit 133, the emergency avoidance unit 171 of the operation control unit 135, and the like.

The vehicle state detection unit 143 performs detection processing of the state of the own vehicle based on data or a signal from each unit of the vehicle control system 100. The state of the own vehicle to be detected includes, for example, speed, acceleration, steering angle, presence or absence of abnormality and details of the abnormality, driving operation state, power seat position and inclination, door lock state, state of other in-vehicle devices, and the like. The vehicle state detection unit 143 supplies data indicating the result of the detection process to the situation recognition unit 153 of the situation analysis unit 133, the emergency avoidance unit 171 of the operation control unit 135, and the like.

The own position estimation unit 132 performs estimation processing of the position, orientation, and the like of the own vehicle based on data or signals from each unit of the vehicle control system 100 (such as the vehicle exterior information detection unit 141 and the condition recognition unit 153 of the condition analysis unit 133). Further, the self-position estimation unit 132 generates a local map for self-position estimation (hereinafter referred to as self-position estimation map) as necessary. The self-position estimation map is a high-precision map using a technique such as simultaneous localization and mapping (SLAM). The self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151, the traffic rule recognition unit 152, the situation recognition unit 153, and the like of the situation analysis unit 133. Further, the self position estimating unit 132 stores the self position estimation map in the storage unit 111.

The condition analysis unit 133 performs analysis processing of the own vehicle and the surrounding condition. The situation analysis unit 133 includes a map analysis unit 151, a traffic rule recognition unit 152, a situation recognition unit 153, and a situation prediction unit 154.

The map analysis unit 151 performs analysis processing of various maps stored in the storage unit 111 using data or signals from each unit of the vehicle control system 100 (such as the self-position estimation unit 132 and the vehicle exterior information detection unit 141) as necessary, and constructs a map including information necessary for automatic driving processing. The map analysis unit 151 supplies the constructed map to the traffic regulation recognition unit 152, the situation recognition unit 153, the situation prediction unit 154, and the route planning unit 161, the behavior planning unit 162, the operation planning unit 163, and the like of the planning unit 134.

The traffic regulation identifying unit 152 performs identification processing of the traffic regulation around the own vehicle based on data or signals from each unit of the vehicle control system 100, such as the own position estimating unit 132, the vehicle exterior information detecting unit 141, and the map analyzing unit 151. Through the recognition processing, for example, the position and state of traffic lights around the own vehicle, detailed traffic regulations around the own vehicle, drivable lanes, and the like are recognized. The traffic regulation recognition unit 152 supplies data indicating the result of the recognition processing to the situation prediction unit 154 and the like.

The condition recognition unit 153 performs recognition processing regarding the condition of the own vehicle based on data or signals from each unit of the vehicle control system 100, such as the own position estimation unit 132, the outside-vehicle information detection unit 141, the inside-vehicle information detection unit 142, the vehicle state detection unit 143, and the map analysis unit 151. For example, the condition recognition unit 153 executes recognition processing of the condition of the own vehicle, the condition around the own vehicle, the condition of the driver of the own vehicle, and the like. Further, the condition identifying unit 153 generates a local map for identifying the condition around the own vehicle (hereinafter referred to as a condition identifying map) as necessary. The situation recognition map is, for example, an occupancy grid map.

The condition of the own vehicle to be recognized includes, for example, the position, orientation, movement (e.g., speed, acceleration, moving direction, etc.) of the own vehicle, the presence or absence of an abnormality, details of the abnormality, and the like. The conditions around the own vehicle to be recognized include, for example, the type and position of a surrounding stationary object, the type, position, and movement (e.g., speed, acceleration level, moving direction, etc.) of a surrounding moving object, the configuration and road surface conditions of a surrounding road, surrounding weather, temperature, humidity, brightness, and the like. The driver's state to be recognized includes, for example, physical condition, degree of arousal, attention, fatigue, movement of line of sight, driving operation, and the like.

The situation recognition unit 153 supplies data (including a situation recognition map as necessary) indicating the result of the recognition processing to the own position estimation unit 132, the situation prediction unit 154, and the like. Further, the situation recognition unit 153 stores the situation recognition map in the storage unit 111.

The condition prediction unit 154 performs a prediction process regarding the condition of the own vehicle based on data or signals from each unit of the vehicle control system 100, such as the map analysis unit 151, the traffic rule recognition unit 152, and the condition recognition unit 153. For example, the condition prediction unit 154 executes prediction processing of the condition of the own vehicle, the condition around the own vehicle, the condition of the driver, and the like.

The conditions of the own vehicle to be predicted include, for example, the behavior of the own vehicle, the occurrence of an abnormality, the travelable distance, and the like. The conditions around the own vehicle to be predicted include, for example, the motion of a moving object around the own vehicle, a change in the state of a traffic light, a change in the environment such as the weather, and the like. The condition of the driver to be predicted includes, for example, the action and physical condition of the driver, and the like.

The situation prediction unit 154 supplies data indicating the result of the prediction processing to the route planning unit 161, the behavior planning unit 162, the operation planning unit 163, and the like of the planning unit 134 together with the data from the traffic rule recognition unit 152 and the situation recognition unit 153.

The route planning unit 161 plans a route to a destination based on data or a signal from each unit (such as the map analysis unit 151 and the condition prediction unit 154) of the vehicle control system 100. For example, the route planning unit 161 sets a route from the current position to a specified destination based on the global map. Further, the route planning unit 161 appropriately changes the route based on the conditions of traffic congestion, accidents, traffic control, construction, and the like, the physical condition of the driver, and the like, for example. The route planning unit 161 supplies data indicating a planned route to the action planning unit 162 and the like.

The behavior planning unit 162 plans the behavior of the own vehicle for safely traveling on the route planned by the route planning unit 161 within the planned time based on data or signals from each unit (such as the map analysis unit 151 and the condition prediction unit 154) of the vehicle control system 100. For example, the behavior planning unit 162 plans a departure, a stop, a traveling direction (e.g., forward, backward, leftward, rightward, turning, etc.), a traveling lane, a traveling speed, a passing, and the like. The behavior planning unit 162 supplies data indicating the planned behavior of the own vehicle to the operation planning unit 163 and the like.

The operation planning unit 163 plans the operation of the own vehicle for realizing the behavior planned by the behavior planning unit 162 based on data or signals from each unit (such as the map analysis unit 151 and the condition prediction unit 154) of the vehicle control system 100. For example, the operation planning unit 163 plans acceleration, deceleration, a travel locus, and the like. The operation planning unit 163 supplies data indicating the planned operation of the own vehicle to the acceleration-deceleration control unit 172, the direction control unit 173, and the like of the operation control unit 135.

The operation control unit 135 controls the operation of the own vehicle. The operation control unit 135 includes an emergency avoidance unit 171, an acceleration-deceleration control unit 172, and a direction control unit 173.

The emergency avoidance unit 171 performs detection processing of emergency such as collision, scratch, entry into an unsafe zone, driver abnormality, and vehicle abnormality based on the detection results of the outside-vehicle information detection unit 141, the inside-vehicle information detection unit 142, and the vehicle state detection unit 143. In the case where the occurrence of an emergency is detected, the emergency avoidance unit 171 plans the operation of the own vehicle for avoiding an emergency such as an abrupt stop or a quick turn. The emergency avoidance unit 171 supplies data indicating the planned operation of the own vehicle to the acceleration-deceleration control unit 172, the direction control unit 173, and the like.

The acceleration-deceleration control unit 172 executes acceleration-deceleration control for realizing the operation of the own vehicle planned by the operation planning unit 163 or the emergency avoidance unit 171. For example, the acceleration-deceleration control unit 172 calculates a control target value of a driving force generation device or a brake device for achieving planned acceleration, deceleration, or sudden stop, and supplies a control command indicating the calculated control target value to the drive-related control unit 107.

The direction control unit 173 performs direction control for realizing the operation of the own vehicle planned by the operation planning unit 163 or the emergency avoidance unit 171. For example, the direction control unit 173 calculates a control target value of a steering mechanism for realizing a travel trajectory or a quick turn planned by the operation planning unit 163 or the emergency avoidance unit 171, and supplies a control command indicating the calculated control target value to the drive-related control unit 107.

In the vehicle control system 100 described above, the image capturing unit 20(21) shown in the present embodiment corresponds to the data acquisition unit 102, and the image processing unit 30-1(30-2, 30-3) corresponds to the vehicle exterior information detection unit 141. In the case where the image capturing unit 20(21) and the image processing unit 30-1(30-2, 30-3) are provided in the vehicle control system 100 and the surrounding environment is captured by the image capturing unit 20(21) while the vehicle is being driven, since the image capturing unit 20(21) performs the image capturing operation for a long time, the color fading of the color filter occurs more easily than in the portable image capturing apparatus. However, since the fading correction or the like can be performed using the fading information generated by the image processing unit 30-1(30-2, 30-3), the influence of fading of the color filter can be reduced. For example, in the case where colors are important in a scene such as a recognition process of a traffic light, a road sign, a license plate of a car, or the like by using an image signal generated by an image capturing unit, or in the case where accuracy can be improved by using colors in a scene such as road surface detection, surrounding object recognition, or a driving lane recognition process, the recognition process can be accurately performed by switching a dictionary for the recognition process based on fading information, as with the comparative area detecting unit of the first embodiment.

Further, the present technology can be applied not only to a moving body control system but also to a monitoring system or the like that performs an image capturing operation for a long time to reduce the influence of color filter fading and enable the monitoring operation or the like.

The series of processes described in the specification may be performed by hardware, software, or a combined configuration of hardware and software. When executing the processing by software, a program in which a processing sequence is recorded is installed in a memory in a computer built in dedicated hardware and executed. Alternatively, the program may be installed and executed in a general-purpose computer that can execute various processes.

For example, the program may be recorded in advance in a hard disk, a Solid State Drive (SSD), or a Read Only Memory (ROM) serving as a recording medium. Alternatively, the program may be temporarily or permanently stored (recorded) in a removable recording medium including a flexible disk, a compact disc read only memory (CD-ROM), a magneto-optical (MO) disk, a Digital Versatile Disc (DVD), a blu-ray disc (BD) (registered trademark), a magnetic disk, a semiconductor memory card, or the like. Such a removable recording medium may be provided as a so-called package software.

Further, the program may be transferred from a download site to the computer in a wireless manner or a wired manner via a Local Area Network (LAN), the internet, or the like, in addition to being installed from the removable recording medium to the computer. The computer can receive the program transferred in this manner and install the program in a recording medium such as a built-in hard disk.

Note that the effects described in this specification are merely illustrative and not restrictive, and additional effects not described may be produced. Furthermore, the present technology should not be construed as being limited to the embodiments of the technology described above. Embodiments of this technology disclose the present technology in an illustrative form. It is apparent that those skilled in the art can modify or replace the embodiments without departing from the spirit of the present technology. That is, in order to judge the spirit of the present technology, the claims should be considered.

Further, the image processing apparatus of the present technology may also have the following configuration.

(1) An image processing apparatus includes a color fading information generation unit configured to generate color fading information indicating color fading of a color filter based on comparison target information and color fading determination reference information, the comparison target information being based on an image signal generated by an image capturing unit using the color filter.

(2) The image processing apparatus according to (1), further comprising:

a color information generating unit configured to generate color information from the image signal and use the color information as the comparison target information; and

a color information comparison unit configured to use color information in a case where the color filter is not faded as the fading determination reference information and compare the comparison target information with the fading determination reference information,

wherein the fading information generating unit generates the fading information based on the comparison result of the color information comparing unit.

(3) The image processing apparatus according to (2), further comprising a comparison area detection unit configured to detect a comparison area indicating an image area of the fading determination object based on the image signal,

wherein the color information generating unit generates the color information from the image signal of the comparison area detected by the comparison area detecting unit and uses the color information as the comparison target information; and is

The color information comparison unit uses the color information of the fading determination object as the fading determination reference information.

(4) The image processing apparatus according to (3), wherein the fading information generating unit estimates a level of fading based on a comparison result of the color information comparing unit, and

the comparison area detection unit detects the comparison area according to the level of fading estimated by the fading information generation unit.

(5) The image processing apparatus according to (3), wherein the comparison area detection unit detects the comparison area within a preset image area.

(6) The image processing apparatus according to (2), wherein the image capturing unit is provided with color component pixels provided with color filters and reference pixels in which color fading does not occur,

the color information generating unit generates color information from the image signal generated in the color component pixel and uses the color information as the comparison target information, and

the color information comparison unit uses color information generated based on the signal of the reference pixel as the fading determination criterion information.

(7) The image processing apparatus according to (6), wherein the reference pixel includes a pixel provided with no color filter.

(8) The image processing apparatus according to (6), wherein the reference pixel includes a pixel provided with a spectral filter in place of the color filter.

(9) The image processing apparatus according to any one of (6) to (8), further comprising an image capturing unit provided with the color component pixels and the reference pixels.

(10) The image processing apparatus according to any one of (2) to (9), further comprising a comparison information accumulation unit configured to accumulate comparison information indicating a comparison result of the color information comparison unit,

wherein the fading information generating unit generates the fading information based on the comparison information accumulated by the comparison information accumulating unit.

(11) The image processing apparatus according to any one of (1) to (10), further comprising a illuminance information accumulation unit configured to accumulate illuminance information generated based on the image signal,

wherein the fading information generating unit uses the illuminance information accumulated by the illuminance information accumulating unit as the comparison target information, and uses information indicating a relationship between a fading level of the color filter and an accumulation result of the illuminance information as the fading determination reference information.

(12) The image processing apparatus according to (11), further comprising an environmental information accumulation unit configured to accumulate environmental information at the time of generating the image signal and include the environmental information in the comparison target information,

wherein the fading information generating unit includes the environmental information accumulated by the environmental information accumulating unit in the comparison target information, and uses information indicating a relationship between a fading level of the color filter and an accumulation result of illuminance information and environmental information as the fading determination reference information.

(13) The image processing apparatus according to any one of (1) to (12), wherein the fading information indicates that a fading level of the color filter exceeds a predetermined level, or indicates a fading level of the color filter.

List of reference numerals

10 image capturing system

20. 21 image capturing unit

30. 30-1, 30-2, 30-3 image processing unit

31 comparison area detection unit

32a, 32b, 32c color information generating unit

33a, 33b color information comparing unit

34a, 34b comparison information accumulation unit

35a, 35b, 38 fading information generating unit

36 illuminance information accumulation unit

37 environment information accumulation unit

40 Environment information generating unit

50 fading information utilization unit

60 image utilization unit

35页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:激光光源和具有激光光源的激光投影器

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类