Image processor, image processing method, and image pickup apparatus

文档序号:1382833 发布日期:2020-08-14 浏览:13次 中文

阅读说明:本技术 图像处理器、图像处理方法和摄像设备 (Image processor, image processing method, and image pickup apparatus ) 是由 古闲史彦 山口哲司 于 2019-01-29 设计创作,主要内容包括:根据本公开的图像处理器包括:图像分割处理部,其被构造成基于包括多个像素值的第一图像图谱数据产生多个第一图谱数据,所述多个第一图谱数据具有彼此不同的像素值排列图案,并包括位于彼此不同的位置处的像素值;插值处理部,其被构造成通过以下方式产生与所述多个第一图谱数据相对应的多个第二图谱数据:利用插值处理确定所述多个第一图谱数据的每者中不存在像素值的位置处的像素值;以及合成处理部,其被构造成通过以下方式产生第三图谱数据:基于所述多个第二图谱数据中彼此相对应的位置处的像素值产生与所述彼此相对应的位置相对应的位置处的像素值。(An image processor according to the present disclosure includes: an image division processing section configured to generate a plurality of first map data based on first image map data including a plurality of pixel values, the plurality of first map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other; an interpolation processing section configured to generate a plurality of second map data corresponding to the plurality of first map data by: determining a pixel value at a position where a pixel value does not exist in each of the plurality of first map data using interpolation processing; and a synthesis processing section configured to generate third map data by: generating a pixel value at a position corresponding to each other position in the plurality of second atlas data based on the pixel values at the position corresponding to each other position.)

1. An image processor comprising:

an image division processing section configured to generate a plurality of first map data based on first image map data including a plurality of pixel values, the plurality of first map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other;

an interpolation processing section configured to generate a plurality of second map data corresponding to the plurality of first map data by: determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing; and

a synthesis processing section configured to generate third map data by: generating a pixel value at a position corresponding to each other among the plurality of second atlas data, based on the pixel values at the positions corresponding to each other.

2. The image processor of claim 1, wherein the arrangement pattern is a checkerboard pattern.

3. The image processor of claim 1, further comprising an interpolation controller configured to determine a processing method in the interpolation process based on the first image atlas data.

4. The image processor of claim 3, wherein the interpolation controller is configured to determine the processing method by: determining an interpolation direction in the interpolation process based on the first image map data.

5. The image processor of claim 3, wherein the interpolation controller is configured to determine spatial frequency information based on the first image atlas data and to determine the processing method based on the spatial frequency information.

6. The image processor of claim 3, wherein the interpolation controller is configured to generate synthetic atlas data based on the first, second and third image atlas data, and to determine the processing method in the interpolation process based on the synthetic atlas data.

7. The image processor of claim 1, wherein,

the image segmentation processing section is configured to generate a plurality of fourth map data based also on second image map data including a plurality of pixel values, the plurality of fourth map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other,

the interpolation processing section is configured to generate a plurality of fifth map data corresponding to the plurality of fourth map data by: determining, with the interpolation process, a pixel value at a position where no pixel value exists in each of the plurality of fourth map data,

the pixel value arrangement patterns in the plurality of first atlas data include a first arrangement pattern and a second arrangement pattern, and

the pixel value arrangement patterns in the plurality of fourth map data include the first arrangement pattern and the second arrangement pattern.

8. The image processor of claim 7, wherein,

the plurality of pixel values in the first image atlas data comprises a plurality of pixel values of a first color,

the plurality of pixel values in the second image atlas data comprise a plurality of pixel values of a second color and a plurality of pixel values of a third color.

9. The image processor of claim 7, wherein,

the plurality of pixel values in the first image atlas data comprise a plurality of pixel values of a first color, and

the plurality of pixel values in the second image atlas data comprise a plurality of pixel values of a second color, a plurality of pixel values of a third color, and a plurality of pixel values of a fourth color.

10. The image processor of claim 7, wherein,

the synthesis processing section is configured to generate sixth map data by: generating pixel values at positions corresponding to each other among the plurality of fifth atlas data based on the pixel values at the positions corresponding to each other,

the image division processing section is configured to generate a plurality of seventh map data based also on third image map data including a plurality of pixel values, the plurality of seventh map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other,

the interpolation processing section is configured to generate a plurality of eighth map data corresponding to the plurality of seventh map data by: determining, with the interpolation process, a pixel value at a position where no pixel value exists in each of the plurality of seventh map data,

the synthesis processing section is configured to generate ninth map data by: generating pixel values at positions corresponding to each other among the plurality of eighth atlas data, based on the pixel values at the positions corresponding to each other, and

the pixel value arrangement patterns in the plurality of seventh map data include the first arrangement pattern and the second arrangement pattern.

11. The image processor of claim 10, wherein,

the plurality of pixel values in the first image atlas data comprises a plurality of pixel values of a first color,

the plurality of pixel values in the second image atlas data comprise a plurality of pixel values of a second color, and

the plurality of pixel values in the third image atlas data comprise a plurality of pixel values of a third color.

12. The image processor of claim 10, wherein a number of the plurality of pixel values in the first image atlas data is different than a number of the plurality of pixel values in the second image atlas data.

13. The image processor of claim 12, wherein,

the plurality of pixel values in the first image atlas data comprise a plurality of pixel values of green, and

two or more pixel values in the first image map data are associated with one pixel value in the second image map data.

14. The image processor of claim 1, further comprising a generator that generates the first image atlas data based on an image signal, wherein the first image atlas data comprises brightness atlas data.

15. The image processor according to claim 1, further comprising a process controller configured to control whether or not the image segmentation processing section, the interpolation processing section, and the synthesis processing section perform processes.

16. The image processor according to claim 15, further comprising a processing section configured to perform predetermined signal processing based on the first image atlas data or the third atlas data,

wherein the processing controller is configured to cause the processing portion to perform the predetermined signal processing based on the first image map data in a first operation mode and to perform the predetermined signal processing based on the third map data in a second operation mode.

17. The image processor according to claim 16, wherein the process controller is configured to control whether or not the image segmentation processing section, the interpolation processing section, and the synthesis processing section perform processes based on a parameter.

18. The image processor of claim 17, wherein,

the first image map data is supplied from an image pickup section,

the parameter includes a gain value in the image pickup section, and

in a case where the gain value is higher than a predetermined gain value, the processing controller performs control to cause the image division processing section, the interpolation processing section, and the synthesis processing section to perform processing.

19. An image processing method comprising:

image segmentation processing: generating a plurality of first map data based on first image map data including a plurality of pixel values, the plurality of first map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other;

interpolation processing: determining a plurality of second map data corresponding to the plurality of first map data by determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing; and

and (3) synthesis treatment: generating third map data by generating pixel values at positions corresponding to each other among the plurality of second map data based on the pixel values at the positions corresponding to each other.

20. An image pickup apparatus includes:

an image pickup section that generates first image map data including a plurality of pixel values;

an image division processing section configured to generate a plurality of first atlas data based on the first image atlas data, the plurality of first atlas data having pixel value arrangement patterns different from each other and including pixel values at positions different from each other;

an interpolation processing section configured to generate a plurality of second map data corresponding to the plurality of first map data by: determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing; and

a synthesis processing section configured to generate third map data by: generating a pixel value at a position corresponding to each other among the plurality of second atlas data, based on the pixel values at the positions corresponding to each other.

Technical Field

The present disclosure relates to an image processor for performing image processing, an image processing method, and an image pickup apparatus including the image processor.

Background

In the image pickup apparatus, a picked-up image is generated based on electric signals converted by photoelectric converters of red, green, and blue colors. For example, patent document 1 discloses red, green, and blue photoelectric converters stacked in one pixel region.

List of references

Patent document

Patent document 1: japanese unexamined patent application publication No. 2011-138927

Disclosure of Invention

Incidentally, in the image pickup apparatus, a captured image of high image quality is desired, and further improvement in image quality is desired.

It is desirable to provide an image processor, an image processing method, and an image pickup apparatus that can improve the image quality of a captured image.

An image processor according to an embodiment of the present disclosure includes an image segmentation processing section, an interpolation processing section, and a synthesis processing section. The image segmentation processing section is configured to generate a plurality of first map data based on first image map data (image mapdata) including a plurality of pixel values. The plurality of first map data have pixel value arrangement patterns different from each other and include pixel values located at different positions. The interpolation processing section is configured to generate a plurality of second map data corresponding to the plurality of first map data by: with interpolation processing, a pixel value at a position where no pixel value exists in each of the plurality of first map data is determined. The synthesis processing section is configured to generate third map data by: generating a pixel value at a position corresponding to each other among the plurality of second atlas data, based on the pixel values at the positions corresponding to each other.

An image processing method according to an embodiment of the present disclosure includes: image segmentation processing: generating a plurality of first map data based on first image map data including a plurality of pixel values, the plurality of first map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other; interpolation processing: determining a plurality of second map data corresponding to the plurality of first map data by determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing; and synthesis treatment: generating third map data by generating pixel values at positions corresponding to each other among the plurality of second map data based on the pixel values at the positions corresponding to each other.

An image pickup apparatus according to an embodiment of the present disclosure includes an image pickup section, an image division processing section, an interpolation processing section, and a synthesis processing section. The image pickup section generates first image map data including a plurality of pixel values. The image segmentation processing section is configured to generate a plurality of first atlas data based on first image atlas data including a plurality of pixel values. The plurality of first map data have pixel value arrangement patterns different from each other and include pixel values located at different positions. The interpolation processing section is configured to generate a plurality of second map data corresponding to the plurality of first map data by: with interpolation processing, a pixel value at a position where no pixel value exists in each of the plurality of first map data is determined. The synthesis processing section is configured to generate third map data by: generating a pixel value at a position corresponding to each other among the plurality of second atlas data, based on the pixel values at the positions corresponding to each other.

Here, the "image pickup apparatus" is not limited to only a so-called image sensor, and includes electronic apparatuses having an image pickup function such as a digital camera and a smartphone.

In the image processor, the image processing method, and the image capturing apparatus according to the embodiments of the present disclosure, a plurality of first atlas data are generated based on the first image atlas data by image segmentation processing. The plurality of first map data have pixel value arrangement patterns different from each other and include pixel values located at positions different from each other. Subsequently, a plurality of second map data are generated based on each of the plurality of first map data by interpolation processing. The plurality of second map data are generated by determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing. Subsequently, third map data is generated based on the plurality of second map data using a synthesis process. The third atlas data is generated by generating pixel values at positions corresponding to the positions on the basis of pixel values at positions corresponding to each other in the plurality of second atlas data.

According to the image processor, the image processing method, and the image capturing apparatus of the embodiments of the present disclosure, a plurality of first atlas data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other are generated based on the first image atlas data, a plurality of second atlas data are generated by determining, with interpolation processing, pixel values at positions where no pixel value exists in each of the plurality of first atlas data, and third atlas data are generated by generating pixel values at positions corresponding to the positions on the basis of pixel values at positions corresponding to each other in the plurality of second atlas data, whereby the image quality of a captured image can be improved. It should be noted that the effects described herein are not limited, but may include any of the effects described in the present disclosure.

Drawings

Fig. 1 is a block diagram showing a configuration example of an image capturing apparatus according to a first embodiment of the present disclosure.

Fig. 2 is a block diagram showing a configuration example of the image pickup section shown in fig. 1.

Fig. 3 is a schematic diagram showing a configuration example of the image pickup pixel shown in fig. 2.

Fig. 4 is a schematic diagram showing a configuration example of the image pickup pixel shown in fig. 2.

Fig. 5 is a flowchart showing an example of the operation of the image processing section shown in fig. 1.

Fig. 6 is an explanatory diagram showing an example of the operation of the image processing section shown in fig. 1.

Fig. 7 is an explanatory diagram showing an example of the image map data shown in fig. 6.

Fig. 8A is another explanatory diagram showing an example of the map data shown in fig. 6.

Fig. 8B is another explanatory diagram showing an example of the map data shown in fig. 6.

Fig. 9A is another explanatory diagram showing an example of the map data shown in fig. 6.

Fig. 9B is another explanatory diagram showing an example of the map data shown in fig. 6.

Fig. 10 is another explanatory diagram showing an example of the map data shown in fig. 6.

Fig. 11 is an explanatory diagram showing an example of the operation of the image processing section according to the modification.

Fig. 12A is another explanatory diagram showing an example of map data according to another modification.

Fig. 12B is another explanatory diagram showing an example of map data according to another modification.

Fig. 13A is another explanatory diagram showing an example of map data according to another modification.

Fig. 13B is another explanatory diagram showing an example of map data according to another modification.

Fig. 14A is another explanatory diagram showing an example of map data according to another modification.

Fig. 14B is another explanatory diagram showing an example of map data according to another modification.

Fig. 15A is another explanatory diagram showing an example of map data according to another modification.

Fig. 15B is another explanatory diagram showing an example of map data according to another modification.

Fig. 16A is another explanatory diagram showing an example of map data according to another modification.

Fig. 16B is another explanatory diagram showing an example of map data according to another modification.

Fig. 17A is another explanatory diagram showing an example of map data according to another modification.

Fig. 17B is another explanatory diagram showing an example of map data according to another modification.

Fig. 18A is another explanatory diagram showing an example of map data according to another modification.

Fig. 18B is another explanatory diagram showing an example of map data according to another modification.

Fig. 18C is another explanatory diagram showing an example of map data according to another modification.

Fig. 19A is another explanatory diagram showing an example of map data according to another modification.

Fig. 19B is another explanatory diagram showing an example of map data according to another modification.

Fig. 19C is another explanatory diagram showing an example of map data according to another modification.

Fig. 20 is another explanatory diagram showing an example of map data according to another modification.

Fig. 21 is a block diagram showing a configuration example of an image capturing apparatus according to another modification.

Fig. 22 is an explanatory diagram showing an example of the operation of the image processing section shown in fig. 21.

Fig. 23A is an explanatory diagram illustrating an operation example of the image processing section illustrated in fig. 21.

Fig. 23B is an explanatory diagram illustrating an operation example of the image processing section illustrated in fig. 21.

Fig. 23C is an explanatory diagram illustrating an operation example of the image processing section illustrated in fig. 21.

Fig. 24 is an explanatory diagram showing an example of the operation of the image processing section according to another modification.

Fig. 25 is a block diagram showing a configuration example of an image capturing apparatus according to another modification.

Fig. 26 is an explanatory diagram showing an example of the operation of the image processing section shown in fig. 25.

Fig. 27 is a block diagram showing a configuration example of an image capturing apparatus according to the second embodiment.

Fig. 28 is an explanatory diagram illustrating a configuration example of an image pickup pixel in the image pickup portion shown in fig. 27.

Fig. 29 is a schematic diagram illustrating a configuration example of an image pickup pixel in the image pickup portion shown in fig. 27.

Fig. 30 is a schematic diagram showing an example of the operation of the image processing section shown in fig. 27.

Fig. 31 is a block diagram showing a configuration example of an image capturing apparatus according to the third embodiment.

Fig. 32 is an explanatory diagram illustrating a configuration example of image pickup pixels in the image pickup portion shown in fig. 31.

Fig. 33 is a schematic diagram illustrating a configuration example of an image pickup pixel in the image pickup portion shown in fig. 31.

Fig. 34 is an explanatory diagram showing an example of the operation of the image processing section shown in fig. 31.

Fig. 35 is a block diagram showing a configuration example of an image capturing apparatus according to a modification.

Fig. 36 is an explanatory diagram showing an example of the operation of the image processing section shown in fig. 35.

Fig. 37 is a block diagram showing a configuration example of an image capturing apparatus according to the fourth embodiment.

Fig. 38 is an explanatory diagram illustrating a configuration example of image pickup pixels in the image pickup portion shown in fig. 37.

Fig. 39 is a schematic diagram illustrating a configuration example of an image pickup pixel in the image pickup portion shown in fig. 37.

Fig. 40 is an explanatory diagram showing an example of the operation of the image processing section shown in fig. 37.

Fig. 41 is an explanatory diagram showing a use example of the image pickup apparatus.

Fig. 42 is a block diagram showing a schematic configuration example of the in-vivo information acquisition system.

Fig. 43 is a view showing a schematic configuration example of the endoscopic surgery system.

Fig. 44 is a block diagram showing a functional configuration example of a camera head and a Camera Control Unit (CCU).

Fig. 45 is a block diagram showing a schematic configuration example of the vehicle control system.

Fig. 46 is an explanatory view of assistance in explaining the mounting positions of the vehicle exterior information detecting portion and the image pickup portion.

Detailed Description

Embodiments of the present disclosure are described in detail below with reference to the accompanying drawings. It should be noted that the following description is in order:

1. first embodiment

2. Second embodiment

3. Third embodiment

4. Fourth embodiment

5. Use example of image pickup apparatus

6. Application example

<1. first embodiment >

[ construction example ]

Fig. 1 shows a configuration example of an image pickup apparatus 1 including an image processor according to a first embodiment. It should be noted that the image processing method according to the embodiment of the present embodiment is implemented by the present embodiment and explained together. The image pickup apparatus 1 includes an optical system 9, an image pickup section 10, and an image processing section 20.

For example, the optical system 9 includes a lens that forms an image on the imaging surface S of the imaging section 10.

The image pickup section 10 picks up an image of an object to generate an image signal DT and a gain signal SGAIN. The image pickup section 10 is configured using, for example, a CMOS (complementary metal oxide semiconductor) image sensor.

Fig. 2 shows a configuration example of the image pickup section 10. The imaging section 10 includes a pixel array 11, a scanning section 12, a readout section 13, and an imaging controller 14.

The pixel array 11 includes a plurality of image pickup pixels P arranged in a matrix. The image pickup pixels P each include a photoelectric converter configured to receive red (R) light, a photoelectric converter configured to receive green (G) light, and a photoelectric converter configured to receive blue (B) light.

Fig. 3 schematically shows a cross-sectional structure of the imaging pixel P. Fig. 3 schematically illustrates a cross-sectional configuration of two image pickup pixels P among the four image pickup pixels P arranged in the region X illustrated in fig. 2.

The semiconductor substrate 100 includes two photodiodes PDR and PDB formed in a pixel region corresponding to one image pickup pixel P. The photodiode PDR is a photoelectric converter configured to receive red (R) light, and the photodiode PDB is a photoelectric converter configured to receive blue (B) light. The photodiode PDR and the photodiode PDB are formed and stacked in the semiconductor substrate 100 in such a manner that the photodiode PDB is located on the image pickup surface S side. The photodiode PDR and the photodiode PDB perform photoelectric conversion based on red light and blue light, respectively, by using the fact that the absorption coefficient of light in the semiconductor substrate 100 differs depending on the wavelength of light.

An insulating film 101 is formed on the surface of the semiconductor substrate 100 on the imaging surface S side. For example, silicon dioxide (SiO)2) The insulating film 101 is structured. Subsequently, a transparent electrode 102, a photoelectric conversion film 103G, and a transparent electrode 104 are formed in this order on the insulating film 101. The transparent electrodes 102 and 104 are electrodes that allow red light, green light, and blue light to pass through. The photoelectric conversion film 103G is a photoelectric conversion film configured to receive green (G) light and allow red light and blue light to pass therethrough. The photoelectric conversion film 103G and the transparent electrodes 102, 104 are included in a photoelectric converter for receiving green (G) light. An on-chip lens 105 is formed on the transparent electrode 104.

Fig. 4 schematically illustrates the positions of the photoelectric converters in the four image pickup pixels P arranged in the region X illustrated in fig. 2. In the image pickup section 10, a photoelectric converter related to green (G), a photoelectric converter related to blue (B), and a photoelectric converter related to red (R) are formed and stacked in a pixel region corresponding to one image pickup pixel P. Thereby, each image pickup pixel P can generate a pixel signal relating to red, a pixel signal relating to green, and a pixel signal relating to blue in the image pickup section 10.

The scanning section 12 sequentially drives a plurality of image pickup pixels P in the pixel array 11, for example, in units of pixel lines based on an instruction from the image pickup controller 14, and includes, for example, an address decoder.

The readout section 13 performs AD conversion based on an instruction from the image pickup controller 14 and on pixel signals supplied from the respective image pickup pixels P to generate an image signal DT. The image signal DT includes three image map data MPG, MPB, and MPR. The image map data MPG includes pixel values of a frame of an image related to green (G). The image map data MPB includes pixel values of a frame of image relating to blue (B). The image map data MPR includes pixel values of one frame of image related to red (R). Each pixel value is represented by a digital code having a plurality of bits.

The image pickup controller 14 supplies control signals to the scanning section 12 and the readout section 13 to control the operations of these circuits, thereby controlling the operation of the image pickup section 10. Further, the image pickup controller 14 also has a function for setting a conversion gain GC of the AD conversion performed by the readout section 13. Specifically, in the case where the image pickup section 10 picks up an image of a dark subject, the image pickup controller 14 increases the conversion gain GC of the AD conversion to be performed, and in the case where the image pickup section 10 picks up an image of a bright subject, the image pickup controller 14 decreases the conversion gain GC of the AD conversion to be performed. Thereby, the image capturing apparatus 1 can capture images of subjects having various luminance levels. Further, the image pickup controller 14 has a function of outputting information on the conversion gain GC as a gain signal SGAIN.

The image processing section 20 (fig. 1) performs image processing based on the image signal DT and the gain signal SGAIN. The image processing unit 20 includes a switching unit 21, an image division processing unit 22, an interpolation processing unit 23, a synthesis processing unit 24, and a signal processing unit 25.

The switching section 21 selectively supplies the image signal DT to the image division processing section 22 or the image processing section 25 based on the conversion gain GC indicated by the gain signal SGAIN. Specifically, for example, the switching section 21 supplies the image signal DT to the image division processing section 22 in a case where the conversion gain GC is higher than a predetermined threshold Gth, and supplies the image signal DT to the signal processing section 25 in a case where the conversion gain GC is lower than the predetermined threshold Gth. Thus, the image division processing portion 22, the interpolation processing portion 23, and the synthesis processing portion 24 of the image processing portion 20 perform processing when the conversion gain GC is higher than the predetermined threshold Gth, and the image division processing portion 22, the interpolation processing portion 23, and the synthesis processing portion 24 of the image processing portion 20 do not perform processing when the conversion gain GC is lower than the predetermined threshold Gth.

The image segmentation processing section 22 performs image segmentation processing a1 based on three image map data MPG, MPB, and MPR included in the image signal DT supplied from the image pickup section 10 via the switching section 21 to generate six map data MG11, MG12, MB11, MB12, MR11, and MR 12. Specifically, the image division processing section 22 generates two map data MG11 and MG12 based on the image map data MPG related to green (G) included in the image signal DT, the map data MG11 and MG12 having pixel value arrangement patterns PAT different from each other and including pixel values located at positions different from each other as described later. Similarly, the image segmentation processing section 22 generates two atlas data MB11 and MB12 based on the image atlas data MPB relating to blue (B) included in the image signal DT, and generates two atlas data MR11 and MR12 based on the image atlas data MPR relating to red (R) included in the image signal DT. Thus, the image segmentation processing section 22 generates six atlas data MG11, MG12, MB11, MB12, MR11, and MR12 based on the image signal DT.

The interpolation processing section 23 performs interpolation processing a2 on the six atlas data MG11, MG12, MB11, MB12, MR11, and MR12 supplied from the image segmentation processing section 22, respectively, to generate six atlas data MG21, MG22, MB21, MB22, MR21, and MR 22. Specifically, as described later, the interpolation processing section 23 determines a pixel value at a position where a pixel value does not exist in the map data MG11 relating to green (G) using the interpolation processing a2 to generate the map data MG21, and determines a pixel value at a position where a pixel value does not exist in the map data MG12 relating to green (G) using the interpolation processing a2 to generate the map data MG 22. Similarly, the interpolation processing section 23 performs interpolation processing a2 on the map data MB11 relating to blue (B) to generate map data MB21, and performs interpolation processing a2 on the map data MB12 relating to blue (B) to generate map data MB 22. Further, the interpolation processing section 23 performs interpolation processing a2 on the map data MR11 relating to red (R) to generate map data MR21, and performs interpolation processing a2 on the map data MR12 relating to red (R) to generate map data MR 22.

The synthesis processing section 24 performs synthesis processing a3 based on the six map data MG21, MG22, MB21, MB22, MR21, and MR22 supplied from the interpolation processing section 23 to generate three map data MG3, MB3, and MR 3. Specifically, as described below, the synthesis processing portion 24 generates map data MG3 based on two map data MG21 and MG22 relating to green (G). Similarly, the synthesis processing section 24 generates atlas data MB3 based on the two atlas data MB21 and MB22 relating to blue (B), and generates a pixel value at a position corresponding to the above-described position based on a pixel value at a position corresponding to each other in the two atlas data MR21 and MR22 relating to red (R) to generate atlas data MR 3. Next, the synthesis processing section 24 supplies the three map data MG3, MB3, and MR3 to the image processing section 25 as an image signal DT 2.

The signal processing section 25 executes predetermined signal processing based on the image signal DT2 supplied from the synthesis processing section 24 through the switching section 21 or the image signal DT supplied from the image pickup section 10. For example, the predetermined signal processing includes white balance adjustment, nonlinear conversion, contour enhancement processing, image size conversion, and the like. Subsequently, the signal processing section 25 outputs the processing result of the predetermined signal processing as the image signal DT 3.

With this configuration, in the case of capturing an image of a dark subject, the conversion gain GC in the image capturing apparatus 1 is increased; thus, the image segmentation process a1, interpolation process a2, and synthesis process A3 are performed. Thereby, the signal-to-noise ratio (S/N ratio) of the captured image in the image capturing apparatus 1 can be increased. Further, in a case where the image pickup apparatus 1 picks up an image of a bright object, the conversion gain GC in the image pickup apparatus 1 is reduced; therefore, the image segmentation process a1, interpolation process a2, and synthesis process A3 are not performed. Thereby, the resolution in the captured image of the image capturing apparatus 1 can be improved.

Here, the image processing section 20 corresponds to a specific example of "image processor" in the present disclosure. The image division processing section 22 corresponds to a specific example of the "image division processing section" in the present disclosure. The interpolation processing section 23 corresponds to a specific example of "interpolation processing section" in the present disclosure. The synthesis processing section 24 corresponds to a specific example of "synthesis processing section" in the present disclosure. The signal processing section 25 corresponds to a specific example of "processing section" in the present disclosure. The switching section 21 corresponds to a specific example of "process controller" in the present disclosure.

[ operation and Effect ]

Next, the operation and action of the image pickup apparatus 1 according to the present embodiment are explained.

(general operation overview)

First, an overall operation overview of the image pickup apparatus 1 is explained with reference to fig. 1. The image pickup section 10 picks up an image of an object to generate an image signal DT and a gain signal SGAIN. The switching section 21 of the image processing section 20 selectively supplies the image signal DT to the image division processing section 22 or the image processing section 25 based on the conversion gain GC indicated by the gain signal SGAIN. The image segmentation processing section 22 performs image segmentation processing a1 based on three image map data MPG, MPB, and MPR included in the image signal DT supplied from the image pickup section 10 via the switching section 21 to generate six map data MG11, MG12, MB11, MB12, MR11, and MR 12. The interpolation processing section 23 performs interpolation processing a2 on the six atlas data MG11, MG12, MB11, MB12, MR11, and MR12 supplied from the image segmentation processing section 22, respectively, to generate six atlas data MG21, MG22, MB21, MB22, MR21, and MR 22. The synthesis processing section 24 performs synthesis processing a3 based on the six map data MG21, MG22, MB21, MB22, MR21, and MR22 supplied from the interpolation processing section 23 to generate three map data MG3, MB3, and MR 3. Subsequently, the synthesis processing section 24 supplies the three map data MG3, MB3, and MR3 to the signal processing section 25 as the image signal DT 2. The signal processing section 25 performs predetermined signal processing based on the image signal DT2 supplied from the synthesis processing section 24 through the switching section 21 or the image signal DT supplied from the image pickup section 10 to generate an image signal DT 3.

(detailed operation)

Fig. 5 shows an example of the operation of the image processing section 20. The image processing section 20 determines whether to perform the image segmentation process a1, the interpolation process a2, and the synthesis process A3 based on the conversion gain GC indicated by the gain signal SGAIN. This operation is explained in detail below.

First, the switching section 21 compares the conversion gain GC indicated by the gain signal SGAIN with a predetermined threshold Gth (step S101). In the case where the conversion gain GC is lower than the predetermined threshold Gth (no in step S101), the process proceeds to step S105.

In the case where the conversion gain GC is equal to or higher than the predetermined threshold Gth (G ≧ Gth) (yes in step S101), the image division processing section 22 executes the image division processing a1 (step S102), the interpolation processing section 23 executes the interpolation processing a2 (step S103), and the synthesis processing section 24 executes the synthesis processing A3 (step S104).

Subsequently, the signal processing section 25 performs predetermined signal processing (step S105). That is, the signal processing section 25 performs predetermined signal processing based on the image signal DT2 generated by the synthesis processing a3 in the case where the conversion gain GC is equal to or higher than the predetermined threshold Gth (yes in step S101), and performs predetermined signal processing based on the image signal DT generated by the image pickup apparatus 10 in the case where the conversion gain GC is lower than the predetermined threshold Gth (no in step S101).

Thus, the flow ends.

As described above, the image processing section 20 executes the image segmentation process a1, the interpolation process a2, and the synthesis process A3 with the conversion gain GC being high. Further, in the case where the conversion gain GC is low, the image segmentation process a1, the interpolation process a2, and the synthesis process A3 are not performed in the image pickup apparatus 1. As described later, the image quality of a captured image in the image capturing apparatus 1 can be improved.

Next, the image segmentation process a1, interpolation process a2, and synthesis process A3 are explained in detail with reference to a specific operation example.

Fig. 6 schematically shows examples of the image segmentation processing a1, interpolation processing a2, and synthesis processing A3 of the image processing section 20.

(image segmentation processing A1)

The image segmentation processing section 22 performs image segmentation processing a1 based on three image atlas data MPG, MPB, and MPR included in the image signal DT supplied from the image pickup section 10 to generate six atlas data MG11, MG12, MB11, MB12, MR11, and MR 12. As an example, the image segmentation process a1 on the image map data MPG relating to green (G) is described in detail below.

Fig. 7 schematically shows image map data MPG relating to green (G). Fig. 8A and 8B schematically show map data MG11 and MG12 relating to green (G), respectively. In fig. 8A and 8B, a hatched portion indicates a position where a pixel value exists, and an unshaded portion indicates a position where a pixel value does not exist (no pixel value).

The image map data MPG (fig. 7) included in the image signal DT includes pixel values in one frame of an image relating to green (G). The example of the image map data MPG shown in fig. 6 schematically shows four pixel values arranged in two rows and two columns in the region X shown in fig. 7.

The image division processing section 22 generates two map data MG11 and MG12 (fig. 8A and 8B) based on such image map data MPG, the two map data MG11 and MG12 having pixel value arrangement patterns PAT different from each other and including pixel values at positions different from each other. The pixel value arrangement Pattern PAT in the map data MG11 and MG12 is a checkerboard Pattern (Checkered Pattern) in which pixel values are shifted from each other by one pixel in the horizontal direction (lateral direction) and the vertical direction (longitudinal direction). In other words, in the checkerboard patterns of the map data MG11 and MG12, the positions where pixel values exist and the positions where pixel values do not exist are opposite to each other, and the pixel values are arranged at positions different from each other. Specifically, for example, in the map data MG11, as shown in fig. 8A, there are pixel values at the upper left and lower right of the region X, and there are no pixel values at the lower left and upper right of the region X. In contrast, in the map data MG12, as shown in fig. 8B, there are pixel values at the lower left and upper right of the region X, and there are no pixel values at the upper left and lower right of the region X. The example of each map data MG11 and MG12 shown in fig. 6 schematically shows four pixel values in this region X. The pixel value at each position in the map data MG11 is the same as the pixel value at the corresponding position in the image map data MPG. Similarly, the pixel value at each position in the map data MG12 is the same as the pixel value at the corresponding position in the image map data MPG.

The image segmentation process section 22 performs an image segmentation process a1 based on the image map data MPG to generate such map data MG11 and MG 12. Similarly, the image segmentation process section 22 performs an image segmentation process a1 based on the image map data MPB to generate map data MB11 and MB12, and performs an image segmentation process a1 based on the image map data MPR to generate map data MR11 and MR 12. As shown in fig. 6, the map data MG11, MB11, and MR11 have the same arrangement pattern PAT, and the map data MG12, MB12, and MR12 have the same arrangement pattern PAT.

(interpolation processing A2)

Next, the interpolation processing section 23 performs interpolation processing a2 on the six atlas data MG11, MR12, MB11, MB12, MR11, and MR12 generated by the image segmentation processing a1, respectively, to generate six atlas data MG21, MG22, MB21, MB22, MR21, and MR22, respectively. As an example, the interpolation processing a2 for the map data MG11 and MG12 (fig. 8A and 8B) relating to green (G) is described in detail below.

Fig. 9A and 9B schematically show map data MG21 and MG22 relating to green (G), respectively. In fig. 9A and 9B, the shaded portion indicates the position where the pixel value exists in the map data MG11 and MG12 before the interpolation process a2, and the unshaded portion indicates the position where the pixel value does not exist in the map data MG11 and MG12 before the interpolation process a2 and the position where the pixel value is generated by the interpolation process a 2.

The interpolation processing section 23 determines the pixel value at the position where the pixel value does not exist in the map image MG11 shown in fig. 8A using the interpolation processing a2 to generate the map data MG21 shown in fig. 9A, and determines the pixel value at the position where the pixel value does not exist in the map image MG12 shown in fig. 8B using the interpolation processing a2 to generate the map data MG22 shown in fig. 9B. Specifically, the interpolation processing section 23 determines the pixel value at the position where no pixel value exists by performing the interpolation processing a2 based on the pixel values in the upper row, the left column, the lower row, and the right column located at the position where no pixel value exists. That is, in this example, the interpolation method of the interpolation process a2 uses pixel values above, below, on the left side, and on the right side of the position where no pixel value exists. For example, in the interpolation processing section 23, by performing bilinear interpolation using these four pixel values, the interpolation processing a2 can be performed. It should be noted that, without being limited to this interpolation method, various known interpolation methods such as bicubic interpolation and spline interpolation may be used. For example, in the map data MG21, as shown in fig. 9A, the interpolation processing section 23 generates pixel values at the lower left position in the region X by interpolation processing a2, and generates pixel values at the upper right position in the region X by interpolation processing a 2. Similarly, in the map data MG22, as shown in fig. 9B, the interpolation processing section 23 generates pixel values at the upper left position in the region X by interpolation processing a2, and generates pixel values at the lower right position in the region X by interpolation processing a 2. In fig. 9A and 9B, "G'" denotes a pixel value generated by the interpolation process a 2. The example of each map data MG21 and MG22 shown in fig. 6 schematically shows four pixel values in this region X.

The interpolation processing section 23 performs interpolation processing a2 on the map data MG11 to generate such map data MG21, and performs interpolation processing a2 on the map data MG12 to generate such map data MG 22. Similarly, the interpolation processing section 23 performs interpolation processing a2 on the map data MB11 to generate map data MB21, and performs interpolation processing a2 on the map data MB12 to generate map data MB 22. Further, the interpolation processing section 23 performs interpolation processing a2 on the map data MR11 to generate map data MR21, and performs interpolation processing a2 on the map data MR12 to generate map data MR 22. Six map data MG21, MG22, MB21, MB22, MR21 and MR22 were generated by the same interpolation method.

(Synthesis treatment A3)

Next, the synthesis processing section 24 performs synthesis processing A3 based on the six map data MG21, MG22, MB21, MB22, MR21, and MR22 generated by interpolation processing a2 to generate three map data MG3, MB3, and MR 3. As an example, the synthesis processing A3 of the map data MG21 and MG22 (fig. 9A and 9B) is described in detail below.

Fig. 10 schematically shows map data MG3 relating to green (G). The synthesis processing portion 24 generates pixel values at positions corresponding to the positions on the basis of pixel values at positions corresponding to each other in the two map data MG21 and MG22 to generate map data MG 3. Specifically, the synthesis processing section 24 may generate a pixel value at a position in the map data MG3 by summing pixel values at the position corresponding to each other in the two map data MG21 and MG 22. For example, the synthesis processing section 24 generates the upper left pixel value in the region X of the map data MG3 by summing up the upper left pixel value in the region X of the map data MG21 shown in fig. 9A and the upper left pixel value in the region X of the map data MG22 shown in fig. 9B. Similarly, the synthesis processing section 24 generates a lower left pixel value in the region X of the map data MG3 by summing up lower left pixel values in the region X of the map data MG21 and MG22, generates an upper right pixel value in the region X of the map data MG3 by summing up upper right pixel values in the region X of the map data MG21 and MG22, and generates a lower right pixel value in the region X of the map data MG3 by summing up lower right pixel values in the region X of the map data MG21 and MG 22. In fig. 10, "2G" indicates that the pixel value becomes about twice as large as the pixel value in the image map data MPG by the synthesizing process a 3. The example of the map data MG3 shown in fig. 6 schematically shows pixel values in this region X.

The synthesis processing section 24 performs synthesis processing A3 based on the map data MG21 and MG22 to generate such map data MG 3. Similarly, the synthesis processing section 24 performs synthesis processing A3 based on the atlas data MB21 and MB22 to generate such atlas data MB3, and performs synthesis processing A3 based on the atlas data MR21 and MR22 to generate such atlas data MR 3. The pixel values in the map data MB3 are about twice the pixel values in the image map data MPB, and the pixel values in the map data MR3 are about twice the pixel values in the image map data MPR.

As described above, the synthesis processing section 24 generates the three map data MG3, MB3, and MR 3. Subsequently, the synthesis processing section 24 supplies the three map data MG3, MB3, and MR3 to the signal processing section 25 as the image signal DT 2.

Here, the image map data MPG, MPB, and MPR correspond to specific examples of "first image map data", "second image map data", and "third image map data" in the present disclosure, respectively. The map data MG11 and MG12 correspond to a specific example of "a plurality of first map data" in the present disclosure. The map data MG21 and MG22 correspond to a specific example of "a plurality of second map data" in the present disclosure. Map data MG3 corresponds to a specific example of "third map data" in the present disclosure. The map data MB11 and MB12 correspond to a specific example of "a plurality of fourth map data" in the present disclosure. The map data MB21 and MB22 correspond to a specific example of "fifth map data" in the present disclosure. The map data MB3 corresponds to a specific example of "sixth map data" in the present disclosure. The atlas data MR11 and MR12 correspond to a specific example of "a plurality of seventh atlas data" in the present disclosure. The atlas data MR21 and MR22 correspond to a specific example of "a plurality of eighth atlas data" in the present disclosure. The atlas data MR3 corresponds to a specific example of "ninth atlas data" in the present disclosure.

As described above, for example, in the image capturing apparatus 1, the image segmentation process a1 is performed based on the image map data MPG to generate map data MG11 and MG12, the interpolation process a2 is performed on these map data MG11 and MG12 to generate map data MG21 and MG22, respectively, and the synthesis process A3 is performed based on these map data MG21 and MG22 to generate map data MG 3. The same applies to the image map data MPB and MPR. Thereby, the signal-to-noise ratios (S/N ratios) of the map data MG3, MB3, and MR3 of the image pickup apparatus 1 can be improved.

That is, for example, the synthesis processing section 24 determines the upper left pixel value in the region X of the map data MG3 by summing the upper left pixel value in the region X of the map data MG21 and the upper left pixel value in the region X of the map data MG 22. Each pixel value has a signal component and a noise component that is random noise. Accordingly, the synthesis processing section 24 sums the upper left pixel value in the region X of the map data MG21 and the upper left pixel value in the region X of the map data MG22 to increase the signal component by about 2 times and the noise component by about 1.4 times. That is, the noise component is random noise as described above, and the noise component included in the upper left pixel value in the region X of the map data MG21 and the noise component included in the upper left pixel value in the region X of the map data MG22 are noise components independent of each other; therefore, the noise component is not increased by about 2 times but by about 1.4 times (square root of 2). Thus, in the image pickup apparatus 1, the signal component increases by about 2 times and the noise component increases by about 1.4 times, so that the signal-to-noise ratio (S/N ratio) of the map data MG3 can be increased. The same applies to the atlas data MB3 and MR 3. The image quality of the captured image in the image capturing apparatus 1 can be improved.

Further, in the image pickup apparatus 1, in the image division processing a1, the pixel value arrangement pattern PAT is a checkerboard pattern. Accordingly, as shown in fig. 8A and 8B, pixel values exist above, below, to the left, and to the right of the position where no pixel value exists; therefore, by performing the interpolation process a2, the pixel value at the position where the pixel value does not exist can be determined based on these four pixel values. As described above, in the image pickup apparatus 1, the interpolation process a2 can be executed based on the pixel values of the upper, lower, left, and right sides, whereby the reduction in resolution in the horizontal direction and the reduction in resolution in the vertical direction can be made substantially equal to each other, and the reduction in resolution is suppressed. The image quality of the captured image in the image capturing apparatus 1 can be improved.

Further, in the image capturing apparatus 1, in the image segmentation process a1, the pixel value arrangement patterns PAT in the map data MG11, MB11, and MR11 are identical to each other, and the pixel value arrangement patterns PAT in the map data MG12, MB12, and MR12 are identical to each other. Thus, the image segmentation process section 22 can perform the image segmentation process a1 based on the three image map data MPG, MPB, and MPR by the same method, and the circuit configuration of the image segmentation process section 22 can be simplified as compared with the case where the image segmentation process a1 is performed based on the three image map data MPG, MPB, and MPR by a different method.

Further, in the image pickup apparatus 1, in the interpolation process a2, interpolation methods for generating the map data MG21, MG22, MB21, MB22, MR21, and MR22 are the same as each other. Thus, the interpolation processing section 23 can generate the six map data MG21, MG22, MB21, MB22, MR21, and MR22 using the same interpolation method, and can simplify the circuit configuration of the interpolation processing section 23 compared to generating the six map data MG21, MG22, MB21, MB22, MR21, and MR22 using different interpolation methods.

Further, in the image capturing apparatus 1, in the image segmentation process a1, the pixel value arrangement patterns PAT in the map data MG11, MB11, and MR11 are identical to each other, and the pixel value arrangement patterns PAT in the map data MG12, MB12, and MR12 are identical to each other. Thereby, false colors in the captured image of the image capturing apparatus 1 can be suppressed. That is, for example, in the case where interpolation processing a2 is performed based on pixel values located in the upper and lower rows of the position where no pixel value exists to generate map data MG21 and MG22 relating to green (G) and interpolation processing a2 is performed based on pixel values located in the left and right columns of the position where no pixel value exists to generate map data MB21 and MB22 relating to blue, the interpolation method in interpolation processing a2 differs depending on the color, so that a local false color may be caused. In contrast, in the image capturing apparatus 1 according to the present embodiment, in the image segmentation process a1, the pixel value arrangement patterns PAT in the map data MG11, MB11, and MR11 are identical to each other, and the pixel value arrangement patterns PAT in the map data MG12, MB12, and MR12 are identical to each other. Subsequently, in the interpolation process a2, interpolation methods for generating the six map data MG21, MG22, MB21, MB22, MR21, and MR22 are the same as one another. Thereby, the possibility of occurrence of such false color in the image pickup apparatus 1 can be reduced. The image quality of the captured image in the image capturing apparatus 1 can be improved.

Further, in the image pickup apparatus 1, whether or not the image division processing, the interpolation processing, and the synthesis processing are executed can be controlled, so that the image quality of the captured image can be improved. Specifically, the image pickup apparatus 1 controls whether to execute the image division processing, the interpolation processing, and the synthesis processing based on the conversion gain GC in the image pickup section 10. Specifically, the image segmentation processing, the interpolation processing, and the synthesis processing are performed in a case where the conversion gain GC indicated by the gain signal SGAIN is higher than the predetermined threshold Gth, and the image segmentation processing, the interpolation processing, and the synthesis processing are not performed in a case where the conversion gain GC indicated by the gain signal SGAIN is lower than the predetermined threshold Gth. Thus, for example, in the case where the image pickup apparatus 1 captures an image of a dark subject, the conversion gain GC increases; thus, the image segmentation process a1, interpolation process a2, and synthesis process A3 are performed. Thereby, the signal-to-noise ratio (S/N ratio) in the captured image of the image capturing apparatus 1 can be improved. That is, in the case of capturing an image of a dark subject, there is a possibility that noise is increased; therefore, by performing the image segmentation process a1, the interpolation process a2, and the synthesis process A3, the signal-to-noise ratio (S/N ratio) in the captured image can be improved. Further, in the case where the image pickup apparatus 1 captures an image of a bright object, the conversion gain GC is reduced; therefore, the image segmentation process a1, interpolation process a2, and synthesis process A3 are not performed. Thereby, the resolution in the captured image of the image capturing apparatus 1 can be improved. That is, in the case of capturing an image of a bright object, less noise is generated, and the resolution can be improved by not performing the image division processing a1, the interpolation processing a2, and the synthesis processing A3. Therefore, the image quality of the captured image of the image capturing apparatus 1 can be improved.

[ Effect ]

As described above, in the present embodiment, the image segmentation processing, the interpolation processing, and the synthesis processing are performed, so that the signal-to-noise ratio in the captured image can be improved. Thereby, the image quality of the captured image can be improved.

In the present embodiment, in the image segmentation processing, the pixel value arrangement pattern is a checkerboard pattern, so that it is possible to make the reduction in resolution in the horizontal direction and the reduction in resolution in the vertical direction substantially the same as each other and suppress the reduction in resolution. Thereby, the image quality of the captured image can be improved.

In the present embodiment, in the image segmentation process, the pixel value arrangement patterns in the atlas data MG11, MB11, and MR11 are identical to each other, and the pixel value arrangement patterns in the atlas data MG12, MB12, and MR12 are identical to each other, so that the circuit configuration of the image segmentation process section can be simplified.

In the present embodiment, interpolation methods for generating six map data in interpolation processing are identical to each other, so that the circuit configuration of the interpolation processing section can be simplified.

In the present embodiment, in the image segmentation process, the pixel value arrangement patterns in the atlas data MG11, MB11, and MR11 are identical to each other, and the pixel value arrangement patterns in the atlas data MG12, MB12, and MR12 are identical to each other. Further, in the interpolation process, interpolation methods for generating six map data are the same as each other. Thereby, the possibility of occurrence of false color can be reduced, and the image quality of the captured image can be improved.

In the present embodiment, whether or not the image segmentation processing, the interpolation processing, and the synthesis processing are executed can be controlled, so that the image quality of the captured image can be improved.

[ modified examples 1-1]

For example, in the above-described embodiment, the synthesis processing section 24 sums up pixel values at positions corresponding to each other in the two map data MG21 and MG22 to generate pixel values at positions corresponding to the above-described positions in the map data MG 3; however, this is not restrictive. Alternatively, for example, as shown in the image capturing apparatus 1A in fig. 11, the pixel values at the positions corresponding to each other in the two map data MG21 and MG22 may be summed, and the sum of the pixel values may be halved, thereby generating the pixel value at the position corresponding to the above-described position in the map data MG 3. Thus, the pixel values in map data MG3 may be substantially equal to the pixel values in image map data MPG. The same applies to the atlas data MB3 and MR 3. Thereby, the number of bits of the digital codes representing the pixel values in the map data MG3, MB3, and MR3 can be reduced while maintaining the signal-to-noise ratio. Therefore, it is possible to facilitate design of the dynamic range in the signal processing section 25.

[ modified examples 1 and 2]

In the above-described embodiment, the pixel value arrangement pattern PAT in the image division processing a1 is a checkerboard pattern in units of single pixel values, but this is not limitative. The present modification is described in detail below using some examples. It should be noted that, as an example, the map data relating to green (G) is explained below, but the same applies to the map data relating to blue (B) and the map data relating to red (R).

(other checkerboard pattern)

Fig. 12A and 12B show examples of the map data MG11 and MG12 in the case where the pixel value arrangement pattern is a checkerboard pattern in units of four pixel values arranged in two rows and two columns. The pitch in the horizontal direction (lateral direction) in the arrangement pattern PAT shown in fig. 12A and 12B is twice the pitch in the vertical direction in the arrangement pattern shown in fig. 8A and 8B. Similarly, the pitch in the vertical direction (longitudinal direction) in the arrangement pattern PAT shown in fig. 12A and 12B is twice the pitch in the vertical direction in the arrangement pattern shown in fig. 8A and 8B. The pixel value arrangement patterns PAT in the map data MG11 and MG12 are offset from each other by two pixels in the horizontal direction and the vertical direction.

Fig. 13A and 13B show examples of map data MG21 and MG22 generated by performing interpolation processing a2 based on the map data MG11 and MG12 shown in fig. 12A and 12B. For example, by performing the interpolation processing a2 based on pixel values in the upper two rows, the left two columns, the lower two rows, and the right two columns at a position where no pixel value exists, the interpolation processing section 23 can determine a pixel value at a position where no pixel value exists.

(stripe pattern)

Fig. 14A and 14B show examples of the map data MG11 and MG12 in the case where the pixel value arrangement pattern PAT is a stripe pattern in which positions where pixel values exist and positions where pixel values do not exist are alternately arranged in the horizontal direction (lateral direction). The pixel value arrangement patterns PAT in the map data MG11 and MG12 are shifted from each other by one pixel in the horizontal direction.

Fig. 15A and 15B show examples of map data MG21 and MG22 generated by performing interpolation processing a2 based on the map data MG11 and MG12 shown in fig. 14A and 14B. By performing the interpolation processing a2 based on the pixel values in the left column and the right column at the position where no pixel value exists, the interpolation processing section 23 can determine the pixel value at the position where no pixel value exists.

Fig. 16A and 16B show examples of the map data MG11 and MG12 in the case where the pixel value arrangement pattern PAT is a stripe pattern in which positions where pixel values exist and positions where pixel values do not exist are alternately arranged in the vertical direction. The pixel value arrangement patterns PAT in the map data MG11 and MG12 are shifted from each other by one pixel in the vertical direction.

Fig. 17A and 17B show examples of map data MG21 and MG22 generated by performing interpolation processing a2 based on the map data MG11 and MG12 shown in fig. 16A and 16B. By performing the interpolation processing a2 based on the pixel values of the upper line and the lower line located at the position where no pixel value exists, the interpolation processing section 23 can determine the pixel value at the position where no pixel value exists.

(other patterns)

In the above example, for example, the image division processing section 22 generates two map data MG11 and MG12 by performing the image division processing a1 based on one image map data MPG, the interpolation processing section 23 generates two map data MG21 and MG22 by performing the interpolation processing a2 on the two map data MG11 and MG12, and the synthesis processing section 24 generates the map data MG3 by performing the synthesis processing A3 based on the two map data MG21 and MG22, but this is not limitative. Alternatively, for example, the image division processing section 22 may generate three map data MG11, MG12, and MG13 by performing the image division processing a1 based on, for example, one image map data MPG, the interpolation processing section 23 may generate three map data MG21, MG22, and MG23 by performing the interpolation processing a2 on the three map data MG11, MG12, and MG13, and the synthesis processing section 24 may generate the map data MG3 by performing the synthesis processing A3 based on the three map data MG21, MG22, and MG 23.

Fig. 18A, 18B, and 18C show examples of the map data MG11, MG12, and MG13 in the case where the pixel value arrangement pattern PAT is a pattern such as a so-called bayer array. Specifically, for example, in the map data MG11, as shown in fig. 8A, there are pixel values in the upper left of the region X, and there are no pixel values in the lower left, upper right, and lower right. In the map data MG12, as shown in fig. 8B, pixel values exist in the lower left and upper right portions of the region X, and pixel values do not exist in the upper left and lower right portions. In the map data MG13, as shown in fig. 8C, there are pixel values in the lower right portion of the region X, and there are no pixel values in the upper left portion, the lower left portion, and the upper right portion. The pixel value at each position in the map data MG11 is the same as the pixel value at the corresponding position in the image map data MPG, the pixel value at each position in the map data MG12 is the same as the pixel value at the corresponding position in the image map data MPG, and the pixel value at each position in the map data MG13 is the same as the pixel value at the corresponding position in the image map data MPG.

Fig. 19A, 19B, and 19C show examples of map data MG21, MG22, and MG23 generated by performing interpolation processing a2 based on the map data MG11, MG12, and MG13 shown in fig. 18A, 18B, and 18C. In the case where there are pixel values above and below the position where there are no pixel values, the interpolation processing section 23 performs interpolation processing a2 on the map data MG11 and MG13 (fig. 18A and 18C) based on these two pixel values. In the case where there is a pixel value on the left and right sides of the position where there is no pixel value, the interpolation processing section 23 performs interpolation processing a2 on the map data MG11 and MG13 based on these two pixel values. In the case where there are pixel values at the upper left, lower left, upper right, and lower right of the position where there are no pixel values, the interpolation processing section 23 performs interpolation processing a2 on the map data MG11 and MG13 based on these four pixel values. Thus, map data MG21 and MG23 were generated. Further, by performing interpolation processing a2 on map data MG12 (fig. 18B) based on pixel values located above, below, left, and right of a position where no pixel value exists, the interpolation processing section 23 generates map data MG 22.

Fig. 20 shows an example of map data MG3 generated by performing synthesis processing A3 based on the map data MG21, MG22, and MG23 shown in fig. 19A, 19B, and 19C. The synthesis processing section 24 sums up pixel values at positions corresponding to each other in the three map data MG21, MG22, and MG23 to generate pixel values at positions corresponding to the positions in the data MG 3. In fig. 20, "3G" indicates that the pixel value becomes three times the pixel value in the image map data MPG by the synthesizing process a 3.

[ modified examples 1 to 3]

In the above-described embodiment, the interpolation processing section 23 executes the interpolation processing a2, but the interpolation method in the interpolation processing a2 can be changed. This modification will be described in detail below.

Fig. 21 shows a configuration example of the image pickup apparatus 2 according to the present modification. The image pickup apparatus 2 includes an image processing section 30. The image processing section 30 includes an interpolation controller 36 and an interpolation processing section 33.

The interpolation controller 36 performs interpolation control processing B1 based on the image map data MPG, MPB, and MPR included in the image signal DT to determine an interpolation method in the interpolation processing a2 of the interpolation processing section 33. The interpolation controller 36 corresponds to a specific example of "interpolation controller" in the present disclosure.

The interpolation processing section 33 performs interpolation processing a2 on the six map data MG11, MG12, MB11, MB12, MR11, and MR12 supplied from the image segmentation processing section 22, respectively, using an interpolation method instructed by the interpolation controller 36 to generate six map data MG21, MG22, MB21, MB22, MR21, and MR 22.

Fig. 22 schematically shows examples of the image segmentation processing a1, the interpolation control processing B1, the interpolation control a2, and the synthesis processing A3 of the image processing section 30.

The interpolation controller 36 first performs a synthesis process B2 based on the image map data MPG, MPB, and MPR included in the image signal DT to generate map data MW. In this synthesis process B2, the interpolation controller 36 sums up pixel values at positions corresponding to each other in the three image map data MPR, MPB, and MPR, whereby a pixel value at a position corresponding to the above-described position in the map data MW can be generated.

Next, the interpolation controller 36 executes spatial frequency detection processing B3 based on the map data MW to detect a spatial frequency. In this spatial frequency detection process B3, the interpolation controller 36 divides one frame image into a plurality of image regions, and determines a spatial frequency in each image region based on the atlas data MW.

Subsequently, the interpolation controller 36 performs interpolation method determination processing B4 based on the spatial frequency determined by the spatial frequency detection processing B3 to determine the interpolation method in the interpolation processing a 2.

Fig. 23A, 23B, and 23C show an example of an interpolation method in the interpolation process a 2. These figures show the map data MG21 generated by performing interpolation processing a 2. In the interpolation method shown in fig. 23A, the pixel value at the position where the pixel value does not exist is determined based on the pixel values located in the upper line and the lower line of the position where the pixel value does not exist. That is, in the example of fig. 23A, the direction in which the interpolation processing is performed (interpolation direction) is the vertical direction (longitudinal direction). In the interpolation method shown in fig. 23B, the pixel value at the position where the pixel value does not exist is determined based on the pixel values in the left column and the right column located at the position where the pixel value does not exist. That is, in the example shown in fig. 23B, the direction in which the interpolation processing is performed (interpolation direction) is the horizontal direction (lateral direction). Further, in the interpolation method shown in fig. 23C, the pixel value at the position where the pixel value does not exist is determined based on the pixel values located in the upper row, the lower row, the left column, and the right column at the position where the pixel value does not exist. That is, in the example shown in fig. 23C, the directions in which the interpolation processing is performed (interpolation directions) are the vertical direction and the horizontal direction. It should be noted that fig. 23A to 23C have illustrated three examples, but the interpolation method is not limited thereto.

In the interpolation method determination process B4, the interpolation controller 36 determines the interpolation method in the interpolation process a2 based on the spatial frequency in each image region. Specifically, in the case where the interpolation controller 36 determines that the image in a specific image area is a vertical stripe pattern based on the spatial frequency in the specific image area, the interpolation controller 36 selects an interpolation method in which the interpolation direction is the vertical direction (fig. 23A). Further, for example, in a case where the interpolation controller 36 determines that the image in a specific image region is a horizontal stripe pattern based on the spatial frequency in the specific image region, the interpolation controller 36 selects an interpolation method in which the interpolation direction is the horizontal direction (fig. 23B). Subsequently, the interpolation controller 36 supplies an instruction on the interpolation method for each image area to the interpolation processing section 33.

The interpolation processing section 33 performs interpolation processing a2 on the six map data MG11, MG12, MB11, MB12, MR11, and MR12 supplied from the image segmentation processing section 22, respectively, using the interpolation method with respect to each image region indicated by the interpolation processor 36, to generate six map data MG21, MG22, MB21, MB22, MR21, and MR 22. Interpolation methods for generating six map data MG21, MG22, MB21, MB22, MR21, and MR22 are the same as one another.

As described above, in the image pickup apparatus 2, the interpolation method in the interpolation process a2 can be changed, so that an optimal interpolation method can be used according to image pickup pixels. This can improve the image quality of the captured image.

Specifically, in the image capturing apparatus 2, the map data MW is generated by performing the synthesis process B2 based on the image map data MPG, MPB, and MPR, and the spatial frequency is detected based on the map data MW. Thus, the spatial frequency can be detected with high accuracy in the image pickup apparatus 2, and the interpolation process a2 is executed based on the spatial frequency thus obtained, so that the accuracy of the interpolation process a2 can be improved. Therefore, the image pickup apparatus 2 can achieve a higher restoration effect, so that the image quality of the captured image can be improved.

It should be noted that in this example, the map data MW is generated by performing the synthesis process B2 based on the image map data MPG, MPB, and MPR, and the spatial frequency is detected based on the map data MW, but this is not limitative. Alternatively, for example, as in the image processing section 30A shown in fig. 24, the spatial frequency detection process B3 may be performed based on the image map data MPG relating to green (G), and the interpolation method for generating the green (G) map data MG21 and MG22 in the interpolation process a2 may be determined based on the spatial frequency in each image region obtained by this spatial frequency detection process B3. The same applies to the image map data MPB and MPR. Furthermore, this is not limiting. For example, the spatial frequency detection process B3 may be performed based on the image map data MPG relating to green (G), the spatial frequency detection process B3 may be performed based on the image map data MPB relating to blue (B), the spatial frequency detection process B3 may be performed based on the image map data MPR relating to red (R), and the interpolation method for generating the six map data MG21, MG22, MB21, MB22, MR21, and MR22 in the interpolation process a2 may be determined collectively for each image region based on the spatial frequency in each image region obtained by the spatial frequency detection process B3. In this case, interpolation methods for generating six map data MG21, MG22, MB21, MB22, MR21, and MR22 may be the same as one another.

[ modified examples 1 to 4]

In the above-described embodiment, the image processing section 20 performs the image segmentation process a1, the interpolation process a2, and the synthesis process A3 based on the image map data MPR relating to red (R), the image map data MPG relating to green (G), and the image map data MPB relating to blue (B), but this is not limitative. Alternatively, for example, the image segmentation process a1, the interpolation process a2, and the synthesis process A3 may be performed based on a luminance signal (luminance signal). This modification will be described in detail below.

Fig. 25 illustrates a configuration example of an image pickup apparatus 1D according to the present modification. The image pickup apparatus 1D includes an image processing section 20D. The image processing unit 20D includes a Y/C separation unit 29D, an image division processing unit 22D, an interpolation processing unit 23D, a synthesis processing unit 24D, and a signal processing unit 25D.

The Y/C separation section 29D separates the RGB signals included in the image signal DT into a luminance (Y) signal and a color (C) signal by performing Y/C separation processing C1, and outputs the luminance signal and the color signal as the image signal DT 11. The image signal DT11 includes map data MY, MCr, and MCb. The map data MY includes pixel values of a frame of an image relating to luminance (Y), the map data MCr includes pixel values of a frame of an image relating to R-Y color difference (Cr), and the map data MCb includes pixel values of a frame of an image relating to B-Y color difference (Cb). Each pixel value is represented by a digital code having a plurality of bits. The Y/C separating section 29D corresponds to a specific example of the "generator" in the present disclosure.

The image division processing section 22D performs an image division process a1 based on the map data MY included in the image signal DT11 supplied from the Y/C separating section 29D via the switching section 21 to generate two map data MY11 and MY 12. Further, the image division processing section 22D holds the map data MCr and MCb included in the output image signal DT11 as they are.

The interpolation processing section 23D performs interpolation processing a2 on the two map data MY11 and MY12 supplied from the image division processing section 22D, respectively, to generate two map data MY21 and MY 22. Further, the interpolation processing section 23D outputs the map data MCr and MCb supplied from the image division processing section 22D as it is.

The synthesis processing section 24D performs synthesis processing A3 based on the two map data MY21 and MY22 supplied from the interpolation processing section 23D to generate one map data MY 3. Subsequently, the synthesis processing section 24D supplies the map data MY3 generated by the synthesis processing a3 and the map data MCr and MCb supplied from the interpolation processing section 23D to the signal processing section 25D as the image signal DT 12.

The signal processing section 25D performs predetermined signal processing based on the image data DT12 supplied from the synthesis processing section 24D via the switching section 21 or the image signal DT11 supplied from the Y/C separation section 29D. Subsequently, the signal processing section 25D outputs the processing result of the predetermined signal processing as the image signal DT 13.

Fig. 26 schematically shows examples of the image segmentation processing a1, interpolation processing a2, and synthesis processing A3 of the image processing section 20D.

The Y/C separating section 29D performs a Y/C separation process C1 to separate the RGB signals included in the image signal DT into a luminance (Y) signal and a color (C) signal. Specifically, the Y/C separating section 29D generates map data MY, MCb, and MCr based on the image map data MPG, MPB, and MPR. The Y/C separating section 29D generates a pixel value relating to luminance (Y) based on pixel values at positions corresponding to each other in the three image map data MPG, MPB, and MPR using the following expression, for example.

VY=VG×0.59+VB×0.11+VR×0.3

In this expression, "VY" is a pixel value relating to luminance (Y), "VG" is a pixel value relating to green (G), "VB" is a pixel value relating to blue (B), and "VR" is a pixel value relating to red (R).

The image segmentation processing section 22D performs an image segmentation process a1 based on the thus generated map data MY to generate two map data MY11 and MY 12. The interpolation processing section 23D performs interpolation processing a2 on the two map data MY11 and MY12, respectively, to generate two map data MY21 and MY 22. The synthesis processing section 24D performs synthesis processing A3 based on the two map data MY21 and MY22 to generate one map data MY 3. Specifically, the synthesis processing section 24D sums up and halves the pixel values at positions corresponding to each other in the two map data MY21 and MY22 to generate the pixel value at the position corresponding to the above-described position in the map data MY 3. The map data MY corresponds to a specific example of "first image map data" in the present disclosure. The map data MY11 and MY12 correspond to a specific example of "a plurality of first map data" in the present disclosure. The pattern data MY21 and MY22 correspond to a specific example of "a plurality of second pattern data" in the present disclosure. The map data MY3 corresponds to a specific example of "third map data" in the present disclosure.

As described above, in the image pickup apparatus 1D, the signal-to-noise ratio (S/N ratio) of the luminance signal can be increased, so that the image quality of the captured image can be improved. Further, in this example, the image segmentation process a1, the interpolation process a2, and the synthesis process A3 are performed only on the map data MY relating to the luminance (Y), so that the amount of processing can be reduced. For example, power consumption in the image pickup apparatus 1D can thus be reduced.

[ Another modification ]

Furthermore, two or more of these variations may be combined.

<2 > second embodiment

Next, an image pickup apparatus 3 according to a second embodiment is explained. The present embodiment differs from the first embodiment in the configuration of the blue photoelectric converter and the red photoelectric converter in the image pickup section. It should be noted that substantially the same components as those of the image pickup apparatus 1 according to the above-described first embodiment are denoted by the same reference numerals, and description thereof is appropriately omitted.

Fig. 27 shows a configuration example of the image pickup apparatus 3 according to the present embodiment. The image pickup apparatus 3 includes an image pickup section 40 and an image processing section 50.

Fig. 28 schematically shows a cross-sectional configuration of the image pickup pixel P in the image pickup section 40. The semiconductor substrate 100 includes one photodiode PD formed in a pixel region corresponding to one image pickup pixel P. Unlike the photodiodes PDB and PDR according to the first embodiment described above, the photodiode PD is configured to receive light of various wavelengths. An insulating film 101 is formed on the surface of the semiconductor substrate 100 on the imaging surface S side, and a color filter 111 is formed on the insulating film 101. Specifically, a color filter 111B or a color filter 111R is selectively formed on the insulating film 101. The color filter 111B allows blue (B) light to pass therethrough, and blocks red (R) light and green (G) light. The color filter 111R allows red (R) light to pass therethrough, and blocks blue (B) light and green (G) light. The color filter 111B and the photodiode PD are included for receiving blueThe color (B) light, and the color filter 111R and the photodiode PD are included in the photoelectric converter for receiving the red (R) light. An insulating film 112 is formed on the color filter 111. For example, silicon dioxide (SiO)2) The insulating film 112 is constructed. Subsequently, the transparent electrode 102, the photoelectric conversion film 103G, the transparent electrode 104, and the on-chip lens 105 are formed in this order on the insulating film 112.

Fig. 29 schematically shows the positions of photoelectric converters in the region X where four image pickup pixels P are arranged. Thus, in the image pickup section 40, the photoelectric converter relating to green (G) and the photoelectric converter relating to blue (B) or red (R) are arranged in the upper layer and the lower layer, respectively, in the pixel region corresponding to one image pickup pixel P. The photoelectric converters associated with blue (B) and red (R) are arranged in a checkerboard pattern. That is, in the image pickup section 40, the color filters 111B and 111R are arranged in a checkerboard pattern. Therefore, each image pickup pixel can generate a pixel signal relating to green and a pixel signal relating to blue or red in the image pickup section 40.

With this configuration, the image pickup section 40 generates the image signal DT21 and the gain signal SGAIN. The image signal DT21 includes two image map data MPG and MPBR. The image map data MPG includes pixel values of one frame of image relating to green (G), and the image map data MPBR includes pixel values of one frame of image relating to blue (B) and red (R). In the image map data MPBR, pixel values related to blue (B) and pixel values related to red (R) are arranged in a checkerboard pattern corresponding to the arrangement of the color filters 111B and 111R.

The image processing unit 50 (fig. 27) includes an image division processing unit 52, an interpolation controller 56, an interpolation processing unit 53, a synthesis processing unit 54, and a signal processing unit 55.

The image segmentation processing section 52 performs image segmentation processing a1 based on image map data MPG and MPBR included in the image signal DT21 supplied from the image pickup section 40 via the switching section 21 to generate four map data MG11, MG12, MR11, and MB 12.

The interpolation controller 56 performs interpolation control processing B1 based on the image map data MPG included in the image signal DT21 to determine an interpolation method in the interpolation processing a2 of the interpolation processing section 53.

The interpolation processing section 53 performs interpolation processing a2 on the four map data MG11, MG12, MR11, and MB12 supplied from the image segmentation processing section 52 using the interpolation method instructed by the interpolation controller 56, respectively, to generate four map data MG21, MG22, MR21, and MB 22.

The synthesis processing section 54 executes the synthesis processing section A3 based on the two map data MG21 supplied from the interpolation processing section 53 to generate one map data MG 3. Subsequently, the synthesis processing section 54 supplies the map data MG3 generated by the synthesis processing a3 and the map data MR21 and MB22 supplied from the interpolation processing section 53 to the image processing section 55 as the image signal DT 22.

The signal processing section 55 executes predetermined signal processing based on the image signal DT22 supplied from the synthesis processing section 54 via the switching section 21 or the image signal DT21 supplied from the image pickup section 40. Subsequently, the signal processing section 55 outputs the processing result of the predetermined signal processing as the image signal DT 23.

Fig. 30 schematically shows examples of the image segmentation processing a1, the interpolation control processing B1, the interpolation processing a2, and the synthesis processing A3 in the image processing section 50.

The image segmentation process section 52 performs an image segmentation process a1 based on the image map data MPG to generate two map data MG11 and MG 12. Further, the image segmentation process section 52 performs an image segmentation process a1 based on the image atlas data MPBR to generate two atlas data MR11 and MB 12. In the atlas data MR11, as shown in fig. 30, pixel values relating to red (R) exist in the upper left and lower right portions of the region X, and pixel values do not exist in the lower left and upper right portions of the region X. Further, in the map data MB12, pixel values relating to blue (B) exist in the lower left and upper right portions of the region X, and pixel values do not exist in the upper left and lower right portions of the region X. That is, in this example, the pixel value arrangement pattern in the image division process a1 is a checkerboard pattern corresponding to the checkerboard pattern arrangement of the color filters 111B and 111R in the image pickup section 40. Accordingly, pixel values related to red (R) included in image map data MPBR are included only in image data MR11, and pixel values related to blue (B) included in image map data MPBR are included only in map data MB 12. As shown in fig. 30, the map data MG11 and MR11 have the same arrangement pattern PAT, and the map data MG12 and MB12 have the same arrangement pattern PAT.

The interpolation controller 56 performs a spatial frequency detection process B3 based on the image map data MPG relating to green (G) to detect a spatial frequency. Subsequently, the interpolation controller 56 executes interpolation method determination processing B4 to determine the interpolation method in the interpolation processing a2 for each image region based on the spatial frequency determined by the spatial frequency detection processing B3. Next, the interpolation controller 56 supplies an instruction on the interpolation method for each image area to the interpolation processing section 53.

The interpolation processing section 53 performs interpolation processing a2 on the four map data MG11, MG12, MR11, and MB12 supplied from the image segmentation processing section 52 using the interpolation method instructed for each image region by the interpolation controller 56, respectively, to generate four map data MG21, MG22, MR21, and MB 22. Interpolation methods for generating the four map data MG21, MG22, MR21, and MB22 are the same as each other.

The synthesis processing section 54 performs synthesis processing A3 based on the two map data MG21 and MG22 to generate one map data MG 3. Specifically, the synthesis processing section 54 sums up pixel values at positions corresponding to each other in the two map data MG21 and MG22 and halves the sum of the pixel values to generate pixel values at positions corresponding to the above-described positions in the map data MG 3.

The image map data MPG and MPBR correspond to specific examples of "first image map data" and "second image map data" in the present disclosure, respectively. The atlas data MR11 and MB12 correspond to a specific example of "a plurality of fourth atlas data" in the present disclosure. The atlas data MR21 and MB22 correspond to a specific example of "a plurality of fifth atlas data" in the present disclosure.

As described above, in the image pickup apparatus 3, for example, the image segmentation process a1 is performed based on the image map data MPG to generate map data MG11 and MG12, the interpolation process a2 is performed on the map data MG11 and MG12 to generate map data MG21 and MG22, respectively, and the synthesis process A3 is performed based on the map data MG21 and MG22 to generate map data MG 3. As in the first embodiment described above, it is thereby possible to increase the signal-to-noise ratio (S/N ratio) in the map data MG3 and improve the image quality of the captured image in the image capturing apparatus 3.

Further, in the image pickup apparatus 3, the pixel value arrangement pattern PAT in the image division processing a1 is a checkerboard pattern corresponding to the checkerboard pattern arrangement of the color filters 111B and 111R in the image pickup section 40. The pixel value arrangement patterns PAT in the map data MG11 and MR11 are identical to each other, and the pixel value arrangement patterns PAT in the map data MG12 and MB12 are identical to each other. Thus, the image division processing section 52 can perform the image division processing a1 based on the two atlas data MPG and MPBR by the same method, so that the circuit configuration of the image division processing section 52 can be simplified. Further, as in the case of the image pickup apparatus 1, it is possible to reduce the possibility of occurrence of false colors and improve the image quality of a captured image.

Further, in the image pickup apparatus 3, the interpolation methods for generating the map data MG21, MG22, MR21, and MB22 in the interpolation processing a2 are the same as each other. Thus, the interpolation processing section 53 can generate the four map data MG21, MG22, MR21, and MB22 using the same interpolation method, so that the circuit configuration of the interpolation processing section 53 can be simplified. Further, as in the case of the image pickup apparatus 1, it is possible to reduce the possibility of occurrence of false colors and improve the image quality of a captured image.

Further, in the image pickup apparatus 3, a spatial frequency is detected based on the image map data MPG over which pixel values relating to green (G) are distributed, and an interpolation method in the interpolation processing a2 is determined based on the detected spatial frequency. Thereby, the image pickup apparatus 3 can detect the spatial frequency with high accuracy, so that the accuracy of the interpolation process a2 can be improved. Therefore, the image pickup apparatus 3 can achieve a higher restoration effect, so that the image quality of the captured image can be improved.

Further, in the image pickup apparatus 3, as in the image pickup apparatus 1 according to the above-described first embodiment, whether or not the image division processing, the interpolation processing, and the synthesis processing are executed can be controlled. Accordingly, in the image capturing apparatus 3, for example, in the case of capturing a dark subject image, by performing the image segmentation process a1, the interpolation process a2, and the synthesis process A3, the signal-to-noise ratio (S/N ratio) of the captured image can be increased. For example, in the case of capturing a bright object image, by not performing the image segmentation process a1, the interpolation process a2, and the synthesis process A3, the resolution in the captured image can be improved. Therefore, the image pickup apparatus 3 can improve the image quality of the captured image.

As described above, in the present embodiment, the image segmentation processing, the interpolation processing, and the synthesis processing are performed, and the signal-to-noise ratio in the captured image can be improved. This can improve the image quality of the captured image.

In the present embodiment, the pixel value arrangement pattern in the image division processing is a checkered pattern corresponding to the checkered pattern arrangement of the color filters in the image pickup section, so that it is possible to simplify the circuit configuration of the image division processing section, and reduce the possibility of occurrence of false colors and improve the image quality of the picked-up image.

In the present embodiment, in the interpolation process, the interpolation methods for generating the four map data are the same as each other, so that it is possible to simplify the circuit configuration of the interpolation processing section, and to reduce the possibility of occurrence of false colors and improve the image quality of a captured image.

In the present embodiment, the spatial frequency is detected based on the image map data MPG over which the pixel values relating to green (G) are distributed to determine the interpolation method in the interpolation process, so that the spatial frequency can be detected with high accuracy and the image quality of the captured image can be improved.

In the present embodiment, whether or not the image segmentation processing, the interpolation processing, and the synthesis processing are executed can be controlled, so that the image quality of the captured image can be improved.

<3. third embodiment >

Next, an image pickup apparatus 4 according to a third embodiment is explained. The present embodiment differs from the first embodiment described above in the arrangement density of photoelectric converters configured to receive blue (B) light and red (R) light in the image pickup section. It should be noted that substantially the same components as those of the image forming apparatus 1 according to the above-described first embodiment are denoted by the same reference numerals, and description thereof is appropriately omitted.

Fig. 31 shows a configuration example of the image pickup apparatus 4 according to the present embodiment. The image pickup apparatus 4 includes an image pickup section 60 and an image processing section 70.

Fig. 32 schematically shows a cross-sectional structure of the imaging pixel P in the imaging section 60. Fig. 33 schematically shows the positions of photoelectric converters in the region X where four image pickup pixels P are arranged. The semiconductor substrate 100 includes photodiodes PDR2 and PDB2 formed in the region X corresponding to the four image pickup pixels P. The photodiode PDR2 is a photoelectric converter configured to receive red (R) light as in the photodiode PDR, and the photodiode PDB2 is a photoelectric converter configured to receive blue (B) light as in the photodiode PDB. The photodiode PDR2 and the photodiode PDB2 are formed and stacked in the semiconductor substrate 100 in a region X corresponding to four image pickup pixels P in such a manner that the photodiode PDB2 is located on the image pickup surface S side. That is, in the image pickup section 10 according to the first embodiment, the photodiodes PDB and PDR are formed and stacked in the pixel region corresponding to one image pickup pixel P, whereas in the image pickup section 60 according to the present embodiment, the photodiodes PDB2 and PDR2 are formed and stacked in the region X corresponding to four image pickup pixels P. Accordingly, in the image pickup section 60, four photoelectric converters related to green (G), one photoelectric converter related to blue (B), and one photoelectric converter related to red (R) are formed and stacked in the region X corresponding to the four image pickup pixels P. In other words, in the image pickup section 60, the arrangement density of the photoelectric converters relating to blue (B) is 1/4 of the arrangement density of the photoelectric converters relating to green (G), and the arrangement density of the photoelectric converters relating to red (R) is 1/4 of the arrangement density of the photoelectric converters relating to green (G). An insulating film 112 is formed on the color filter 111. For example, silicon dioxide (SiO)2) The insulating film 112 is constructed. An insulating film 101 is formed on a semiconductor substrate 100, and a transparent electrode 102, a photoelectric conversion film 103G, a transparent electrode 104, and an on-chip lens 105 are formed in this order on the insulating film 101.

With this configuration, the image pickup section 60 generates the image signal DT31 and the gain signal SGAIN. The image signal DT31 includes three image map data MPG, MPB, and MPR. The image map data MPG includes pixel values of a frame of an image related to green (G). The image map data MPB includes pixel values of a frame of image relating to blue (B). The image map data MPR includes pixel values of one frame of image related to red (R). The number of pixel values in the image map data MPB is 1/4 of the number of pixel values in the image map data MPG, and the number of pixel values in the image map data MPR is 1/4 of the number of pixel values in the image map data MPG. Four pixel values in the image map data MPG are associated with one pixel value in the image map data MPB and also associated with one pixel value in the image map data MPR.

The image processing unit 70 (fig. 31) includes an image division processing unit 72, an interpolation processing unit 73, a synthesis processing unit 74, and a signal processing unit 75.

The image segmentation processing section 72 performs image segmentation processing a1 based on image map data MPG, MPB, and MPR included in the image signal DT31 supplied from the image pickup section 60 via the switching section 21 to generate six map data MG11, MG12, MB11, MB12, MR11, and MB 12.

The interpolation processing section 73 performs interpolation processing a2 on the six atlas data MG11, MG12, MB11, MB12, MR11, and MB12 supplied from the image segmentation processing section 72, respectively, to generate six atlas data MG21, MG22, MB21, MB22, MR21, and MB 22.

The synthesis processing section 74 performs synthesis processing a3 based on the six map data MG21, MG22, MB21, MB22, MR21, and MB22 supplied from the interpolation processing section 73 to generate three map data MG3, MB3, and MR 3. Subsequently, the synthesis processing section 74 supplies the map data MG3, MB3, and MR3 generated by the synthesis processing a3 to the signal processing section 75 as the image signal DT 32.

The signal processing section 75 performs predetermined signal processing based on the image signal DT32 supplied from the synthesis processing section 74 via the switching section 21 or the image signal DT31 supplied from the image pickup section 60. Subsequently, the signal processing section 75 outputs the processing result of the predetermined signal processing as the image signal DT 33.

Fig. 34 schematically shows examples of the image segmentation processing a1, interpolation processing a2, and synthesis processing A3 in the image processing section 70.

The image segmentation process section 72 performs an image segmentation process a1 based on the image map data MPG, MPB, and MPR to generate six map data MG11, MG12, MB11, MB12, MR11, and MR 12. As shown in fig. 34, the pixel value arrangement pattern PAT in the map data MG11 and MG12 is a checkerboard pattern in units of four pixel values (fig. 12A and 12B). In contrast, the pixel value arrangement pattern PAT in the atlas data MB11, MB12, MR11, MR12 is a checkerboard pattern in units of one pixel value (fig. 8A and 8B). That is, the unit of the checkerboard pattern in each of the map data MG11 and MG12 is four times the unit of the checkerboard pattern in each of the map data MB11, MB12, MR11, and MR12, corresponding to the arrangement density of the photoelectric converters related to green (G), blue (B), and red (R) in the image pickup section 60.

The socket processing section 73 performs interpolation processing a2 on the six atlas data MG11, MG12, MB11, MB12, MR11, and MR12 supplied from the image segmentation processing section 22, respectively, to generate six atlas data MG21, MG22, MB21, MB22, MR21, and MR 22. In the case of performing interpolation processing a2 on the map data MG11 and MG12, the interpolation processing section 73 may use the interpolation method shown in fig. 13A and 13B. Interpolation methods for generating the atlas data MB21, MB22, MR21, and MR22 are the same as each other. Further, the interpolation method for generating each map data MG21 and MG22 and the interpolation method for generating each map data MB21, MB22, MR21, and MR22 may be the same. Specifically, for example, the interpolation directions in the two interpolation methods may be the same as each other.

The synthesis processing section 74 performs synthesis processing a3 based on the six map data MG21, MG22, MB21, MB22, MR21, and MR22 to generate three map data MG3, MB3, and MR 3.

As described above, in the image pickup apparatus 4, for example, the image segmentation process a1 is performed based on the image map data MPG to generate map data MG11 and MG12, the interpolation process a2 is performed on the map data MG11 and MG12 to generate map data MG21 and MG22, respectively, and the synthesis process A3 is performed based on the map data MG21 and MG22 to generate map data MG 3. The same applies to MPB and MPR. As in the first embodiment described above, it is thereby possible to increase the signal-to-noise ratio (S/N ratio) of the map data MG3, MB3, and MR3 and improve the image quality of the captured image of the image capturing apparatus 4.

Further, in the image pickup apparatus 4, the unit of the checkerboard pattern in each of the map data MG11 and MG12 is four times the unit of the checkerboard pattern in each of the map data MB11, MB12, MR11, and MR12, corresponding to the arrangement density of the photoelectric converters related to green (G), blue (B), and red (R) in the image pickup section 60. Thus, the image segmentation processing section 72 can perform the image segmentation process a1 based on the three image map data MPG, MPB, and MPR by a similar method, so that the circuit configuration of the image segmentation processing section 72 can be simplified. Further, as in the case of the image pickup apparatus 1, it is possible to reduce the possibility of occurrence of false colors and improve the image quality of a captured image.

In addition, in the image pickup apparatus 4, the interpolation methods for generating the map data MB21, MB22, MR21, and MR22 in the interpolation processing a2 are the same as each other. Thus, the interpolation processing section 73 can generate the four map data MB21, MB22, MR21, and MR22 using the same interpolation method, so that the circuit configuration of the interpolation processing section 73 can be simplified. Further, in the interpolation process a2, the interpolation method for generating each map data MG21 and MG22 is similar to the interpolation method for generating each map data MB21, MB22, MR21, and MR22, so that it is possible to reduce the possibility of occurrence of false color and improve the image quality of a captured image as in the case of the image capturing apparatus 1.

Further, in the image pickup apparatus 4, as with the image pickup apparatus 1 according to the first embodiment described above, whether or not to execute the image division processing, the interpolation processing, and the synthesis processing can be controlled. Accordingly, in the image capturing apparatus 4, for example, in the case of capturing a dark subject image, by performing the image segmentation process a1, the interpolation process a2, and the synthesis process A3, the signal-to-noise ratio (S/N ratio) of the captured image can be increased. For example, in the case of capturing a bright object image, by not performing the image segmentation process a1, the interpolation process a2, and the synthesis process A3, the resolution of the captured image can be improved. Therefore, the image pickup apparatus 4 can improve the image quality of the captured image.

As described above, in the present embodiment, by performing the image segmentation processing, the interpolation processing, and the synthesis processing, the signal-to-noise ratio of the captured image can be increased. This can improve the image quality of the captured image.

In the present embodiment, the unit in the checkerboard pattern in each of the map data MG11 and MG12 is four times the unit in the checkerboard pattern in each of the map data MB11, MB12, MR11, and MR12, corresponding to the arrangement density of the photoelectric converters related to green, blue, and red in the image pickup section, so that it is possible to simplify the circuit configuration of the image division processing section, and reduce the possibility of occurrence of false colors and improve the image quality of a captured image.

In the present embodiment, in the interpolation process, the interpolation methods for generating the map data MB21, MB22, MR21, and MR22 are the same as each other, so that it is possible to simplify the circuit configuration of the interpolation processing section and reduce the possibility of occurrence of false colors.

In the present embodiment, in the interpolation process, the interpolation method for generating each of the map data MG21 and MG22 is similar to the interpolation method for generating each of the map data MB21, MB22, MR21, and MR22, so that it is possible to reduce the possibility of occurrence of false colors and improve the image quality of a captured image.

In the present embodiment, whether or not the image segmentation processing, the interpolation processing, and the synthesis processing are executed can be controlled, so that the image quality of the captured image can be improved.

[ modification 3-1]

In the above-described embodiment, the image processing section 70 performs the image segmentation process a1, the interpolation process a2, and the synthesis process A3 based on the image map data MPR relating to red (R), the image map data MPG relating to green (G), and the map data MPB relating to blue (B). Alternatively, as in the case of the image capturing apparatus 1D according to the modification of the first embodiment (fig. 25), the image division processing a1, the interpolation processing a2, and the synthesis processing A3 may be performed based on the luminance signal. This modification will be described in detail below.

Fig. 35 illustrates a configuration example of the image pickup apparatus 4A according to the present modification. The image pickup apparatus 4A includes an image processing section 70A. The image processing unit 70A includes a Y/C separation unit 79A, an image division processing unit 72A, an interpolation processing unit 73A, a synthesis processing unit 74A, and a signal processing unit 75A.

The Y/C separating section 79A separates the RGB signals included in the image signal DT31 into a luminance (Y) signal and a color (C) signal by performing Y/C separation processing C1, and outputs the luminance signal and the color signal as the image signal DT 41. The image signal DT41 includes map data MY, MCr, and MCb. The map data MY includes pixel values of a frame of an image relating to luminance (Y), the map data MCr includes pixel values of a frame of an image relating to R-Y color difference (Cr), and the map data MCb includes pixel values of a frame of an image relating to B-Y color difference (Cb). The number of pixel values in the map data MCr is 1/4 of the number of pixel values in the map data MY. Similarly, the number of pixel values in the map data MCb is 1/4 of the number of pixel values in the map data MY.

The image division processing section 72A performs image division processing a1 based on the map data MY included in the image signal DT41 supplied from the Y/C separating section 79A via the switching section 21 to generate two map data MY11 and MY 12. Further, the image division processing section 72A holds the map data MCr and MCb included in the output image signal DT41 as it is.

The interpolation processing section 73A performs interpolation processing a2 on the two map data MY11 and MY12 supplied from the image division processing section 72A, respectively, to generate two map data MY21 and MY 22. Further, the interpolation processing section 73A outputs the map data MCr and MCb supplied from the image division processing section 72A as it is.

The synthesis processing section 74A performs synthesis processing A3 based on the two map data MY21 and MY22 supplied from the interpolation processing section 73A to generate one map data MY 3. Subsequently, the synthesis processing section 74A supplies the map data MY3 generated by the synthesis processing a3 and the map data MCr and MCb supplied from the interpolation processing section 73A to the signal processing section 75A as the image signal DT 42.

The signal processing section 75A performs predetermined signal processing based on the image signal DT42 supplied from the synthesis processing section 74A via the switching section 21 or the image signal DT41 supplied from the Y/C separation section 79A. Subsequently, the signal processing section 75A outputs the processing result of the predetermined signal processing as the image signal DT 43.

Fig. 36 schematically shows an example of the image segmentation processing a1, interpolation processing A3, and synthesis processing A3 in the image processing section 70A.

The Y/C separating section 79A performs Y/C separation to separate the RGB signals included in the image signal DT31 into a luminance (Y) signal and a color (C) signal. Specifically, the Y/C separating section 79A generates map data MY, MCb, and MCr based on the image map data MPG, MPB, and MPR. For example, the Y/C separating section 79A generates a pixel value relating to luminance (Y) based on pixel values at positions corresponding to each other in the three image map data MPG, MPB, and MPR using the following expression.

VY1=VG1×0.59+VB/4×0.11+VR/4×0.3

VY2=VG2×0.59+VB/4×0.11+VR/4×0.3

VY3=VG3×0.59+VB/4×0.11+VR/4×0.3

VY4=VG4×0.59+VB/4×0.11+VR/4×0.3

In this expression, "VY 1" to "VY 4" are each pixel values relating to luminance (Y), "VG 1" to "VG 4" are each pixel values relating to green (G), "VB" is a pixel value relating to blue (B), and "VR" is a pixel value relating to red (R). "VY 1" and "VG 1" both represent the upper left pixel value of region X. "VY 2" and "VG 2" both represent the upper right pixel values of region X. "VY 3" and "VG 3" both represent the lower left pixel value of region X. "VY 4" and "VG 4" both represent the lower right pixel values of region X.

The image segmentation processing section 72A performs image segmentation processing a1 based on the thus generated map data MY to generate two map data MY11 and MY 12. The interpolation processing section 73A performs interpolation processing a2 on the two map data MY11 and MY12, respectively, to generate two map data MY21 and MY 22. The synthesis processing section 74A performs synthesis processing A3 based on the two map data MY21 and MY22 to generate one map data MY 3.

As described above, in the image pickup apparatus 4A, the signal-to-noise ratio (S/N ratio) of the luminance signal can be increased, so that the image quality of the captured image can be improved. Further, in this example, the image segmentation process a1, the interpolation process a2, and the synthesis process A3 are performed only on the map data MY relating to the luminance (Y), so that the amount of processing can be reduced. For example, the power consumption of the image pickup apparatus 4A can thus be reduced.

[ modification 3-2]

Each of the modifications of the first embodiment can be applied to the image pickup apparatus 4 according to the above-described embodiment. Specifically, for example, as with the image pickup apparatus 2 (fig. 21) according to the modification of the above-described first embodiment, the interpolation method in the interpolation process a2 of the interpolation processing section 73 may be controlled by executing the interpolation control process B1 based on the image map data MPG, MPB, and MPR included in the image signal DT 31.

<4. fourth embodiment >

Next, an image pickup apparatus 5 according to a fourth embodiment is explained. In the present embodiment, the image pickup section includes a photoelectric converter configured to receive Infrared Rays (IR) in addition to the photoelectric converters configured to receive green (G) light, blue (B) light, and red (R) light. It should be noted that substantially the same components as those of the image pickup apparatus 1 according to the above-described first embodiment are denoted by the same reference numerals, and description thereof is appropriately omitted.

Fig. 37 shows a configuration example of the image pickup apparatus 5 according to the present embodiment. The image pickup apparatus 5 includes an image pickup section 80 and an image processing section 90.

Fig. 38 schematically shows a cross-sectional structure of an imaging pixel P in the imaging section 80. Fig. 39 schematically shows the positions of photoelectric converters in the region X where four image pickup pixels P are arranged. The semiconductor substrate 100 includes a photodiode PD formed in a pixel region corresponding to one image pickup pixel P. The photodiode PD is configured to receive light of various wavelengths corresponding to visible light. An insulating film 101 is formed on the surface of the semiconductor substrate 100 on the imaging plane S side, and a color filter 111 is formed on the insulating film 101. Specifically, in this example, a red (R) color filter 111R, a green (G) color filter 111G, and a blue (B) color filter 111B are formed on the insulating film 101 at the upper left, lower left, and upper right, and lower right portions of the region X corresponding to the four image pickup pixels P, respectively. The color filter 111R allows red (R) light to pass therethrough and blocks blue (B) light and green (G) light. The color filter 111G allows green (G) light to pass therethrough and blocks red (R) light and blue (B) light. The color filter 111B allows blue (B) light to pass therethrough and blocks red (R) light and green (G) light. The color filter 111R and the photodiode PD are included in a photoelectric converter for receiving red (R) light. The color filter 111G and the photoelectric converter PD are included in a photoelectric converter for receiving green (G) light. The color filter 111B and the photoelectric converter PD are included in a photoelectric converter for receiving blue (B) light. The color filters 111R, 111G, and 111B are arranged in a so-called bayer array.

An insulating film 112 is formed on the color filter 111. Subsequently, the transparent electrode 102, the photoelectric conversion film 103IR, and the transparent electrode 104 are formed in this order on the insulating film 112. The transparent electrodes 102 and 104 are electrodes that allow red light, green light, blue light, and infrared rays to pass through. The photoelectric conversion film 103IR is a photoelectric conversion film configured to receive green (G) light and allow red light, green light, and blue light to pass therethrough. The photoelectric conversion film 103IR and the transparent electrodes 102 and 104 are included in a photoelectric converter for receiving Infrared Rays (IR). An on-chip lens 105 is formed on the transparent electrode 104.

As described above, in the image pickup section 80, as shown in fig. 39, the photoelectric converter related to Infrared Rays (IR) and the photoelectric converter related to red (R), green (G), or blue (B) are arranged in the upper layer and the lower layer, respectively, in the pixel region corresponding to one image pickup pixel P. The photoelectric converters related to red (R), green (G), or blue (B) are arranged in a bayer array. Thereby, each image pickup pixel P in the image pickup section 80 can generate a pixel signal relating to infrared rays and a pixel signal relating to red, green, or blue.

With this configuration, the image pickup section 80 generates the image signal DT51 and the gain signal SGAIN. The image signal DT51 includes two image map data MPIR and MPRGB. The image map data MPIR includes pixel values of one frame of an image related to Infrared Rays (IR), and the image map data MPRGB includes pixel values of one frame of an image related to red (R), green (G), and blue (B).

The image processing unit 90 (fig. 37) includes an image division processing unit 92, an interpolation processing unit 93, a synthesis processing unit 94, and a signal processing unit 95.

The image segmentation processing section 92 performs image segmentation processing a1 to generate three map data MIR12, MIR11, and MIR13 based on the image map data MPIR included in the image signal DT51 supplied from the image pickup section 80 via the switching section 21, and performs image segmentation processing a1 to generate three map data MG12, MR11, and MB13 based on the image map data MPRGB included in the image signal DT 51.

The interpolation processing section 93 performs interpolation processing a2 on the six map data MIR12, MIR11, MIR13, MG12, MR11, and MB13 supplied from the image segmentation processing section 92, respectively, to generate six map data MIR22, MIR21, MIR23, MG22, MG21, and MB 23.

The synthesis processing section 94 performs synthesis processing A3 based on the three map data MIR22, MIR21, and MIR23 supplied from the interpolation processing section 93 to generate map data MIR 3. Next, the synthesis processing section 94 supplies the map data MIR3 generated by the synthesis processing a3 and the map data MG22, MR21, and MB23 supplied from the interpolation processing section 93 to the signal processing section 95 as the image signal DT 52.

The signal processing section 95 performs predetermined signal processing based on the image signal DT52 supplied from the synthesis processing section 94 via the switching section 21 or the image signal DT51 supplied from the image pickup section 60. Subsequently, the signal processing section 95 outputs the processing result of the predetermined signal processing as the image signal DT 53.

Fig. 40 schematically shows the image segmentation processing a1, interpolation processing a2, and synthesis processing A3 in the image processing section 90.

The image segmentation process section 92 performs an image segmentation process a1 to generate three map data MIR12, MIR11, and MIR13 based on the image map data MPIR, and performs an image segmentation process a1 to generate three map data MG12, MIR11, and MB13 based on the image map data MPRGB. As shown in fig. 40, the pixel value arrangement pattern PAT in the map data MIR12, MIR11, and MIR13 is a pattern corresponding to a bayer array (fig. 18A to 18C). The same applies to the pixel value arrangement pattern PAT in the map data MG12, MR11, and MB 13. That is, in this example, the pixel value arrangement pattern PAT in the image division processing a1 is a pattern corresponding to a bayer array representing the arrangement of the color filters 111R, 111G, and 111B in the image pickup section 80. Accordingly, the red (R) pixel value included in the image map data MPRGB is included only in the map data MR11, the green (G) pixel value included in the image map data MPRGB is included only in the map data MG12, and the blue (B) pixel value included in the image map data MPRGB is included only in the map data MB 13. As shown in fig. 40, the map data MIR12 and the MG12 have the same arrangement pattern PAT, the map data MIR11 and the MR11 have the same arrangement pattern PAT, and the map data MIR13 and the MB13 have the same arrangement pattern PAT.

The interpolation processing section 93 performs interpolation processing a2 on the six map data MIR12, MIR11, MIR13, MG12, MR11, and MB13 supplied from the image segmentation processing section 92, respectively, to generate six map data MIR22, MIR21, MIR23, MG22, MR21, and MB 23. For example, in the case of performing the interpolation processing a2 on the map data MIR12, MIR11, and MIR13, the interpolation processing section 93 may use the interpolation method shown in fig. 19A to 19C. The same applies to the interpolation processing a2 for the map data MG12, MR11, and MB 13. The interpolation methods for generating the respective map data MIR22 and MG22 are identical to each other. The interpolation methods for generating the respective atlas data MIR21 and MR21 are identical to each other. The interpolation methods for generating the respective map data MIR23 and MB23 are identical to each other.

The synthesis processing section 94 performs synthesis processing A3 based on the three map data MIR22, MIR21, and MIR23 to generate map data MIR 3.

The image map data MPIR and MPRGB correspond to specific examples of "first image map data" and "second image map data" in the present disclosure, respectively. The map data MIR12, MIR11, and MIR13 correspond to specific examples of the "plurality of first map data" in the present disclosure. The map data MIR22, MIR21, and MIR23 correspond to specific examples of the "plurality of second map data" in the present disclosure. The map data MIR3 corresponds to a specific example of "third map data" in the present disclosure. The map data MG12, MR11, and MB13 correspond to specific examples of "a plurality of fourth map data" in the present disclosure. The map data MG22, MR21, and MB23 correspond to specific examples of "a plurality of fifth map data" in the present disclosure.

As described above, in the image capturing apparatus 5, for example, the image segmentation process a1 is performed based on the image map data MIPR to generate map data MIR12, MIR11, and MIR13, the interpolation process a2 is performed on the map data MIR12, MIR11, and MIR13 to generate map data MIR22, MIR21, and MIR23, respectively, and the synthesis process A3 is performed based on the map data MIR22, MIR21, and MIR23 to generate map data MIR 3. As in the first embodiment described above, it is possible to increase the signal-to-noise ratio (S/N ratio) of the map data MIR3 and improve the image quality of the captured image in the image capturing apparatus 5.

Further, in the image pickup apparatus 5, the pixel value arrangement pattern PAT in the image division processing a1 is a pattern corresponding to a bayer array to correspond to the arrangement of the color filters 111R, 111G, and 111B in the image pickup section 80. The pixel value arrangement patterns PAT in the map data MIR12 and MG12 are identical to each other. The pixel value arrangement patterns PAT in the atlas data MIR11 and MR11 are identical to each other. The pixel value arrangement patterns PAT in the map data MIR13 and MB13 are identical to each other. Thus, the image segmentation processing section 92 can perform the image segmentation process a1 based on the two image map data MPIR and MPRGB by the same method, so that the circuit configuration of the image segmentation processing section 92 can be simplified.

Further, in the image pickup apparatus 5, in the interpolation process a2, the interpolation methods for generating the map data MIR22 and MG22 are the same as each other, the interpolation methods for generating the map data MIR21 and MR21 are the same as each other, and the interpolation methods for generating the map data MIR23 and MB23 are the same as each other, so that the circuit configuration of the interpolation processing section 93 can be simplified.

Further, in the image pickup apparatus 5, as with the image pickup apparatus 1 according to the first embodiment described above, whether or not to execute the image division processing, the interpolation processing, and the synthesis processing can be controlled. Accordingly, in the image capturing apparatus 3, for example, in the case of capturing a dark subject image, by performing the image segmentation process a1, the interpolation process a2, and the synthesis process A3, the signal-to-noise ratio (S/N ratio) of the captured image can be increased. For example, in the case of capturing a bright object image, by not performing the image segmentation process a1, the interpolation process a2, and the synthesis process A3, the resolution of the captured image can be improved. Therefore, the image pickup apparatus 5 can improve the image quality of the captured image.

As described above, in the present embodiment, the image segmentation processing, the interpolation processing, and the synthesis processing are performed, whereby the signal-to-noise ratio of the captured image can be increased. Thereby, the image quality of the captured image can be improved.

In the present embodiment, the pixel value arrangement pattern in the image division processing is a pattern corresponding to a bayer array to correspond to the arrangement of color filters in the image pickup section, so that the circuit configuration of the image division processing section can be simplified.

In the present embodiment, in the interpolation process, the interpolation methods for generating the map data MIR22 and the MG22 are identical to each other, the interpolation methods for generating the map data MIR21 and the MR21 are identical to each other, and the interpolation methods for generating the map data MIR23 and the MB23 are identical to each other, so that the circuit configuration of the interpolation processing section 93 can be simplified.

In the present embodiment, whether or not the image segmentation processing, the interpolation processing, and the synthesis processing are executed can be controlled, so that the image quality of the captured image can be improved.

[ modification 4-1]

Each of the modifications of the first embodiment can be applied to the image pickup apparatus 5 according to the above-described embodiment. Specifically, for example, as with the image pickup apparatus 2 (fig. 21) according to the modification of the first embodiment described above, the interpolation method in the interpolation process a2 of the interpolation processing section 93 may be controlled by executing the interpolation control process B1 based on the image map data MPRGB included in the image signal DT 51.

<5. use example of image pickup apparatus >

Fig. 41 shows a use example of the image pickup apparatus 1 and the like according to the above-described embodiment. For example, the above-described image pickup apparatus 1 and the like can be used for various situations in which light such as visible light, infrared light, ultraviolet light, and X-rays is sensed.

Devices for taking images for viewing, e.g. digital cameras or mobile devices with camera function

Devices for traffic, for example onboard sensors that photograph the front, rear, periphery, interior, etc. of the car for safe driving, such as automatic stopping, driver state recognition, etc.; a monitoring camera for monitoring a running vehicle or a road; or distance measuring sensors for measuring the distance between vehicles

Devices for household appliances, e.g. televisions, refrigerators or air conditioners, which take a gesture of a user and operate the device according to the gesture

Devices for medical care and health care, such as endoscopes or devices for angiography by receiving infrared light

Devices for security, e.g. surveillance cameras for crime prevention or surveillance cameras for personal authentication

Devices for cosmetic care, for example skin measuring devices for imaging the skin or microscopes for imaging the scalp

Devices for sports, e.g. sports cameras or wearable cameras for sports use, etc

Devices for agriculture, e.g. cameras for monitoring the condition of farmlands or crops

<6. application example >

< example of application of in-vivo information acquisition System >

Further, the technology according to the present disclosure (present technology) can be applied to various products. For example, techniques according to the present disclosure may be applied to endoscopic surgical systems.

Fig. 42 is a block diagram depicting a schematic configuration example of an in-vivo information acquisition system of a patient using a capsule-type endoscope to which the technology (present technology) according to an embodiment of the present disclosure can be applied.

The in-vivo information acquisition system 10001 includes a capsule endoscope 10100 and an external control device 10200.

At the time of examination, the patient swallows the capsule type endoscope 10100. The capsule-type endoscope 10100 has an image capturing function and a wireless communication function, and continuously captures images of the inside of an organ such as the stomach and the intestine (hereinafter, also referred to as in-vivo images) at predetermined intervals while moving inside the organ by peristaltic motion or the like within a certain period of time before it is naturally excreted from the patient. Subsequently, the capsule endoscope 10100 continuously transmits information of in-vivo images to the external control device 10200 outside the body by wireless transmission.

The external control device 10200 integrally controls the operation of the in-vivo information acquisition system 10001. Further, the external control device 10200 receives information of the in-vivo image transmitted from the capsule endoscope 10100 to the external control device, and generates image data for displaying the in-vivo image on a display device (not shown) based on the received information of the in-vivo image.

In the in-vivo information acquisition system 10001, an in-vivo image for reflecting the in-vivo state of the patient can be acquired at any time in this manner from after the time when the capsule type endoscope 10100 is swallowed until the period when the capsule type endoscope 10100 is excreted.

The configurations and functions of the capsule endoscope 10100 and the external control device 10200 will be described in more detail below.

The capsule endoscope 10100 includes a capsule type casing 10101 in which a light source unit 10111, an image capturing unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, a power supply unit 10116, and a control unit 10117 are accommodated.

The light source unit 10111 includes, for example, a light source such as a Light Emitting Diode (LED), and irradiates light on an image capturing field of view of the image capturing unit 10112.

The image capturing unit 10112 includes an image capturing element and an optical system including a plurality of lenses disposed at a front stage of the image capturing element. Reflected light (hereinafter referred to as observation light) of light irradiated on body tissue as an observation target is collected by an optical system and introduced onto an image capturing element. In the image pickup unit 10112, incident observation light is photoelectrically converted by the image pickup element, thereby generating an image signal corresponding to the observation light. The image signal generated by the image capturing unit 10112 is supplied to the image processing unit 10113.

The image processing unit 10113 includes a processor such as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), and performs various signal processes on the image signal generated from the image capturing unit 10112. The image processing unit 10113 supplies the image signal on which the signal processing has thus been performed to the wireless communication unit 10114 as raw data.

The wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal on which the signal processing has been performed by the image processing unit 10113, and transmits the obtained image signal to the external control device 10200 through the antenna 10114A. The wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A. The wireless communication unit 10114 supplies the control signal received from the external control device 10200 to the control unit 10117.

Power feeding section 10115 includes, for example, an antenna coil for receiving power, a power regeneration circuit for regenerating power from a current generated in the antenna coil, a booster circuit, and the like. The power supply unit 10115 generates power using a non-contact charging principle.

The power supply unit 10116 includes a secondary battery and stores the power generated by the power supply unit 10115. In fig. 42, arrow marks or the like indicating the supply destination of electric power from the power supply unit 10116 are omitted to avoid complicated illustration. However, power stored in the power supply unit 10116 is supplied to and used to drive the light source unit 10111, the image capturing unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117.

The control unit 10117 includes a processor such as a CPU, and appropriately controls driving of the light source unit 10111, the image capturing unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power supply unit 10115 according to a control signal transmitted from the external control device 10200 to the control unit.

The external control device 10200 includes a processor such as a CPU and a GPU, a microcomputer, or a control board mixedly combined with a processor and a storage element such as a memory. The external control device 10200 transmits a control signal to the control unit 10117 of the capsule endoscope 10100 through the antenna 10200A to control the operation of the capsule endoscope 10100. In the capsule endoscope 10100, for example, the irradiation conditions of the light source unit 10111 for irradiating light to the observation target can be changed in accordance with a control signal from the external control device 10200. Further, the image capturing conditions (for example, the frame rate, exposure level, and the like of the image capturing unit 10112) can be changed by a control signal from the external control device 10200. Further, the content of processing in the image processing unit 10113 or the condition (e.g., transmission interval, number of images transmitted, etc.) of transmitting an image signal from the wireless communication unit 10114 may be changed according to a control signal from the external control device 10200.

Further, the external control device 10200 performs various image processes on the image signal transmitted from the capsule endoscope 10100 to the external control device to generate image data for displaying the captured in-vivo image on the display device. For the image processing, various signal processing such as development processing (demosaicing processing), image quality improvement processing (band enhancement processing, super-resolution processing, Noise Reduction (NR) processing, and/or image stabilization processing), and/or enlargement processing (electronic scaling processing) may be performed. The external control device 10200 controls the driving of the display device so that the display device displays the captured in-vivo image based on the generated image data. Alternatively, the external control device 10200 may also control a recording device (not shown) to record the generated image data or control a printing device (not shown) to output the generated image data by printing.

The above has explained an example of an in-vivo information acquisition system to which the technique according to the present disclosure can be applied. The technique according to the present disclosure can be applied to, for example, the image capturing unit 10112 among the above-described components. This can improve detection accuracy.

<4. application example of endoscopic surgery System >

The technology according to the embodiments of the present disclosure (present technology) can be applied to various products. For example, techniques according to the present disclosure may be applied to endoscopic surgical systems.

Fig. 43 is a diagram depicting a schematic configuration example of an endoscopic surgery system to which the technique (present technique) according to an embodiment of the present disclosure can be applied.

In fig. 43, the following state is shown: a surgeon (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic surgical system 11000. As shown, the endoscopic surgical system 11000 includes an endoscope 11100, other surgical tools 11110 such as a veress tube 11111 and an energy device 11112, a support arm arrangement 11120 for supporting the endoscope 11100 thereon, and a cart 11200 on which various endoscopic surgical devices are mounted.

The endoscope 11100 includes a lens barrel 11101 and a camera head 11102 connected to a proximal end of the lens barrel 11101, and a region of a predetermined length from a distal end of the lens barrel 11101 is inserted into a body cavity of a patient 11132. In the illustrated example, the endoscope 11100 is depicted as including a hard type lens barrel 11101 as a hard type endoscope. However, the endoscope 11100 may include the flexible lens barrel 11101 as a flexible endoscope.

The lens barrel 11101 has an opening at its distal end, in which an objective lens is fitted. The light source device 11203 is connected to the endoscope 11100 so that light generated by the light source device 11203 is introduced into the distal end of the lens barrel 11101 through a light guide extending inside the lens barrel 11101 and irradiated toward an observation target in the body cavity of the patient 11132 through the objective lens. Note that the endoscope 11100 may be a forward-looking endoscope, or may be a skew-view endoscope or a side-view endoscope.

An optical system and an image pickup element are provided inside the camera head 11102 so that reflected light (observation light) from an observation target is condensed on the image pickup element by the optical system. Photoelectric conversion is performed on the observation light by the image pickup element to generate an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image. The image signal is transmitted to the CCU11201 as raw data.

The CCU11201 includes a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and the like, and integrally controls the operations of the endoscope 11100 and the display device 11202. In addition, the CCU11201 receives an image signal from the camera head 11102, and performs various image processes such as a development process (demosaicing process) on the image signal to display an image based on the image signal.

Under the control of the CCU11201, the display device 11202 displays an image thereon based on the image signal on which the image processing has been performed by the CCU 11201.

The light source device 11203 includes, for example, a light source such as a Light Emitting Diode (LED), and supplies irradiation light to the endoscope 11100 when imaging a surgical region.

The input device 11204 is an input interface of the endoscopic surgical system 11000. A user can input various types of information or instructions in the endoscopic surgical system 11000 through the input device 11204. For example, the user can input an instruction or the like for changing the image capturing conditions (the type of irradiation light, the magnification factor, the focal length, and the like) of the endoscope 11100.

The treatment tool control 11205 controls the driving of the energy device 11112 for cauterizing tissue, cutting tissue, sealing blood vessels, and the like. The pneumoperitoneum device 11206 feeds gas into the body cavity of the patient 11132 through the pneumoperitoneum tube 11111 to inflate the body cavity, thereby ensuring the field of view of the endoscope 11100 and ensuring the working space for the procedure. The recorder 11207 is a device capable of recording various information relating to the operation. The printer 11208 is a device capable of printing various information related to the operation in various forms such as text, images, and diagrams.

It is to be noted that the light source device 11203 that supplies irradiation light to the endoscope 11100 at the time of imaging the surgical region may include a white light source including, for example, an LED, a laser light source, or a combination thereof, for example. In the case where the white light source includes a combination of red, green, and blue (RGB) laser light sources, since the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, the white balance of a captured image can be adjusted by the light source device 11203. In addition, in this case, if the driving of the image pickup element of the camera head 11102 is controlled in synchronization with the emission timing by irradiating the laser beams from the RGB laser light sources to the observation target in a time-division manner, images respectively corresponding to RGB can be picked up in a time-division manner. According to this method, a color image can be acquired even if a color filter is not provided for the image pickup element.

In addition, the light source device 11203 may be controlled so that the intensity of the output light is changed every predetermined period of time. By controlling the driving of the image pickup element of the camera head 11102 in synchronization with the change timing of the light intensity so as to acquire images in a time-division manner and synthesize the images, a high dynamic range image without underexposure blocking shadow and overexposure highlight can be generated.

Further, the light source device 11203 may be configured to provide light having a predetermined wavelength band for specific light observation. In the specific light observation, for example, by irradiating light having a narrow band than the irradiation light at the time of normal observation (i.e., white light) by utilizing the wavelength dependence of light absorption in the body tissue, imaging for a predetermined tissue such as a blood vessel having a high contrast such as a mucosal layer surface layer portion (narrow band imaging) is performed. Alternatively, in the specific light observation, fluorescence observation for obtaining an image from fluorescence generated by irradiation of excitation light may be performed. In the fluorescence observation, observation of fluorescence from body tissue (autofluorescence observation) may be performed by irradiating the body tissue, or a fluorescence image may be obtained by locally injecting an agent such as indocyanine green (ICG) in the body tissue and irradiating excitation light corresponding to the fluorescence wavelength of the agent to the body tissue. The light source device 11203 may be configured to provide narrow-band light and/or excitation light suitable for specific light observations as described above.

Fig. 44 is a block diagram depicting an example of a functional configuration of the camera head 11102 and the CCU11201 depicted in fig. 11.

The camera head 11102 includes a lens unit 11401, an image capturing unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405. The CCU11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and the CCU11201 are connected to communicate with each other via a transmission cable 11400.

The lens unit 11401 is an optical system provided at a position connected to the lens barrel 11101. Observation light acquired from the distal end of the lens barrel 11101 is guided to the camera head 11102 and introduced into the lens unit 11401. The lens unit 11401 includes a combination of a plurality of lenses including a zoom lens and a focus lens.

The number of image capturing elements included in the image capturing unit 11402 may be one (single plate type) or a plurality (multi-plate type). In the case where the image capturing unit 11402 is configured of a multi-panel type, for example, image signals corresponding to R, G and B, respectively, are generated by image capturing elements, and a color image can be obtained by synthesizing these image signals. The image capturing unit 11402 may also be configured to have a pair of image capturing elements for acquiring a right eye image signal and a left eye image signal for three-dimensional (3D) display. If a 3D display is performed, the surgeon 11131 can more accurately sense the depth of the body tissue at the surgical site. Note that in the case where the image capturing unit 11402 is configured as a multi-plate type, a plurality of systems of lens units 11401 may be provided in a manner corresponding to the respective image capturing elements.

In addition, the image capturing unit 11402 is not necessarily provided in the camera head 11102. For example, the image capturing unit 11402 may be disposed immediately after the objective lens inside the lens barrel 11101.

The driving unit 11403 includes an actuator, and the driving unit 11403 moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Accordingly, the magnification and focus of the image captured by the image capturing unit 11402 can be appropriately adjusted.

The communication unit 11404 includes a communication device for transmitting/receiving various information to/from the CCU 11201. The communication unit 11404 transmits the image signal obtained from the image capturing unit 11402 to the CCU11201 as raw data via the transmission cable 11400.

In addition, the communication unit 11404 receives a control signal for controlling the driving of the camera head 11102 from the CCU11201, and supplies the control signal to the camera head control unit 11405. For example, the control signal includes information on image capturing conditions such as information for specifying a frame rate of a captured image and information for specifying an exposure value at the time of image capturing and/or information for specifying a magnification and a focus of the captured image.

It should be noted that image capturing conditions such as a frame rate, an exposure value, a magnification, and a focus may be appropriately specified by a user or automatically set by the control unit 11413 of the CCU11201 based on the obtained image signal. In the latter case, an Auto Exposure (AE) function, an Auto Focus (AF) function, and an Auto White Balance (AWB) function are combined in the endoscope 11100.

The camera head control unit 11405 controls driving of the camera head 11102 based on a control signal received from the CCU11201 through the communication unit 11404.

The communication unit 11411 includes a communication device for transmitting/receiving various types of information to/from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 through the transmission cable 11400.

In addition, the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102. The image signal and the control signal may be transmitted by electrical communication, optical communication, or the like.

The image processing unit 11412 performs various image processes on the image signal transmitted in the form of raw data from the camera head 11102.

The control unit 11413 performs various types of control regarding image capturing of an operation area or the like performed by the endoscope 11100 and display of a captured image obtained by image capturing of the operation area or the like. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.

Further, the control unit 11413 controls the display device 11202 to display a captured image for drawing a surgical region or the like based on the image signal on which the image processing has been performed by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 may recognize a surgical tool such as forceps, a specific living body part, bleeding, fog when the energy device 11112 is used, or the like by detecting the shape, color, or the like of the edge of the object included in the captured image. When the control unit 11413 controls the display device 11202 to display the photographed image, the control unit 11413 may display various kinds of surgery assistance information in a manner of being superimposed with the image of the surgical region by using the recognition result. By causing the operation assistance information to be displayed and presented to the surgeon 11131 in a superimposed manner, the burden on the surgeon 11131 can be reduced, or the surgeon 11131 can be enabled to perform an operation more reliably.

The transmission cable 11400 connecting the camera head 11102 and the CCU11201 to each other is an electrical signal cable for electrical signal communication, an optical fiber for optical communication, or a composite cable for both electrical communication and optical communication.

Here, in the illustrated example, although the communication is performed by wired communication using the transmission cable 11400, the communication between the camera head 11102 and the CCU11201 may be performed by wireless communication.

Examples of endoscopic surgical systems to which techniques according to the present disclosure may be applied have been described above. The technique according to the present disclosure can be applied to the image capturing unit 11402 and the image processing unit 11412 among the above components. This can improve the image quality of the captured image, thereby allowing the doctor to know the state in the patient's body more accurately.

Note that the endoscopic surgical system is described here as an example, but the technique according to the present disclosure may be applied to other systems, such as a microsurgical system or the like.

< application example of Mobile body >

The techniques according to the present disclosure may be applied to various products. For example, the technology according to the embodiments of the present disclosure may be implemented in the form of an apparatus mounted on any type of moving body, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, an unmanned aerial vehicle, a ship, a robot, a construction machine, and an agricultural machine (tractor).

Fig. 45 is a block diagram depicting an example of a schematic configuration of a vehicle control system as an example of a mobile body control system to which the technique according to the embodiment of the present disclosure is applicable.

The vehicle control system 12000 includes a plurality of electronic control units connected to each other via a communication network 12001. In the example shown in fig. 45, the vehicle control system 12000 includes a running system control unit 12010, a vehicle body system control unit 12020, an outside-vehicle information detection unit 12030, an inside-vehicle information detection unit 12040, and an integrated control unit 12050. In addition, the microcomputer 12051, the audio/video output section 12052, and the in-vehicle network interface (I/F)12053 are shown as a functional configuration of the integrated control unit 12050.

The running system control unit 12010 controls the operations of the devices related to the running system of the vehicle according to various types of programs. For example, the running system control unit 12010 functions as a control device to control a driving force generation device for generating a driving force of the vehicle, such as an internal combustion engine or a drive motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting a steering angle of the vehicle, a brake device for generating a braking force of the vehicle, and the like.

The vehicle body system control unit 12020 controls the operations of various devices provided on the vehicle body according to various types of programs. For example, the vehicle body system control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lamps such as a headlamp, a backup lamp, a brake lamp, a turn lamp, or a fog lamp. In this case, a radio wave in place of a key or a signal of various types of switches transmitted from the mobile device may be input into the vehicle body system control unit 12020. The vehicle body system control unit 12020 receives these input radio waves or signals, and controls the door lock device, power window device, lamp, and the like of the vehicle.

The vehicle external information detection unit 12030 detects information about the outside of the vehicle including the vehicle control system 12000. For example, the vehicle external information detection unit 12030 is connected to the imaging unit 12031. The vehicle external information detection unit 12030 causes the imaging unit 12031 to capture an image of the outside of the vehicle, and receives the captured image. Based on the received image, the vehicle external information detection unit 12030 may perform processing for detecting an object such as a person, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or perform processing for detecting a distance to the vehicle.

The image pickup section 12031 is an optical sensor that receives light and outputs an electric signal corresponding to the amount of received light. The image pickup section 12031 may output an electric signal as an image, or may output an electric signal as information on a measured distance. In addition, the light received by the image pickup portion 12031 may be visible light or may be invisible light such as infrared light.

The in-vehicle information detection unit 12040 detects information about the interior of the vehicle. For example, the in-vehicle information detection unit 12040 is connected to a driver state detection unit 12041 for detecting the state of the driver. The driver state detection unit 12041 includes, for example, a camera for imaging the driver. Based on the detection information input from the driver state detection section 12041, the in-vehicle information detection unit 12040 may calculate the degree of fatigue of the driver or the degree of concentration of the driver, or may determine whether the driver is dozing.

The microcomputer 12051 can calculate a control target value of the driving force generation device, the steering mechanism, or the brake device based on information about the interior or exterior of the vehicle obtained by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and can output a control command to the running system control unit system 12010. For example, the microcomputer 12051 may execute cooperative control aimed at realizing Advanced Driver Assistance System (ADAS) functions including vehicle collision avoidance or vehicle collision shock absorption, following travel based on following distance, vehicle speed maintenance travel, vehicle collision warning, vehicle lane departure warning, and the like.

In addition, by controlling the driving force generation device, the steering mechanism, the brake device, or the like based on the information on the inside or outside of the vehicle obtained by the outside-vehicle information detection unit 12030 or the inside-vehicle information detection unit 12040, the microcomputer 12051 can execute cooperative control intended to achieve autonomous running, which causes the vehicle to run automatically without depending on the operation of the driver.

In addition, the microcomputer 12051 can output a control command to the vehicle body system control unit 12020 based on the vehicle exterior information obtained by the vehicle exterior information detecting unit 12030. For example, the microcomputer 12051 may perform cooperative control intended to prevent glare by controlling headlights and changing high beams to low beams according to the position of a preceding vehicle or an oncoming vehicle detected by the vehicle outside information detection unit 12030.

The sound/image output portion 12052 transmits an output signal of at least one of sound and image to a passenger who can visually or aurally notify the vehicle of the information or the outside of the vehicle. In the example of fig. 45, an audio speaker 12061, a display portion 12062, and a dashboard 12063 are shown as examples of output devices. The display portion 12062 may include, for example, at least one of an in-vehicle display and a heads-up display.

Fig. 46 is a diagram showing an example of the mounting position of the imaging section 12031.

In fig. 46, the image pickup portion 12031 includes image pickup portions 12101, 12102, 12103, 12104, and 12105.

For example, the image pickup portions 12101, 12102, 12103, 12104, and 12105 are provided at positions such as a front nose, a rear view mirror, a rear bumper, and a rear door of the vehicle 12100 and an upper portion of a windshield in the vehicle interior. The camera portion 12101 provided at the nose and the camera portion 12105 provided at the upper portion of the windshield in the vehicle interior mainly obtain images in front of the vehicle 12100. The image pickup portions 12102 and 12103 provided at the rear view mirror mainly obtain images of both sides of the vehicle 12100. An image pickup unit 12104 provided at a rear bumper or a rear door mainly obtains an image behind the vehicle 12100. 12105 provided at an upper portion of a windshield in the vehicle interior is mainly used to detect a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign or lane, or the like.

Note that fig. 46 depicts an example of the imaging ranges of the imaging sections 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided at the nose. Imaging ranges 12112 and 12113 indicate imaging ranges of the imaging portions 12102 and 12103 provided at the rear view mirror, respectively. The imaging range 12114 indicates the imaging range of the imaging unit 12104 provided at the rear bumper or the rear door. For example, a bird's eye view image of the vehicle 12100 as viewed from above is obtained by superimposing the image data captured by the image capturing sections 12101 to 12104.

At least one of the image pickup portions 12101 to 12104 may have a function of obtaining distance information. For example, at least one of the image pickup portions 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element including pixels for phase difference detection.

For example, the microcomputer 12051 can determine the distance from each three-dimensional (3D) object within the imaging ranges 12111 to 12114 and the temporal change (relative speed with respect to the vehicle 12100) of the distance based on the distance information obtained from the imaging sections 12101 to 12104, and thereby extract the closest three-dimensional object that exists particularly on the traveling path of the vehicle 12100 and travels at a predetermined speed (e.g., equal to or greater than 0km/h) in substantially the same direction as the vehicle 12100 as the preceding vehicle. Further, the microcomputer 12051 may set in advance a following distance to be maintained ahead of the preceding vehicle, and execute automatic braking control (including following stop control), automatic acceleration control (including following start driving control), and the like. Thus, it is possible to perform cooperative control aimed at achieving autonomous traveling that is vehicle automatic traveling without depending on the operation of the driver or the like.

For example, the microcomputer 12051 may classify three-dimensional object data of a three-dimensional object into three-dimensional object data of a two-wheeled vehicle, a standard vehicle, a large vehicle, a pedestrian, a utility pole, or other three-dimensional object based on distance information obtained from the image pickup portions 12101 to 12104, extract the classified three-dimensional object data, and automatically avoid an obstacle using the extracted three-dimensional object data. For example, the microcomputer 12051 recognizes obstacles around the vehicle 12100 as obstacles that can be visually recognized by the driver of the vehicle 12100 and obstacles that are difficult for the driver to visually recognize. Then, the microcomputer 12051 determines a collision risk indicating a level of danger of collision with each obstacle. In the case where the risk of collision is equal to or higher than the preset value and thus there is a possibility of collision, the microcomputer 12051 outputs a warning to the driver via the audio speaker 12061 or the display portion 12062, and performs forced deceleration or collision avoidance steering via the running system control unit 12010. Thereby, the microcomputer 12051 can perform driving assistance to avoid a collision.

At least one of the image pickup portions 12101 to 12104 may be an infrared camera that detects infrared light. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured images of the image capturing sections 12101 to 12104. Such identification of a pedestrian is performed, for example, by the following procedure: a process of extracting feature points in the captured images of the image capturing sections 12101 to 12104 as infrared cameras; and a process of determining whether it is a pedestrian by performing pattern matching on a series of feature points representing the outline of the object. In the case where the microcomputer 12051 determines that a pedestrian is present in the captured images of the image capturing portions 12101 to 12104 and thus recognizes the pedestrian, the sound/image output portion 12052 controls the display portion 12062 so that a square contour line is displayed in a superimposed manner on the recognized pedestrian for emphasis. The sound/image output portion 12052 can also control the display portion 12062 to display an icon or the like representing a pedestrian at a desired position.

The above explains an example of a vehicle control system to which the technology according to the present disclosure can be applied. The technique according to the present disclosure can be applied to the image pickup portion 12031 among the above-described components. Thereby, the image quality of the captured image can be improved, allowing the vehicle control system 12000 to know, for example, the environment outside the vehicle more accurately. Thereby, more accurate driving assistance and the like can be performed.

Although the present technology has been described with reference to some embodiments, modifications, and specific application examples thereof, the present technology is not limited to these embodiments and the like, and may be modified in any manner.

For example, in the above-described respective embodiments, the image pickup apparatus 1 is configured using the image pickup section 10 and the image processing section 20, but this is not limitative. Alternatively, for example, an operation device other than the image pickup device 1 may have the function of the image processing section 20. In this case, the operation apparatus is provided with an image data file including information on the image map data MPR, MPG, and MPB and the conversion gain GC. This allows the operation device to be able to execute the image segmentation process a1, interpolation process section a2, and synthesis process A3 based on the image data file. The operation device may include a personal computer that executes an image processing program.

In addition, in the above-described respective embodiments, for example, the image processing section 20 controls whether to execute the image dividing process a1, the interpolation process a2, and the synthesis process A3 based on the conversion gain CG indicated by the gain signal SGAIN, but this is not exemplary. Alternatively, for example, the image pickup section 10 may determine whether or not to execute the image dividing process a1, the interpolation process a2, and the synthesis process A3, and generate a mode signal indicating the determination result. In this case, the image processing section 20 can perform an operation according to the mode signal.

It should be noted that the effects described herein are merely illustrative and non-limiting and may include other effects.

Note that the present technology can be configured in the following manner.

(1) An image processor comprising:

an image division processing section configured to generate a plurality of first map data based on first image map data including a plurality of pixel values, the plurality of first map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other;

an interpolation processing section configured to generate a plurality of second map data corresponding to the plurality of first map data by: determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing; and

a synthesis processing section configured to generate third map data by: generating a pixel value at a position corresponding to each other among the plurality of second atlas data, based on the pixel values at the positions corresponding to each other.

(2) The image processor according to (1), wherein the arrangement pattern is a checkerboard pattern.

(3) The image processor according to (1) or (2), further comprising an interpolation controller configured to determine a processing method in the interpolation processing based on the first image atlas data.

(4) The image processor according to (3), wherein the interpolation controller is configured to determine the processing method by: determining an interpolation direction in the interpolation process based on the first image map data.

(5) The image processor according to (3) or (4), wherein the interpolation controller is configured to determine spatial frequency information based on the first image atlas data, and determine the processing method based on the spatial frequency information.

(6) The image processor according to any one of (3) to (5), wherein the interpolation controller is configured to generate synthesized atlas data based on the first, second, and third image atlas data, and determine the processing method in the interpolation processing based on the synthesized atlas data.

(7) The image processor according to any one of (1) to (6), wherein,

the image segmentation processing section is configured to generate a plurality of fourth map data based also on second image map data including a plurality of pixel values, the plurality of fourth map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other,

the interpolation processing section is configured to generate a plurality of fifth map data corresponding to the plurality of fourth map data by: determining, with the interpolation process, a pixel value at a position where no pixel value exists in each of the plurality of fourth map data,

the pixel value arrangement patterns in the plurality of first atlas data include a first arrangement pattern and a second arrangement pattern, and

the pixel value arrangement patterns in the plurality of fourth map data include the first arrangement pattern and the second arrangement pattern.

(8) The image processor according to (7), wherein an interpolation method in the interpolation method of the plurality of first atlas data is the same as an interpolation method in the interpolation process of the plurality of fourth atlas data.

(9) The image processor according to (7) or (8), wherein,

the plurality of pixel values in the first image atlas data comprises a plurality of pixel values of a first color,

the plurality of pixel values in the second image atlas data comprise a plurality of pixel values of a second color and a plurality of pixel values of a third color.

(10) The image processor according to (7) or (8), wherein,

the plurality of pixel values in the first image atlas data comprise a plurality of pixel values of a first color, and

the plurality of pixel values in the second image atlas data comprise a plurality of pixel values of a second color, a plurality of pixel values of a third color, and a plurality of pixel values of a fourth color.

(11) The image processor according to (7), wherein,

the synthesis processing section is configured to generate sixth map data by: generating pixel values at positions corresponding to each other among the plurality of fifth atlas data based on the pixel values at the positions corresponding to each other,

the image division processing section is configured to generate a plurality of seventh map data based also on third image map data including a plurality of pixel values, the plurality of seventh map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other,

the interpolation processing section is configured to generate a plurality of eighth map data corresponding to the plurality of seventh map data by: determining, with the interpolation process, a pixel value at a position where no pixel value exists in each of the plurality of seventh map data,

the synthesis processing section is configured to generate ninth map data by: generating pixel values at positions corresponding to each other among the plurality of eighth atlas data, based on the pixel values at the positions corresponding to each other, and

the pixel value arrangement patterns in the plurality of seventh map data include the first arrangement pattern and the second arrangement pattern.

(12) The image processor according to (11), wherein,

an interpolation method in the interpolation method of the plurality of first map data is the same as an interpolation method in the interpolation process of the plurality of fourth map data and an interpolation method in the interpolation process of the plurality of seventh map data.

(13) The image processor according to (11) or (12), wherein,

the plurality of pixel values in the first image atlas data comprises a plurality of pixel values of a first color,

the plurality of pixel values in the second image atlas data comprise a plurality of pixel values of a second color, and

the plurality of pixel values in the third image atlas data comprise a plurality of pixel values of a third color.

(14) The image processor according to any one of (11) to (13), wherein the number of the plurality of pixel values in the first image atlas data is different from the number of the plurality of pixel values in the second image atlas data.

(15) The image processor according to (14), wherein,

the plurality of pixel values in the first image atlas data comprise a plurality of pixel values of green, and

two or more pixel values in the first image map data are associated with one pixel value in the second image map data.

(16) The image processor of any one of (1) to (5), further comprising a generator that generates the first image atlas data based on an image signal, wherein the first image atlas data comprises luminance atlas data.

(17) The image processor according to any one of (1) to (16), further comprising a process controller configured to control whether or not the image segmentation processing section, the interpolation processing section, and the synthesis processing section execute a process.

(18) The image processor according to (17), further comprising a processing section configured to perform predetermined signal processing based on the first image atlas data or the third atlas data,

wherein the processing controller is configured to cause the processing portion to perform the predetermined signal processing based on the first image map data in a first operation mode and to perform the predetermined signal processing based on the third map data in a second operation mode.

(19) The image processor according to (18), wherein the process controller is configured to control whether or not the image segmentation processing section, the interpolation processing section, and the synthesis processing section execute processing based on a parameter.

(20) The image processor according to (19), wherein,

the first image map data is supplied from an image pickup section,

the parameter includes a gain value in the image pickup section, and

in a case where the gain value is higher than a predetermined gain value, the processing controller performs control to cause the image division processing section, the interpolation processing section, and the synthesis processing section to perform processing.

(21) An image processing method comprising:

image segmentation processing: generating a plurality of first map data based on first image map data including a plurality of pixel values, the plurality of first map data having pixel value arrangement patterns different from each other and including pixel values located at positions different from each other;

interpolation processing: determining a plurality of second map data corresponding to the plurality of first map data by determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing; and

and (3) synthesis treatment: generating third map data by generating pixel values at positions corresponding to each other among the plurality of second map data based on the pixel values at the positions corresponding to each other.

(22) An image pickup apparatus includes:

an image pickup section that generates first image map data including a plurality of pixel values;

an image division processing section configured to generate a plurality of first atlas data based on the first image atlas data, the plurality of first atlas data having pixel value arrangement patterns different from each other and including pixel values at positions different from each other;

an interpolation processing section configured to generate a plurality of second map data corresponding to the plurality of first map data by: determining a pixel value at a position where no pixel value exists in each of the plurality of first map data by interpolation processing; and

a synthesis processing section configured to generate third map data by: generating a pixel value at a position corresponding to each other among the plurality of second atlas data, based on the pixel values at the positions corresponding to each other.

(23) The image pickup apparatus according to (22), wherein,

the image pickup section includes a plurality of pixels arranged in a predetermined color arrangement, and

the arrangement pattern has a pattern corresponding to the color arrangement.

This application claims the benefit of japanese priority patent application JP 2018-.

It will be understood by those skilled in the art that various modifications, combinations, sub-combinations and variations may be made in accordance with design requirements and other factors without departing from the scope of the appended claims and their equivalents.

83页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像编码方法及其装置、图像解码方法及其装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类