Image capturing apparatus, image capturing method, and image capturing device

文档序号:1358576 发布日期:2020-07-24 浏览:15次 中文

阅读说明:本技术 图像拍摄装置、图像拍摄方法和图像拍摄器件 (Image capturing apparatus, image capturing method, and image capturing device ) 是由 伊藤厚史 伊利亚·列舍图斯基 于 2018-12-07 设计创作,主要内容包括:本发明涉及一种能够在无透镜成像期间减小由掩模开口引起的衍射的影响的成像装置、成像方法和成像元件。在掩模前级设置被划分成多个区域并且在多个区域之中透射不同波长带的入射光的带通滤波器。设置用于对已经透过带通滤波器的、在多个区域之间不同的波长带的入射光进行调制的掩模。该掩模包括使得由对于每个波长的衍射产生的模糊最小化的单位尺寸的开口。这种配置可以应用于无透镜成像装置。(The present invention relates to an imaging device, an imaging method, and an imaging element capable of reducing the influence of diffraction caused by a mask opening during lens-less imaging. A band-pass filter that is divided into a plurality of regions and transmits incident light of different wavelength bands among the plurality of regions is provided at a front stage of a mask. A mask for modulating incident light of a wavelength band different between the plurality of regions, which has passed through the band-pass filter, is provided. The mask includes an opening of unit size that minimizes the blur produced by diffraction for each wavelength. This configuration can be applied to a lensless imaging apparatus.)

1. An image capturing apparatus comprising:

a band-pass filter divided into a plurality of regions, each region transmitting incident light in a different wavelength band;

a mask that is divided in correspondence with the plurality of regions and modulates incident light in the different wavelength bands transmitted through respective ones of the plurality of regions of the band-pass filter;

a solid-state image pickup device that has an image pickup face divided in correspondence with the plurality of regions and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal; and

a signal processing section that reconstructs two-dimensional pixel signals captured by the solid-state image capturing device into a final image through signal processing.

2. The image capturing apparatus according to claim 1,

the mask is a mask pattern having a unit size different according to each of the plurality of regions.

3. The image capturing apparatus according to claim 2,

the mask pattern of each of the plurality of regions is a mask pattern having a different unit size based on a wavelength band of incident light transmitted through the band pass filter.

4. The image capturing apparatus according to claim 2,

the mask pattern of each of the plurality of regions is a mask pattern having a unit size of: the unit size is such that when incident light in a wavelength band transmitted through the band-pass filter is captured by the solid-state image capturing device, diffusion of the incident light due to diffraction is almost minimized.

5. The image capturing apparatus according to claim 1,

the distance from the mask to the image pickup face of the solid-state image pickup device is a distance different for each of the plurality of regions.

6. The image capturing apparatus according to claim 5,

the distance from the mask to the image pickup surface of the solid-state image pickup device is a different distance based on the wavelength band of the incident light transmitted through the band-pass filter for each of the plurality of regions.

7. The image capturing apparatus according to claim 6,

a distance from the mask to an image pickup face of the solid-state image pickup device of each of the plurality of regions is a distance of: the distance is such that when incident light in a wavelength band transmitted through the band-pass filter is captured by the solid-state image capturing device, diffusion of the incident light due to diffraction is almost minimized.

8. The image capturing apparatus according to claim 5,

the mask is a mask pattern having the same unit size for all of the plurality of regions.

9. The image capturing apparatus according to claim 1, further comprising:

a light shielding wall that blocks incident light from an adjacent region at boundaries between the band-pass filter, the mask, and the plurality of regions of the solid-state image pickup device.

10. The image capturing apparatus according to claim 1,

the signal processing section includes:

a dividing section that divides two-dimensional pixel signals captured by the solid-state image capturing device in correspondence with the plurality of areas;

a plurality of image reconstruction sections that reconstruct each two-dimensional pixel signal obtained by dividing the two-dimensional pixel signal into a final image by signal processing; and

and an integrating unit that integrates the final images reconstructed by the plurality of image reconstructing units.

11. The image capturing apparatus according to claim 10,

the integration unit integrates the final images reconstructed by the plurality of image reconstruction units by superimposing the final images.

12. The image capturing apparatus according to claim 10,

the integration unit performs integration by selecting any one of the final images reconstructed by the plurality of image reconstruction units.

13. The image capturing apparatus according to claim 10,

the integration unit performs integration such that at least two or more final images among the final images reconstructed by the plurality of image reconstruction units are selected and the selected final images are superimposed.

14. The image capturing apparatus according to claim 1,

a minute gap is formed between the solid-state image pickup device and the mask in the incident direction of the incident light.

15. The image capturing apparatus according to claim 1,

the image capture device does not include a lens that focuses the incident light onto any one of the band pass filter, the mask, and the solid state image capture device.

16. The image capturing apparatus according to claim 1,

the wavelength band of the incident light is about 8 μm to about 14 μm.

17. An image capturing method of an image capturing apparatus, the image capturing apparatus comprising:

a band-pass filter divided into a plurality of regions, each region transmitting incident light in a different wavelength band;

a mask that is divided in correspondence with the plurality of regions and modulates incident light in the different wavelength bands transmitted through respective ones of the plurality of regions of the band-pass filter; and

a solid-state image pickup device which has an image pickup face divided in correspondence with the plurality of areas and picks up incident light modulated by each of the plurality of areas through the mask as a two-dimensional pixel signal, the image pickup method including the following signal processing:

the two-dimensional pixel signals photographed by the solid-state image photographing device are reconstructed into a final image through signal processing.

18. An image pickup device comprising:

a band-pass filter divided into a plurality of regions, each region transmitting incident light in a different wavelength band;

a mask that is divided in correspondence with the plurality of regions and modulates incident light in the different wavelength bands transmitted through respective ones of the plurality of regions of the band-pass filter; and

a solid-state image pickup device which has an image pickup face divided in correspondence with the plurality of regions and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal.

Technical Field

The present disclosure relates to an image capturing apparatus, an image capturing method, and an image capturing device, and particularly relates to an image capturing apparatus, an image capturing method, and an image capturing device that enable capturing of a two-dimensional image with high spatial resolution as a final image even in the case where a wide range of wavelengths is simultaneously acquired in a lens-less image capturing apparatus.

Background

The lens-less image photographing device is an image photographing device that photographs an image by using a light modulation mechanism such as a patterned opening or a diffraction grating together with a two-dimensional image sensor without using lens photographing that has been used in an existing two-dimensional image photographing system, and performs signal processing on pixel signals after photographing the image to reconstruct the pixel signals into a two-dimensional final image, thereby achieving size reduction, weight reduction, price reduction, non-planarization, and the like of the image photographing device.

There are several techniques for the configuration of the lensless image capture device.

For example, in one of the following techniques that has been proposed: modulating light entering an image capture surface of the two-dimensional image sensor through the patterned mask opening; based on the modulated light, a captured image; and reconstructing a final image by performing signal processing on the captured image (see patent document 1).

In addition, in another technique that has been proposed as follows: controlling light entering an image shooting surface of the two-dimensional image sensor through a Fresnel structure mask opening; and reconstructing a final image by performing signal processing based on fourier transform (see non-patent document 1).

Further, in still another technique that has been proposed: modulating incident light into a sine wave-like form having different phases depending on an incident light angle by using a diffraction grating; shooting the incident light through a two-dimensional image sensor; and the captured image is reconstructed by signal processing to restore the final image (see patent document 2).

Reference list

Patent document

Patent document 1: WO2016/123529A1

Patent document 2: JP 2016-510910A (US 2016003994)

Non-patent document

Non-patent document 1: L ensless L light-field Imaging with Fresnel Zone Aperture (Yusuke Nakamura, Takeshi Shimano, Kazuyuki Tajima, Mayu Sao, Taku Hoshizawa (Hitachi, L td)), IWISS2016

Disclosure of Invention

Technical problem

All the techniques described in patent document 1, patent document 2, and non-patent document 1 mentioned above claim a scalable method capable of coping with the wavelength of incident light to be input. Of course, the lens-less image capturing apparatus is a technique that can also be applied to image capturing of X-rays and γ -rays, and is also used for astronomical image observation. Theoretically, the lens-less image capturing apparatus can also be applied to millimeter wave or terahertz wave imaging.

However, if the optical diffraction phenomenon is considered, the pattern, shape, and the like of the opening that modulates light in each technique need to be optimized for the target wavelength. This is because diffraction is a phenomenon that depends on the wavelength of incident light.

For example, in the technique of patent document 1, if the distance between the pattern mask and the image sensor or the size of the unit pattern of the aperture is not optimized for the target wavelength, the amount of blur due to diffraction increases.

In addition, the technique of non-patent document 1 is considered to be more susceptible to diffraction, and if the modulation period of the transmittance of the mask is not optimized for the target wavelength, mutual interference due to diffraction occurs.

Further, since the technique of patent document 2 utilizes diffraction itself, light reaching the sensor exhibits different behaviors depending on the wavelength. If the lattice spacing is not designed to be a phase suitable for the target wavelength, light inevitably diffuses, and as a natural result, spatial blurring occurs in the image.

As described above, in any technique, in the case where a wide range of wavelengths is simultaneously acquired, there is a possibility that the spatial resolution of a two-dimensional image to be reconstructed into a final image may be reduced.

The present disclosure has been made in view of such circumstances, and a specific object of the present disclosure is to enable a two-dimensional image with high spatial resolution to be reconstructed and restored to a final image even in the case where a wide range of wavelengths are simultaneously acquired in a lensless image capturing apparatus.

Solution to the problem

An image capturing apparatus according to an aspect of the present disclosure is an image capturing apparatus including: a band-pass filter divided into a plurality of regions, each region transmitting incident light in a different wavelength band; a mask which is divided corresponding to the plurality of regions and modulates incident light in the different wavelength bands transmitted through respective regions of the plurality of regions of a band-pass filter; a solid-state image pickup device that has an image pickup face divided in correspondence with the plurality of regions and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal; and a signal processing section that reconstructs the two-dimensional pixel signal captured by the solid-state image capturing device into a final image through signal processing.

The mask may be a mask pattern having a unit size different according to each of the plurality of regions.

The mask pattern for each of the plurality of regions may be a mask pattern having a different unit size based on a wavelength band of incident light transmitted through the band pass filter.

The mask pattern for each of the plurality of regions may be a mask pattern having the following unit dimensions: so that when incident light in a wavelength band transmitted through the band-pass filter is photographed by the solid-state image pickup device, the spread of the incident light due to diffraction is almost minimized.

The distance from the mask to the image pickup face of the solid-state image pickup device may be different for each of the plurality of regions.

The distance from the mask to the image pickup face of the solid-state image pickup device may be a distance different based on the wavelength band of the incident light transmitted through each of the plurality of regions of the band-pass filter.

The distance from the mask to the image pickup face of the solid-state image pickup device may be, for each of the plurality of regions, the following distance: so that when incident light in a wavelength band transmitted through the band-pass filter is photographed by the solid-state image pickup device, the spread of the incident light due to diffraction is almost minimized.

The mask may be a mask pattern having the same unit size for all of the plurality of regions.

A light-shielding wall that blocks incident light from an adjacent area at a boundary between the band-pass filter, the mask, and the plurality of areas of the solid-state image pickup device may also be included.

The signal processing section may include: a dividing section that divides the two-dimensional pixel signals captured by the solid-state image capturing device in association with the plurality of areas; a plurality of image reconstruction sections that reconstruct each pixel signal obtained by dividing two-dimensional pixel signals into a final image by signal processing; and an integrating unit that integrates the final images reconstructed by the plurality of image reconstructing units.

The integration unit integrates the final images reconstructed by the plurality of image reconstruction units by superimposing the final images.

The integration unit may integrate one of the final images reconstructed by the plurality of image reconstruction units by selecting the one final image.

The integration section may select at least two of the final images reconstructed by the plurality of image reconstructors and integrate the selected at least two final images in such a manner that the selected at least two final images are superimposed.

A minute gap is formed between the solid-state image pickup device and the mask in the incident direction of the incident light.

The image pickup device may have a configuration that does not include a lens that focuses the incident light onto any one of a band-pass filter, a mask, and a solid-state image pickup device.

The wavelength band of the incident light is about 8 μm to about 14 μm.

An image capturing method according to an aspect of the present disclosure is an image capturing method of an image capturing apparatus including: a band-pass filter divided into a plurality of regions so that incident light having a wavelength band different for each of the plurality of regions is transmitted therethrough; a mask that is divided in correspondence with the plurality of regions and modulates incident light having a wavelength band different for each of the plurality of regions, which is transmitted through the band-pass filter; and a solid-state image pickup device that has an image pickup plane divided in correspondence with the plurality of regions and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal. The image capturing method includes: signal processing for reconstructing a two-dimensional pixel signal captured by a solid-state image capturing device into a final image by signal processing.

An image capturing device according to an aspect of the present disclosure is an image capturing device including: a band-pass filter divided into a plurality of regions, each of the plurality of regions transmitting incident light in a different wavelength band; a mask which is divided in correspondence with the plurality of regions and modulates incident light of different wavelength bands that is transmitted through corresponding regions of the plurality of regions of the band-pass filter, respectively; and a solid-state image pickup device that has an image pickup plane divided in correspondence with the plurality of regions and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal.

The invention has the advantages of

According to an aspect of the present disclosure, in particular, even in the case where a wide range of wavelengths is simultaneously acquired in a lensless image capturing apparatus, a two-dimensional image having a high spatial resolution can be reconstructed and restored to a final image.

Drawings

Fig. 1 is a diagram for explaining an overview of a lens-less image capturing apparatus.

Fig. 2 is a diagram for explaining the image capturing principle in the lens-less image capturing apparatus.

Fig. 3 is a diagram for explaining an overview of the first embodiment of the present disclosure.

Fig. 4 is a block diagram for explaining a configuration example of a lens-less image capturing apparatus of the present disclosure.

Fig. 5 is a diagram for explaining the operation of the band pass filter.

Fig. 6 is a diagram for explaining a change in an image observed when modulated light is captured in the case where the unit size of the mask and the wavelength of incident light are changed.

Fig. 7 is a diagram for explaining a configuration example of the signal processing section in fig. 4.

Fig. 8 is a flowchart for explaining the image capturing process.

Fig. 9 is a diagram for explaining a configuration example of the second embodiment of the present disclosure.

Fig. 10 is a diagram for explaining a change in an image observed when modulated light is captured with a change in the distance from the mask to the solid-state image capturing device and the wavelength of incident light.

Fig. 11 is a diagram for explaining a configuration example of the third embodiment of the present disclosure.

Detailed Description

Hereinafter, suitable embodiments of the present disclosure are described in detail with reference to the accompanying drawings. Note that in the present specification and the drawings, constituent elements having substantially the same functional configuration have the same reference numerals, and therefore, duplicate descriptions of these constituent elements are omitted.

Hereinafter, a mode for carrying out the present technology will be explained. The description is given in the following order.

1. Overview of a lens-less image capture device

2. Overview of the disclosure

3. First embodiment

4. Second embodiment

5. Third embodiment

<1. overview of lens-less image capturing apparatus >)

Before explaining the configuration of the present disclosure, an overview of the lens-less image capturing apparatus is explained by comparison with the configuration of a typical image capturing apparatus.

An example of the configuration of a typical image capturing apparatus includes a pinhole camera as shown at the lower right portion in fig. 1.

In the case of an image pickup apparatus including a pinhole, as shown in the lower right part of fig. 1, light rays L1 to L3 emitted from light sources different from each other on the object plane are respectively transmitted through the pinhole 21 and picked up as images at pixels I1 to I3 on the solid-state image pickup device 11.

In the case of an image pickup apparatus including a pinhole camera, at the solid-state image pickup device 11, an image is formed using only light rays corresponding to one pixel, which are included in the light rays L1 to L3 emitted from the respective light sources, and which enter the corresponding pixel on the solid-state image pickup device 11.

In view of this, in one inventive apparatus as shown at the upper right portion in fig. 1, the image pickup lens 32 is disposed in the middle of the light shielding film 31, the image pickup lens 32 condenses light rays L1 to L3 as shown by light rays I11 to I13, and forms respective images on the solid-state image pickup device 11 and captures the images by the solid-state image pickup device 11.

In the case of the upper right part in fig. 1, at the solid-state image pickup device 11, an image formed with light having a light intensity equal to the sum of the light intensities of all the light rays L1 to L3 is formed, and the image is caused to enter the solid-state image pickup device 11, whereby an image as an image having a sufficient amount of light is picked up at each pixel of the solid-state image pickup device 11.

As shown at the upper right portion in fig. 1, by using the image capturing lens 32, the set of individual point light sources constitutes at least a part of the subject. Therefore, in image capturing of an object, light rays emitted from a plurality of point light sources on an object plane are collected, and an image of the object thus formed is captured.

As explained with reference to the upper right portion in fig. 1, the image pickup lens 32 functions to guide each light ray (i.e., diffused light) exiting from one of the point light sources onto the solid-state image pickup device 11. Therefore, this causes an image corresponding to the final image to be formed on the solid-state image pickup device 11, and the image formed with the detection signals detected at the respective pixels on the solid-state image pickup device 11 becomes a captured image in which the image is formed.

However, since the size of the image pickup apparatus (image pickup device) is determined by the image pickup lens of the image pickup apparatus and the focal length of the image pickup lens, there is a limitation in size reduction of the image pickup apparatus.

In view of this, in one inventive apparatus as shown at the upper left part in fig. 1, an image of an object on an object plane is captured by using the solid-state image capturing device 11 and the mask 51 without providing an image capturing lens and a pinhole.

At the upper left part of fig. 1, a mask 51 including opening portions 51a having a plurality of sizes is disposed in front of the solid-state image pickup device 11, and light rays L1 to L3 from the respective light sources are modulated, enter the image capturing plane of the solid-state image pickup device 11, and are received by the respective pixels on the solid-state image pickup device 11.

Here, at the mask 51, as shown in the lower left part in fig. 1, the opening portion 51a and the light blocking portion 51b have sizes randomly set in units of a unit size Δ in the horizontal direction and the vertical direction, thereby forming a mask pattern on the mask 51, the unit size Δ is a size at least larger than the pixel size, further, a gap having a minute distance d is provided between the solid-state image pickup device 11 and the mask 51, further, at the lower left part in fig. 1, the pitch between the pixels on the solid-state image pickup device 11 is set to w with such a configuration that the light rays L1 to L3 are modulated before entering the solid-state image pickup device 11 depending on the sizes of the unit size Δ and the distance d.

More specifically, for example, as shown at the upper left part in fig. 2, the light sources of the light rays L1 to L3 at the upper left part in fig. 1 may be point light sources PA, PB, and PC, and the light rays having the light intensities a, b, and c may enter each of the positions PA, PB, and PC on the solid-state image pickup device 11 where the light rays enter after being transmitted through the mask 51.

In the case of the lens-less image pickup device, as shown at the upper left in fig. 2, since incident light is modulated by randomly setting the opening portion 51a passing through the mask 51, directivity is provided for the detection sensitivity of each pixel according to the incident angle of the incident light. The provision of the incident angle directivity for the detection sensitivity of each pixel mentioned here means that the photosensitive characteristic according to the incident angle of incident light differs depending on the area on the solid-state image pickup device 11.

That is, in the case where it is assumed that the light sources constituting at least a part of the object plane 71 are point light sources, this causes light rays having the same light intensity emitted from the same point light source to enter the solid-state image pickup device 11, but since the light rays are modulated by the mask 51, the incident angles of the light rays differ between the respective areas on the image pickup surface of the solid-state image pickup device 11. Then, due to the difference in the incident angle of incident light between the regions on the solid-state image pickup device 11 generated by the mask 51, the photosensitive characteristic differs between the regions, that is, the regions have incident angle directivity. Therefore, since the mask 51 is disposed in front of the image pickup surface of the solid-state image pickup device 11, even if the light rays have the same light intensity, this will allow the light rays to be detected with different sensitivities between the respective regions on the solid-state image pickup device 11, and detection signals having different detection signal levels between the respective regions to be detected.

More specifically, as shown at the upper right portion in fig. 2, the detection signal levels DA, DB, and DC of the pixels at the positions Pa, Pb, and Pc on the solid-state image pickup device 11 are represented by the following formulas (1) to (3), respectively.

DA=α1×a+β1×b+γ1×c···(1)

DB=α2×a+β2×b+γ2×c···(2)

DC=α3×a+β3×b+γ3×c···(3)

Here, α 1 is a coefficient for the detection signal level α to be restored at the position PA on the solid-state image pickup device 11, set according to the incident angle of the light from the point light source PA on the object plane 71.

In addition, β 1 is a coefficient for the detection signal level b to be restored at the position Pa on the solid-state image pickup device 11, set according to the incident angle of the light ray from the point light source PB on the object plane 71.

Further, γ 1 is a coefficient for the detection signal level c to be restored at the position Pa on the solid-state image pickup device 11, which is set according to the incident angle of the light from the point light source PC on the object plane 71.

Therefore, (α 1 × a) in the detection signal levels DA indicates the detection signal level of the light from the point light source PA at the position Pc.

In addition, (β 1 × b) in the detection signal levels DA indicates the detection signal level of the light from the point light source PB at the position Pc.

Further, (γ 1 × c) in the detection signal levels DA indicates the detection signal level of the light from the point light source Pc at the position Pc.

Therefore, the detection signal level DA is represented as a composite value of the respective components of the point light sources PA, PB, and PC at the position Pa multiplied by coefficients α 1, β 1, and γ 1, respectively, hereinafter, the coefficients α 1, β 1, and γ 1 are collectively referred to as a coefficient set.

Similarly, the coefficient sets α 2, β 2, and γ 2 in the detection signal level DB at the point light source PB correspond to the coefficient sets α 1, β 1, and γ 1, respectively, in the detection signal level DA at the point light source PA. in addition, the coefficient sets α 3, β 3, and γ 3 in the detection signal level DC at the point light source PC correspond to the coefficient sets α 1, β 1, and γ 1, respectively, in the detection signal level DA at the point light source PA.

It should be noted, however, that the detection signal levels of the pixels at the positions Pa, Pb, and Pc are values represented by the sum of the products of the light intensities a, b, and c of the light rays emitted from the point light sources Pa, Pb, and Pc, respectively, and the coefficients. Therefore, these detection signal levels are mixtures of the light intensities a, b, and c of the light rays emitted from the point light sources PA, PB, and PC, respectively, and thus are different from the detection signal levels forming the image of the subject.

That is, a system of equations using the coefficient sets α 1, β 1 and γ 1, the coefficient sets α 2, β 2 and γ 2, the coefficient sets α 3, β 3 and γ 3, and the detection signal levels DA, DB and DC is formed, and the system of equations is solved for the light intensities a, b and c, thereby obtaining pixel values at the respective positions Pa, Pb and Pc as shown at the lower right portion in fig. 2.

In addition, although the coefficient sets α 1, β 1, and γ 1, the coefficient sets α 2, β 2, and γ 2, and the coefficient sets α 3, β 3, and γ 3 change respectively in the case where the distance between the solid-state image pickup device 11 and the object plane 71 shown at the upper left of fig. 2 changes, restored images (final images) of the object plane at various distances can be reconstructed by changing the coefficient sets.

Therefore, by changing the coefficient set to the coefficient sets corresponding to the various distances, it is possible to reconstruct images of the object plane at the various distances from the image capturing position by performing image capturing once.

Therefore, in image capturing performed by using a lens-less image capturing apparatus, it is not necessary to know a phenomenon like what is generally called defocus that occurs when image capturing is performed with an image capturing apparatus that uses a lens in a state where an image is out of focus. As long as image capturing is performed so that a subject desired to be captured is included in the angle of view, images of subject planes at various distances can be reconstructed after image capturing by changing the coefficient set to coefficients corresponding to the distances.

Note that the detection signal level shown at the upper right portion of fig. 1 is not a detection signal level corresponding to an image forming an image of a subject, and is therefore not a pixel value. In addition, the detection signal levels shown at the lower right portion of fig. 1 are each a signal value of each pixel corresponding to an image forming an image of a subject, that is, a value of each pixel of a restored image (final image), and are thus a pixel value.

With such a configuration, a device commonly referred to as a lensless image capture device that does not require an image capture lens and a pinhole can be realized. Therefore, an image pickup lens, a pinhole, and the like are not necessary configurations, and therefore the height of the image pickup device can be reduced, that is, the thickness of the image pickup device in the incident direction of light can be thinned in a configuration in which an image pickup function is realized. In addition, by changing the coefficient set in various ways, final images (restored images) on the object plane at various distances can be reconstructed and restored.

Note that an image which is captured by the solid-state image capturing device and is not reconstructed is hereinafter simply referred to as a captured image, and an image which is reconstructed and restored by performing signal processing on the captured image is referred to as a final image (restored image). Therefore, from a single captured image, by variously changing the above-mentioned coefficient set, it is possible to reconstruct images on the object plane 71 at various distances into a final image.

However, if the light rays L1 to L3 enter the opening portion 51a provided by the mask 51 in the lens-less image pickup apparatus, diffraction occurs when the light rays L1 to L3 exit, whereby incident light is diffused and appears as blur on the image picked up on the solid-state image pickup device 11.

Meanwhile, the degree of influence of diffraction can be changed by the wavelength of incident light and the unit size which is the minimum unit for adjusting the size of the opening portion 51 a.

In view of this, in the lens-less image pickup apparatus of the present disclosure, the mask 51 and the solid-state image pickup device 11 are divided into a plurality of regions, and a band-pass filter that changes the wavelength band of incident light is provided in front of the mask 51 for each divided region. Then, the size of the opening portion 51a is provided on the mask 51 in unit size in accordance with the wavelength band of light transmitted by the band-pass filters of the respective regions, so that the influence on the diffusion of incident light caused by diffraction is reduced.

With such a configuration, the captured image is captured in each region while appropriately reducing the influence of diffraction according to the wavelength band of the incident light set for each region, the final image of each region is reconstructed by performing signal processing, and the final images are integrated. Therefore, it is possible to reduce the influence of blur due to diffraction in a lens-less image capturing apparatus and reconstruct a final image with high spatial resolution in image capturing by using incident light in an incident light wavelength band having a wide range.

<2. overview of the present disclosure >

Next, an overview of an image pickup device of a lens-less image pickup apparatus of the present disclosure will be described with reference to fig. 3. Note that the left part in fig. 3 shows a configuration example of an image pickup device in a typical lens-less image pickup apparatus, the upper left part is a top view of the mask 51, and the lower left part is a perspective view of the mask 51 and the solid-state image pickup device 11 when viewed laterally and obliquely from above. In addition, the right part in fig. 3 shows a configuration example of an image pickup device in the lens-less image pickup apparatus of the present disclosure, the upper right part is a plan view of the mask 102, and the lower right part is a perspective view of the mask 102 and the solid-state image pickup device 101 in a state where the band-pass filter 103 is disposed in front of the mask 102 when viewed laterally and obliquely from above.

As shown in the left part in fig. 3, for example, in an image pickup device of a typical lens-less image pickup apparatus, the unit size of the opening portion 51a of the mask 51 is uniformly set for all the areas, and in the solid-state image pickup device 11, a single image as a whole is picked up by using light that has passed through the mask 51.

In contrast, in the image pickup device in the lens-less image pickup apparatus of the present disclosure, each of the solid-state image pickup device 101, the mask 102, and the band-pass filter 103 is divided into two regions in the horizontal direction and the vertical direction to divide each of them into four regions in total, and four picked-up images of the same subject are picked up by using incident lights in mutually different wavelength bands entering the respective regions.

More specifically, the mask 102 is divided into a total of four regions including regions 102A to 102D, and in the regions 102A to 102D, opening portions 102A to 102D having different unit sizes are set, respectively.

In addition, the unit sizes of the aperture portions 102a to 102D are randomly set according to the wavelength band of light transmitted through the respective regions 103A to 103D of the band-pass filter 103 disposed in front of the mask 102.

That is, the unit size of the opening portion 102A of the region 102A is set to a unit size that minimizes the influence of diffraction when light in a wavelength band transmitted through the region 103A of the band-pass filter 103 is transmitted.

In addition, the unit size of the opening portion 102B of the region 102B is set to a unit size that minimizes the influence of diffraction when light in a wavelength band transmitted through the region 103B of the band-pass filter 103 is transmitted.

Further, the unit size of the opening portion 102C of the region 102C is set to a unit size that minimizes the influence of diffraction when light in a wavelength band transmitted through the region 103C of the band-pass filter 103 is transmitted.

In addition, the unit size of the opening portion 102D of the region 102D is set to a unit size that minimizes the influence of diffraction when light in a wavelength band of the region 103D transmitted through the band-pass filter 103 is transmitted.

The solid-state image pickup device 101 is also divided into four regions, regions 101A to 101D, so that these regions correspond to the regions 102A to 102D, and the region 101A of the solid-state image pickup device 101 picks up a picked-up image a formed with light in a wavelength band transmitted through the region 103A of the band-pass filter 103.

The region 101B of the solid-state image pickup device 101 picks up a picked-up image B formed with light in a wavelength band transmitted through the region 103B of the band-pass filter 103.

Further, the region 101C of the solid-state image pickup device 101 picks up a picked-up image C formed with light in a wavelength band transmitted through the region 103C of the band-pass filter 103.

In addition, the region 101D of the solid-state image pickup device 101 picks up a picked-up image D formed with light having a wavelength transmitted through the region 103D of the band-pass filter 103.

Then, the photographed images a to D, which are less affected by the blur due to diffraction in the respective wavelength bands, may be photographed. In addition, final images a to D are generated by performing signal processing in the above-mentioned lens-less image capturing apparatus based on the captured images a to D, and the final images a to D are combined to be integrated into a single final image. So that a single final image with a high spatial resolution can be recovered.

<3. first embodiment >

Next, a configuration example of a lens-less image capturing apparatus (lens-less image capturing apparatus) of the present disclosure is explained with reference to fig. 4.

Fig. 4 shows a configuration example of a transverse cross section of the lensless image capture device 111 of the present disclosure. More specifically, the lens-less image pickup device 111 in fig. 4 includes a control section 121, an image pickup device 122, a signal processing section 123, a display section 124, and a storage section 125.

The control section 121 includes a processor and the like, and controls the overall operation of the lens-less image pickup device 111.

The image pickup device 122 has the configuration described with reference to the right part in fig. 3, picks up an image formed with a pixel signal according to the light amount of incident light from the subject indicated by the right arrow in fig. 4 as a picked-up image, and outputs the picked-up image to the signal processing section 123.

More specifically, the image pickup device 122 corresponds to the right part in fig. 3, and includes a band-pass filter 103, a mask 102, and a solid-state image pickup device 101 from the left side of the figure.

As explained with reference to fig. 3, the band-pass filter 103 is divided into two regions in both the horizontal direction and the vertical direction, that is, four regions in total, regions 103A to 103D, and the different regions transmit light in different wavelength bands of incident light.

For example, as shown in fig. 5, the region 103A of the band-pass filter 103 transmits incident light in the wavelength band ZA near the predetermined wavelength λ 1 to λ 1+ α of the incident light, the region 103B transmits incident light in the wavelength band ZB near the predetermined wavelength λ 1+ α to λ 1+2 α of the incident light, the region 103C transmits incident light in the wavelength band ZC near the predetermined wavelength λ 1+2 α to λ 1+3 α of the incident light, and the region 103D transmits incident light in the wavelength band ZD near the predetermined wavelength λ 1+3 α to λ 1+4 α of the incident light.

Here, λ 1 is a predetermined wavelength of incident light, and α is a predetermined constant.

Therefore, the regions 103A to 103D of the band-pass filter 103 transmit incident light in the wavelength bands ZA to ZD of the incident light that are different from each other. Note that although the wavelength bands ZA to ZD are wavelength bands having almost the same width in fig. 5, they do not have to have the same width, but may have different widths.

As shown in the right part of fig. 3, the mask 102 is divided into regions 102A to 102D so that these regions correspond to regions 103A to 103D of the band-pass filter 103. A unit size Δ (refer to the lower left portion of fig. 1) for adjusting the interval between the light shielding portion and the opening portion in each mask is adjusted so that the influence of diffraction is minimized corresponding to the wavelength of light to be transmitted.

For example, in the case where the interval between the mask 102 and the solid-state image pickup device 101 is a predetermined distance d, the magnitude of diffraction occurring in the case where incident light transmits through the mask 102 varies depending on the wavelength of the transmitted light and the unit size Δ for adjusting the size of the opening portion.

For example, as shown in the uppermost row in fig. 6, when the wavelength of incident light is the wavelength λ 1, and the size of one opening portion (i.e., the unit size Δ) changes from Δ 1 to Δ 1+7 β in increments of a predetermined width β from the left side in the figure, the captured image of incident light passing through one opening portion captured by the solid-state image capturing device 101 changes as shown in captured images F1 to F8.

If diffraction occurs due to the opening portion, an incident light image formed with the incident light does not become a dot-like image on the image pickup surface of the solid-state image pickup device 101 but becomes an image that exhibits scattered and diffused light, and further, the magnitude of the diffusion varies depending on the degree of influence of the diffraction.

That is, in the case where the wavelength of incident light on the uppermost row in fig. 6 is the wavelength λ 1, the photographed image F4 (unit size Δ ═ Δ 1+3 β) exhibits the brightest spot having the smallest diameter, and the influence of diffraction can be considered to be minimized.

In addition, as shown in the uppermost second row in fig. 6, when the wavelength of incident light is the wavelength λ 1+ α, and the unit size Δ is changed from Δ 1 to Δ 1+7 β in increments of a predetermined width β, the captured images of the incident light passing through one opening portion captured by the solid-state image capturing device 101 are changed as shown in captured images F11 to F18.

That is, in the case where the wavelength of incident light on the second row in fig. 6 is the wavelength λ 1+ α, the photographed image F15 (unit size Δ ═ Δ 1+4 β) exhibits the brightest spot having the smallest diameter, and the influence of diffraction can be considered to be minimized.

Further, as shown in the uppermost third row in fig. 6, when the wavelength of incident light is the wavelength λ 1+2 α, and the unit size Δ is changed from Δ 1 to Δ 1+7 β in increments of a predetermined width β, the captured images of the incident light passing through one opening portion captured by the solid-state image capturing device 101 are changed as shown in captured images F21 to F28.

That is, in the case where the wavelength of the incident light of the third row in fig. 6 is the wavelength λ 1+2 α, the photographed image F25 (unit size Δ ═ Δ 1+4 β) exhibits the brightest spot having the smallest diameter, and the influence of diffraction can be considered to be minimized.

In addition, as shown in the uppermost fourth row in fig. 6, when the wavelength of incident light is the wavelength λ 1+3 α, and the unit size Δ is changed from Δ 1 to Δ 1+7 β in increments of a predetermined width β, the captured images of the incident light passing through one opening portion captured by the solid-state image capturing device 101 are changed as shown in captured images F31 to F38.

That is, in the case where the wavelength of incident light on the fourth row in fig. 6 is the wavelength λ 1+3 α, the photographed image F36 (unit size Δ 1+5 β) exhibits the brightest spot having the smallest diameter, and the influence of diffraction can be considered to be minimized, in addition, as the unit size Δ decreases as shown in the photographed images F35, F34, F33, F32, and F31, or as the unit size Δ increases as shown in the photographed images F26 to F28, the dots near the center are spread, and the influence of diffraction gradually increases.

In view of this, in the case where each of the regions 103A to 103D of the band-pass filter 103 transmits light in a corresponding wavelength band of the wavelength bands ZA to ZD shown in fig. 5, by setting the unit sizes Δ of the opening portions 102A to 102D in the regions 102A to 102D of the corresponding mask 102 to Δ 1+3 β, Δ 1+4 β, Δ 1+4 β, and Δ 1+5 β, respectively, it is possible to modulate incident light and cause the incident light to enter the solid-state image pickup device 101 in a state in which the influence of diffraction on light in the wavelength bands is minimized.

Therefore, the regions 101A to 101D of the solid-state image pickup device 101 can take four taken images of the respective wavelength bands in which the influence of the blur due to diffraction is minimized for each wavelength band.

The solid-state image pickup device 101 is at least partially constituted by a CMOS image sensor or the like, picks up a picked-up image formed by a pixel signal according to the light amount of incident light in each pixel unit, and outputs the picked-up image to the signal processing section 123. In addition, the solid-state image pickup device 101 is divided into regions 101A to 101D so that these regions correspond to the regions 103A to 103D of the band-pass filter 103, respectively, and the regions 102A to 102D of the mask 102, respectively, a total of four picked-up images a to D in the same spatial range formed with incident light in different wavelength bands are picked up for the different regions, and the picked-up images a to D are output to the signal processing section 123. Note that although a single captured image formed with four identical captured images a to D between which parallax is generated is captured in the spatial range of the regions 101A to 101D of the solid-state image capturing device 101, the influence of parallax between the captured images a to D is ignored in the description given herein.

For the image signals of the captured images a to D supplied from the image capturing device 122, the signal processing section 123 reconstructs and integrates the final images a to D of the regions 101A to 101D by solving the equation set as explained using fig. 2 and equations (1) to (3), for example, and outputs the final images a to D to the display section 124 as a single final image and causes the single final image to be displayed on the display section 124 or causes the final image to be stored in the storage section 125.

Note that the detailed configuration of the signal processing section 123 is mentioned in detail below with reference to fig. 7.

< example of arrangement of Signal processing section >

Next, a detailed configuration example of the signal processing section 123 will be explained with reference to fig. 7.

The signal processing unit 123 includes a signal region dividing unit 131, an image reconstructing unit 132, and an image integrating unit 133.

The signal area dividing section 131 divides the image supplied from the image pickup device 122 into picked-up images a to D each formed with a pixel signal of a corresponding one of the areas 101A to 101D, and outputs the picked-up images a to D to the image reconstruction processing sections 151A to 151D of the image reconstruction section 132, respectively.

The image reconstructing section 132 solves the system of equations as explained with reference to fig. 2 and equations (1) to (3) for each of the four captured images a to D divided into the regions 101A to 101D by the signal region dividing section 131, respectively, to thereby reconstruct the captured images a to D into final images a to D, and outputs the reconstructed final images a to D to the image integrating section 133.

More specifically, the image reconstruction section 132 includes image reconstruction processing sections 151A to 151D, and the image reconstruction processing sections 151A to 151D reconstruct final images a to D, respectively, from the captured images a to D of the areas 101A to 101D divided by the signal area dividing section 131. The image reconstruction processing sections 151A to 151D reconstruct the final images a to D, respectively, from the captured images a to D of the areas 101A to 101D divided by the signal area dividing section 131, and output the final images a to D to the image integrating section 133.

The image integration section 133 integrates the final images a to D, which are reconstructed from the captured images a to D of the regions 101A to 101D of the solid-state image capturing device 101 and supplied from the image reconstruction processing sections 151A to 151D of the image reconstruction section 132, in such a manner that the final images a to D are superimposed to form a single image. The image integration unit 133 outputs the single image to the display unit 124, and displays the image on the display unit 124. The image integration section 133 outputs the image to the storage section 125, and causes the image to be stored in the storage section 125.

< image capturing processing >

Next, an image capturing process performed by the lens-less image capturing apparatus 111 in fig. 1 will be described with reference to a flowchart in fig. 8.

In step S11, each of the regions 103A to 103D of the band-pass filter 103 transmits light in a corresponding one of the wavelength bands ZA to ZD of the incident light explained with reference to fig. 2.

In step S12, each of the regions 102A to 102D of the mask 102 modulates incident light by an opening portion corresponding to the unit size Δ of the opening portions 102A to 102D, and makes the incident light enter the image pickup face of the solid-state image pickup device 101.

In step S13, each of the regions 101A to 101D of the solid-state image pickup device 101 picks up a corresponding one of the picked-up images a to D formed with light modulated by being transmitted through the regions 102A to 102D of the mask 102, and outputs the picked-up images a to D to the signal processing section 123 as a single picked-up image.

In step S14, the signal area dividing section 131 of the signal processing section 123 divides the photographed images supplied from the solid-state image pickup device 101 of the image pickup device 122 into the photographed images a to D of the areas 101A to 101D, respectively, and outputs the photographed images a to D to the image reconstruction processing sections 151A to 151D of the image reconstruction section 132, respectively.

In step S15, the image reconstruction processing sections 151A to 151D reconstruct the captured images a to D captured by the regions 101A to 101D of the solid-state image capturing device 101, respectively, by the processing described with reference to fig. 2 and equations (1) to (3), generate final images a to D, and output the final images a to D to the image integrating section 133.

In step S16, the image integration section 133 integrates final images a to D obtained from the four captured images a to D captured respectively by the regions 101A to 101D in such a manner as to superimpose the final images a to D, and outputs the final images a to D as a single final image.

Through the above-described processing, light is modulated by the opening portions and the light blocking portions having the sizes corresponding to the unit size Δ of the respective regions 102A to 102D of the mask 102 according to the wavelength band of incident light. Thereby, the photographed images a to D of the respective wavelength bands can be photographed in a state where the influence of diffraction is minimized, and the final images a to D can be reconstructed from the photographed images a to D that have been photographed. Accordingly, the influence of blur due to diffraction in the reconstructed final images a to D can be reduced, and a final image with high spatial resolution can also be generated.

In particular, since the influence of diffraction of light in the long wavelength band is large, it is possible to effectively suppress blurring due to the influence of diffraction of light in the long wavelength band.

For example, in far-infrared image sensing (thermal imaging sensing) known to be capable of taking temperature information as an image, the temperature of a target is determined from the peak wavelength of integrated light in a wide range of a long wavelength band. More specifically, the range of the wavelength band to be sensed is generally considered to be about 8 μm to 14 μm. In this case, the longest wavelength of the sensing target is about twice the shortest wavelength of the sensing target.

In addition, in recent years, there are many image capturing systems that capture images from visible light and near-infrared light at the same time, but also in this case, blue light has a shortest wavelength peak of 450nm, near-infrared light has a longest wavelength peak of 800nm, and therefore the longest wavelength of near-infrared light is approximately twice the shortest wavelength of blue light.

In this way, in the case where light having a wide range of wavelength bands is integrated and used for exposure, it becomes difficult to attempt optimization for reducing the influence of blur due to diffraction of the entire region of the wide range of wavelength bands. For example, in the case of a modulation designed to reduce the point spread on the shorter wavelength side, the point spread on the longer wavelength side increases, and the spatial resolution of the image decreases. The same applies similarly to the opposite case, and it becomes difficult to restore an image with high spatial resolution.

In view of this, for example, in the case of performing image sensing capable of capturing temperature information as an image, the wavelength λ 1 mentioned in fig. 5 is set to 8 μm, α is set to 1.5 μm, and the wavelength bands ZA to ZD are set to wavelength bands near 8 μm to 9.5 μm, 9.5 μm to 11 μm, 11 μm to 12.5 μm, and 12.5 μm to 14 μm.

In addition, the above-mentioned wavelength bands ZA to ZD may also be set to other wavelength bands, and may be set to RGB (red, green, blue visible light) and IR (infrared light), for example.

Further, in the example in the explanation given above, each of the band-pass filter 103, the mask 102, and the solid-state image pickup device 101 is divided into four regions according to the wavelength band, images a to D picked up in the respective regions, and final images a to D are reconstructed by performing signal processing using the picked up images a to D to integrate the final images a and D into a single final image. However, the number of divided regions may be any number other than four, and the band-pass filter 103, the mask 102, and the solid-state image pickup device 101 may be divided into a larger number of regions only to such an extent that such division does not cause a significant reduction in resolution. In addition, since the final images a to D are finally integrated, it is desirable that the sizes of the respective regions are approximately the same as each other; however, the sizes of the respective areas are not necessarily exactly the same, and they need only have such a size that the spatial range required in the final image is covered by the captured image as a whole.

Further, when integrating a plurality of images, the image integrating section 133 may integrate the plurality of images after performing parallax correction according to the positions of the images captured in the plurality of regions in the solid-state image capturing device 101. Thereby, the spatial resolution can be improved.

In addition, although in the example described above, the image integration section 133 integrates the plurality of reconstructed final images into a single image in such a manner that the plurality of reconstructed final images are superimposed, the plurality of reconstructed final images may be regarded as being integrated as long as the single image can be finally generated by using the plurality of final images.

Thus, for example, the image integrating section 133 can integrate a plurality of final images by selecting a single final image from the plurality of final images. For example. The single final image to be selected here may be the image with the least influence of diffraction.

In addition, the image integration unit 133 may select a plurality of final images from the plurality of final images and integrate the plurality of final images such that the selected final images are superimposed, for example. For example, the plurality of final images to be selected here may be a plurality of final images that are relatively less affected by diffraction.

<4. second embodiment >

In the example explained above, each of the band-pass filter 103, the mask 102, and the solid-state image pickup device 101 is divided into a plurality of corresponding regions, a wavelength band of incident light to be transmitted by the band-pass filter 103 is set for each of the divided regions, and the incident light is modulated by the mask 102 including aperture portions each having a unit size corresponding to the wavelength band in accordance with the corresponding region. Thereby, the captured image is captured in each region with the influence of diffraction reduced, a final image is reconstructed from the captured image in each region, and the final images of the respective regions are integrated. Thereby, a single final image with high spatial resolution is captured by reducing the influence of blur due to diffraction.

However, the degree of influence of diffraction may also be adjusted by the distance between the mask 102 and the solid-state image pickup device 101 according to the wavelength band of the incident light. In view of this, the opening portion of the mask 102 may be formed to have the same unit size Δ in all regions, and the distance between the mask 102 and the solid-state image pickup device 101 may be varied for each region according to the wavelength band of incident light to reduce the influence of diffraction.

Fig. 9 shows a configuration example of the image pickup device 122 of the lens-less image pickup apparatus 111, in which the opening portion of the mask 102 is formed to have the same unit size Δ in all regions, and the distance between the mask 102 and the solid-state image pickup device 101 is changed for each region according to the wavelength band of incident light to reduce the influence of diffraction.

Note that configurations in fig. 9 that are functionally the same as those in fig. 3 have the same reference numerals, and description thereof is appropriately omitted. That is, fig. 9 is different from fig. 3 in that regions 102A 'to 102D' and regions 103A 'to 103D' are set instead of regions 102A to 102D of the mask 102 and regions 103A to 103D of the band-pass filter 103. In addition, although the description of the mask patterns for the regions 102A 'and 102B' is omitted in fig. 9, the omission is merely for convenience of explanation, and mask patterns having the same unit size are actually provided to the regions 102A 'to 102D'.

In fig. 9, distances between the regions 101A to 101D of the solid-state image pickup device 101 and the regions 102A 'to 102D' of the mask 102 are respectively different from each other.

That is, in fig. 9, the distance between the region 101A of the solid-state image pickup device 101 and the region 102A 'of the mask 102 is a distance GapA, the distance between the region 101B of the solid-state image pickup device 101 and the region 102B' of the mask 102 is a distance GapB, the distance between the region 101C of the solid-state image pickup device 101 and the region 102C 'of the mask 102 is a distance GapC, the distance between the region 101D of the solid-state image pickup device 101 and the region 102D' of the mask 102 is a distance GapD, and GapA to GapD are distances different from each other.

This is because the degree of influence of diffraction can be adjusted according to the wavelength band of incident light by the distance between the mask 102 and the solid-state image pickup device 101.

For example, in the case where the unit size Δ for adjusting the opening portions of the mask 102 is a predetermined same value for all the areas, and the wavelength of incident light is the wavelength λ 1 as shown in the uppermost row in fig. 10, when the distance between the mask 102 and the solid-state image pickup device 101 is changed from G1 to G1+11 γ, the picked-up images of incident light passing through one opening portion picked up by the solid-state image pickup device 101 are changed as shown in picked-up images F101 to F108.

If diffraction occurs due to the opening portion, an incident light image formed with the incident light does not become a dot-like image on the image pickup surface of the solid-state image pickup device 101 but becomes an image that exhibits scattered and diffused light, and further, the magnitude of the diffusion varies depending on the degree of influence of the diffraction.

That is, in the case where the wavelength of incident light on the uppermost row in fig. 10 is the wavelength λ 1, the image F106 (the distance between the mask 102 and the solid-state image pickup device 101 is G1+9 γ) exhibits the brightest spot having the smallest diameter, and it can be considered that the influence of diffraction is minimized. In addition, at this time, as the distance between the mask 102 and the solid-state imaging device 101 decreases as shown in images F105 to F101, or as the distance between the mask 102 and the solid-state imaging device 101 increases as shown in image F107, the light spot near the center spreads, and the influence of diffraction gradually increases.

In addition, as shown in the second row in fig. 10, in the case where the wavelength of incident light is the wavelength λ 1+ α, when the distance between the mask 102 and the solid-state imaging device 101 is changed from G1 to G1+11 γ, the captured images of incident light passing through one opening portion captured by the solid-state imaging device 101 change as shown in captured images F111 to F118.

That is, in the case where the wavelength of incident light on the second row in fig. 10 is the wavelength λ 1+ α, the photographed image F115 (the distance between the mask 102 and the solid-state image pickup device 101 is G1+7 γ) exhibits the brightest spot having the smallest diameter, and the influence of diffraction can be considered to be minimized.

In addition, as shown in the third row in fig. 10, in the case where the wavelength of incident light is the wavelength λ 1+2 α, when the distance between the mask 102 and the solid-state imaging device 101 is changed from G1 to G1+11 γ, the captured images of incident light passing through one opening portion captured by the solid-state imaging device 101 change as shown by captured images F121 to F128.

That is, in the case where the wavelength of incident light of the third row in fig. 10 is the wavelength λ 1+2 α, the photographed image F124 (the distance between the mask 102 and the solid-state image pickup device 101 is G1+5 γ) exhibits the brightest spot having the smallest diameter, and the influence of diffraction can be considered to be minimized.

In addition, as shown in the fourth row in fig. 10, in the case where the wavelength of incident light is the wavelength λ 1+3 α, when the distance between the mask 102 and the solid-state imaging device 101 is changed from G1 to G1+11 γ, the captured images of incident light passing through one opening portion captured by the solid-state imaging device 101 change as shown in captured images F131 to F138.

That is, in the case where the wavelength of incident light in the fourth row in fig. 10 is the wavelength λ 1+3 α, the photographed image F133 (the distance between the mask 102 and the solid-state image pickup device 101 is G1+3 γ) exhibits the brightest spot having the smallest diameter, and the influence of diffraction can be considered to be minimized.

In view of this, in the case where each of the regions 103A 'to 103D' of the band-pass filter 103 in fig. 9 transmits light in a corresponding wavelength band among the wavelength bands ZA to ZD shown in fig. 5, by setting the distances GapA to GapD of the regions 102A 'to 102D' of the corresponding mask 102 and the regions 101A to 101D of the corresponding solid-state image capturing device 101 to G1+9 γ, G1+7 γ, G1+5 γ, and G1+3 γ, respectively, it is possible to modulate incident light and make it enter the solid-state image capturing device 101 in a state in which diffraction of light in the respective wavelength bands is minimized.

Therefore, the solid-state image pickup device 101 can pick up four picked-up images of respective wavelength bands in which the influence of blur due to diffraction is minimized for each wavelength band.

Note that since image capturing processing at the lens-less image capturing apparatus 111 by using the image capturing device 122 having the configuration described with reference to fig. 9 is similar to the processing described with reference to the flowchart in fig. 8, description of the image capturing processing is omitted.

In addition, since it is only necessary that the band-pass filter 103, the mask 102, and the solid-state image pickup device 101 have a configuration that minimizes the influence of diffraction according to the wavelength band to be transmitted, both the unit size of the mask 102 and the distance between the mask 102 and the solid-state image pickup device 101 can be adjusted for each region that transmits the same wavelength band.

<5. third embodiment >

In the example explained above, each of the band-pass filter 103, the mask 102, and the solid-state image pickup device 101 is divided into a plurality of regions according to the wavelength band of light to be transmitted by the band-pass filter 103, the light is modulated by the mask 102 in the respective regions so that the influence of diffraction is minimized for the respective wavelength bands in the respective regions, and is picked up by the solid-state image pickup device 101, a final image of each region is reconstructed from the picked-up image picked up in the region, and the final images are integrated into a single final image.

However, near the boundary between the respective regions, light in different wavelength bands may inevitably undergo color mixing, and the influence of diffraction may not be appropriately reduced near the boundary.

In view of this, by providing the light-shielding wall at the boundary of the divided regions, it is possible to prevent color mixing of light in different wavelength bands, thereby making it possible to appropriately reduce the influence of diffraction of each wavelength band region.

Fig. 11 shows a configuration example of the image pickup device 122 of the lens-less image pickup apparatus 111 in which a light shielding wall is provided at a boundary between divided regions. Note that configurations in fig. 11 that are functionally the same as those in fig. 3 have the same reference numerals, and a description thereof is omitted.

That is, fig. 11 differs from the image pickup device of the lens-less image pickup apparatus in fig. 3 in that light shielding walls 201-1 and 201-2 are provided at boundaries between the regions 103A to 130D of the band-pass filter 103, between the regions 102A to 102D of the mask 102, and between the regions 101A to 101D of the solid-state image pickup device 101.

By providing the light-shielding walls 201-1 and 201-2 at the boundaries between the regions 103A to 130D of the band-pass filter 103, between the regions 102A to 102D of the mask 102, and between the regions 101A to 101D of the solid-state image pickup device 101, color mixing due to light in adjacent wavelength bands near the boundaries between the respective regions can be prevented, and the influence of diffraction in the respective regions can be reduced with certainty.

Accordingly, an image photographed with incident light in a wavelength band in each region can be utilized while reducing the influence of blur due to diffraction, and thus the spatial resolution of a final image reconstructed based on the photographed image can be improved.

Note that in this specification, a system refers to a set of a plurality of constituent elements (a device, a module (portion), and the like), and it is not important whether all the constituent elements are contained in a single housing. Therefore, both a plurality of devices accommodated in separate housings and connected via a network and one device in which a plurality of modules are accommodated in one housing are systems.

In addition, the embodiments of the present disclosure are not limited to the above-mentioned embodiments, and may be changed in various ways within a scope not departing from the gist of the present disclosure.

Further, each step illustrated in the above-mentioned flowcharts may be executed not only by one apparatus but also by a plurality of apparatuses by being shared among the plurality of apparatuses.

In addition, in the case where one step includes a plurality of processes, not only the plurality of processes included in one step may be executed by one apparatus, but also the plurality of processes included in one step may be executed by a plurality of apparatuses by sharing among the plurality of apparatuses.

Note that the present disclosure may also have the following configuration.

<1>

An image capturing apparatus comprising:

a band-pass filter divided into a plurality of regions, each region transmitting incident light in a different wavelength band;

a mask that is divided in correspondence with the plurality of regions and modulates incident light in the different wavelength bands transmitted through respective ones of the plurality of regions of the band-pass filter;

a solid-state image pickup device that has an image pickup face divided in correspondence with the plurality of regions and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal; and

a signal processing section that reconstructs the two-dimensional pixel signal captured by the solid-state image capturing device into a final image through signal processing.

<2>

The image photographing apparatus according to <1>, wherein,

the mask is a mask pattern having a unit size different according to each of the plurality of regions.

<3>

The image photographing apparatus according to <2>, wherein,

the mask pattern of each of the plurality of regions is a mask pattern having a different unit size based on a wavelength band of incident light transmitted through the band pass filter.

<4>

The image photographing apparatus according to <2>, wherein,

the mask pattern for each of the plurality of regions is a mask pattern having the following unit dimensions: so that when incident light in a wavelength band transmitted through the band-pass filter is photographed by the solid-state image pickup device, diffusion of the incident light due to diffraction is almost minimized.

<5>

The image photographing apparatus according to <1>, wherein,

the distance from the mask to the image pickup face of the solid-state image pickup device is a distance different for each of the plurality of regions.

<6>

The image photographing apparatus according to <5>, wherein,

the distance from the mask to an image pickup face of the solid-state image pickup device is a distance different based on a wavelength band of incident light transmitted through each of the plurality of regions of the band-pass filter.

<7>

The image photographing apparatus according to <6>, wherein,

a distance from the mask to an image pickup face of the solid-state image pickup device is, for each of the plurality of regions, a distance of: so that when incident light in a wavelength band transmitted through the band-pass filter is photographed by the solid-state image pickup device, diffusion of the incident light due to diffraction is almost minimized.

<8>

The image photographing apparatus according to <5>, wherein,

the mask is a mask pattern having the same unit size for all of the plurality of regions.

<9>

The image capturing apparatus according to <1>, further comprising:

a light shielding wall that blocks incident light from an adjacent area at a boundary between the band-pass filter, the mask, and a plurality of areas of the solid-state image pickup device.

<10>

The image photographing apparatus according to <1>, wherein,

the signal processing section includes:

a dividing section that divides two-dimensional pixel signals captured by the solid-state image capturing device in association with the plurality of areas;

a plurality of image reconstruction sections that reconstruct each pixel signal obtained by dividing the two-dimensional pixel signal into a final image by signal processing; and

an integrating unit that integrates the final images reconstructed by the plurality of image reconstructing units.

<11>

The image photographing apparatus according to <10>, wherein,

the integration unit integrates the final images reconstructed by the plurality of image reconstruction units by superimposing the final images.

<12>

The image photographing apparatus according to <10>, wherein,

the integration section integrates one of the final images reconstructed by the plurality of image reconstruction sections by selecting the one final image.

<13>

The image photographing apparatus according to <10>, wherein,

the integration section selects at least two of the final images reconstructed by the plurality of image reconstruction sections and integrates the selected at least two final images in such a manner that the selected at least two final images are superimposed.

<14>

The image photographing apparatus according to <1>, wherein,

a minute gap is formed between the solid-state image pickup device and the mask in the incident direction of the incident light.

<15>

The image photographing apparatus according to <1>, wherein,

the image capture device does not include a lens that focuses the incident light onto any one of the band pass filter, the mask, and the solid state image capture device.

<16>

The image photographing apparatus according to <1>, wherein,

the wavelength band of the incident light is about 8 μm to about 14 μm.

<17>

An image capturing method of an image capturing apparatus, the image capturing apparatus comprising:

a band-pass filter divided into a plurality of regions, each region transmitting incident light in a different wavelength band,

a mask which is divided in correspondence with the plurality of regions and modulates incident light in different wavelength bands transmitted through respective ones of the plurality of regions of the band-pass filter, and

a solid-state image pickup device that has an image pickup face divided in correspondence with the plurality of regions and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal, the image pickup method including:

signal processing for reconstructing the two-dimensional pixel signals photographed by the solid-state image photographing device into a final image by signal processing.

<18>

An image pickup device comprising:

a band-pass filter divided into a plurality of regions, each region transmitting incident light in a different wavelength band;

a mask that is divided in correspondence with the plurality of regions and modulates incident light in different wavelength bands transmitted through respective ones of the plurality of regions of the band-pass filter; and

a solid-state image pickup device that has an image pickup face divided in correspondence with the plurality of regions, and picks up incident light modulated by each of the plurality of regions through the mask as a two-dimensional pixel signal.

List of reference numerals

101 solid-state image pickup device, 101A to 101D, 101A 'to 101D', region 102 mask, 102A to 102D, 102A 'to 102D', region 103 band pass filter, 103A to 103D, 103A 'to 103D', region 111 lens-less image pickup device, 121 control section, 122 image pickup device, 123 signal processing section, 124 display section, 125 storage section, 131 signal region dividing section, 132 image reconstruction section, 133 image integration section, 151A to 151D image reconstruction processing section.

32页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于帧间预测插值的参考尺寸

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类