Image processing device, imaging device, image processing method, and program

文档序号:1713833 发布日期:2019-12-13 浏览:19次 中文

阅读说明:本技术 图像处理装置、摄像装置、图像处理方法及程序 (Image processing device, imaging device, image processing method, and program ) 是由 小野修司 于 2018-04-04 设计创作,主要内容包括:本发明的目的在于提供一种图像处理装置、摄像装置、图像处理方法及程序,所述图像处理装置能够生成在大范围内抑制了水面上的反射光引起的影响的图像。图像处理装置(500)具备:图像获取部(501),获取包含具有波浪的水面的相同场景的多个帧图像;亮度值计算部(503),根据多个帧图像计算与构成各帧图像的微小区域对应的亮度值;像素值提取部(505),提取多个帧图像中微小区域的亮度值比其他帧图像的微小区域的亮度值小的亮度值的帧图像的微小区域所对应的像素值;存储部(507),存储通过像素值提取部(505)提取的像素值;及合成图像生成部(509),根据存储于存储部(507)的像素值生成与场景对应的合成图像。(an object of the present invention is to provide an image processing device capable of generating an image in which the influence of reflected light on a water surface is suppressed over a wide range, an imaging device, an image processing method, and a program. An image processing device (500) is provided with: an image acquisition unit (501) that acquires a plurality of frame images of the same scene including a water surface having waves; a luminance value calculation unit (503) that calculates, from the plurality of frame images, luminance values corresponding to the micro regions that constitute each frame image; a pixel value extraction unit (505) that extracts pixel values corresponding to minute regions of a frame image having a luminance value that is less than the luminance value of the minute regions of other frame images, from among the plurality of frame images; a storage unit (507) that stores the pixel values extracted by the pixel value extraction unit (505); and a synthetic image generation unit (509) for generating a synthetic image corresponding to the scene from the pixel values stored in the storage unit (507).)

1. an image processing apparatus includes:

An image acquisition unit that acquires a plurality of frame images of the same scene including a water surface having waves;

A luminance value calculation unit that calculates luminance values corresponding to minute regions constituting the respective frame images, based on the plurality of frame images;

A pixel value extracting unit that extracts pixel values corresponding to the minute regions of a frame image having a luminance value smaller than the luminance value of the minute region of another frame image, among the plurality of frame images;

A storage unit that stores the pixel value extracted by the pixel value extraction unit; and

And a synthetic image generating unit that generates a synthetic image corresponding to the scene from the pixel values stored in the storage unit.

2. An image processing apparatus includes:

An image acquisition unit that acquires images corresponding to a plurality of frames of the same scene in which a water quality inspection target including a water surface having waves is imaged, each frame having a 1 st image and a 2 nd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band;

A pixel value calculation unit that calculates a sum of a pixel value of the 1 st image and a pixel value of the 2 nd image for each micro area in each of the frames, based on the 1 st image and the 2 nd image in the plurality of frames;

a ratio or difference calculation unit that calculates a ratio or difference between pixel values of the 1 st image and pixel values of the 2 nd image corresponding to the micro region when a sum of pixel values of the 1 st image and pixel values of the 2 nd image in the micro region becomes minimum in the plurality of frames;

a storage unit that stores the ratio or the difference calculated by the ratio or difference calculation unit;

A water quality data calculation unit for calculating water quality data of the water quality inspection target based on the ratio or the difference stored in the storage unit; and

And a water quality distribution image generating unit that generates a water quality distribution image representing the water quality distribution of the water quality inspection target, based on the water quality data calculated by the water quality data calculating unit.

3. An image processing apparatus includes:

An image acquisition unit that acquires images corresponding to a plurality of frames of the same scene in which a water quality inspection target including a water surface having waves is photographed, each frame having a 1 st image, a 2 nd image, and a 3 rd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band, and the 3 rd image being an image based on a 3 rd wavelength band including the 1 st wavelength band and the 2 nd wavelength band;

a pixel value acquiring unit that acquires pixel values corresponding to micro areas constituting the respective 3 rd images from the 3 rd images of the plurality of frames;

A ratio or difference calculation unit that calculates a ratio or difference between pixel values of the 1 st image and pixel values of the 2 nd image corresponding to the micro areas when the pixel values of the micro areas in the plurality of frames are the smallest;

A storage unit that stores the ratio or the difference calculated by the ratio or difference calculation unit;

A water quality data calculation unit for calculating water quality data of the water quality inspection target based on the ratio or the difference stored in the storage unit; and

and a water quality distribution image generating unit that generates a water quality distribution image representing the water quality distribution of the water quality inspection target, based on the water quality data calculated by the water quality data calculating unit.

4. The image processing apparatus according to claim 2 or 3,

The 1 st wave band is a wave band containing 670nm, the 2 nd wave band is a wave band containing 700nm,

the water quality data calculation unit calculates a concentration of chlorophyll a as water quality data of the water quality inspection target from the 1 st image and the 2 nd image acquired by the image acquisition unit,

The water quality distribution image generating unit generates a concentration distribution image representing the calculated concentration distribution of chlorophyll a as the water quality distribution image.

5. The image processing apparatus according to claim 3,

The 3 rd waveband is a waveband of visible light.

6. The image processing apparatus according to any one of claims 1 to 5,

The micro area is an area of 1 pixel.

7. The image processing apparatus according to any one of claims 1 to 6,

the waves are artificially generated waves.

8. An imaging apparatus provided with the image processing apparatus according to any one of claims 1 to 7.

9. An imaging device provided with the image processing device according to claim 2, the imaging device comprising:

An imaging optical system including an imaging lens, and a 1 st filter and a 2 nd filter corresponding to a 1 st region and a 2 nd region of the imaging lens, respectively, the 1 st filter transmitting light of the 1 st wavelength band, the 2 nd filter transmitting light of the 2 nd wavelength band; and

And an orientation sensor that has a plurality of pixels including photoelectric conversion elements arranged in a two-dimensional shape, and that pupil-divides and selectively receives light beams entering through the 1 st filter and the 2 nd filter of the imaging optical system.

10. an imaging device provided with the image processing device according to claim 3, the imaging device comprising:

An imaging optical system including an imaging lens, and a 1 st, a 2 nd, and a 3 rd optical filters corresponding to a 1 st, a 2 nd, and a 3 rd regions of the imaging lens, respectively, the 1 st optical filter transmitting light of the 1 st wavelength band, the 2 nd optical filter transmitting light of the 2 nd wavelength band, and the 3 rd optical filter transmitting light of the 3 rd wavelength band; and

And an orientation sensor that has a plurality of pixels including photoelectric conversion elements arranged in a two-dimensional shape, and that pupil-divides and selectively receives light beams that enter through the 1 st filter, the 2 nd filter, and the 3 rd filter of the imaging optical system, respectively.

11. The image pickup apparatus according to any one of claims 8 to 10,

Fixed at a fixed point to photograph the same scene containing the water surface with the waves.

12. the image pickup apparatus according to any one of claims 8 to 11,

The imaging device includes:

A polarizing filter passing a beam of light of the same scene containing the water surface with the waves.

13. An image processing method, comprising the steps of:

An image acquisition step of acquiring a plurality of frame images of the same scene including a water surface having waves;

a luminance value calculation step of calculating, from the plurality of frame images, luminance values corresponding to minute regions constituting the respective frame images;

A pixel value extraction step of extracting a pixel value corresponding to the minute region of a frame image having a luminance value smaller than the luminance value of the minute region of another frame image, from among the plurality of frame images;

A storage step of storing the pixel values extracted by the pixel value extraction step; and

A synthetic image generating step of generating a synthetic image corresponding to the scene from the pixel values stored in the storing step.

14. An image processing method, comprising the steps of:

An image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image and a 2 nd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band;

A pixel value calculation step of calculating a sum of a pixel value of the 1 st image and a pixel value of the 2 nd image for each micro area in each of the frames, based on the 1 st image and the 2 nd image in the plurality of frames;

A ratio or difference calculation step of calculating a ratio or difference between pixel values of the 1 st image and pixel values of the 2 nd image corresponding to the micro area when a sum of pixel values of the 1 st image and pixel values of the 2 nd image in the micro area becomes minimum in the plurality of frames;

A storage step of storing the ratio or the difference calculated by the ratio or difference calculation step;

A water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or the difference stored in the storage step; and

A water quality distribution image generation step of generating a water quality distribution image representing a water quality distribution of the water quality inspection target from the water quality data calculated by the water quality data calculation step.

15. An image processing method, comprising the steps of:

An image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image, a 2 nd image, and a 3 rd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band, the 3 rd image being an image based on a 3 rd wavelength band including the 1 st wavelength band and the 2 nd wavelength band;

A pixel value acquisition step of acquiring pixel values corresponding to minute regions constituting the respective 3 rd images from the 3 rd images of the plurality of frames;

A ratio or difference calculation step of calculating a ratio or difference between pixel values of the 1 st image and pixel values of the 2 nd image corresponding to the micro areas when the pixel values of the micro areas become the minimum in the plurality of frames;

A storage step of storing the ratio or the difference calculated by the ratio or difference calculation step;

A water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or the difference stored in the storage step; and

A water quality distribution image generation step of generating a water quality distribution image representing a water quality distribution of the water quality inspection target from the water quality data calculated by the water quality data calculation step.

16. The image processing method according to any one of claims 14 to 16,

the image processing method comprises a wave generating step for artificially generating the waves.

17. a program for causing a computer to execute an image processing process including the steps of:

An image acquisition step of acquiring a plurality of frame images of the same scene including a water surface having waves;

A luminance value calculation step of calculating, from the plurality of frame images, luminance values corresponding to minute regions constituting the respective frame images;

a pixel value extraction step of extracting a pixel value corresponding to the minute region of a frame image having a luminance value smaller than the luminance value of the minute region of another frame image, from among the plurality of frame images;

A storage step of storing the pixel values extracted by the pixel value extraction step; and

a synthetic image generating step of generating a synthetic image corresponding to the scene from the pixel values stored in the storing step.

18. A program for causing a computer to execute an image processing process including the steps of:

An image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image and a 2 nd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band;

A pixel value calculation step of calculating a sum of a pixel value of the 1 st image and a pixel value of the 2 nd image for each micro area in each of the frames, based on the 1 st image and the 2 nd image in the plurality of frames;

A ratio or difference calculation step of calculating a ratio or difference between pixel values of the 1 st image and pixel values of the 2 nd image corresponding to the micro area when a sum of pixel values of the 1 st image and pixel values of the 2 nd image in the micro area becomes minimum in the plurality of frames;

A storage step of storing the ratio or the difference calculated by the ratio or difference calculation step;

A water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or the difference stored in the storage step; and

A water quality distribution image generation step of generating a water quality distribution image representing a water quality distribution of the water quality inspection target from the water quality data calculated by the water quality data calculation step.

19. A program for causing a computer to execute an image processing process including the steps of:

an image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image, a 2 nd image, and a 3 rd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band, the 3 rd image being an image based on a 3 rd wavelength band including the 1 st wavelength band and the 2 nd wavelength band;

A pixel value acquisition step of acquiring pixel values corresponding to minute regions constituting the respective 3 rd images from the 3 rd images of the plurality of frames;

A ratio or difference calculation step of calculating a ratio or difference between pixel values of the 1 st image and pixel values of the 2 nd image corresponding to the micro areas when the pixel values of the micro areas become the minimum in the plurality of frames;

A storage step of storing the ratio or the difference calculated by the ratio or difference calculation step;

A water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or the difference stored in the storage step; and

A water quality distribution image generation step of generating a water quality distribution image representing a water quality distribution of the water quality inspection target from the water quality data calculated by the water quality data calculation step.

Technical Field

The present invention relates to an image processing apparatus, an imaging apparatus, an image processing method, and a program, and more particularly to a technique for acquiring an image in which the influence of reflected light from a water surface is suppressed.

Background

When an object in water is observed and imaged from the land (in the air), there are cases where an object in water cannot be properly imaged due to the influence of reflected light on the water surface. This is because light from an object in water is buried in reflected light on the water surface, and light from the object in water cannot be captured satisfactorily.

Conventionally, a technique has been proposed in which the influence of reflected light on the water surface is suppressed by using a polarizing filter utilizing the properties of brewster's angle.

For example, patent document 1 describes a technique relating to a camera provided with a polarizing filter. The camera provided with the polarization filter described in patent document 1 includes a rotatable polarization filter, and the drive control unit rotates the polarization filter so that a signal corresponding to the level of the video signal of the subject becomes minimum.

Prior art documents

Patent document

patent document 1: japanese laid-open patent publication No. 10-145668

disclosure of Invention

Technical problem to be solved by the invention

However, polarizing filters can only effectively reject reflected light near the Brewster's angle. That is, in the technique of suppressing the influence of the reflected light only by the polarizing filter, the influence of the reflected light of the light incident at an incident angle larger than the brewster angle cannot be effectively suppressed. Therefore, when the angle formed by the optical axis (camera line of sight) and the normal line of the water surface is increased, the polarizing filter does not function effectively, and the influence of the reflected light on the water surface cannot be suppressed.

Even when a camera provided with a polarizing filter described in patent document 1 captures an image of a water surface, the influence of reflected light may not be suppressed satisfactorily at a position where the angle formed by the optical axis and the normal line of the water surface is large.

The present invention has been made in view of such circumstances, and an object thereof is to provide an image processing apparatus capable of generating an image in which an influence due to reflected light on a water surface is suppressed over a wide range, an imaging apparatus, an image processing method, and a program.

Means for solving the technical problem

An image processing apparatus according to an aspect of the present invention for achieving the above object includes: an image acquisition unit that acquires a plurality of frame images of the same scene including a water surface having waves; a luminance value calculation unit that calculates, from the plurality of frame images, luminance values corresponding to the micro regions constituting each of the frame images; a pixel value extracting unit that extracts pixel values corresponding to minute regions of a frame image having a luminance value smaller than the luminance value of the minute regions of another frame image, among the plurality of frame images; a storage unit that stores the pixel value extracted by the pixel value extraction unit; and a synthetic image generating unit that generates a synthetic image corresponding to the scene based on the pixel values stored in the storage unit.

According to this aspect, the pixel extracting unit extracts pixel values corresponding to the minute regions of the frame image having a luminance value smaller than the luminance value of the minute regions of the other frame image from the plurality of frame images, the extracted pixel values are stored in the storage unit, and the synthesized image generating unit generates the synthesized image from the pixel values stored in the storage unit. That is, according to the present embodiment, the synthesized image generated by the synthesized image generating unit is composed of the pixel value at which the luminance value shows the minimum.

Thus, in the present embodiment, an image in which the influence of reflected light is suppressed over a wide range of the water surface to be captured can be generated. That is, the composite image generated in the present embodiment is clearly captured on the water surface or in the water, with the influence of the reflected light on the water surface suppressed over a wide range.

an image processing apparatus according to another aspect of the present invention includes: an image acquisition unit that acquires images corresponding to a plurality of frames of the same scene in which a water quality inspection target including a water surface having waves is photographed, each frame having a 1 st image and a 2 nd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band; a pixel value calculation unit that calculates the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image for each micro area in each frame, based on the 1 st image and the 2 nd image in the plurality of frames; a ratio or difference calculation unit that calculates a ratio or difference between a pixel value of the 1 st image and a pixel value of the 2 nd image corresponding to a micro region in which the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro region is the smallest in a plurality of frames; a storage unit for storing the ratio or difference calculated by the ratio or difference calculation unit; a water quality data calculation unit for calculating water quality data of the water quality inspection object based on the ratio or difference stored in the storage unit; and a water quality distribution image generation unit that generates a water quality distribution image representing the water quality distribution of the water quality inspection target based on the water quality data calculated by the water quality data calculation unit.

According to this aspect, the ratio or difference between the pixel value of the 1 st image and the pixel value of the 2 nd image corresponding to the micro region in which the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro region is the smallest in a plurality of frames is calculated by the ratio or difference calculating unit, and the calculated ratio or difference is stored in the storage unit. Further, according to this aspect, the water quality data calculation unit calculates the water quality data of the water quality inspection target from the ratio or difference stored in the storage unit, and the water quality distribution image generation unit generates a water quality distribution image indicating the water quality distribution of the water quality inspection target from the calculated water quality data.

Thus, in the present embodiment, an image in which the influence of reflected light is suppressed over a wide range of the water surface to be captured can be generated. That is, in the water quality distribution image generated in the present embodiment, the influence of the reflected light on the water surface is suppressed, and an accurate water quality distribution is shown.

An image processing apparatus according to another aspect of the present invention includes: an image acquisition unit that acquires images corresponding to a plurality of frames of the same scene in which a water quality inspection target including a water surface having waves is photographed, each frame having a 1 st image, a 2 nd image, and a 3 rd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band, and the 3 rd image being an image based on a 3 rd wavelength band including the 1 st wavelength band and the 2 nd wavelength band; a pixel value acquisition unit that acquires, from the 3 rd image of the plurality of frames, pixel values corresponding to the micro regions constituting each of the 3 rd images; a ratio or difference calculation unit that calculates a ratio or difference between a pixel value of the 1 st image and a pixel value of the 2 nd image, the ratio or difference corresponding to a micro region having a smallest pixel value among the micro regions in the plurality of frames; a storage unit for storing the ratio or difference calculated by the ratio or difference calculation unit; a water quality data calculation unit for calculating water quality data of the water quality inspection object based on the ratio or difference stored in the storage unit; and a water quality distribution image generation unit that generates a water quality distribution image representing the water quality distribution of the water quality inspection target based on the water quality data calculated by the water quality data calculation unit.

according to this aspect, the proportion or difference between the pixel value of the 1 st image and the pixel value of the 2 nd image corresponding to the micro region in which the pixel value of the micro region of each 3 rd image in the plurality of frames is the smallest is calculated by the proportion or difference calculating unit, and the proportion or difference calculated by the proportion or difference calculating unit is stored in the storage unit. Further, according to this aspect, the water quality data calculation unit calculates the water quality data of the water quality inspection target from the ratio or difference stored in the storage unit, and the water quality distribution image generation unit generates the water quality distribution image indicating the water quality distribution of the water quality inspection target from the water quality data calculated by the water quality data calculation unit.

Thus, in the present embodiment, an image in which the influence of reflected light is suppressed over a wide range of the water surface to be captured can be generated. That is, in the water quality distribution image generated in the present embodiment, the influence of the reflected light on the water surface is suppressed, and an accurate water quality distribution is shown.

preferably, the 1 st wavelength band is a wavelength band including 670nm, the 2 nd wavelength band is a wavelength band including 700nm, the water quality data calculation unit calculates the concentration of chlorophyll a as water quality data of the water quality inspection object from the 1 st image and the 2 nd image acquired by the image acquisition unit, and the water quality distribution image generation unit generates a concentration distribution image indicating the calculated concentration distribution of chlorophyll a as the water quality distribution image.

According to this embodiment, the 1 st wavelength band is a wavelength band including 670nm, the 2 nd wavelength band is a wavelength band including 700nm, the water quality data calculation unit calculates the concentration of chlorophyll a as water quality data to be inspected for water quality from the 1 st image and the 2 nd image acquired by the image acquisition unit, and the water quality distribution image generation unit generates a concentration distribution image indicating the calculated concentration distribution of chlorophyll a as a water quality distribution image.

As a result, the water distribution image generated in the present embodiment shows an accurate concentration distribution of chlorophyll a while suppressing the influence of reflected light from the water surface.

Preferably, the 3 rd wavelength band is a wavelength band of visible light.

According to this aspect, since the 3 rd wavelength band is a wavelength band of visible light, the influence of the reflected light from the water surface can be detected more appropriately, and a water quality distribution image in which the influence of the reflected light is further suppressed can be acquired.

The micro area is preferably 1 pixel area.

According to this embodiment, the micro region is a region of 1 pixel. This aspect can suppress the influence of reflected light in a finer area.

Preferably the waves are artificially generated waves.

according to this mode, the wave is a manually generated wave. Thus, in this embodiment, even in a water surface where natural waves do not occur, the influence of reflected light on the water surface can be suppressed by acquiring a synthetic image or a water quality distribution image by artificially generating waves.

An imaging device according to another aspect of the present invention includes the image processing device.

Preferably fixed at a fixed point to photograph the same scene containing a water surface with waves.

According to this aspect, since the same scene including the water surface having the waves is photographed while being fixed to the fixed point, it is possible to accurately photograph the image of the same scene and generate an image in which the blur of the photographic subject is suppressed.

It is preferable to provide a polarizing filter for passing a light beam of the same scene including a water surface having waves.

According to this aspect, the polarization filter is provided to pass a light flux of the same scene including the water surface having waves. Thus, in the present embodiment, the influence of polarized light reflected by the water surface can be effectively suppressed, and therefore, an image in which the influence of reflected light is further suppressed can be obtained.

An imaging device according to another aspect of the present invention includes the image processing device described above, and includes: an imaging optical system having an imaging lens, and a 1 st optical filter and a 2 nd optical filter corresponding to a 1 st region and a 2 nd region of the imaging lens, respectively, the 1 st optical filter transmitting light of a 1 st wavelength band, the 2 nd optical filter transmitting light of a 2 nd wavelength band; and an orientation sensor having a plurality of pixels formed of two-dimensionally arranged photoelectric conversion elements, pupil-dividing and selectively receiving light beams incident via the 1 st filter and the 2 nd filter of the imaging optical system, respectively.

According to this aspect, the 1 st image, i.e., the 2 nd image, of different wavelength bands is captured using an imaging optical system having an imaging lens, and a 1 st optical filter and a 2 nd optical filter corresponding to the 1 st region and the 2 nd region of the imaging lens, respectively, the 1 st optical filter transmitting light of the 1 st wavelength band, and the 2 nd optical filter transmitting light of the 2 nd wavelength band. This aspect can thereby achieve a reduction in size and weight of the imaging device.

Further, according to this embodiment, since the 1 st and 2 nd images are captured by the 1 imaging optical system described above, the 1 st and 2 nd images can be properly captured by adjusting the 1 st imaging optical system.

Further, according to this aspect, the 1 st and 2 nd images are captured by the 1 st image capturing optical system and the orientation sensor that pupil-divides and selectively receives the light fluxes entering through the 1 st and 2 nd filters of the image capturing optical system. This embodiment eliminates the need for registration between the 1 st and 2 nd images.

further, according to this embodiment, the 1 st image and the 2 nd image can be acquired simultaneously and as independently separated image data.

An imaging device according to another aspect of the present invention includes the image processing device described above, and includes: an imaging optical system having an imaging lens, and a 1 st, a 2 nd, and a 3 rd optical filters corresponding to a 1 st, a 2 nd, and a 3 rd regions of the imaging lens, respectively, the 1 st optical filter transmitting light of a 1 st wavelength band, the 2 nd optical filter transmitting light of a 2 nd wavelength band, and the 3 rd optical filter transmitting light of a 3 rd wavelength band; and an orientation sensor having a plurality of pixels including photoelectric conversion elements arranged in a two-dimensional manner, pupil-dividing and selectively receiving light beams incident via the 1 st filter, the 2 nd filter, and the 3 rd filter of the imaging optical system, respectively.

according to this aspect, the 1 st, 2 nd and 3 rd images are captured using an imaging optical system having an imaging lens, and a 1 st, a 2 nd and a 3 rd optical filters corresponding to the 1 st, the 2 nd and the 3 rd regions of the imaging lens, respectively, the 1 st optical filter transmitting light of the 1 st wavelength band, the 2 nd optical filter transmitting light of the 2 nd wavelength band, and the 3 rd optical filter transmitting light of the 3 rd wavelength band. This aspect can thereby achieve a reduction in size and weight of the imaging device.

Further, according to the present embodiment, since the 1 st, 2 nd, and 3 rd images are captured by the 1 imaging optical system described above, the 1 st, 2 nd, and 3 rd images can be captured appropriately by adjusting the 1 st imaging optical system.

Further, according to this embodiment, the 1 st, 2 nd, and 3 rd images are captured by the 1 st image capturing optical system and the orientation sensor that performs pupil division on the light fluxes entering through the 1 st, 2 nd, and 3 rd filters of the image capturing optical system and selectively receives the light fluxes. This embodiment eliminates the need for registration between the 1 st, 2 nd and 3 rd images.

according to this embodiment, the 1 st image, the 2 nd image, and the 3 rd image can be acquired simultaneously and as independently separated image data.

an image processing method according to another aspect of the present invention includes the steps of: an image acquisition step of acquiring a plurality of frame images of the same scene including a water surface having waves; a luminance value calculation step of calculating, from the plurality of frame images, luminance values corresponding to minute regions constituting the respective frame images; a pixel value extraction step of extracting pixel values corresponding to minute regions of a frame image having a luminance value smaller than the luminance value of the minute regions of other frame images among the plurality of frame images; a storage step of storing the pixel values extracted by the pixel value extraction step; and a synthetic image generation step of generating a synthetic image corresponding to the scene from the pixel values stored in the storage step.

an image processing method according to another aspect of the present invention includes the steps of: an image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image and a 2 nd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band; a pixel value calculation step of calculating the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image for each micro area in each frame based on the 1 st image and the 2 nd image in the plurality of frames; a ratio or difference calculation step of calculating a ratio or difference between a pixel value of the 1 st image and a pixel value of the 2 nd image corresponding to a micro region in which the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro region becomes minimum in a plurality of frames; a storage step of storing the ratio or difference calculated by the ratio or difference calculation step; a water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or difference stored in the storage step; and a water quality distribution image generation step of generating a water quality distribution image representing the water quality distribution of the water quality inspection object based on the water quality data calculated by the water quality data calculation step.

An image processing method according to another aspect of the present invention includes the steps of: an image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image, a 2 nd image, and a 3 rd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band, the 3 rd image being an image based on a 3 rd wavelength band including the 1 st wavelength band and the 2 nd wavelength band; a pixel value acquisition step of acquiring pixel values corresponding to micro regions constituting each 3 rd image from the 3 rd images of the plurality of frames; a ratio or difference calculation step of calculating a ratio or difference between a pixel value of the 1 st image and a pixel value of the 2 nd image corresponding to a micro region in which a pixel value of the micro region is the smallest in a plurality of frames; a storage step of storing the ratio or difference calculated by the ratio or difference calculation step; a water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or difference stored in the storage step; and a water quality distribution image generation step of generating a water quality distribution image representing the water quality distribution of the water quality inspection object based on the water quality data calculated by the water quality data calculation step.

Preferably comprising a wave generation step in which waves are artificially generated.

According to this aspect, since the waves are artificially generated in the wave generation step, even when an image of the water surface without natural waves is acquired, an image in which the influence of the reflected light from the water surface is suppressed can be generated.

A program according to another aspect of the present invention causes a computer to execute an image processing step including: an image acquisition step of acquiring a plurality of frame images of the same scene including a water surface having waves; a luminance value calculation step of calculating, from the plurality of frame images, luminance values corresponding to minute regions constituting the respective frame images; a pixel value extraction step of extracting pixel values corresponding to minute regions of a frame image having a luminance value smaller than the luminance value of the minute regions of other frame images among the plurality of frame images; a storage step of storing the pixel values extracted by the pixel value extraction step; and a synthetic image generation step of generating a synthetic image corresponding to the scene from the pixel values stored in the storage step.

a program according to another aspect of the present invention causes a computer to execute an image processing step including: an image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image and a 2 nd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band; a pixel value calculation step of calculating the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image for each micro area in each frame based on the 1 st image and the 2 nd image in the plurality of frames; a ratio or difference calculation step of calculating a ratio or difference between a pixel value of the 1 st image and a pixel value of the 2 nd image corresponding to a micro region in which the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro region becomes minimum in a plurality of frames; a storage step of storing the ratio or difference calculated by the ratio or difference calculation step; a water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or difference stored in the storage step; and a water quality distribution image generation step of generating a water quality distribution image representing the water quality distribution of the water quality inspection object based on the water quality data calculated by the water quality data calculation step.

A program according to another aspect of the present invention causes a computer to execute an image processing step including: an image acquisition step of acquiring images corresponding to a plurality of frames of the same scene in which a water quality inspection object including a water surface having waves is photographed, each frame having a 1 st image, a 2 nd image, and a 3 rd image, the 1 st image being an image based on a 1 st wavelength band, the 2 nd image being an image based on a 2 nd wavelength band different from the 1 st wavelength band, the 3 rd image being an image based on a 3 rd wavelength band including the 1 st wavelength band and the 2 nd wavelength band; a pixel value acquisition step of acquiring pixel values corresponding to micro regions constituting each 3 rd image from the 3 rd images of the plurality of frames; a ratio or difference calculation step of calculating a ratio or difference between a pixel value of the 1 st image and a pixel value of the 2 nd image corresponding to a micro region in which a pixel value of the micro region is the smallest in a plurality of frames; a storage step of storing the ratio or difference calculated by the ratio or difference calculation step; a water quality data calculation step of calculating water quality data of the water quality inspection object based on the ratio or difference stored in the storage step; and a water quality distribution image generation step of generating a water quality distribution image representing the water quality distribution of the water quality inspection object based on the water quality data calculated by the water quality data calculation step.

Effects of the invention

According to the present invention, an image in which the influence of reflected light is suppressed over a wide range of the captured water surface can be generated.

Drawings

Fig. 1 is a graph showing the reflectance of p-polarized light and s-polarized light.

Fig. 2 is a diagram illustrating calculation of the reflectance of the water surface when the imaging device is installed at a remote location to image the water surface.

Fig. 3 is a graph showing the reflectance of the water surface.

Fig. 4 is a graph showing the reflectance of the water surface.

Fig. 5 is a diagram showing an imaging range and a reflectance.

Fig. 6 is a diagram showing an imaging range and a reflectance.

Fig. 7 is a diagram conceptually showing imaging of the same scene containing a water surface with waves.

Fig. 8 is a diagram illustrating an angle formed by the optical axis and the surface normal when waves are present on the water surface.

Fig. 9 is a perspective view and a rear view showing an embodiment of the imaging device.

Fig. 10 is a perspective view and a rear view showing an embodiment of the imaging device.

Fig. 11 is a block diagram showing an embodiment of the internal configuration of the imaging apparatus.

Fig. 12 is a diagram showing a functional block diagram of the image processing apparatus.

Fig. 13 is a diagram illustrating extraction of pixel values.

Fig. 14 is a conceptual diagram showing an original image and a composite image.

Fig. 15 is a graph showing the reflectance of reflected light on the water surface when there is no wave and the reflectance of reflected light on the water surface when there is a wave.

Fig. 16 is a flowchart showing an operation of the image processing apparatus.

Fig. 17 is a functional block diagram of the image processing apparatus.

fig. 18 is a diagram for explaining calculation of a ratio or a difference between a pixel of the 1 st image and a pixel of the 2 nd image.

Fig. 19 is a flowchart showing an operation of the image processing apparatus.

Fig. 20 is a diagram schematically showing an example of a frame structure of the imaging apparatus.

fig. 21 is a schematic view of the light receiving element group viewed from the optical axis direction.

Fig. 22 is a sectional view on the broken line a-a of fig. 21 (a).

Fig. 23 is a sectional view on the broken line a-a of fig. 21 (a).

fig. 24 is a functional block diagram of the image processing apparatus.

fig. 25 is a diagram for explaining calculation of the comparative example or the difference calculating section.

fig. 26 is a flowchart showing an operation of the image processing apparatus.

Fig. 27 is a diagram schematically showing an example of a frame configuration of the imaging apparatus.

Fig. 28 is a schematic view of the light receiving element group viewed from the optical axis direction.

Fig. 29 is a sectional view on the broken line a-a of fig. 28 (a).

fig. 30 is a sectional view on the broken line a-a of fig. 28 (a).

Detailed Description

Preferred embodiments of an image processing apparatus, an imaging apparatus, an image processing method, and a program according to the present invention will be described below with reference to the accompanying drawings.

Conventionally, as a technique for suppressing the influence of reflected light from a water surface, there is a case where an image pickup device provided with a polarizing filter picks up an image of the water surface. The influence of reflected light is suppressed by utilizing a phenomenon that reflected light on a water surface is polarized into s-polarized light and p-polarized light and a Brewster angle at which the reflectance of p-polarized light becomes 0 (zero).

fig. 1 is a graph showing the reflectance of p-polarized light and s-polarized light in synthetic quartz (n ═ 1.458). The reflectance of s-polarized light becomes larger as the incident angle becomes larger. In contrast, in p-polarized light, the reflectance gradually decreases toward 0 from the incident angle of 0 to the brewster angle (57 ° in the case of synthetic quartz), the reflectance becomes 0 at the brewster angle, and thereafter the reflectance becomes larger as the incident angle becomes larger.

in addition, the brewster angle is generally represented by the following formula (1). And, the refractive index of water is 1.33, so the brewster angle is 53 °.

[ numerical formula 1]

θ5: brewster angle

n1, n 2: refractive index

For example, when a polarizing filter for blocking s-polarized light is provided, the reflectance of p-polarized light becomes 0 near brewster's angle, and the s-polarized light is blocked by the polarizing filter. This makes it possible to obtain a captured image in which the influence of reflected light on the water surface is suppressed.

However, when the angle formed by the optical axis and the normal line of the water surface is in the vicinity of the brewster angle, the influence of the reflected light on the water surface can be suppressed as described above, but when the angle formed by the optical axis and the normal line of the water surface becomes larger than the brewster angle, the component of the p-polarized light becomes large, and it becomes difficult to suppress the influence of the reflected light. That is, if the angle formed by the optical axis and the normal line of the water surface becomes large, it may be difficult to capture a scene including the water surface in which the influence of the reflected light is suppressed even in an imaging device including a polarizing filter. Here, the water surface normal refers to a normal to the water surface intersecting the optical axis.

next, a specific example of the influence of the reflected light on the water surface will be described.

For example, in the field of remote sensing, an image pickup device is sometimes used to pick up an image of a water surface (for example, a sea surface, a lake surface, a river surface, etc.) and perform a water quality inspection based on the obtained image. In this case, if the influence of the reflected light from the water surface is large, accurate data of the water quality inspection target cannot be acquired, and it is difficult to perform the water quality inspection.

fig. 2 is a diagram illustrating calculation of the reflectance of the water surface when an imaging device including a polarizing filter for blocking s-polarized light is installed at a remote location to image the water surface. The angle formed by the optical axis and the normal line of the water surface is set as alpha, and no wave is generated on the water surface.

Consider the case where the imaging device is placed at a height h with the optical axes R1, R2, R3, and R4, respectively. When the optical axes are R1, R2, R3, and R4, the water surface is photographed at distances d1, d2, d3, and d4 from the origin O. The light S1 from the sun is reflected on the water surface at the angle of incidence α d2, and becomes reflected light S3 and transmitted light S2. The reflected light S3 is reflected at a reflection angle α and the transmitted light S2 is transmitted at a refraction angle β.

In the case of the incident angle α and the refraction angle β, the reflectance of p-polarized light is calculated by the following equations (2) and (3). In the case shown in fig. 2, the reflectance of p-polarized light is considered in consideration of the influence of reflected light in a captured image obtained by an imaging device including a polarizing filter for blocking s-polarized light.

[ numerical formula 2]

[ numerical formula 3]

And Rp: reflectivity of p-polarized light

n1, n 2: refractive index

Fig. 3 and 4 are diagrams showing the reflectance of a water surface without waves when the imaging device is set at an altitude of 40m (h in fig. 2 is 40 m). Fig. 4 shows an enlarged view of a range from 0m to 160m in fig. 3. The origin 0 indicates the position of the altitude 0m of the point where the imaging device is installed.

As shown in fig. 3 and 4, when the image is taken from a position directly below (0m) where the imaging device is installed, the front reflection Rp is about 2%. since the s-polarized light is blocked by the effect of the polarizing filter, the reflectance becomes zero when the water surface is photographed at a distance of 45m from the origin (see fig. 4).

On the other hand, if the distance from the origin of the photographed position exceeds 45m, the reflectance gradually increases. For example, the reflectivity on water surfaces away from about 150m exceeds 10%. The reflectance at a position distant from about 240m is 20%, and the reflectance at a position distant from about 600m is 50%.

Here, when the water surface is photographed by the image pickup device and the water quality is checked by remote sensing, accurate remote sensing can be performed by acquiring a photographed image in which the reflectance is suppressed to 2% or less, preferably 1% or less.

Fig. 5 and 6 are diagrams showing the imaging range and the reflectance. In fig. 5 and 6, the water surface W of the pond to be imaged is imaged from the imaging point E and remote sensing is performed.

In fig. 5, the imaging device 10 is installed at a point E of elevation 40m to image the water surface W. In the case shown in fig. 5, when the image is taken from the position immediately below (0m) the imaging device 10, the reflectance is 2%, the incident light has a brewster angle and the reflectance is 0 in the range of 45m, the reflectance is 1% in the range of 150m, the reflectance is 20% in the range of 240m, and the reflectance is 50% in the range of 600 m. In order to perform a water quality inspection with high accuracy, if the reflectance is suppressed to 2% or less, preferably 1% or less, the imaging device 10 provided at the point E of the elevation 40m can perform imaging only in a narrow range of the water surface W.

fig. 6 is a diagram showing the reflectance of an image captured when the height of the installation position of the imaging device 10 is increased from the altitude 40m to the altitude 150 m. When the installation position of the imaging device is increased to an elevation of 150m, the range of reflectance of 1% or less becomes 300m, and the water surface W can be imaged.

As shown in fig. 5 and 6, by increasing the height of the installation position of the camera, the imaging range (for example, the range in which the reflectance is 2% or less, preferably 1% or less) in which the influence of the reflected light on the water surface is effectively suppressed can be enlarged. However, if the height of the installation position of the imaging device is increased, the construction cost is increased, and there is a possibility that the view near the installation position is damaged, or the maintenance of the imaging device becomes more difficult. Further, even when a lake surface is photographed by an unmanned aerial vehicle (for example, an unmanned aerial vehicle) having an imaging device, there is a possibility that an operation cost increases when the height increases, an imaging frequency cannot be increased, or the vehicle cannot fly in bad weather.

Therefore, the present application proposes a method of acquiring an image in which the influence of reflected light is effectively suppressed without increasing the height of the position where the image pickup device is disposed.

Fig. 7 is a diagram conceptually showing a shot of the same scene containing a water surface with waves to which the present invention is applied.

the imaging device 10 is provided at a fixed point on the bank C by a tripod 11. The optical axis L of the imaging device 10 faces the water surface W, and images the water surface W from the bank C. That is, the same scene including the water surface W having waves is photographed by the imaging device 10 fixed to the fixed point.

Light from the sun is incident on the water surface W and reflected by the surface of the water surface W. Waves are generated on the water surface W, and the incident angle and the reflection angle of sunlight also change with the change of the water surface due to the waves.

The imaging device 10 performs shooting of the same scene for a certain time. Specifically, the imaging apparatus 10 acquires a plurality of frames (frame images) in a state of being fixed to the tripod 11 in the moving image mode.

Here, the fixed time period varies depending on the performance of the image processing apparatus 500 (fig. 12) described later and the image quality required by the user. For example, the imaging device 10 performs imaging for 10 seconds to 120 seconds, preferably 20 seconds to 100 seconds. The imaging time of the imaging device 10 may be determined according to the time at which the composite image and the water quality distribution image generated by the image processing device 500 described later are obtained.

Fig. 8 is an enlarged view of region F of fig. 7, and illustrates an angle formed between the optical axis and the surface normal when waves are present on the water surface.

In the imaging positions 701 and 713, the optical axis L is located in the trough of the wave and the water surface is flat. At this time, the angle formed by the optical axis and the normal line of the water surface is α. The imaging position 707 is the peak of the wave, and the angle formed by the optical axis and the normal line to the water surface is α.

At the imaging position 703 and the imaging position 705, the water surface W is tilted by the waves, and therefore the angle formed by the optical axis and the water surface normal line becomes (α - Δ). On the other hand, at the imaging position 709 and the imaging position 711 beyond the apex of the wave (imaging position 707), the angle formed by the optical axis and the water surface normal becomes (α + Δ). Thus, the angle α formed by the optical axis L and the water surface normal changes due to the presence of waves on the water surface W.

In natural environments, there are few cases where the water surface is completely flat, such as the surface of a lake, the surface of a sea, and the surface of a river, and waves are generated. On the water surface where the waves are generated, the angle between the normal line of the water surface and the optical axis changes, and the reflectivity of the water surface changes in linkage with the angle. As shown in fig. 8, at the imaging position 703 and the imaging position 705, at the moment when the water surface is tilted to the front side or at the imaging position, the change in shape of the water surface W due to the waves acts in a direction in which the angle between the water surface and the optical axis of the camera does not exceed the brewster angle. Therefore, when an observation target object in water is imaged by extracting only the moment when the optical axis and the water surface normal line are at or near the brewster angle, the observation target object is less likely to be affected by the water surface reflected light. In particular, in an image captured at the moment when the inclination of the water surface normal line due to the waves is large and the angle formed by the optical axis and the water surface normal line is equal to or smaller than the brewster angle, the effect of the polarizing filter can be exhibited to the maximum, and the object in the water can be observed and imaged without being affected by the water surface reflected light (without the light of the observation object being buried in the reflected light).

All the ranges of the observation target do not satisfy the above-described appropriate conditions at the same time. Therefore, a large number of images are taken, and appropriate frames and positions are selected by image processing. That is, since the water surface reflection component decreases as the brewster angle is approached, the same position of the water surface is observed and compared for a certain period of time, and it is determined that the brightness value is the lowest as the brewster angle is the closest in embodiment 1. In embodiment 2, the condition that the sum of the pixel value of image 1 and the pixel value of image 2 is the smallest is determined as the closest condition to the brewster angle, and in embodiment 3, the condition that the pixel value of image 3 is the smallest is determined as the closest condition to the brewster angle. This process is performed for all the minute regions (pixels) of the observation screen. In addition, the wave may be a natural wave or an artificial wave generated by a wave generator or the like. When the artificial wave is generated, it is preferable to generate a concentric wave in which the installation position (the contact point between the installation position and the water surface) of the imaging device 10 is the center of the concentric circle formed by the wave. By generating concentric waves with the installation position of the imaging device 10 as the center, the influence of reflection can be more effectively suppressed when the inclination direction of the waves (the direction of the normal line of the water surface) is directed toward the imaging device 10.

fig. 9 and 10 are a perspective view and a rear view respectively showing an embodiment of the imaging device 10 of the present invention. The imaging device 10 is a digital camera or a digital video camera in which light passing through a lens is received by an imaging element and converted into a digital signal, and the digital signal is recorded as image data of a still image or a moving image on a recording medium.

as shown in fig. 9, the image pickup apparatus 10 has a photographing lens (optical system) 12, a flash 1, and the like disposed on the front surface thereof, and a shutter button 2, a power supply/mode switch 3, a mode dial 4, and the like disposed on the upper surface thereof. On the other hand, as shown in fig. 10, a Liquid Crystal Display (LCD) 30, a zoom button 5, a cross button 6, a MENU/OK button 7, a play button 8, a BACK button 9, and the like are disposed on the BACK surface of the camera.

the photographing lens 12 is formed of a telescopic zoom lens, and is extended from the camera body by setting the camera mode to a photographing mode by the power supply/mode switch 3. The flash 1 irradiates flash light toward a main subject.

The shutter button 2 is constituted by a so-called two-stage stroke switch including a "half-press (S1 ON)" and a "full-press (S2 ON)", and functions as a shooting preparation instruction section and as an image recording instruction section.

When the moving image shooting mode is selected as the shooting mode and the shutter button 2 is "fully pressed", the imaging device 10 starts recording of a moving image, and when the shutter button 2 is "fully pressed" again, the imaging device 10 stops recording and enters a standby state. When the moving image photographing mode is selected, focus adjustment is continuously performed by performing autofocus through the lens driving unit 36, and exposure control is performed by performing automatic exposure control through the shutter driving unit 33 and the stop driving unit 34.

When the still image shooting mode is selected as the shooting mode and the shutter button 2 is "half-pressed", the imaging apparatus 10 performs a shooting preparation operation for executing AF (auto focus) and/or AE (auto Exposure) control, and when the shutter button 2 is "full-pressed", the imaging apparatus 10 performs shooting and recording of a still image.

The power/mode switch 3 has both a function as a power switch for turning ON/OFF (ON/OFF) the power of the image pickup apparatus 10 and a function as a mode switch for setting the mode of the image pickup apparatus 10, and is disposed slidably between an "OFF (OFF) position", a "playback position", and a "photographing position". The imaging apparatus 10 is turned ON (ON) when the power/mode switch 3 is slid to be positioned at the "play position" or the "photographing position", and turned OFF when the power is positioned at the "OFF (OFF) position". Then, the power supply/mode switch 3 is slid to be positioned at the "playback position" to set the "playback mode", and to be positioned at the "shooting position" to set the "shooting mode".

The mode dial 4 functions as a shooting mode setting means for setting a shooting mode of the imaging apparatus 10, and the shooting mode of the imaging apparatus 10 is set to various modes by the setting position of the mode dial 4. Examples of the mode include a "still image shooting mode" in which still images are shot, and a "moving image shooting mode" in which moving images are shot. The plurality of frame images in the present invention are acquired by, for example, a moving image photographing mode.

the liquid crystal display 30 functions as a part of a Graphical User Interface (GUI) by displaying a live preview image (live view image) in a shooting mode, displaying a still image or a moving image in a playback mode, and displaying a menu screen.

The zoom button 5 functions as a zoom instruction mechanism for instructing zooming, and is composed of a telephoto button 5T for instructing zooming to the telephoto side and a wide button 5W for instructing zooming to the wide side. In the imaging device 10, when the tele button 5T and the wide button 5W are operated in the photographing mode, the focal length of the photographing lens 12 is changed. In the playback mode, the telephoto button 5T and the wide-angle button 5W are operated to zoom in and out an image during playback.

the cross button 6 is a multi-function button for inputting instructions in 4 directions, i.e., up, down, left, and right, and functions as a button (cursor movement operation means) for selecting an item from a menu screen or instructing selection of various setting items from respective menus. The left/right button functions as a frame transfer (forward/reverse transfer) button in the play mode.

The menu/ok button 7 is an operation button having both a function as a menu button for executing an instruction to display a menu on the screen of the liquid crystal display 30 and a function as an ok button for issuing an instruction to specify and execute a selected content.

The play button 8 is a button for switching to a play mode in which a still image or a moving image recorded by photographing is displayed on the liquid crystal display 30.

The return button 9 functions as a button for instructing to cancel an input operation or to return to a previous operation state.

Fig. 11 is a block diagram showing an embodiment of the internal configuration of the imaging apparatus 10. The imaging apparatus 10 records the captured image on the memory card 54, and the operation of the entire apparatus is comprehensively controlled by the Central Processing Unit (CPU) 40.

The image pickup apparatus 10 is provided with the operation unit 38 such as the shutter button 2, the power supply/mode switch 3, the mode dial 4, the telephoto button 5T, the wide-angle button 5W, the cross button 6, the menu/ok button 7, the play button 8, and the return button 9. Signals from the operation unit 38 are input to the CPU40, and the CPU40 controls the circuits of the image pickup apparatus 10 based on the input signals, and performs drive control of the image pickup device (image sensor) 16 (sensor drive unit 32), lens drive control (lens drive unit 36), aperture drive control (aperture control unit 34), photographing operation control, image processing control, image data recording/playback control, display control of the liquid crystal display 30, and the like, for example.

When the power of the imaging apparatus 10 is turned on by the power/mode switch 3, power is supplied to each block from a power supply unit not shown, and the driving of the imaging apparatus 10 is started.

The light beam transmitted through the photographing lens 12, the diaphragm 14, the mechanical shutter (mechanical shutter)15, and the like is focused on an image pickup element 16 which is a CMOS (Complementary Metal-Oxide Semiconductor) type color image sensor. The imaging element 16 is not limited to the CMOS type, and may be an XY address type or a CCD (Charge Coupled Device) type color image sensor.

A polarizing filter 13 is provided in front of the photographing lens 12. The polarizing filter 13 passes a light beam of the same scene including a water surface having waves as an object of photographing. The polarizing filter 13 has at least a function of blocking s-polarized light reflected by the water surface.

The imaging element 16 is configured by a plurality of pixels arranged in a matrix in a predetermined pattern (bayer arrangement), and each pixel includes a microlens, a color filter CF of red (R), green (G), or blue (B), and a photodiode PD. The imaging element 16 may also include a filter that transmits light in the 1 st, 2 nd, or 3 rd wavelength band, which will be described later.

The CPU40 performs the AF operation and the AE operation all the time during capturing and/or recording (recording) of a moving image and capturing and/or displaying of a through image.

The ROM47 is a ROM (Read Only Memory) or an EEPROM (Electrically Erasable Programmable Read Only Memory) that stores various parameters and tables used for a camera control program, defect information of the image pickup device 16, image processing, and the like.

The image data (mosaic image data) of RGB outputted from the image pickup device 16 when a moving image or a still image is picked up is inputted from the image input controller 22 to a Memory (SDRAM) 48 and temporarily stored.

The image data temporarily stored in the memory 48 is appropriately read by the image processing unit 24, and signal processing such as offset processing, gain control processing including white balance correction and sensitivity correction, gamma correction processing, demosaic processing (color interpolation processing), and RGB and/or YC conversion processing is performed.

the image data processed by the image processing section 24 is input to a VRAM (Video RAM) 50. The VRAM50 includes an a region and a B region in which image data representing 1 frame worth of image is recorded. In the VRAM50, image data representing an image of 1 frame is alternately rewritten in the a region and the B region. In the a region and the B region of the VRAM50, the written image data is read from a region other than the region on the side where the image data is rewritten.

The image data read from the VRAM50 is encoded by the video encoder 28 and output to the liquid crystal display 30 provided on the back side of the camera, whereby through-preview images are continuously displayed on the display screen of the liquid crystal display 30.

The compression/expansion processing unit 26 performs compression processing on the luminance data Y and the color difference data Cb and Cr, which are processed by the image processing unit 24 and temporarily stored in the memory 48 when a moving image or a still image is recorded. In the case of moving images, compression is performed in the form of h.264, for example, and in the case of still images, compression is performed in the form of JPEG (joint photographic coding Experts Group), for example. The compressed image data compressed by the compression/expansion processing section 26 is recorded in the memory card 54 via the media controller 52.

In the playback mode, the compression/expansion processing unit 26 performs expansion processing on the compressed image data obtained from the memory card 54 via the media controller 52. The media controller 52 performs recording and reading of the compressed image data on and from the memory card 54.

[ embodiment 1]

Fig. 12 is a diagram showing a functional block diagram of an image processing apparatus 500 according to the present invention.

The image processing apparatus 500 includes an image acquisition unit 501, a luminance value calculation unit 503, a pixel value extraction unit 505, a storage unit 507, and a synthetic image generation unit 509. The image processing apparatus 500 is provided in the imaging apparatus 10, for example. The image processing apparatus 500 may be provided in a computer, for example, and in this case, a plurality of frame images captured by a camera (for example, the imaging apparatus 10) are input to the computer.

the image acquisition unit 501 acquires a plurality of frame images of the same scene including a water surface having waves. For example, the image acquisition unit 501 acquires image data of a frame image after signal processing by the image processing unit 24. The image acquisition unit 501 is realized by the image processing unit 24, for example.

The luminance value calculation unit 503 calculates luminance values corresponding to the minute regions constituting each frame image from the plurality of frame images. The luminance value calculation unit 503 is realized by, for example, the image processing unit 24, and calculates the luminance value of each frame image. Here, the micro region is a region in the frame image, and various regions can be used. For example, the micro area is an area of 1 pixel constituting the frame image.

The pixel value extraction unit 505 extracts pixel values corresponding to the minute regions of the frame image having a luminance value smaller than the luminance value of the minute regions of the other frame images among the plurality of frame images. Preferably, the pixel values corresponding to the minute regions in the plurality of frame images when the brightness values of the minute regions are the minimum are extracted. That is, the pixel value extraction unit 505 acquires the luminance values of the micro regions in each frame image, compares the luminance values at the same positions between a plurality of frame images, and extracts the pixel value of the micro region when the luminance value is the smallest. The pixel value at this time is, for example, information on the color of the pixel.

The storage unit 507 stores the pixel values extracted by the pixel value extraction unit 505. The storage unit 507 is realized by the memory 48, for example. The pixel values of the micro areas where the luminance value extracted by the pixel value extraction unit 505 is the smallest are sequentially stored.

The synthetic image generating unit 509 generates a synthetic image corresponding to the scene from the pixel values stored in the storage unit 507. The composite image generation unit 509 is realized by the video encoder 28, for example.

Fig. 13 is a diagram illustrating extraction of pixel values by the pixel value extraction unit 505. In fig. 13, for the sake of explanation, 5 frame images are shown as a plurality of frame images, but actually, more frame images are acquired by the image acquisition unit 501. In fig. 13, a frame image composed of 4 × 5 pixels is schematically shown, but actually, the frame image is composed of many more pixels.

The image acquisition unit 501 acquires a 1 st frame image 601, a 2 nd frame image 603, a 3 rd frame image 605, a 4 th frame image 607, and a 5 th frame image 609 in which the same scene is captured.

Then, the luminance value calculation unit 503 calculates the luminance value of the pixel at the position P of the 1 st frame image 601. Then, the luminance value of the pixel at position P in the 1 st frame image 601 is stored as an initial value in the storage unit 507. Then, the luminance values of the pixels located at the positions P of the 2 nd frame image 603, the 3 rd frame image 605, the 4 th frame image 607, and the 5 th frame image 609 are calculated in this order. In addition, the position P in each frame image indicates the same position.

The pixel values are sequentially calculated by the luminance value calculation unit 503, and the pixel value of the frame image when the luminance value of the pixel at the position P becomes the minimum is extracted by the pixel value extraction unit 505. In the case shown in fig. 13, for example, the pixel value extraction unit 505 determines that the luminance value at the position P of the 5 th frame image 609 becomes the minimum, and extracts the pixel value at the position P of the 5 th frame image 609. The extracted pixel value is stored in the storage unit 507.

Fig. 14 is a conceptual diagram illustrating an original image and a composite image generated by the composite image generating unit 509. Fig. 14(a) shows an original image, fig. 14(B) shows a composite image, and 2 images have the same scene captured. The original image is an image subjected to normal image processing, and the influence of the reflected light on the water surface is not suppressed. If the influence of the reflected light on the water surface is large, as in the original image, the light from the observation object in the water is buried in the reflected light on the water surface, and appropriate observation cannot be performed. On the other hand, in the composite image, the pixel value at the time of the minimum luminance value in each frame image is extracted and the composite image is configured from the extracted pixel value, so that the influence of the reflected light on the water surface is suppressed and the light from the observation target object in the water can be appropriately captured.

Fig. 15 is a graph showing the reflectance of reflected light on the water surface when there is no wave and the reflectance of reflected light on the water surface when there is a wave. The imaging device 10 provided with the polarizing filter 13 was set at an altitude of 30m, and the reflectance was measured from an image obtained from a water surface having a maximum water surface angle of 15 ° and waves. The image for measuring the reflectance is measured from the image generated by the composite image generating unit 509.

The reflectance of the reflected light on the water surface when there is no wave becomes approximately 0 in the vicinity of the horizontal distance 40m, but then the reflectance becomes larger as the horizontal distance becomes longer. On the other hand, regarding the reflectance of reflected light on the water surface when there is a wave, the reflectance does not increase even if the horizontal distance becomes long. In this way, it is found that the composite image is a water surface having waves, which can suppress the influence of reflected light.

Fig. 16 is a flowchart showing the operation of the image processing apparatus 500.

First, waves are artificially generated when there are no waves on the water surface. For example, waves are generated on the water surface by a wave generator (wave generating step: step S10). After that, the same scene including the water surface with the waves is photographed by the image pickup device 10 (step S11). Then, the image acquisition unit 501 acquires a plurality of frame images in which the same scene including the water surface having waves is photographed (image acquisition step: step S12).

Subsequently, the luminance value calculation unit 503 calculates the luminance value in the micro region (luminance value calculation step: step S13). Then, the pixel value extracting unit 505 extracts the pixel value corresponding to the minute region when the luminance value of the minute region becomes the minimum in the plurality of frame images (pixel value extracting step: step S14). Then, the extracted pixels are stored in the storage unit 507 (storage step: step S15). Then, the synthetic image generating unit generates a synthetic image corresponding to the acquired same scene based on the pixel values stored in the storage unit 507 (synthetic image generating step: step S16).

In the above-described embodiment, the hardware configuration of the processing unit (processing unit) that executes various processes is a processor (processor) as shown below. Among various processors, a general-purpose processor that executes software (programs) to function as various Processing units, that is, a CPU (Central Processing Unit), an FPGA (Field Programmable Gate Array), or the like, which is a processor capable of changing a Circuit configuration after manufacturing, a PLD (Programmable Logic Device), an ASIC (Application Specific Integrated Circuit), or the like, which is a processor having a Circuit configuration specifically designed to execute a Specific process, or the like, is included.

The 1 processing unit may be constituted by 1 of these various processors, or may be constituted by 2 or more processors (e.g., a plurality of FPGAs, or a combination of the CPU40 and the FPGAs) of the same kind or different kinds. Further, a plurality of processing units may be constituted by 1 processor. First, as an example of the configuration of the plurality of processing units with 1 processor, a configuration is adopted in which a computer such as a client or a server is represented, and 1 processor is configured by a combination of 1 or more CPUs 40 and software, and functions as a plurality of processing units. Secondly, a System is used in which a processor that realizes the functions of the entire System including a plurality of processing units by 1 IC (Integrated Circuit) Chip, as typified by a System On Chip (SoC). In this manner, the various processing units are configured as a hardware configuration by using 1 or more of the various processors described above.

More specifically, the hardware configuration of these various processors is a circuit (circuit) in which circuit elements such as semiconductor elements are combined.

The respective structures and functions described above can be implemented by any hardware, software, or a combination of both as appropriate. For example, the present invention can be applied to a program for causing a computer to execute the above-described processing steps (processing steps), a computer-readable recording medium (non-transitory recording medium) on which such a program is recorded, or a computer on which such a program is installed.

[ 2 nd embodiment ]

Next, embodiment 2 of the present invention will be explained.

Fig. 17 is a diagram showing a functional block diagram of an image processing apparatus 500 according to the present invention. Note that the description of the items already described in fig. 12 is omitted.

The image processing apparatus 500 includes an image acquisition unit 501, a pixel value calculation unit 521, a ratio or difference calculation unit 523, a storage unit 507, a water quality data calculation unit 525, and a water quality distribution image generation unit 527.

The image acquisition unit 501 acquires images corresponding to a plurality of frames of the same scene in which the water quality inspection target including the water surface having waves is imaged. The image acquisition unit 501 acquires a 1 st image and a 2 nd image in each frame, the 1 st image being based on a 1 st wavelength band, and the 2 nd image being based on a 2 nd wavelength band different from the 1 st wavelength band.

The pixel value calculator 521 calculates the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image for each micro area in each frame based on the 1 st image and the 2 nd image in the plurality of frames. That is, the pixel value calculation unit 521 calculates the sum of the 1 st image and the 2 nd image in each frame. The pixel value calculation unit 521 is realized by the image processing unit 24. Here, the pixel values of the 1 st image and the 2 nd image are, for example, output values in the imaging element 16.

The ratio or difference calculation unit 523 calculates the ratio or difference between the pixel value of the 1 st image and the pixel value of the 2 nd image corresponding to the micro region in which the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro region becomes the minimum in the plurality of frames. The ratio or difference calculation unit 523 is realized by the image processing unit 24.

The storage unit 507 stores the ratio or difference calculated by the ratio or difference calculation unit 523.

The water quality data calculation unit 525 calculates water quality data of the water quality inspection target based on the ratio or difference stored in the storage unit 507. The water quality data calculating unit 525 calculates water quality data by a known technique.

For example, when estimating the concentration of chlorophyll a, the water quality data calculation unit 525 performs estimation by the following 2-wavelength band-ratio method.

[ numerical formula 4]

Chl.a∝R(λi)/R(λj)…(4)

A, Chl.a: concentration of chlorophyll alpha

R (λ): reflectivity at wavelength λ

It is known that the 2 wavelengths used differ from water area to water area, for example, 670nm and 720nm are used along the shore. Therefore, by setting the 1 st wavelength band to a wavelength band including 670nm and the 2 nd wavelength band to a wavelength band including 700nm, remote sensing of the concentration of chlorophyll a can be performed by the imaging device 10. For example, the 1 st wavelength band is a wavelength band of 650nm or more and 690nm or less, and the 2 nd wavelength band is a wavelength band of 680nm or more and 720nm or less.

the water quality distribution image generator 527 generates a water quality distribution image indicating the water quality distribution of the water quality inspection target based on the water quality data calculated by the water quality data calculator 525. For example, the water quality distribution image generating unit 527 generates, as the water quality distribution image, a concentration distribution image indicating the calculated concentration distribution of chlorophyll a. The water quality data calculation unit 525 calculates the concentration of chlorophyll a as water quality data to be inspected for water quality from the 1 st image and the 2 nd image acquired by the image acquisition unit 501.

Fig. 18 is a diagram for explaining the calculation of the scale or difference calculating unit 523, which is similar to fig. 13.

the image acquisition unit 501 acquires images corresponding to the 1 st frame 621, the 2 nd frame 623, the 3 rd frame 625, the 4 th frame 627, and the 5 th frame 629. Each frame is composed of the 1 st image and the 2 nd image.

The pixel value calculation unit 521 calculates the sum of the pixel values of the 1 st image 621A and the 2 nd image 621B of the 1 st frame 621. The sum of the pixel values of the 1 st image 621A and the 2 nd image 621B is stored in the storage unit 507 as an initial value. Then, the pixel value calculation unit 521 calculates the sum of the pixel values of the 1 st and 2 nd images 623A and 623B in the 2 nd frame 623, the sum of the pixel values of the 1 st and 2 nd images 625A and 625B in the 3 rd frame 625, the sum of the pixel values of the 1 st and 2 nd images 627A and 627B in the 4 th frame 627, and the sum of the pixel values of the 1 st and 2 nd images 629A and 629B in the 5 th frame 629 in this order.

The ratio or difference calculating unit 523 calculates the ratio or difference between the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro area when the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro area calculated by the pixel value calculating unit 521 is the minimum.

Fig. 19 is a flowchart showing the operation of the image processing apparatus 500.

First, the same scene containing the water surface with waves is photographed by the imaging device 10. Then, the image acquisition unit 501 acquires a plurality of frame images of the same scene including the water surface having waves (step S20: image acquisition step).

Subsequently, the pixel value calculation unit 521 calculates the sum of the pixel value of the 1 st image and the pixel value of the 2 nd image of each frame in the micro area (step S21: pixel value calculation step). Then, the proportion or difference calculating unit 523 calculates the proportion or difference corresponding to the micro area when the pixel value of the micro area becomes the minimum in the plurality of frame images (step S22: proportion or difference calculating step). Then, the calculated ratio or difference is stored in the storage unit 507 (step S23: storage step). Thereafter, the water quality data calculating unit 525 calculates water quality data from the ratio or difference stored in the storage unit 507 (step S24: water quality data calculating step), and the water quality distribution image generating unit 527 generates a water quality distribution image (step S25: water quality distribution image generating step).

fig. 20 is an image pickup apparatus 10 used in embodiment 2, and schematically shows an example of a frame structure of the image pickup apparatus 10 of the main part.

the imaging device 10 includes a photographing lens 12, an imaging element 16, and an image acquisition unit 501.

The photographing lens 12 is a single lens system for image pickup, and has different transmission wavelength characteristics for each region through which incident light passes. The photographing lens 12 includes 1 or more lenses (imaging lenses) 100a and a wavelength separation filter 100b for making the transmission wavelength band different for each incident region of the imaging lens.

the wavelength separation filter 100b has a 1 st filter that transmits light of a 1 st wavelength band and a 2 nd filter that transmits light of a 2 nd wavelength band different from the 1 st wavelength band. That is, the filter includes an a filter 100B-a (1 st filter) for transmitting light of the 1 st wavelength band and a B filter 100B-B (2 nd filter) for transmitting light of the 2 nd wavelength band.

Here, the a filter 100B-a is disposed in correspondence with the pupil area 122a (1 st area) of the exit pupil 120 of the lens 100a, and the B filter 100B-B is disposed in correspondence with the pupil area 122B (2 nd area). Therefore, of the object light passing through the photographing lens 12, the light passing through the pupil area 122a of the exit pupil 120 of the photographing lens 12 becomes light having the 1 st wavelength band, and the light passing through the pupil area 122b becomes light having the 2 nd wavelength band.

the wavelength separation filter 100b of the present embodiment is disposed in the vicinity of the pupil plane of the lens 100a and at the subsequent stage of the lens 100a on the optical path of the object light, but may be disposed at an optically equivalent position. The photographic lens 12 may have an optical path giving different transmission wavelength characteristics throughout the entire lens system, and the difference in transmission wavelength characteristics may not be provided by a specific optical surface of a specific filter. Further, the wavelength separation filter 100b may have a lens effect.

The subject light having passed through the photographing lens 12 enters the image pickup device 16. The image pickup element 16 separates and receives light passing through a pupil area 122a and light passing through a pupil area 122b of the exit pupil 120 of the photographing lens 12, respectively. The image pickup device 16 supplies a signal based on the light received by being separated to the image acquisition unit 501 as an image signal.

The image acquiring section 501 acquires the 1 st image received via the 1 st filter and the 2 nd image received via the 2 nd filter, respectively. That is, the image acquisition unit 501 acquires 2 images having different wavelengths from each other based on the image signal.

The image pickup element (orientation sensor) 16 has a plurality of pixels each including a two-dimensionally arranged photoelectric conversion element. The image pickup device 16 pupil-divides and selectively receives the light beams incident through the 1 st filter and the 2 nd filter, respectively.

The image pickup element 16 has a plurality of microlenses 152. The microlenses 152 are arranged according to a predetermined rule in a direction perpendicular to the optical axis. The corresponding light receiving element group 161 is disposed as a polarizing optical element in each microlens 152. The light receiving element group 161 is composed of a plurality of light receiving elements 162.

The plurality of light receiving elements 162 are MOS (Metal Oxide Semiconductor) or CMOS type image pickup elements. In addition, a solid-state imaging device such as a CCD type imaging device may be used as the plurality of light receiving elements 162.

Fig. 21(a) is a schematic view of the light receiving element group 161 corresponding to the microlens 152 when viewed from the optical axis direction. As shown in the drawing, in the present embodiment, a light receiving element group 161 in which 4 light receiving elements 162-1a, 162-1b, 162-2a, 162-2b are arranged in 2 rows and 2 columns is provided corresponding to 1 microlens 152.

As shown in fig. 21(b), the light receiving element group corresponding to the microlens may be such that the light receiving element group 1161 in which the rectangular light receiving elements 1162-1 and 1162-2 are arranged corresponds to the microlens 152.

Fig. 22 is a sectional view on the broken line a-a of fig. 21 (a). As shown in the figure, the light passing through the pupil area 122a of the exit pupil 120 of the photographing lens 12 is received by the light receiving element 162-1a through the microlens 152. Although not shown here, the light passing through the pupil area 122b is received by the light receiving element 162-1b through the microlens 152.

The light passing through the pupil area 122b is received by the light receiving elements 162-2a and 162-2b through the microlens 152. Note that 262 shown in the figure is a light shielding portion provided to prevent interference between adjacent pixels.

As described above, the light passing through the pupil area 122a is the 1 st wavelength band light, and the light passing through the pupil area 122b is the 2 nd wavelength band light. Therefore, the light receiving elements 162-1a and 162-1b receive the light of the 1 st wavelength band, and the light receiving elements 162-2a and 162-2b receive the light of the 2 nd wavelength band.

In this way, the imaging relationship between the pupil of the photographing lens 12 and the plurality of light receiving elements 162 is coupled by the microlens, and the light received by each light receiving element 162 is limited to the light passing through the pupil area 122 set in advance in the exit pupil 120 of the photographing lens 12.

each light receiving element 162 of the light receiving element group 161 outputs an image pickup signal having an intensity corresponding to the amount of received light to the image acquisition unit 501. The image acquisition unit 501 generates and acquires an image of an object from the image pickup signals of the plurality of light receiving elements 162. Specifically, the image acquisition unit 501 generates image signals representing images of different wavelength bands from the image pickup signals supplied from the light receiving element group 161.

in this example, the image acquisition unit 501 generates an image of the 1 st wavelength band (1 st image) from the image pickup signals of the light receiving elements 162-1a and 162-1b that receive the light passing through the pupil area 122 a. Then, an image of the 2 nd wavelength band (2 nd image) is generated from the image pickup signals of the light receiving elements 162-2a and 162-2b that receive the light passing through the pupil area 122 b.

In this embodiment, an example is shown in which the microlens makes light passing through 2 regions of the exit pupil enter 2 light receiving elements in the vertical direction. 2 regions of the exit pupil to which the microlens points correspond to respective regions different in transmission wavelength of the imaging optical system. Thus, images of 2 different wavelength bands can be obtained simultaneously, independently, and in parallel. Therefore, the 2 images acquired by the imaging device 10 become images having the same optical axis. At this time, the plurality of microlenses 152 (microlens array) function as pupil-dividing portions.

in addition, instead of the plurality of light receiving elements 162 corresponding to the microlenses 152, it is also possible to make the light-shielding mask function as a pupil division portion by making 1 light receiving element 162 correspond to the microlenses 152. Fig. 23 is a view for explaining a case where the light-shielding mask is caused to function as a pupil division portion in the same cross-sectional view as fig. 22. At this time, 1 light receiving element 162 is provided for 1 microlens 152, and a part of each light receiving element 162 is shielded from light by the light shielding masks 2262-1 and 2262-2. Thus, the light passing through the pupil area 122a of the exit pupil 120 of the photographing lens 12 is received by the light receiving element 162-1a through the microlens 152. The light passing through the pupil area 122b is received by the light receiving elements 162-2a and 162-2b through the microlens 152.

[ embodiment 3]

Next, embodiment 3 of the present invention will be explained.

Fig. 24 is a diagram showing a functional block diagram of an image processing apparatus 500 according to the present invention. Note that the description of the items already described in fig. 17 is omitted.

The image processing apparatus 500 includes an image acquisition unit 501, a pixel value acquisition unit 531, a ratio or difference calculation unit 523, a storage unit 507, a water quality data calculation unit 525, and a water quality distribution image generation unit 527.

The image acquisition unit 501 acquires images corresponding to a plurality of frames of the same scene in which the water quality inspection target including the water surface having waves is imaged. Each frame acquired by the image acquisition unit 501 includes a 1 st image, a 2 nd image, and a 3 rd image, the 1 st image being based on a 1 st wavelength band, the 2 nd image being based on a 2 nd wavelength band different from the 1 st wavelength band, and the 3 rd image being based on a 3 rd wavelength band including the 1 st wavelength band and the 2 nd wavelength band. For example, the 3 rd wavelength band is a wavelength band of visible light.

The pixel value acquiring unit 531 acquires pixel values corresponding to the micro areas constituting each 3 rd image from the 3 rd images of the plurality of frames. The pixel value acquisition unit 531 is realized by the image processing unit 24. Here, the pixel value of the 3 rd image is, for example, an output value of the image pickup device 16.

the ratio or difference calculation unit 523 calculates a ratio or difference between the pixel value of the 1 st image and the pixel value of the 2 nd image corresponding to the micro region when the pixel value of the micro region becomes the smallest in the plurality of frames.

Fig. 25 is a diagram for explaining the calculation of the scale or difference calculating unit 523, which is similar to fig. 13.

The image acquisition unit 501 acquires images corresponding to the 1 st frame 621, the 2 nd frame 623, the 3 rd frame 625, the 4 th frame 627, and the 5 th frame 629. Each frame is composed of the 1 st image, the 2 nd image, and the 3 rd image.

The pixel value obtaining section 531 obtains the pixel value of the 3 rd image 621C of the 1 st frame 621. Also, the pixel value of the 3 rd image 621C of the 1 st frame 621 is set to an initial value. Thereafter, the pixel value of the 3 rd image 623C of the 2 nd frame 623, the pixel value of the 3 rd image 625C of the 3 rd frame 625, the pixel value of the 3 rd image 627C of the 4 th frame 627, and the pixel value of the 3 rd image 629C of the 5 th frame 629 are acquired in this order.

The ratio or difference calculating unit 523 calculates the ratio or difference between the pixel value of the 1 st image and the pixel value of the 2 nd image in the micro area when the pixel value of the 3 rd image in the micro area acquired by the pixel value acquiring unit 531 is the minimum. For example, when the 3 rd wavelength band is a wavelength band of visible light, the pixel value of the 3 rd image can be output with good sensitivity, and therefore, a more accurate ratio or difference of timing can be calculated.

Fig. 26 is a flowchart showing the operation of the image processing apparatus 500.

first, the same scene containing the water surface with waves is photographed by the imaging device 10. Then, the image acquisition unit 501 acquires a plurality of frame images of the same scene including the water surface having waves (step S30: image acquisition step).

Subsequently, the pixel value of the 3 rd image in the minute area is acquired by the pixel value acquiring unit 531 (step S31: pixel value acquiring step). Then, the proportion or difference calculating unit 523 calculates the proportion or difference corresponding to the micro area when the pixel value of the micro area becomes the minimum in the plurality of frame images (step S32: proportion or difference calculating step). Then, the calculated ratio or difference is stored in the storage unit 507 (step S33: storage step). Then, the water quality data calculating unit 525 calculates water quality data from the ratio or difference stored in the storage unit 507 (step S34: water quality data calculating step), and the water quality distribution image generating unit 527 generates a water quality distribution image (step S35: water quality distribution image generating step).

fig. 27 is an image pickup apparatus 10 used in embodiment 3, and schematically shows an example of a frame structure of the image pickup apparatus 10 of the main part.

Fig. 27 is a diagram schematically showing an example of the frame structure of the imaging apparatus 10. The imaging device 10 includes a photographing lens 12, an imaging element 16, and an image acquisition unit 501.

The photographing lens 12 is a single lens system for image pickup, and has different transmission wavelength characteristics for each region through which incident light passes. The photographing lens 12 includes 1 or more lenses 100a and a wavelength separation filter 100b for making the transmission wavelength band different for each incident region of the imaging lens.

The wavelength separation filter 100b has 3 wavelength bands. That is, the optical filter includes a C filter 100B-C for transmitting light of the 3 rd wavelength band, a B filter 100B-B for transmitting light of the 2 nd wavelength band, and an A filter 100B-A for transmitting light of the 1 st wavelength band.

Here, an a filter 100B-a is disposed in correspondence with a pupil area 122a of the exit pupil 120 of the lens 100a, a B filter 100B-B is disposed in correspondence with the pupil area 122B, and a C filter 100B-C is disposed in correspondence with the pupil area 122C. Therefore, of the object light passing through the photographing lens 12, the light passing through the pupil area 122a of the exit pupil 120 of the photographing lens 12 becomes light having the 1 st wavelength band, the light passing through the pupil area 122b becomes light having the 2 nd wavelength band, and the light passing through the pupil area 122c becomes light having the 3 rd wavelength band.

The wavelength separation filter 100b is disposed near the pupil plane of the lens 100a and at the rear stage of the lens 100a on the optical path of the subject light, but may be disposed at an optically equivalent position. The photographic lens 12 may have an optical path giving different transmission wavelength characteristics throughout the entire lens system, and the difference in transmission wavelength characteristics may not be provided by a specific optical surface of a specific filter. Further, the wavelength separation filter 100b may have a lens effect.

The subject light having passed through the photographing lens 12 enters the image pickup device 16. The image pickup element 16 separates and receives light passing through a pupil area 122a, light passing through a pupil area 122b, and light passing through a pupil area 122c of the exit pupil 120 of the photographing lens 12, respectively. The image pickup device 16 supplies a signal based on the light received by being separated to the image acquisition unit 501 as an image signal. The image acquisition unit 501 generates images of different colors from the image signal.

the image pickup element 16 has a plurality of microlenses 152. The microlenses 152 are arranged according to a predetermined rule in a direction perpendicular to the optical axis. The corresponding light receiving element group 161 is disposed as a polarizing optical element in each microlens 152. The light receiving element group 161 is composed of a plurality of light receiving elements 162.

The plurality of light receiving elements 162 may form a MOS type image pickup element. The plurality of light receiving elements 162 may form a solid-state image sensor such as a CCD image sensor, in addition to the MOS image sensor.

Fig. 28(a) is a schematic view of the light receiving element group 161 corresponding to the microlens 152 when viewed from the optical axis direction. As shown in the drawing, in the present embodiment, a light receiving element group 161 in which 9 light receiving elements 162-1a, 162-1b, 162-1c, 162-2a, 162-2b, 162-2c, 162-3a, 162-3b, and 162-3c are arranged in 3 rows and 3 columns is provided corresponding to 1 microlens 152.

As shown in fig. 28(b), the light receiving element group 1161 in which the rectangular light receiving elements 1162-1, 2, and 3 are arranged may correspond to the microlens 152, and as shown in fig. 28(c), the light receiving element group 2161 in which the light receiving elements 2162-1, 2, and 3 are arranged may correspond to the elongated microlens.

Fig. 29 is a sectional view on the broken line a-a of fig. 28 (a). As shown in the figure, the light passing through the pupil area 122a of the exit pupil 120 of the photographing lens 12 is received by the light receiving elements 1162-1a through the microlens 152, the light passing through the pupil area 122b is received by the light receiving elements 1162-2b through the microlens 152, and the light passing through the pupil area 122c is received by the light receiving elements 1162-2c through the microlens 152.

note that 262 shown in the figure is a light shielding portion provided to prevent interference between adjacent pixels.

As described above, the light passing through the pupil area 122a is the 1 st wavelength band light, the light passing through the pupil area 122b is the 2 nd wavelength band light, and the light passing through the pupil area 122c is the 3 rd wavelength band light. Therefore, the light receiving elements 162-1a, 162-1b, and 162-1c receive the light of the 1 st wavelength band, the light receiving elements 162-2a, 162-2b, and 162-2c receive the light of the 2 nd wavelength band, and the light receiving elements 162-3a, 162-3b, and 162-3c receive the light of the 3 rd wavelength band.

In this way, the imaging relationship between the pupil of the photographing lens 12 and the plurality of light receiving elements 162 is coupled by the microlens, and the light received by each light receiving element 162 is limited to the light passing through the pupil area 122 set in advance in the exit pupil 120 of the photographing lens 12.

Each light receiving element 162 of the light receiving element group 161 outputs an image pickup signal having an intensity corresponding to the amount of received light to the image acquisition unit 501. The image acquisition unit 501 generates an image of an object from the image pickup signals of the plurality of light receiving elements 162. Specifically, the image acquisition unit 501 generates image signals representing images of different colors from the image pickup signals supplied from the light receiving element group 161.

In this example, the image acquisition unit 501 generates an image of the 1 st wavelength band (1 st image) from the image pickup signals of the light receiving elements 162-1a, 162-1b, and 162-1c that receive the light passing through the pupil area 122 a. Then, an image of the 2 nd wavelength band (2 nd image) is generated from the image pickup signals of the light receiving elements 162-2a, 162-2b, 162-2c that receive the light passing through the pupil area 122 b. Similarly, an image of the 3 rd wavelength band (3 rd image) is generated from the image pickup signals of the light receiving elements 162-3a, 162-3b, 162-3c that receive the light passing through the pupil area 122 a.

In this embodiment, an example is shown in which the microlens makes light passing through 3 regions of the exit pupil enter 3 light receiving elements in the vertical direction. The 3 regions of the exit pupil to which the microlenses point correspond to the respective regions of the imaging optical system having different transmission wavelengths. Thus, images of 3 different wavelength bands can be obtained simultaneously, independently, and in parallel.

Fig. 30 is a diagram schematically showing an example of the light receiving unit. The image pickup device 16 of the present embodiment includes 1 light receiving element corresponding to 1 microlens. In the example of fig. 30, the light receiving element 162a is arranged corresponding to the microlens 952a, the light receiving element 162b is arranged corresponding to the microlens 952b, and the light receiving element 162c is arranged corresponding to the microlens 952 c.

Light passing through substantially the entire surface of the exit pupil 120 enters each microlens 952. The microlenses 952 have such a refractive power that each light receiving element 162 receives light passing through a partial region of the exit pupil 120. Therefore, the size of the light beam that can be received by the light receiving element 162 is limited to the size of a range that passes through a part of the exit pupil 120.

In the image pickup device 16 of the present embodiment, the optical axis of the microlens 952 is provided so as to be offset from the center position of the light receiving element 162 in a plane perpendicular to the optical axis of the photographing lens 12. Here, the center position of the light receiving element 162 is set as the center position of a region through which light that is received by the light receiving element 162 and can be used for photoelectric conversion passes. The center position of the light receiving element 162 may be the center of the light receiving opening formed in the light shielding portion 262 located near the light receiving element 162.

Each amount of bias is designed in the microlens 952 so that the corresponding light receiving element 162 receives light passing through the pupil area 122 set in advance. The light beam that can be received by the light receiving element 162 is limited to a light beam that passes through a partial region of the exit pupil 120 by the refractive power and bias of the microlens 952.

In the present embodiment, the microlens 952a limits light that the light receiving element 162a can receive through the light receiving opening to light that passes through the pupil region 122 a. Similarly, the microlenses 952b and c restrict the light that can be received by the corresponding light receiving elements 162a and 162c through the light receiving openings to the light that passes through the pupil regions 122b and c, respectively.

Therefore, the light receiving element 162a receives the light of the 1 st wavelength band, the light receiving element 162b receives the light of the 2 nd wavelength band, and the light receiving element 162c receives the light of the 3 rd wavelength band.

In this way, in the plurality of microlenses 952, the optical axis is set so as to be offset with respect to the light receiving opening of the light receiving element 162, so that the corresponding light receiving element 162 receives the object light having passed through each of the predetermined pupil regions 122. As a result, each light receiving element 162 receives light of a different transmission wavelength band. The image acquisition unit 501 can obtain the 1 st image, the 2 nd image, and the 3 rd image from the image pickup signals of the light receiving elements 162.

while the present invention has been described with reference to the above examples, it is to be understood that the present invention is not limited to the above embodiments, and various modifications may be made without departing from the spirit of the present invention.

Description of the symbols

1-flash, 2-shutter button, 3-power/mode switch, 4-mode dial, 5-zoom button, 6-cross button, 7-menu/ok button, 8-play button, 9-return button, 10-camera, 11-tripod, 12-camera lens, 13-polarizing filter, 14-iris, 15-mechanical shutter, 16-camera element, 22-image input controller, 24-image processing section, 26-compression/expansion processing section, 28-video encoder, 30-liquid crystal display, 38-operation section, 40-CPU, 47-ROM, 48-memory, 50-VRAM, 52-media controller, 54-memory card, 500-image processing apparatus, 501-image obtaining section, 503-brightness value calculating section, 505-pixel value extracting section, 507-storage section, 509-synthetic image generating section, 521-pixel value calculating section, 523-proportion or difference calculating section, 525-water quality data calculating section, 527-water quality distribution image generating section, 531-pixel value obtaining section, step S10-step S16-process step of image processing method, S20-step S25-process step of image processing method, S30-step S35-process of image processing method.

55页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:放射线摄像设备、放射线摄像系统、放射线摄像设备的控制方法和程序

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类