Interchangeable lens, information processing device, information processing method, and program

文档序号:261221 发布日期:2021-11-16 浏览:26次 中文

阅读说明:本技术 可互换透镜、信息处理装置、信息处理方法和程序 (Interchangeable lens, information processing device, information processing method, and program ) 是由 早坂健吾 栉田英功 井藤功久 于 2020-04-14 设计创作,主要内容包括:本技术涉及能够执行适当处理的可互换透镜、信息处理装置、信息处理方法和程序。一种多眼可互换透镜,其具有镜筒、可移动部分、多个单眼透镜和光源。所述可移动部分被配置成相对于所述镜筒沿光轴可移动。多个单眼透镜与可移动部分集成以可移动,并且被布置成使得通过各个单眼透镜发射的成像光的发射位置不彼此交叠。一个或多个光源被配置成可沿光轴与多个单眼透镜和可移动部分一体地移动,并且被布置成使得照射在设置在相机机身中的图像传感器上的照射光的发射位置不与来自各个单眼透镜的成像光的发射位置交叠。本技术可应用于例如可互换透镜或多个单眼透镜被进给出的相机系统。(The present technology relates to an interchangeable lens, an information processing apparatus, an information processing method, and a program capable of executing appropriate processing. A multi-eye interchangeable lens has a lens barrel, a movable portion, a plurality of single-eye lenses, and a light source. The movable portion is configured to be movable along an optical axis with respect to the lens barrel. The plurality of monocular lenses are integrated with the movable portion to be movable, and are arranged such that emission positions of imaging light emitted through the respective monocular lenses do not overlap each other. The one or more light sources are configured to be movable integrally with the plurality of monocular lenses and the movable portion along the optical axis, and are arranged such that an emission position of illumination light that illuminates on an image sensor provided in the camera body does not overlap with an emission position of imaging light from each of the monocular lenses. The present technology is applicable to, for example, a camera system in which an interchangeable lens or a plurality of monocular lenses are fed out.)

1. An interchangeable lens, comprising:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses.

2. The interchangeable lens of claim 1, wherein

The one or more light sources emit non-parallel light.

3. The interchangeable lens of claim 2, wherein

The image sensor is located between a condensing point where the non-parallel light is condensed in a case where the movable unit is sent out to a minimum feeding state of a minimum extent and a condensing point where the non-parallel light is condensed in a case where the movable unit is sent out to a maximum feeding state of a maximum extent.

4. The interchangeable lens of claim 2, wherein

In the case where the movable unit is sent out to the maximum feeding state of the maximum extent, the condensing point where the non-parallel light is condensed is located at one of the front side and the depth side including the image sensor.

5. The interchangeable lens of claim 2, wherein

The light source is arranged at a position different from the center of the optical axis of the movable unit, and emits non-parallel light in an oblique direction inclined to the center of the optical axis.

6. The interchangeable lens of claim 1, comprising:

a plurality of the plurality of light sources.

7. The interchangeable lens of claim 1, further comprising:

a storage unit configured to store spot light position information indicating a position of a light source illuminating the image sensor and monocular image position information indicating an emission position in the image sensor, the emission position being an emission position of each imaging light emitted from the plurality of monocular lenses.

8. An information processing apparatus comprising:

a detection unit configured to detect a light image on a captured image captured by an image sensor, the light image being a light image of illumination light emitted from a light source of a lens unit, the lens unit including:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and

a processing unit configured to perform processing according to a detection result of the detecting unit.

9. The information processing apparatus according to claim 8, wherein

The detection unit detects a size of a light image in the captured image.

10. The information processing apparatus according to claim 9, wherein

The processing unit detects a feeding amount of the movable unit according to a size of the light image.

11. The information processing apparatus according to claim 8, wherein

The detection unit detects a detection light image position as a position of a light image in a captured image.

12. The information processing apparatus according to claim 11, wherein

The processing unit detects the amount of feed of the movable unit based on the detected light image position.

13. The information processing apparatus according to claim 11, wherein

The processing unit specifies an imaged monocular image position, which is a position of a monocular image with the position of the monocular lens as a viewpoint in the captured image, from the detected light image position.

14. The information processing apparatus according to claim 13, further comprising:

a storage unit configured to store a storage light image position indicating a position of a light source illuminating the image sensor, and a storage monocular image position indicating an emission position of each imaging light emitted from the plurality of monocular lenses in the image sensor, wherein

The processing unit specifies the imaged monocular image position based on the relationship between the stored light image position and the detected light image position.

15. The information processing apparatus according to claim 14, wherein

The processing unit specifies the imaged monocular image position by correcting the stored monocular image position based on the relationship between the stored optical image position and the detected optical image position.

16. The information processing apparatus according to claim 13, further comprising:

an association unit configured to associate the captured image with the imaging monocular image position.

17. The information processing apparatus according to claim 14, further comprising:

an association unit configured to associate the stored optical image position, the detected optical image position, and the stored monocular image position with the captured image.

18. The information processing apparatus according to claim 13, further comprising:

an association unit configured to associate the stored optical image position, the difference between the stored optical image position and the detected optical image position, and the stored monocular image position with the captured image.

19. An information processing method comprising:

a detection step of detecting a light image on a captured image captured by an image sensor, the light image being a light image of irradiation light emitted from a light source of a lens unit, the lens unit including:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and

and a processing step of performing processing according to the detection result of the detection step.

20. A program for causing a computer to function as:

a detection unit configured to detect a light image on a captured image captured by an image sensor, the light image being a light image of illumination light emitted from a light source of a lens unit, the lens unit including:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and

a processing unit configured to perform processing according to a detection result of the detecting unit.

Technical Field

The present technology relates to an interchangeable lens, an information processing apparatus, an information processing method, and a program, and particularly relates to, for example, an interchangeable lens, an information processing apparatus, an information processing method, and a program for enabling appropriate processing.

Background

A technique for improving the convenience of a service using a multi-viewpoint image including a plurality of images having viewpoints different from each other is proposed (for example, see patent document 1).

CITATION LIST

Patent document

Patent document 1 International publication No. 2015/037472

Disclosure of Invention

Problems to be solved by the invention

For example, a multi-viewpoint image may be captured by a camera system including a monocular lens, which is a plurality of lenses arranged so as not to overlap with each other in the optical axis direction.

However, in the camera system, in a case where a plurality of monocular lenses are sent out in the optical axis direction and the focus or the like is adjusted, the area of the monocular image corresponding to the image formed by the light flux condensed by the monocular lens on the captured image captured by the image sensor may change before and after the sending out.

When the area of the monocular image changes, it may be difficult to appropriately perform processing in the camera system.

The present technology is proposed in view of the above circumstances, and enables appropriate processing.

Solution to the problem

The interchangeable lens of the present technology is an interchangeable lens including: a lens barrel; a movable unit configured to be movable along an optical axis with respect to the lens barrel; a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses.

In the interchangeable lens of the present technology, the movable unit is configured to be movable along the optical axis with respect to the lens barrel, the plurality of monocular lenses are configured to be movable integrally with the movable unit, and emission positions of the imaging light emitted by the respective monocular lenses are arranged so as not to overlap each other. One or more light sources are configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and are arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses.

An information processing apparatus or a program of the present technology is an information processing apparatus including: a detection unit configured to detect a light image on a captured image captured by an image sensor, the light image being a light image of illumination light emitted from a light source of a lens unit, the lens unit including: a lens barrel; a movable unit configured to be movable along an optical axis with respect to the lens barrel; a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and a processing unit configured to perform processing according to a detection result of the detecting unit.

An information processing method of the present technology is an information processing method including: a detection step of detecting a light image on a captured image captured by an image sensor, the light image being a light image of irradiation light emitted from a light source of a lens unit, the lens unit including: a lens barrel; a movable unit configured to be movable along an optical axis with respect to the lens barrel; a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and a processing step of performing processing according to a detection result of the detecting step.

In the information processing apparatus, the information processing method, and the program of the present technology, a light image on a captured image captured by an image sensor is detected, the light image being a light image of irradiation light emitted from a light source of a lens unit, the lens unit including: a lens barrel; a movable unit configured to be movable along an optical axis with respect to the lens barrel; a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged so that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses, and perform processing according to a detection result.

Note that the information processing apparatus may be a stand-alone apparatus, or may be an internal block configuring one apparatus.

Further, the program may be provided by being transmitted via a transmission medium or by being recorded on a recording medium.

Drawings

Fig. 1 is a perspective view showing a configuration example of an embodiment of a camera system to which the present technology is applied.

Fig. 2 is a block diagram showing an example of the electrical configuration of the camera system 1.

Fig. 3 is a diagram for describing an outline of imaging of a captured image performed using the multi-eye interchangeable lens 20.

FIG. 4 is a view showing a monocular lens 31 in the multi-ocular interchangeable lens 200To 314And an arrangement example of the light sources 32L and 32R and a view of a captured image captured using the multi-eye interchangeable lens 20.

Fig. 5 is a diagram for describing an attachment error when the multi-eye interchangeable lens 20 is attached (mounted) to the camera body 10.

Fig. 6 is a diagram for describing a calculation method of obtaining the relative optical axis center positions (dx1', dy1') to (dx4', dy4') as the mounting error reflection position information.

Fig. 7 is a block diagram showing a configuration example of the image processing unit 53.

Fig. 8 is a diagram for describing calibration performed by the camera system 1.

Fig. 9 is a diagram for describing generation of calibration data of a plurality of feed amounts corresponding to a plurality of reference focus positions.

Fig. 10 is a diagram for describing general imaging performed by the camera system 1.

Fig. 11 is a diagram for describing generation of calibration data for an imaging feed amount corresponding to the imaging focus position P4 by interpolation.

Fig. 12 is a sectional view showing a configuration example of the light sources 32L and 32R.

Fig. 13 is a sectional view showing a configuration example of the multi-eye interchangeable lens 20.

Fig. 14 is a diagram for describing a first detection method of detecting the feed amount of the feed unit 23.

Fig. 15 is a diagram showing an example of a change in the dot size of non-parallel light as dot light.

Fig. 16 is a flowchart for describing an example of processing of detecting the feed amount by the first detection method.

Fig. 17 is a diagram for describing a second detection method of detecting the feed amount of the feed unit 23.

Fig. 18 is a flowchart for describing an example of processing of detecting the feed amount by the second detection method.

Fig. 19 is a diagram for describing a third detection method of detecting the feed amount of the feed unit 23.

Fig. 20 is a diagram for describing a state in which a condensing point at which non-parallel light as spot light is condensed is located on one of the front side and the depth side including the image sensor 51 when the feeding unit 23 moves from the minimum feeding state to the maximum feeding state.

Fig. 21 is a diagram for describing a fourth detection method of detecting the feed amount of the feed unit 23.

Fig. 22 is a diagram showing the irradiation position of the spot light in the case where the feeding unit 23 is in the minimum feeding state and the irradiation position of the spot light in the case where the feeding unit 23 is in the maximum feeding state.

Fig. 23 is a diagram showing an example of photographed images of the spot light images PL 'and PR' occurring in a case where the feeding unit 23 is in the minimum feeding state, and photographed images of the spot light images PL "and PR" occurring in a case where the feeding unit 23 is in the maximum feeding state.

Fig. 24 is a flowchart for describing an example of processing of detecting the feed amount by the fourth detection method.

Fig. 25 is a diagram showing another configuration example of the multi-eye interchangeable lens 20.

Fig. 26 is a sectional view showing another configuration example of the light sources 32L and 32R.

Fig. 27 is a diagram showing a state in which the position of the spot light image changes with the lens tilt.

Fig. 28 is a block diagram showing another electrical configuration example of the camera system 1.

Fig. 29 is a block diagram showing a configuration example of a post-processing device that performs post-processing on associated information.

Fig. 30 is a block diagram showing another configuration example of a post-processing device that performs post-processing on associated information.

Fig. 31 is a block diagram showing an example of an electrical configuration of the first another embodiment of the camera system to which the present technology is applied.

Fig. 32 is a block diagram showing an example of an electrical configuration of the second another embodiment of the camera system to which the present technology is applied.

Fig. 33 is a block diagram showing a configuration example of an embodiment of a computer to which the present technology is applied.

Detailed Description

< embodiment of Camera System to which the present technology is applied >

Fig. 1 is a perspective view showing a configuration example of an embodiment of a camera system (imaging apparatus) to which the present technology is applied.

The camera system 1 includes a camera body 10 and a multi-eye interchangeable lens 20 (lens unit).

The multi-eye interchangeable lens 20 is attachable to and detachable from the camera body 10. That is, the camera body 10 includes the camera mount 11, and (the lens mount 22 of) the multi-eye interchangeable lens 20 is fixed (attached) to the camera mount 11 so that the multi-eye interchangeable lens 20 is mounted on the camera body 10. Note that general interchangeable lenses other than the multi-eye interchangeable lens 20 may be attached to and detached from the camera body 10.

The camera body 10 includes an image sensor 51. The image sensor 51 is, for example, a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and images by receiving light beams condensed by the multi-eye interchangeable lens 20 or other interchangeable lenses mounted on (the camera mount 11 of) the camera body 10 and performing photoelectric conversion. Hereinafter, an image obtained by imaging by the image sensor 51 is also referred to as a captured image.

The multi-eye interchangeable lens 20 includes a lens barrel 21, a lens mount 22, and a feeding unit 23. The feeding unit 23 is a movable unit configured to be movable along the optical axis of the lens barrel 21 relative to the lens barrel 21. The feed unit 23 includes five monocular lenses 310、311、312、313And 314As a plurality of monocular lenses. A plurality of monocular lenses 31iIs configured to be movable integrally with the feeding unit 23, and is arranged so as to be movable via the monocular lens 31iEmission positions of the emitted imaging light do not overlap each other. Further, the feeding unit 23 includes light sources 32L and 32R. The light sources 32L and 32R are arranged in association with the feeding unit 23 and the plurality of monocular lenses 31iIs movable integrally along the optical axis of the lens barrel 21, and is arranged so that the emission position of the irradiation light emitted to the image sensor 51 provided in the camera body 10 does not coincide with the plurality of monocular lenses 31iEmission positions of the imaging light of each of the overlap.

The lens barrel 21 has a substantially cylindrical shape, and a lens mount 22 is formed on one bottom surface side of the cylindrical shape.

When the multi-eye interchangeable lens 20 is mounted on the camera body 10, the lens mount 22 is fixed (attached) to the camera mount 11 of the camera body 10.

The feeding unit 23 has a substantially cylindrical shape, and is accommodated in the cylindrical barrel 21.

The feeding unit 23 is provided with a plurality of five lenses, i.e., a monocular lens 310、311、312、313And 314They are arranged so as not to overlap (when viewed) in the optical axis direction of the optical axis (barrel optical axis) of the entire barrel 21. In fig. 1, five monocular lenses 310To 314With four monocular lenses 311To 314A monocular lens 31 is formed on a two-dimensional plane (parallel to the light receiving surface (image forming surface) of the image sensor 51) orthogonal to the optical axis of the lens barrel0A form of a vertex of a square which is a center (as a center of gravity) is provided in the feeding unit 23.

When the multi-eye interchangeable lens 20 is mounted on the cameraMonocular lens 31 while on body 100To 314The light beam from the object is condensed to the image sensor 51 of the camera body 10.

Note that, here, the camera body 10 is a so-called single-board camera including one image sensor 51, but as the camera body 10, a plurality of image sensors, that is, a so-called three-board camera including three image sensors of red, green, and blue (RGB), for example, may be employed. In a three-plate camera, using an optical system such as a prism, the image will be taken from a monocular lens 310To 314The emitted light beams are focused on three image sensors, respectively. Note that the number of image sensors may be any number other than three, for example, a two-plate camera, in addition to a three-plate camera. Further, the image sensor is not limited to the image sensor for RGB. All image sensors may be monochrome or may include color filters such as a bayer array.

Except for five monocular lenses 310To 314In addition, the feeding unit 23 is provided with two light sources 32L and 32R as a plurality of light sources. When the multi-eye interchangeable lens 20 is viewed from the front, the light sources 32L and 32R are provided at the right and left ends of the feeding unit 23, respectively.

The light sources 32L and 32R are configured by, for example, Light Emitting Diodes (LEDs), lasers, or the like, and emit spot light (spot light) from the front side (the side on which the light beam is incident) of the intraocular interchangeable lens 20 to the rear side.

Therefore, in the case where the multi-eye interchangeable lens 20 is mounted on the camera body 10, the point light emitted by the light sources 32L and 32R is received by the image sensor 51 of the camera body 10.

As described above, the feeding unit 23 is provided with the light sources 32L and 32R and the monocular lens 310To 314

The feeding unit 23 is configured to be movable (slidable) in the optical axis direction of the barrel optical axis within the cylindrical barrel 21 so as to be sent out to the front side (retracted to the rear side) within the barrel 21.

Accordingly, the multi-eye interchangeable lens 20 is configured such that the light sources 32L and 32R and the single-eye lens 31 provided in the feeding unit 230To 314Is sent out in one piece.

As described above, since the monocular lens 310To 314And the light sources 32L and 32R are sent out integrally, so the camera system 1 can perform appropriate processing.

That is, a spot light image, which is an image of the spot light emitted by the light sources 32L and 32R, appears in the captured image captured by the image sensor 51, and the attachment error of the multi-eye interchangeable lens 20 can be obtained using the spot light image, as described below.

Among the multi-eye interchangeable lens 20, the single-eye lens 310To 314Is provided in the feeding unit 23, and thus the monocular lens 310To 314Is sent out together with the feeding unit 23, so that, for example, focus adjustment for telephoto shooting or macro shooting can be performed.

In this case, if the light sources 32L and 32R are provided in a portion other than the feeding unit 23 of the multi-eye interchangeable lens 20, even the single-eye lens 310To 314Is sent out, and the light sources 32L and 32R are not sent out either. Then, using the spot light images of the spot lights emitted by such light sources 32L and 32R makes it difficult to accurately obtain the spot light image due to the monocular lens 310To 314Of the roll-out.

In contrast, the monocular lens 310To 314And the light sources 32L and 32R are sent out integrally, accurate acquisition of the monocular lens 31 can be performed using the spot light image0To 314The attachment error changed by the sending out.

Furthermore, even in the case of taking an image, the monocular lens 310To 314The respective regions of the monocular image corresponding to the image formed by the condensed light beams are formed by the monocular lens 310To 314The process of sending out is changed, and an appropriate process of accurately specifying each region of the monocular image can be executed.

Further, a lens for suppressing flare effect 31 can be obtained0To 314The calibration data of the influence of the lens distortion changed by the sending out, and further, the calibration data of the influence of the lens distortion changed by the sending out may be executedAppropriate processing of parallax information with suppressed lens distortion effects is obtained with such calibration data.

Note that, in fig. 1, the multi-eye interchangeable lens 20 is provided with 5 monocular lenses 310~314. However, the number of monocular lenses provided in the multi-eye interchangeable lens 20 is not limited to 5, and any number such as 2, 3, 6, or more may be employed.

Further, the plurality of monocular lenses provided in the multi-eye interchangeable lens 20 may be arranged at any position on the two-dimensional plane in addition to the center and vertex positions of the square.

Further, as the plurality of monocular lenses provided in the multi-eye interchangeable lens 20, a plurality of lenses having different focal lengths, f-numbers, or other specifications may be employed. Note that here, in order to simplify the description, a plurality of lenses having the same specification are employed.

Further, in fig. 1, two light sources 32L and 32R are provided in the multi-eye interchangeable lens 20. However, the number of light sources provided in the multi-eye interchangeable lens 20 is not limited to two, and any number such as one or three or more may be employed as necessary.

Further, in the case where two light sources 32L and 32R are provided as the plurality of light sources in the multi-eye interchangeable lens 20, for example, the two light sources 32L and 32R may be arranged in a connecting arrangement of five monocular lenses 310To 314On the line of the farthest two points on the plane of (a), i.e., on the circle of the multi-eye interchangeable lens 20 when the substantially columnar feed unit 23 is viewed from the front of fig. 1. In this case, when the feeding unit 23 is viewed from the front, the light sources 32L and 32R are arranged on a line passing through the center of the circle. As described below, it is desirable that the light sources 32L and 32R be arranged as far apart as possible. When the feeding unit 23 is viewed from the front, the light sources 32L and 32R can be arranged most apart by arranging the light sources 32L and 32R on a line passing through the center of the circle.

In the multi-ocular interchangeable lens 20, five monocular lenses 31 as a plurality of monocular lenses0To 314Is arranged so that when multiple eyes can interact with each otherMonocular lens 31 when interchangeable lens 20 is mounted on camera body 10iBecomes orthogonal to the light receiving surface of the image sensor 51 (monocular optical axis).

In the camera system 1 having the multi-eye interchangeable lens 20 mounted on the camera body 10, the image sensor 51 images and is formed by five monocular lenses 310To 314The respective condensed light fluxes form images corresponding to the images formed on the light receiving surface of the image sensor 51.

Now, suppose that the lens is a monocular lens 31iAn image corresponding to an image formed by the condensed light beams (where i is 0, 1, 2, 3, 4) is referred to as a monocular image, and a photographed image photographed by one image sensor 51 includes images respectively corresponding to five monocular lenses 310To 314Corresponding to the respective monocular images (corresponding to the respective monocular lenses 31)0To 314An image of an image formed by the collected light beams).

Monocular lens 31iHas a single-eye lens 31iOf the viewpoint at the position of (1), and thus each monocular lens 310To 314Are images of different viewpoints.

Further, the captured image includes a spot light image (image formed by spot light) which is an image corresponding to the spot light emitted by the two light sources 32L and 32R, respectively.

Here, the camera system 1 in fig. 1 includes a camera body 10 and a multi-eye interchangeable lens 20 attachable to and detachable from the camera body 10, but the present technology can also be applied to a so-called lens-integrated camera system in which the multi-eye interchangeable lens 20 is fixed to the camera body 10. That is, the present technology is applicable to, for example, a lens-integrated camera.

Further, one monocular lens 31 may be configured by arranging a plurality of lenses in the optical axis direction of the barrel optical axis instead of by one lensi

Further, for example, in addition to the camera body 10, some or all of the processes of the area specifying unit 52, the image processing unit 53, the position calculating unit 57, the spot light image detecting unit 62, and the feed amount detecting unit 64, which will be described below, of the camera body 10 may be performed by a cloud server, a reproduction-dedicated device, or the like.

Further, in addition to the adjustment of the focus by the feeding of the feeding unit 23 of the multi-eye interchangeable lens 20, the zoom magnification may be adjusted. Hereinafter, for the sake of simplifying the description, it is assumed that the focus is adjusted by the feeding of the feeding unit 23.

Note that, with respect to the camera body 10, the surface of the side on which the multi-eye interchangeable lens 20 is mounted, that is, the surface on which the camera mount 11 is located, is defined as the front surface.

< example of Electrical configuration of Camera System 1>

Fig. 2 is a block diagram showing an electrical configuration example of the camera system 1 of fig. 1.

In the camera system 1, the multi-eye interchangeable lens 20 includes a storage unit 41, a communication unit 42, and a control unit 43.

The storage unit 41 stores lens information as information on the multi-eye interchangeable lens 20. The lens information includes individual difference reflection position information (known reference position).

The individual difference reflection position information is about the monocular lens 31 on the known captured image appearing on the predetermined object located at the known distanceiFor example, the known shot image is shot by the image sensor(s) 51 when the multi-eye interchangeable lens 20 is mounted on the camera body 10. The individual difference reflection position information is, as it were, position information on the incident position of a predetermined light beam on the image sensor 51, which is deviated by a different amount (deviation from the design position) for each individual of the multi-eye interchangeable lens 20 due to a manufacturing error (manufacturing variation) during the manufacture of the multi-eye interchangeable lens 20, and which includes a different manufacturing error for each individual during the manufacture of the multi-eye interchangeable lens 20 (deviation from the monocular lens 31 due to the different manufacturing error)iDeviation of the emission position of the emitted imaging light). As the individual difference reflection position information, a known shot pattern from which a predetermined object located at a known distance appears may be employedMonocular lens 31 on imageiThe position itself corresponding to the predetermined light beam on the monocular image, for example, the known captured image is captured by the image sensor 51 when the multi-eye interchangeable lens 20 is mounted on the camera body 10.

Here, it is assumed that the monocular lens 31iIs formed through the monocular lens 31iThe position of the image of the light beam of (1) is the optical axis center position. Note that the monocular optical axis is considered to be parallel to the optical axis of the entire lens barrel 21 (barrel optical axis), or is considered to be arranged at a fixed distance, but a deviation occurs.

Now, assume for example that the lens passes through a monocular lens 31iIs adopted as the monocular lens 31iA monocular lens 31iThe individual difference reflecting position information of the monocular image of (1) is the optical axis center position of the monocular image.

Note that the predetermined light beam is not limited to passing through the monocular lens 31iThe monocular optical axis of (1). That is, as the predetermined light beam, for example, a light beam passing through the monocular lens 31 may be adoptediA light beam at a position separated by a predetermined distance and parallel to the monocular optical axis, etc.

Except for a known monocular lens 31 for taking an imageiThe lens information includes individual difference point light position information (known light position) regarding the positions of the point light images of the point lights of the light sources 32L and 32R on the known captured image, in addition to the individual difference reflection position information of the monocular image. As the individual difference spot light position information, the positions themselves of the spot light images of the spot lights of the light sources 32L and 32R on the known captured image may be employed. The individual difference point light position information is position information including a manufacturing error different for each individual during the manufacture of the multi-eye interchangeable lens 20, similar to the individual difference reflection position information.

Here, a unique lens Identification (ID) may be assigned to the multi-eye interchangeable lens 20, and the lens ID of the multi-eye interchangeable lens 20 may be adopted as the lens information to be stored in the storage unit 41. Further, in this case, a database may be prepared in which the lens ID as the lens information is associated with the individual difference reflection position information and the individual difference point light position information as the lens information other than the lens ID of the multi-eye interchangeable lens 20 specified by the lens ID. In this case, the individual difference reflection position information, the individual difference point light position information, and the like of the multi-eye interchangeable lens 20 associated with the lens ID can be acquired by searching the database using the lens ID as a key.

The communication unit 42 performs wired or wireless communication with a communication unit 56 of the camera body 10, and the communication unit 56 will be described below. Note that the communication unit 42 may also communicate with a server on the internet, a Personal Computer (PC) on a wired or wireless Local Area Network (LAN), or other external devices by any communication method as necessary.

The communication unit 42 communicates with the communication unit 56 of the camera body 10 to transmit the lens information stored in the storage unit 41 to the communication unit 56 in, for example, the following cases: the multi-eye interchangeable lens 20 is mounted on the camera body 10 or the power is turned on in a state where the multi-eye interchangeable lens 20 is mounted on the camera body 10.

Further, the communication unit 42 receives a command and other information transmitted from the communication unit 56, and supplies the command and other information to the control unit 43. The control unit 43 controls the multi-eye interchangeable lens 20, such as focus adjustment, by feeding (moving) the feeding unit 23 according to information from the communication unit 42.

The camera body 10 includes an image sensor 51, an area specifying unit 52, an image processing unit 53, a display unit 54, a storage unit 55, a communication unit 56, a position calculation unit 57, a control unit 61, a spot light image detection unit 62, a feed amount information storage unit 63, and a feed amount detection unit 64.

The image sensor 51 is, for example, a CMOS image sensor as described with reference to fig. 1, and the light receiving surface of the image sensor 51 is formed by the monocular lens 31 of the multi-eye interchangeable lens 20 mounted on the camera body 100To 314The condensed light beam and the light beam as spot light emitted by the light sources 32L and 32R are irradiated.

Image of a personThe sensor 51 is received by the monocular lens 310To 314Collects the light beams and the light beams as spot light emitted by the light sources 32L and 32R, and performs photoelectric conversion, images and outputs a photographed image including the monocular lens 310To 314Is generated by the monocular lens 310To 314A monocular image corresponding to an image formed by the condensed light beams) and spot light images of the spot lights of the light sources 32L and 32R. The captured image (another captured image) output by the image sensor 51 is supplied to the area specifying unit 52, the position calculating unit 57, and the point light image detecting unit 62.

The captured image output by the image sensor 51 is supplied from the position calculation unit 57 to the area specification unit 52, and also mounting error reflection position information (unknown reference position) is supplied as position information on a monocular image included in the captured image output by the image sensor 51.

The mounting error reflection position information is about the monocular lens 31 on the captured image (another captured image) obtained by imaging an arbitrary subject by the image sensor(s) 51 in a state where the multi-eye interchangeable lens 20 is mounted on the camera body 10, for exampleiPosition information of a position on the monocular image corresponding to a predetermined light beam (regardless of whether the distance to the object is known). The mounting error reflection position information is, as it were, position information on the incident position of the predetermined light beam on the image sensor 51, which is deviated due to the mounting error of the multi-eye interchangeable lens 20 when the multi-eye interchangeable lens 20 is mounted, and is mounting error included when the multi-eye interchangeable lens 20 is used (from the monocular lens 31 due to the mounting error)iDeviation of the emission position of the emitted imaging light).

The mounting error represents a deviation of an attachment position (mounting position) of the multi-eye interchangeable lens 20 due to the fact that the multi-eye interchangeable lens 20 is attachable to and detachable from the camera body 10. For example, the mounting error may change each time the multi-eye interchangeable lens 20 is mounted. Further, for example, when an impact is applied to the camera system 1 having the multi-eye interchangeable lens 20 mounted on the camera body 10, the mounting error may change.

The mounting error reflection position information is position information including a manufacturing error in addition to the mounting error (including the slave monocular lens 31 due to the manufacturing error and the mounting error)iPositional information of a deviation of an emission position of the emitted imaging light).

Here, in the case where the captured image is a known captured image obtained by imaging a certain subject at a known distance, in the case where the optical axis center position on the monocular image included in the known captured image is employed as the individual difference reflection position information, for example, the optical axis center position (irrespective of whether the distance to the subject is known) on the monocular image included in the captured image (another captured image) obtained by imaging an arbitrary subject may be employed as the mounting error reflection position information.

The area specifying unit 52 specifies the monocular lens 31 on the captured image from the image sensor 51 based on the mounting error reflection position information from the position calculating unit 570To 314And outputting area designation result information indicating a designation result of each area.

That is, the area specifying unit 52 specifies, for example, a rectangular area having a predetermined size centered (as the center of gravity) on the mounting error reflection position information of the captured image from the captured image of the image sensor 51 as the area of the monocular image.

Here, the area specifying unit 52 may output, as the area specifying result information, a set of, for example, the entire captured image and area information indicating each area of the monocular image on the entire captured image. Further, the area specifying unit 52 may extract (cut out) a monocular image from the captured image and output the monocular image as the area specification result information. Note that the monocular image may be output in a set having region information.

Hereinafter, for the sake of simplifying the description, for example, it is assumed that the region specifying unit 52 outputs a monocular image (the monocular lens 31) extracted from a captured image0To 314Monocular image of (d) as the region designation result information.

Monocular lens 31 output from area specifying unit 520To 314The monocular image of (a) is supplied to the image processing unit 53.

The image processing unit 53 is a part of a processing unit that performs processing according to a detection result of the point light image detection unit 62 to be described below. The image processing unit 53 uses the monocular lens 31 from the region specifying unit 520To 314Has a monocular image (i.e., has a monocular lens 31)0To 314For example, monocular images of different viewpoints at respective positions), and the feed amount of the feeding unit 23 supplied from the feed amount detecting unit 64, for example, performs image processing such as generating parallax information, and refocus generating (reconfiguring) an image in which an arbitrary object is focused, and supplies a processing result image obtained as a result of the image processing to the display unit 54 and the storage unit 55.

Note that the image processing unit 53 may also perform general image processing such as defect correction and noise reduction. Further, the image processing unit 53 may perform image processing on both the image to be saved (stored) in the storage unit 55 and the image to be displayed as a so-called through image on the display unit 54.

The display unit 54 includes, for example, a liquid crystal panel, an organic Electroluminescence (EL) panel, or the like, and is disposed on the rear surface of the camera body 10. The display unit 54 displays, for example, the processing result image or the like supplied from the image processing unit 53 as a through image. As the through image, in addition to the processing result image, a part of the entire captured image captured by the image sensor 51 or a monocular image extracted from the captured image may be displayed. In addition, the display unit 54 may display information such as menus and settings of the camera body 10, for example.

The storage unit 55 includes a memory card (not shown) or the like, and stores a processing result image supplied from the image processing unit 53 in accordance with a user operation or the like, for example.

The communication unit 56 performs wired or wireless communication with the communication unit 42 of the multi-eye interchangeable lens 20 and the like. Note that the communication unit 56 can also communicate with a server on the internet, a PC on a wired or wireless LAN, or other external devices by any communication method as needed.

For example, when the multi-eye interchangeable lens 20 is mounted on the camera body 10, the communication unit 56 communicates with the communication unit 42 of the multi-eye interchangeable lens 20 to receive the lens information of the multi-eye interchangeable lens 20 transmitted from the communication unit 42, and supplies the lens information to the position calculation unit 57 and the spotlight image detection unit 62.

Further, the communication unit 56 transmits information specifying a focus (position) to the communication unit 42, for example, from the control unit 61.

The position calculation unit 57 obtains the monocular lens 31 as included in the captured image supplied from the image sensor 51 from the individual difference reflection position information included in the lens information from the communication unit 56iReflects the positional information, and supplies the mounting error reflection positional information to the area specifying unit 52.

Note that, in fig. 2, in obtaining the mounting error reflection position information as the optical axis center position on the monocular image included in the captured image supplied from the image sensor 51, the position calculation unit 57 uses the individual difference point light position information in addition to the individual difference reflection position information included in the lens information.

The control unit 61 controls the focus and the like according to a user operation for adjusting the focus and the like. For example, the control unit 61 provides information for specifying a focus to the communication unit 56 according to an operation by the user.

The spot light image detection unit 62 detects the incidence range of the spot light emitted from the light sources 32L and 32R on the image sensor 51, and supplies the detection result to the feed amount detection unit 64.

That is, the spot light image detection unit 62 detects a spot light image on the captured image from the image sensor 51 based on the individual difference spot light position information included in the lens information from the communication unit 56. Further, the spot light image detection unit 62 detects (generates) spot light image information on the spot light image, such as the (spot) size and position of the spot light image (detects the light image position), and outputs the spot light image information as the detection result of the spot light image. The spot light image information output by the spot light image detection unit 62 is supplied to a feed amount detection unit 64, and the feed amount detection unit 64 is another part of a processing unit that performs processing according to the detection result of the spot light image detection unit 62. As the spot light image information, in addition to information directly indicating the size and position of the spot light image on the captured image (for example, the size and position itself), information indirectly indicating the size and position of the spot light image on the captured image (for example, an image in which the spot light image appears in a state where the size and position of the spot light image on the captured image are maintained) may be employed.

The feed amount information storage unit 63 stores feed amount information. The feed amount information is information in which the feed amount of the feed unit 23 is associated with the spot light image information on the spot light image when the feed unit 23 is fed out by the feed amount. The feed amount information may be generated in advance and stored in the feed amount information storage unit 63. Further, for example, before shipping the multi-eye interchangeable lens 20 from the factory, the feed amount information may be generated and stored in the storage unit 41 in advance as a part of the lens information. In the case where the feed amount information is stored in the storage unit 41 as a part of the lens information, the communication unit 56 communicates with the communication unit 42 to acquire the lens information stored in the storage unit 41, and supplies the feed amount information included in the lens information to the feed amount information storage unit 63 and stores the information therein.

The feed amount detecting unit 64 detects the feed amount of the feeding unit 23 associated with the spot light image information from the spot light image detecting unit 62 among the feed amount information stored in the feed amount information storing unit 63, and supplies the feed amount to the image processing unit 53.

< overview of imaging performed using the multi-eye interchangeable lens 20>

Fig. 3 is a diagram for describing an outline of imaging of a captured image performed using the multi-eye interchangeable lens 20.

Is provided withThe image sensor 51 of the camera body 10 of the multi-eye interchangeable lens 20 images a photographed image including a scene corresponding to the image captured by the single-eye lens 31iA monocular image of an image formed when the light beams are condensed, and a spot light image of spot light emitted by the light sources 32L and 32R.

Here, in the present specification, the monocular lens 31iOf the optical axis directions of the monocular optical axes of the camera body 10, a direction from the rear surface side to the front surface side is defined as a z direction (axis), a direction from left to right when facing the z direction is defined as an x direction, and a direction from bottom to top is defined as a y direction.

Further, in order to match the left and right sides of the object appearing in the image with the left and right sides of the object in the real space, and the monocular lens 31 is madeiLeft and right sides of the position of (1) and with respect to the monocular lens 31iThe left and right sides of the captured image of the monocular image of (1) are matched, and hereinafter, the position on the captured image, the monocular lens 31, will be described with reference to the z direction, i.e., the state facing the imaging direction in which the subject to be captured exists from the rear surface side of the camera body 10iUnless otherwise specified, the position of (b), the left and right of the object, and the like.

Note that one monocular lens 31 is attachediAnd another monocular lens 31jThe straight line or line segment of the monocular optical axes of (i ≠ j) is also referred to as the baseline, and the distance between the monocular optical axes is also referred to as the baseline length. Further, the angle indicating the direction of the base line is also referred to as a base line angle. Here, as the base line angle, for example, an angle formed by the x-axis and the base line (angle of epipolar line) is employed.

Further, in the present specification (and claims), the feeding out of the feeding unit 23 broadly means that the feeding unit 23 moves in the optical axis direction of the optical axis of the lens barrel. Therefore, the feeding out of the feeding unit 23 includes both the movement of the feeding unit toward the front side and the movement of the feeding unit toward the depth side.

FIG. 4 is a view showing a monocular lens 31 in the multi-ocular interchangeable lens 200To 314And arrangement examples of the light sources 32L and 32R and a captured image captured using the multi-eye interchangeable lens 20To (d) is shown.

A of FIG. 4 is a diagram showing a monocular lens 31 in the multi-ocular interchangeable lens 200To 314And a rear view of an example of the arrangement of the light sources 32L and 32R.

In A of FIG. 4, as shown in FIG. 1, a monocular lens 310To 314Is arranged so that the other four monocular lenses 311To 314The monocular lens 31 is arranged on a two-dimensional plane parallel to the light receiving surface of the image sensor 510The vertices of the central square.

That is, in fig. 4, for example, with a monocular lens 310To 314Monocular lens 31 in (1)0As a reference, a monocular lens 311Is arranged on the monocular lens 310Upper right of (1), a monocular lens 312Is arranged on the monocular lens 310At the upper left. Furthermore, a monocular lens 313Is arranged on the monocular lens 310Lower left of (1), a monocular lens 314Is arranged on the monocular lens 310At the lower right.

Further, in a of fig. 4, the light source 32L is arranged at a left end position of the multi-eye interchangeable lens 20 having a substantially circular plane, and the light source 32R is arranged at a right end position opposite to the light source 32L with respect to the center (center) of the multi-eye interchangeable lens 20 having a substantially circular plane.

Note that the light sources 32L and 32R may be arranged at any different positions of (the feeding unit 23 of) the multi-eye interchangeable lens 20.

Note that the light sources 32L and 32R may be arranged such that the spot light images PL and PR of the spot light emitted by the respective light sources 32L and 32R on the captured image captured by the image sensor 51 are located outside the area of the monocular image included in the captured image (outside the area of the monocular image that is passed through the monocular lens 31)iOut of the range of light irradiation). In this case, it is possible to suppress the spot light images PL and PR from overlapping the monocular image to deteriorate the image quality of the monocular image.

B of fig. 4 is a view showing an example of a photographed image photographed by the image sensor 51 of the camera body 10 mounted with the multi-eye interchangeable lens 20 in which20 in which a monocular lens 31 is arranged0To 314And light sources 32L and 32R, as shown in a of fig. 4.

By being provided with a monocular lens 310To 314And the image sensor 51 of the camera body 10 of the multi-eye interchangeable lens 20 of the light sources 32L and 32R includes the images corresponding to the images captured by the single-eye lens 31, respectively0To 314Monocular images E0, E1, E2, E3, and E4 of the image formed by the condensed light beams, and point light images PL and PR of point light of the respective light sources 32L and 32R.

The region specifying unit 52 (fig. 2) reflects the position information based on the optical axis center position as the mounting error of each monocular image E # i obtained in the position calculating unit 57 on the captured image with the monocular lens 31 having passed throughiA rectangular area of a predetermined size centered on the optical axis center position of the area irradiated with the light beam of (1) is designated as each monocular lens 31iThe optical axis center position is the mounting error reflection position information of the monocular image E # i.

Thus, for the monocular lens 31iThe monocular image E # i becomes the same as the slave monocular lens 31iAn image similar to a photographed image obtained by performing imaging using a separate camera or a separate image sensor, that is, by using the monocular lens 31iIs an image obtained by imaging of the viewpoint.

Therefore, each monocular lens 310To 314A parallax occurs between any two monocular images E # i and E # j of the monocular images E0 to E4. That is, the same subject photographed in the monocular images E # i and E # j appears at a position shifted by parallax.

< attachment error of the multi-ocular interchangeable lens 20>

Fig. 5 is a diagram for describing an attachment error when the multi-eye interchangeable lens 20 is attached (mounted) to the camera body 10.

That is, fig. 5 shows an example of a captured image captured by the camera system 1 in which the multi-eye interchangeable lens 20 is attached to the camera body 10.

In the case where the multi-eye interchangeable lens 20 is attached to the camera body 10, the attachment position of the multi-eye interchangeable lens 20 with respect to the light receiving surface of the image sensor 51 of the camera body 10 may be shifted particularly in the rotational direction among mainly the horizontal direction (x direction), the vertical direction (y direction), and the rotational direction. First, due to manufacturing errors, the attachment position of the multi-eye interchangeable lens 20 is offset by a different amount for each individual. Further, when the multi-eye interchangeable lens 20 is attached to the camera body 10, or when an impact is applied to the camera system 1 having the multi-eye interchangeable lens 20 attached to the camera body 10 when the multi-eye interchangeable lens 20 is used, the attachment position of the multi-eye interchangeable lens 20 changes.

Now, for example, an error of the actual attachment position with respect to a predetermined position (e.g., the design attachment position of the multi-eye interchangeable lens 20) is referred to as an attachment error. The attachment error with respect to the design attachment position is caused by manufacturing variations or the like, and changes when the multi-eye interchangeable lens 20 is attached to the camera body 10 or when an impact is applied to the camera system 1 having the multi-eye interchangeable lens 20 attached to the camera body 10.

The attachment error is an error of the actual attachment position of the multi-eye interchangeable lens 20, and suitably includes a manufacturing error and a mounting error. For example, in the case of using the designed attachment position of the multi-eye interchangeable lens 20 as a reference, the attachment error includes both a manufacturing error and a mounting error. Further, for example, in the case of using a position shifted from the design attachment position of the multi-eye interchangeable lens 20 by the manufacturing error as a reference, the attachment error does not include the manufacturing error, and includes a mounting error.

As described with reference to FIG. 4, the monocular image E # i is similar to that obtained by passing through the monocular lens 31iIs an image of an image obtained by imaging of a viewpoint, and therefore, the monocular images E0 to E4 are images having different viewpoints.

In the case where parallax information is obtained using monocular images E0 to E4, which are images having different viewpoints, for example, for the monocular lens 310To 314The baseline length and baseline angle depicted in fig. 3 are required.

Due to the monocular lens 310To 314Is fixed to the multi-ocular interchangeable lens 20, the base line length is a fixed value that is not changed due to a mounting error and that can be measured in advance after the multi-ocular interchangeable lens 20 is manufactured.

Meanwhile, the base line angle varies due to an attachment error (mounting error) in the rotational direction of the multi-eye interchangeable lens 20. Therefore, in order to obtain accurate parallax information using the monocular images E0 to E4, it is necessary to deal with an attachment error in the rotational direction.

Here, the monocular lens 31iThe image distortion caused by the lens aberration of (a) is small and in some cases negligible, there is no problem with attachment errors in the horizontal direction and the vertical direction. However, in the case where image distortion caused by lens aberration is large and distortion correction is required for the image distortion, it is necessary to accurately grasp the optical axis center position of the monocular image E # i in order to perform appropriate distortion correction. In order to accurately grasp the optical axis center position of the monocular image E # i, it is necessary to grasp attachment errors (mounting errors) in the horizontal direction and the vertical direction.

Now, as shown in fig. 5, (coordinates of) optical axis center positions of the monocular images E0 to E4 are represented as (x0, y0), (x1, y1), (x2, y2), (x3, y3), and (x4, y4) in a specific xy coordinate system (two-dimensional coordinate system).

Furthermore, the monocular lens 310~314A monocular lens 31 to be positioned at the center (center)0The monocular image E0 is referred to as a central image E0, and the monocular lenses 31 located at the periphery are referred to as peripheral monocular lenses1~314The monocular images E1 to E4 are referred to as peripheral images E1 to E4.

Relative optical axis center positions (hereinafter also referred to as relative optical axis center positions) (dx1, dy1), (dx2, dy2), (dx3, dy3), and (dx4, dy4) of the peripheral images E1 to E4 of the monocular images E0 to E4 with respect to one monocular image (i.e., the center image E0, for example) can be obtained by equation (1).

[ mathematical formula 1]

(dx1,dy1)=(x1-x0,y1-y0)

(dx2,dy2)=(x2-x0,y2-y0)

(dx3,dy3)=(x3-x0,y3-y0)

(dx4,dy4)=(x4-x0,y4-y0)...(1)

In the case where the optical axis center position (x0, y0) of the center image E0 is set as the origin of the xy coordinate system, the relative optical axis center positions (dx1, dy1), (dx2, dy2), (dx3, dy3), and (dx4, dy4) are equal to the optical axis center positions (x1, y1), (x2, y2), (x3, y3), and (x4, y4) of the peripheral images E1 to E4.

The relative optical axis center position (dx # i, dy # i) (where i ═ 1, 2, 3, 4) can be regarded as a vector in a direction connecting the base lines of the optical axis center position (x0, y0) of the center image E0 and the optical axis center position (x # i, y # i) of the peripheral image E # i. From the relative optical axis center positions (dx # i, dy # i), a base line angle (tan) representing the direction of a base line L0# i connecting the optical axis center position (x0, y0) of the center image E0 and the optical axis center position (x # i, y # i) of the peripheral image E # i can be obtained-1((y#i-y0)/(x#i-x0))=tan-1(dy#i/dx#i)。

Therefore, if the relative optical axis center position (dx # i, dy # i) can be obtained, the base line angle representing the direction of the base line L0# i at that time can be obtained, and accurate parallax information that is not affected by the attachment error in the rotational direction can be obtained using the base line angle.

In the present technology, the optical axis center positions (x0, y0) to (x4, y4) of the respective monocular images E0 to E4 on the known captured image in which a predetermined object located at a known distance appears are obtained, the known captured image being captured by the image sensor 51, that is, the relative optical axis center positions (dx1, dy1) to (dx4, dy4) of the respective monocular images E1 to E4 are obtained as individual difference reflection position information in the case where the optical axis center position (x0, y0) of the center image E0 is set as the origin. Further, in the present technology, the optical axis center positions (x0', y0') to (x 4', y4') of the respective monocular images E0 to E4 on the captured image at the time of imaging the captured image, that is, the relative optical axis center positions (dx1', dy1') to (dx4', dy4') of the respective monocular images E1 to E4 are obtained as the mounting error reflection position information in a case where the optical axis center position (x0', y0') of the center image E0 is set as the origin, using the individual difference reflection position information ((x0, y0) to (x4, y4) or (dx1, dy1) to (dx4, dy 4).

If the relative optical axis center positions (dx1', dy1') to (dx4', dy4') of the respective monocular images E1 to E4 on the captured image as the mounting error reflection position information are obtained, the base line angle at the time of imaging the captured image can be obtained with which accurate parallax information that is not affected by the attachment error in the rotation direction can be obtained.

The position calculation unit 57 in fig. 2 obtains the relative optical axis center positions (dx1', dy1') to (dx4', dy4') of the respective monocular images E1 to E4 on the captured image as the mounting error reflection position information using the relative optical axis center positions (dx1, dy1) to (dx4, dy4) as the individual difference reflection position information.

< calculation method for obtaining the relative optical axis center position (dx # i ', dy # i') of the monocular image E # i on the captured image as the mounting error reflection position information >

Fig. 6 is a diagram for describing a calculation method for obtaining the relative optical axis center positions (dx1', dy1') to (dx4', dy4') as the mounting error reflection position information.

Hereinafter, for simplification of description, an xy coordinate system with the optical axis center position (x0, y0) of the center image E0 as the origin is used. In this case, as described above, the relative optical axis center positions (dx1, dy1), (dx2, dy2), (dx3, dy3), and (dx4, dy4) are equal to the optical axis center positions (x1, y1), (x2, y2), (x3, y3), and (x4, y 4).

A of fig. 6 shows an example of a known photographic image obtained by imaging a predetermined subject in the camera system 1 in which the multi-eye interchangeable lens 20 is attached to the camera body 10.

An object appearing in a known captured image is, for example, a chart image in which a predetermined chart (for example, a circle divided into four equal parts by a line segment passing through the center of the circle) is drawn. By arranging the chart image on the monocular lens 310Such that the center of the circle of the chart, which is the chart image, appears at a predetermined point on the center image E0, for example (i.e., the optical axis center position of the center image E0, for example (x0,y0) — (0, 0)), a known captured image is captured. Therefore, the known captured image is an image obtained by imaging a chart image in which a predetermined chart is drawn at a known distance.

Since the known captured image is captured as described above, a chart image in which the center of the circle as the chart is located at the optical axis center position (x0, y0) ═ 0, 0 appears in the center image E0 on the known captured image. Further, the chart image appears in the peripheral image E # i similarly to the center image E0. Note that, in the peripheral image E # i, the position of the circle as the graph is shifted from the position of the circle as the graph appearing in the center image E0 according to the parallax with the center image E0.

Therefore, in the center image E0 on the known captured image, the center of the circle as the graph is located at the optical axis center position (x0, y0) — (0, 0), but in the peripheral image E # i, the center of the circle as the graph is shifted from the optical axis center position (x # i, y # i) according to the parallax with the center image E0.

Since the chart image is placed at a known distance, when a known shot image is shot, it is possible to take the image from the known distance and the monocular lens 31iAnd a monocular lens 310The base length and the base angle therebetween obtain the parallax between the peripheral image E # i and the center image E0.

Here, imaging of a known captured image may be performed, for example, before shipping the multi-eye interchangeable lens 20 from the factory. Therefore, the base line angle at the time of imaging the known captured image can be measured at the time of imaging the known captured image. Alternatively, the monocular lens 31 may be adjusted at the time of imaging a known captured imageiAttachment (fixation) to the lens barrel 21 makes the base line angle a predetermined value such as a design value.

Since the optical axis center position (x # i, y # i) of the peripheral image E # i is a position shifted from the center of the circle as the chart appearing in the peripheral image E # i according to the parallax with the center image E0, the optical axis center position can be obtained from the center position of the circle as the chart appearing in the peripheral image E # i and the parallax with the center image E0.

Further, since the center of the graph as the graph image appears at the optical axis center position (x0, y0) ((0, 0)) of the center image E0 on the known captured image, the optical axis center position (x0, y0) of the center image E0 can be obtained by detecting the position as the center of the circle of the graph from the center image E0.

As described above, the optical axis center positions (x1, y1) to (x4, y4) of the peripheral images E1 to E4 and the optical axis center position (x0, y0) of the center image E0 on the known captured image can be obtained from the known captured image.

From the individual difference reflection position information as the optical axis center position (x0, y0) of the center image E0 on the known captured image and the individual difference reflection position information as the optical axis center position (x # i, y # i) of the peripheral image E # i, it is possible to obtain the relative optical axis center position (dx # i, dy # i) as the individual difference reflection relative position information which is the relative individual difference reflection position information of the peripheral image E # i with respect to the individual difference reflection position information (x0, y0) of the center image E0, and the relative optical axis center position (dx # i, dy # i) as the individual difference reflection relative position information is stored as the lens information in the storage unit 41 of fig. 2. Here, the individual difference reflection position information (x # i, y # i) is also referred to as individual difference reflection absolute position information, in contrast to the individual difference reflection relative position information (dx # i, dy # i).

Note that as the lens information, in addition to the relative position information (relative optical axis center position) (dx # i, dy # i) (i 1, 2, 3, 4) reflected by the individual difference, the absolute position information (optical axis center position) (x # i, y # i) (i 0, 1, 2, 3, 4) may be reflected by the individual difference. This is because the individual difference reflection relative position information (dx # i, dy # i) can be obtained from the individual difference reflection absolute position information (x # i, y # i) according to equation (1), and the individual difference reflection relative position information (dx # i, dy # i) is (substantially) equivalent information of the individual difference reflection absolute position information (x # i, y # i).

When the relative optical axis center positions (dx1', dy1') to (dx4', dy4') are obtained as the mounting error reflection position information, the relative position information (dx # i, dy # i) is reflected as the individual differences (or the absolute position information (x # i, y # i)) on the known captured image, in addition to the individual differencesIndividual difference point light position information (X) of the positions of point light images PL and PR of point light of light sources 32L and 32RL,YL) And (X)R,YR) Is obtained in advance.

For example, the center of gravity position of the point light image PL on the known captured image may be adopted as the individual difference point light position information (X) of the point light image PLL,YL). Similarly, the barycentric position of the spot light image PR on the known captured image may be adopted as the individual difference spot light position information (X) of the spot light image PRR,YR)。

For individual difference point light position information (X)L,YL) And (X)R,YR) Obtaining individual difference point light position information (X)L,YL) And (X)R,YR) Middle point of (X)C,YC) And individual difference point light position information (X)L,YL) And (X)R,YR) And the midpoint (X)C,YC) As lens information, is stored in the storage unit 41 of fig. 2.

Note that the individual difference point light position information (X) may be excluded from the lens informationL,YL) And (X)R,YR) Middle point between (X)C,YC). This is because the light position information (X) can be derived from the individual difference pointsL,YL) And (X)R,YR) Obtaining individual difference point light position information (X)L,YL) And (X)R,YR) Middle point of (X)C,YC)。

The position calculation unit 57 reflects the relative position information as the individual difference from the relative optical axis center position (hereinafter also simply referred to as the optical axis center position) (dx # i, dy # i) and the individual difference point light position information (X) as the individual differenceL,YL) And (X)R,YR) (hereinafter also referred to simply as individual difference reflection position information) the (relative) optical axis center positions (dx1', dy1') to (dx4', dy4') are obtained as the mounting error reflection position information on the unknown captured image.

B of fig. 6 shows an example of an unknown captured image captured in the camera system 1 in which the multi-eye interchangeable lens 20 is attached to the camera body 10.

The unknown captured image is an image captured without a restriction (a restriction such as a restriction of a distance of a known object) at the time of capturing the known captured image in the camera system 1 in which the multi-eye interchangeable lens 20 is attached to the camera body 10.

When an unknown captured image is imaged, an attachment error (mounting error) in a different rotational direction from when the known captured image is imaged may occur.

In an xy coordinate system with the optical axis center position (x0', y0') of the center image E0 on the unknown captured image as the origin (0, 0), the optical axis center position (x # i ', y # i') (i ═ 1, 2, 3, 4) of the peripheral image E # i on the unknown captured image is equal to the relative optical axis center position (dx # i ', dy # i') (x # i ', y # i') - (x0', y0') of the peripheral image E # i with respect to the optical axis center position (x0', y0') of the center image E0.

Here, the positional information on the positions of the spot light images PL and PR of the spot lights of the respective light sources 32L and 32R on the unknown captured image is also referred to as mounting error spot light positional information (unknown light position). The mounting error spot light position information is position information including a manufacturing error and a mounting error of the multi-eye interchangeable lens 20, similar to the mounting error reflection position information. As the mounting error spot light position information, the positions themselves of the spot light images PL and PR of the spot lights of the respective light sources 32L and 32R on the unknown captured image may be employed. Note that the positions of the spot light images PL and PR of the spot lights of the respective light sources 32L and 32R on the unknown captured image are respectively represented as (X)L',YL') and (X)R',YR')。

Mounting error spot light position information (X)L',YL') and (X)R',YR') can be obtained from the spot light images PL and PR on the unknown captured image, similar to the individual difference spot light position information (X)L,YL) And (X)R,YR)。

Further, mounting error spot light position information (X)L',YL') and (X)R',YR') the midpoint between them is denoted by (X)C',YC')。

Now, let us assume that the relative rotation error is denoted θErrorThe relative rotation error is an attachment error in a rotation direction when imaging the unknown captured image with reference to an attachment error in the rotation direction when imaging the known captured image, and the relative rotation error θErrorThe individual difference point light position information (X) included in the lens information may be used according to equation (2)L,YL) And (X)R,YR) And mounting error spot light position information (X) obtained from an unknown captured imageL',YL') and (X)R',YR') to obtain.

[ mathematical formula 2]

(Note X)R>XL,XR’>XL’)...(2)

Relative rotation error θ according to equation (2)ErrorIs expressed to represent the light position information (X) of the individual difference points of the connectionL,YL) And (X)R,YR) Connection mounting error spot light position information (X) based on the angle of the straight line directionL’,YL') and (X)R', YR') and individual difference point light position information (X)L,YL) And (X)R,YR) The larger the interval (mounting error spot light position information (X)L’,YL') and (X)R',YR') greater spacing), the higher the accuracy. Therefore, by arranging the light sources 32L and 32R as far apart as possible, the relative rotation error θ can be accurately obtainedError

Note that, in the case where three or more light sources are provided in the multi-eye interchangeable lens 20, the rotation error θ of each pair of two light sources obtained from the three or more light sources is obtained according to equation (2)ErrorAnd the rotation error θ obtained for each pair may be employedErrorAs the final rotation error thetaError

Relative rotation error thetaErrorIs mounting error spot light position information (X)L',YL') (or (X)R',YR') and individual difference point light position information (X)L,YL) (or (X)R,YR) Angle of rotation between). By the relative rotation error theta according to equation (3)ErrorRotating the optical axis center position (dx # i, dy # i) where the relative position information is reflected as the individual difference, the relative rotation error θ can be obtainedErrorThe relative optical axis center position (dx # i ', dy # i') on the unknown captured image as the mounting error reflection position information is caused.

[ mathematical formula 3]

In the case where the optical axis center positions (dx1', dy1') to (dx4', dy4') of the respective monocular images E1 to E4 on the unknown captured image as the mounting error reflection position information are obtained according to equations (2) and (3), the attachment errors in the horizontal direction and the vertical direction can be obtained by obtaining the mounting error point light position information (X) of the point light images PL and PR of the respective light sources 32L and 32R on the unknown captured imageL',YL') and (X)R',YR') and individual difference point light position information (X) of the point light images PL and PR of the respective light sources 32L and 32R on the known captured imageL,YL) And (X)R,YR) To move in the middleAnd (c) obtaining the amount.

That is, for example, the attachment error X in the horizontal direction can be obtained according to equation (4)ErrorAnd an attachment error Y in the vertical directionError

[ mathematical formula 4]

Note that in equation (4), mounting error spot light position information (X) of the spot light images PL and PR of the respective light sources 32L and 32R on the unknown captured image is obtainedL',YL') and (X)R',YR') middle point (X)C',YC') individual difference point light position information (X) from the point light images PL and PR of the respective light sources 32L and 32R on the known captured imageL,YL) And (X)R,YR) Middle point of (X)C,YC) As an attachment error X in the horizontal directionErrorAnd an attachment error Y in the vertical directionError. Or, for example, as an attachment error X in the horizontal directionErrorAnd an attachment error Y in the vertical directionErrorThe mounting error spot light position information (X) can be obtainedL′,YL') and individual difference point light position information (X)L,YL) Amount of translation therebetween, or mounting error spot light position information (X)R',YR') and individual difference point light position information (X)R,YR) The amount of translation therebetween.

In the camera system 1, first, at the time of obtaining the optical axis center positions (dx1', dy1') to (dx4', dy4') of the respective monocular images E1 to E4 on the unknown captured image as the mounting error reflection position information, a process of obtaining individual difference reflection position information or the like that obtains the optical axis center positions (dx # i, dy # i) or the like as the individual difference reflection relative position information that is required in the case of obtaining the optical axis center positions (dx # i ', dy # i') is performed.

The process of acquiring the individual difference reflection position information and the like may be performed by the camera body 10, a computer described below, and the like. For convenience, an apparatus that performs processing of acquiring individual difference reflection position information and the like is referred to as an acquisition processing apparatus.

The acquisition processing apparatus acquires the pair of through-holes as a pair placed on the monocular lens 31 by the camera system 1 in which the multi-eye interchangeable lens 20 is attached to the camera body 100A known shot image obtained by imaging a chart image of a predetermined subject at a position of a known distance on the monocular optical axis.

The acquisition processing device acquires a predetermined point of the chart image as a predetermined object appearing in each monocular image E # i included in the known captured image, for example, as the position of the center of a circle of the chart.

For each of the monocular images (peripheral images) E1 to E4, using the distance to the chart image as the object, and the base line length and the base line angle of the multi-eye interchangeable lens 20, the obtaining processing device obtains the parallax between the center of a predetermined point as the chart image of the object appearing in the monocular image E # i and the center of a predetermined point as the chart image of the object appearing in the monocular image (center image) E0.

Further, for each of the monocular images E1 to E4, the acquisition processing device obtains the optical axis center position (position on the known captured image) (x # i, y # i) of the monocular image E # i located at a position shifted from the center position, from the parallax of the center of the circle that is a predetermined point of the chart image of the object appearing in the monocular image E # i, as the individual difference reflection absolute position information (x # i, y # i) of the monocular image E # i. Further, the acquisition processing apparatus obtains the optical axis center position (x0, y0) as the position of the center of the chart image as the object appearing in the monocular image E0 as the individual difference reflection absolute position information (x0, y0) of the monocular image E0.

Then, the acquisition processing apparatus obtains, for each of the monocular images E1 to E4, the individual difference reflection relative position information (dx # i, dy # i) of the monocular image E # i with reference to the individual difference reflection absolute position information (x0, y0) of the monocular image E0 from equation (1) using the individual difference reflection absolute position information (x # i, y # i).

Further, the acquisition processing device acquires the barycentric positions of the spot light images PL and PR of the spot lights of the light sources 32L and 32R on the known captured image as individual difference spot light position information (X), respectivelyL,YL) And (X)R,YR)。

The individual difference reflection relative position information (dx # i, dy # i) and the individual difference spot light position information (X # i) obtained in the above-described process of obtaining the individual difference reflection relative position information and the likeL,YL) And (X)R,YR) As part of the lens information, it is stored in the storage unit 41 of fig. 2.

When using the camera system 1 having the multi-eye interchangeable lens 20 mounted on the camera body, the individual difference reflection relative position information (dx # i, dy # i) and the individual difference point light position information (X)L,YL) And (X)R,YR) The mounting error reflection position information calculation process of obtaining the relative optical axis center positions (dx1', dy1') to (dx4', dy4') of the respective monocular images E1 to E4 on the unknown captured image or the like as the mounting error reflection position information is performed in the camera body 10.

That is, in the camera body 10 (fig. 2), when the multi-eye interchangeable lens 20 is mounted, the communication unit 56 communicates with the communication unit 42 of the multi-eye interchangeable lens 20, receives lens information of the multi-eye interchangeable lens 20 transmitted from the communication unit 42, and supplies the lens information to the position calculation unit 57. As described above, the position calculation unit 57 acquires the lens information supplied from the communication unit 56.

The position calculation unit 57 waits for an unknown captured image to be captured, and acquires an unknown captured image, which is a captured image in which an arbitrary object appears. That is, in the camera system 1 having the multi-eye interchangeable lens 20 attached to the camera body 10, the position calculation unit 57 acquires the captured image captured by the image sensor 51 as an unknown captured image.

The position calculation unit 57 detects an unknown beatThe spot light images PL and PR of the spot lights of the light sources 32L and 32R included in the photographic image, and further the respective positions (detection light image positions) of the spot light images PL and PR, for example, the positions of the centers of gravity, are detected as the mounting error spot light position information (X)L',YL') and (X)R',YR'). The position calculation unit 57 is a part of a processing unit that performs processing based on the detection results of the spot light images PL and PR, and based on mounting error spot light position information (X) as the detection results of the spot light images PL and PRL',YL') and (X)R',YR') specifies an imaged monocular image position as a position of the monocular image in the unknown captured image.

That is, the position calculation unit 57 bases on the mounting error spot light position information (X)L',YL') (or (X)R',YR') (detected light image position) and individual difference point light position information (X) included in the lens informationL,YL) (or (X)R,YR) (storage light image positions indicating the positions of the spot light images PL and PR of the spot light emitted to the light sources 32L and 32R of the image sensor 51), an imaging monocular image position is specified (calculated).

For example, the position calculation unit 57 calculates the position by spot light position information (X) based on the mounting errorL',YL') (or (X)R',YR') (detected light image position) and individual difference point light position information (X)L,YL) (or (X)R,YR) (storing optical image position), correcting the relative optical axis center position (dx # i, dy # i) reflecting the relative position information as the individual difference included in the lens information (representing the relative optical axis center position from the plurality of monocular lenses 31 in the image sensor 51)iA stored monocular image position of the emission position of each of the imaging lights emitted by each of the unknown captured images), an imaging monocular image position, that is, a (relative) optical axis center position (dx # i ', dy # i'), is specified as the mounting error reflection position information of each of the monocular images E1 to E4 included in the unknown captured image.

Specifically, first, the position calculation unit 57 obtains the safety included in the lens informationError spot light position information (X)L',YL') (or (X)R',YR') and individual difference point light position information (X)L,YL) (or (X)R,YR) Angle of rotation between) as (relative) rotation error θError

For example, the position calculation unit 57 obtains to connect the individual difference point light position information (X) included in the lens information according to the formula (2)L,YL) And (X)R,YR) Indicating connection mounting error spot light position information (X) with reference to the direction of line segment(s)L',YL') and (X)R',YR') the relative angle of the direction of the line segment as the rotation error thetaError

The position calculating unit 57 calculates the position by the rotation error θ obtained from equation (2)ErrorThe rotation reflects the relative position information as the individual difference included in the lens information, and the (relative) optical axis center position (dx # i ', dy # i') is obtained from equation (3) as the occurrence of the rotation error θErrorThe mounting error of each of the monocular images E1 to E4 included in the unknown captured image reflects the position information.

Further, as necessary, the position calculation unit 57 obtains individual difference point light position information (X) included in the lens informationL,YL) Or (X)R,YR) Mounting error spot light position information (X) with the spot light images PL and PR of the light sources 32L and 32R on the unknown captured imageL’,YL') or (X)R’,YR') as an attachment error X in the horizontal directionErrorAnd an attachment error Y in the vertical directionError

That is, for example, the position calculation unit 57 obtains the mounting error spot light position information (X) according to equation (4), for exampleL',YL') and (X)R',YR') middle point (X)C',YC') light position information (X) relative to individual difference pointsL,YL) And (X)R,YR) Middle point of (X)C,YC) As an attachment error X in the horizontal directionErrorAnd is perpendicular toAttachment error Y in directionError

In the camera body (fig. 2), the area specifying unit 52 may perform the area specifying process of specifying the area of the monocular image E # i on the unknown captured image using the relative optical axis center positions (dx1', dy1') to (dx4', dy4') of the monocular images E1 to E4 on the unknown captured image obtained in the above-described mounting error reflection position information calculation process as the mounting error reflection position information.

In the area specifying process, the area specifying unit 52 acquires the relative optical axis center position (dx # i ', dy # i') as the mounting error reflection position information of the monocular images E1 to E4 included in the unknown captured image supplied from the position calculating unit 57.

Then, the area specifying unit 52 specifies the areas of the monocular images E1 to E4 on the unknown captured image in accordance with the optical axis center position (dx # i ', dy # i') as the mounting error reflection position information. That is, for example, on the xy coordinate system when the individual difference reflection relative position information (dx # i, dy # i) is obtained, the area specifying unit 52 specifies a rectangular area having a predetermined size centered on the mounting error reflection position information (dx # i ', dy # i') on the unknown captured image as the area of the monocular image E # i (i ═ 1, 2, 3, 4). Further, the area specifying unit 52 specifies a rectangular area having a predetermined size centered on the origin on the unknown captured image as the area of the monocular image E0.

Thereafter, the area specifying unit 52 extracts the monocular images E0 to E4 from the captured image, and outputs the extracted images as area specification result information.

As described above, in the camera system 1 mounted with the multi-eye interchangeable lens 20, the multi-eye interchangeable lens 20 includes the single-eye lens 31 provided with the lens arranged so as not to overlap in the optical axis direction of the barrel optical axis (as viewed)0To 314The feeding unit 23 and the light sources 32L and 32R obtain the optical axis center position (dx # i ', dy # i') as the mounting error reflection position information on the monocular image E # i, for the monocular image E # i on the unknown captured image.

Therefore, regardless of the feeding state of the feeding unit 23Whatever the state, an image having a plurality of viewpoints, that is, having a single-eye lens 31 can be easily obtained from an unknown captured imageiThe monocular image E # i of the viewpoint at the position of (a).

Further, a base line angle tan indicating the direction of the base line L0# i (fig. 5) can be obtained from the optical axis center position (dx # i ', dy # i') as the mounting error reflection position information-1(dy # i/dx # i)), and accurate parallax information that is not affected by the attachment error in the rotational direction of the multi-eye interchangeable lens 20 can be obtained using the base line angle.

Here, for example, it is assumed that an xy coordinate system with the optical axis center position (X0, y0) of the monocular image E0 on the known captured image as the origin is adopted, and the attachment error X in the horizontal directionErrorAnd an attachment error Y in the vertical directionErrorIs 0. In this case, in the unknown captured image, the optical axis center position as the mounting error reflection position information (x0', y0') of the monocular image E0 is the origin, and the region of the monocular image E0 is a region centered on the origin.

At the same time, the attachment error X in the horizontal directionErrorOr attachment error Y in the vertical directionErrorIn the case other than 0, in the unknown captured image, the optical axis center position of the monocular image E0 is offset from the origin by the attachment error X in the horizontal directionErrorAnd an attachment error Y in the vertical directionError

In this case, assuming that the optical axis center position as the mounting error reflection position information (x0', y0') of the monocular image E0 is the origin, when a rectangular region having a predetermined size centered on the origin is specified as the region of the monocular image E0, the optical axis center position as the actual mounting error reflection position information (x0', y0') is shifted from the origin, and therefore, a rectangular region having a predetermined size centered on the position shifted from the actual optical axis center position of the monocular image E0 on the unknown captured image is specified as the region of the monocular image E0.

As a result, for each of the other monocular images E1 to E4, a rectangular region on the unknown captured image having a predetermined size centered at a position shifted from the optical axis center position (x # i ', y # i') of the monocular image E # i is specified as the region of the monocular image E # i.

I.e. attachment error X in the horizontal directionErrorOr attachment error Y in the vertical directionErrorIn a case other than 0, a rectangular region on the unknown captured image having a predetermined size centered at a position shifted by the same amount of translation from the optical axis center position (x # i ', y # i') of the monocular image E # i is specified as the region of the monocular image E # i of each of the monocular images E0 to E4.

Note that the attachment error X in the horizontal directionErrorOr attachment error Y in the vertical directionErrorCase other than 0 and attachment error X in the horizontal directionErrorAnd an attachment error Y in the vertical directionErrorIn both cases of 0, the base line angle (tan # i) at the time of imaging the unknown captured image is obtained from the optical axis center position (dx # i ', dy # i') that is the relative mounting error reflection position information with the position of the monocular image E0 as a reference-1(dy#i/dx#i))。

Therefore, the base line angle obtained for the monocular image E # i centered at the position shifted from the optical axis center position (x # i ', y # i') by the same amount of translation is the same as the base line angle obtained for the monocular image E # i centered at the optical axis center position (x # i ', y # i') on the unknown captured image.

I.e. even attachment error X in the horizontal directionErrorOr attachment error Y in the vertical directionErrorIn the case where the value is not 0, the attachment error X in the horizontal direction can be obtainedErrorAnd an attachment error Y in the vertical directionErrorThe same base angle for 0. Then, accurate parallax information that is not affected by the attachment error of the multi-eye interchangeable lens 20 can be obtained from the monocular image E # i having the specified area on the unknown captured image using the base line angle.

Note that, in the present embodiment, the individual difference reflection position information and the individual difference point light position information are obtained for each of the multi-eye interchangeable lenses 20, and are included and stored in the lens information. However, for each model of the multi-eye interchangeable lens 20, it is possible to adoptThe position information and the individual difference point light position information are reflected with the common value as the individual difference. In the case of employing the common individual difference reflection position information and the individual difference point light position information for each model of the multi-eye interchangeable lens 20, the individual difference reflection relative position information (dx # i, dy # i) and the individual difference point light position information (X # i) for each model areL,YL) And (X)R,YR) Is incorporated into equations (2) and (3) so that the camera body 10 can obtain the rotation error θ of equation (2)ErrorAnd finally the optical axis center position (dx # i ', dy # i') as the mounting error reflection position information of the formula (3), as long as the model number of the multi-eye interchangeable lens 20 can be recognized.

< example of configuration of the image processing unit 53 >

Fig. 7 is a block diagram showing a configuration example of the image processing unit 53 in fig. 2.

Fig. 7 shows, for example, a configuration example of a part of the image processing unit 53 that performs image processing for obtaining parallax information.

In fig. 7, the image processing unit 53 includes a calibration data generation unit 101, a calibration data storage unit 102, an interpolation unit 103, and a parallax information generation unit 104.

The image processing unit 53 generates parallax information about parallax by using the monocular image supplied from the region specifying unit 52. Here, the parallax information includes a parallax value representing parallax by the number of pixels, a distance in the depth direction corresponding to the parallax, and the like.

Parallax information obtained using a monocular image is subjected to the monocular lens 31iThe position of the lens, lens distortion, etc. Therefore, to remove the monocular lens 31iThe image processing unit 53 performs calibration to generate information about the monocular lens 31iAs calibration data, the position of (a), lens distortion, etc.

In the calibration, for example, calibration data is generated from a calibration image that is a monocular image on a captured image obtained by imaging a plane chart for calibration (hereinafter also referred to as a calibration chart) as a known object.

That is, in calibration, the focus position of the camera system 1 is controlled to (the position of) a certain predetermined distance, and a calibration chart installed at the focus position is photographed. Then, calibration data for the focus position of the camera system 1 in which the focus position is controlled to a certain predetermined distance is generated using the calibration image that is the monocular image obtained by imaging the calibration chart.

The calibration data generated as described above is calibration data of the focal position of the camera system 1 when the calibration chart is captured. Therefore, for example, in the case where the focus position in the case of imaging a general subject is different from the focus position in the case of imaging a calibration chart, if the parallax information calculation using the monocular image appearing in the captured image obtained by imaging the general subject is performed using the calibration data of the focus position of the camera system 1 at the time of capturing the calibration chart, the accuracy of the parallax information is lowered.

Therefore, the image processing unit 53 can obtain high-precision parallax information by interpolating the calibration data.

The calibration data generation unit 101 generates calibration data for each of a plurality of feed amounts from the feed amount detection unit 64 by using the monocular image supplied from the area specifying unit 52, and supplies the calibration data to the calibration data storage unit 102.

That is, the calibration data generation unit 101 generates calibration data of a plurality of feed amounts corresponding to a plurality of focus positions from the monocular image as calibration images at a plurality of focus positions obtained by imaging the calibration chart at a plurality of focus positions corresponding to a plurality of feed amounts of the feed unit 23 in calibration, and supplies the calibration data to the calibration data storage unit 102.

The calibration data storage unit 102 stores calibration data of a plurality of feed amounts supplied from the calibration data generation unit 101.

The interpolation unit 103 generates calibration data of a feed amount at the time of imaging a general object (object other than the calibration chart) of the feed amount supplied from the feed amount detection unit 64 (hereinafter also referred to as general imaging) by performing interpolation using calibration data of a plurality of feed amounts or the like stored in the calibration data storage unit 102, and supplies the generated calibration data to the parallax information generation unit 104.

The parallax information generating unit 104 generates parallax information by using the calibration data from the interpolation unit 103 and a monocular image on a captured image captured by general imaging (hereinafter also referred to as a general captured image) supplied from the region specifying unit 52. The general photographed image is equivalent to the unknown photographed image.

According to the camera system 1 including the image processing unit 53 as described above, for example, an RGB image as a monocular image having an image quality similar to that of a general monocular camera and parallax information as depth information can be obtained at the same time. The parallax information can be used for, for example, image processing such as refocusing to generate (reconstruct) an image focused on an arbitrary object, generating an obstacle removed image in which an obstacle for a desired object desired by a user has been removed, lens simulation for simulating a lens having an arbitrary characteristic, and synthesis in consideration of a CG depth and an actual shot.

With the camera system 1, for example, after the camera system 1 is manufactured, the operation mode of the camera system 1 is set to a calibration mode for performing calibration, and calibration is performed in the factory or the like of the camera system 1.

That is, in the camera system 1, the focus of the multi-eye interchangeable lens 20 is controlled to a focus position (hereinafter also referred to as a reference focus position) at which calibration data is generated, and a calibration chart is mounted at the reference focus position and photographed. Then, in the camera system 1, the calibration data generation unit 101 generates calibration data of the feed amount of the feeding unit 23 (hereinafter also referred to as a reference feed amount) corresponding to the reference focus position from the calibration image obtained by imaging the calibration chart, and causes the calibration data storage unit 102 to store the calibration data. Calibration is performed for a plurality of reference focus positions, thereby generating calibration data of a plurality of reference feed amounts corresponding to the plurality of reference focus positions.

When calibration is performed and the camera system 1 is shipped from the factory, the operation mode of the camera system 1 is set to a general imaging mode in which general imaging is performed. In the general imaging mode, a feed amount (hereinafter also referred to as an imaging feed amount) of the feed unit 23 corresponding to a focus position (hereinafter also referred to as an imaging focus position) at the time of capturing a general captured image is supplied from the feed amount detection unit 64 to the interpolation unit 103.

The interpolation unit 103 generates calibration data of an imaging feed amount corresponding to the imaging focus position by interpolation using calibration data of a plurality of reference feed amounts corresponding to a plurality of reference focus positions stored in the calibration data storage unit 102, and supplies the generated calibration data to the parallax information generation unit 104. Note that in the case where there is calibration data for all positions, interpolation is not required.

The parallax information generating unit 104 generates parallax information from a monocular image on a general captured image captured in a state where the feeding unit 23 is sent out the imaging feed amount, using the calibration data for the imaging feed unit from the interpolation unit 103. As a result, accurate parallax information in which the influence of lens distortion or the like is suppressed is generated.

Fig. 8 is a diagram for describing calibration performed by the camera system 1.

In calibration, a position P at a predetermined distance Z mm from the camera system 1 is set as a reference focus position P, a calibration chart is mounted at the reference focus position P, and the camera system 1 images the calibration chart.

The calibration chart shown in fig. 8 is, for example, an object on which a grid-like pattern is drawn, but any object having a known positional relationship or the like may be employed as the calibration chart.

In calibration, calibration data of the feed amount P corresponding to the reference focus position P is generated from a calibration image obtained by imaging a calibration map at the reference focus position P.

In the calibration image obtained by imaging the calibration chart by the camera system 1, for example, due to the monocular lens 31iThat is, a positional deviation occurs between a true position where an object should appear in the calibration image (a position where the object should originally appear without lens distortion or the like) and a position where the object actually appears in the calibration image.

Since the calibration image is a monocular image on the captured image obtained by mounting and imaging the calibration chart (which is a known object) at the focal position P (which is a known position), the true position where the object should appear, that is, the true position where each part (e.g., grid point) of the calibration chart should appear on the calibration image, can be obtained in advance by calculation.

Furthermore, the actual position of the object appearing in the calibration image may be obtained from the calibration image.

The calibration data generation unit 101 obtains the actual position where the object (for example, the grid points of the calibration chart) appears from the calibration image. Then, the calibration data generation unit 101 generates, as calibration data, information on a positional deviation between the actual position where the object appears and the true position where the object should appear, which can be obtained in advance by calculation.

Note that as the calibration data, so-called internal parameters and external parameters of the camera system 1 may be generated, but here, for the sake of simplifying the description, information on positional deviation of an object in a calibration image obtained by imaging the calibration chart by the camera system 1 is generated as the calibration data.

Fig. 9 is a diagram for describing generation of calibration data of a plurality of feed amounts corresponding to a plurality of reference focus positions.

In generating calibration data of a plurality of feed amounts corresponding to a plurality of reference focus positions, first, a position P1 at a predetermined distance Z1 mm from the camera system 1 is set as a reference focus position P1, a calibration chart is installed at the reference focus position P1, and the camera system 1 images the calibration chart.

Then, a position P2 at a distance Z2 mm different from the distance Z1 mm from the camera system 1 is set as a reference focus position P2, the calibration chart is mounted at the reference focus position P2, and the camera system 1 images the calibration chart.

Further, a position P3 at a distance Z3 mm different from the distances Z1 and Z2 mm from the camera system 1 is set as a reference focus position P3, a calibration chart is installed at the reference focus position P3, and the camera system 1 images the calibration chart.

Note that, in fig. 9, the distances Z1 mm, Z2 mm, and Z3 mm have the relationship represented by the expression Z1 mm < Z2 mm < Z3 mm.

The camera system 1 controls the focal point position to the position P1, and images the calibration chart installed at the position P1. By this imaging, a calibration image at the reference focus position P1, that is, a calibration image having the position P1 as the reference focus position is obtained.

Similarly, the camera system 1 controls the focal position to the positions P2 and P3, and images the calibration chart installed at the positions P2 and P3. By the imaging, calibration images at the reference focus positions P2 and P3, that is, calibration images having the positions P2 and P3 as reference focus positions are obtained.

Note that the imaging sequence of the calibration chart mounted at the reference focus positions P1, P2, and P3 is not particularly limited.

Further, here, three different positions P1, P2, and P3 are employed as the reference focus positions, but two different positions or four or more different positions may be employed as the reference focus positions in addition to the three different positions.

As described above, in the camera system 1, calibration data of a plurality of feed amounts corresponding to a plurality of reference focus positions (reference focus positions P1, P2, P3) is generated from a calibration image obtained by imaging a calibration chart by the camera system 1 that controls the focus position to a plurality of positions (reference focus positions P1, P2, and P3).

Fig. 10 is a diagram for describing general imaging performed by the camera system 1.

In general imaging, the camera system 1 sets a position P4 at an arbitrary distance Z4 mm from the camera system 1 as an imaging focus position P4, and images a general subject present at the imaging focus position P4.

Note that, in fig. 10, the position P4 does not match any of the positions P1, P2, and P3, and is a position farther than the position P1 and closer than the position P2.

In the camera system 1, image processing is performed on a general captured image obtained by imaging a subject in the camera system 1 in which the focus position is controlled to the imaging focus position P4.

Here, in the camera system 1, in the case where the imaging focus position at the time of performing the general imaging does not match any of the plurality of reference focus positions at the time of calibration, if the image processing is performed on the general captured image using the calibration data as any of the plurality of reference feed amounts for the plurality of reference focus positions, inappropriate image processing may be performed.

In fig. 10, since the imaging focus position P4 does not match any of the reference focus positions P1, P2, and P3, if the calibration data such as any of the reference feed amounts for the reference feed amounts P1, P2, and P3 corresponding to the reference focus positions P1, P2, and P3 is used, image processing is performed on a general captured image captured by the camera system 1 in which the focus position is controlled to the imaging focus position P4, that is, the imaging feed amount is set to the feed amount corresponding to the imaging focus position P4, inappropriate image processing may be performed.

Therefore, in the camera system 1, in order to perform appropriate image processing on a general captured image, the interpolation unit 153 (fig. 7) generates calibration data of the imaging feed amount P4 corresponding to the imaging focus position P4 by interpolation using calibration data of the reference feed amounts P1, P2, P2 corresponding to the reference focus positions P1, P2, and P3.

That is, in the case where the focal position (i.e., the feed amount of the feed unit 23) is different, the lens condition (lens state) of the camera system 1 is different. Therefore, even in the same pixel of a general captured image, the positional deviation (amount) differs between the case where the feed amount of the feed unit 23 is a feed amount corresponding to a certain focus position and the case where the feed amount is a feed amount corresponding to another focus position.

Therefore, if the parallax information is generated for a general captured image using the calibration data of the reference feed amount that does not match the imaging feed amount, the error of the parallax information caused by the above positional deviation is improperly corrected, and accurate parallax information may not be obtained.

In order to perform appropriate image processing on a general captured image, that is, in order to obtain accurate parallax information, for example, the camera system 1 images a calibration chart at a plurality of (different) reference focus positions, and generates calibration data of a plurality of reference feed amounts corresponding to the plurality of reference focus positions from calibration images at the plurality of reference focus positions obtained by the imaging.

Further, the camera system 1 generates calibration data of an imaging feed amount corresponding to the imaging focus position by interpolating calibration data using a plurality of reference feed amounts corresponding to a plurality of reference focus positions, and performs image processing such as generation of parallax information using the calibration data for the imaging feed amount.

Fig. 11 is a diagram for describing generation by interpolating calibration data of the imaging feed amount corresponding to the imaging focus position P4.

In fig. 11, the vertical direction (vertical axis) represents calibration data, and the horizontal direction (horizontal axis) represents the feed amount of the feed unit 23.

In fig. 11, calibration data of each of the reference feed amounts P1, P2, and P3 corresponding to the reference focus positions P1, P2, and P3 is represented by a circle.

The interpolation unit 103 generates calibration data (a portion indicated by a triangle in the drawing) of the imaging feed amount P4 corresponding to the imaging focus position P4 by performing linear interpolation or another interpolation using calibration data of at least two or more of the calibration data of the reference feed amounts P1, P2, and P3.

As described above, the camera system 1 generates calibration data of the imaging feed amount P4 by interpolation using calibration data of the plurality of feed amounts P1, P2, and P3.

Therefore, it is possible to perform appropriate image processing on the general captured image at the imaging focus position P4 using the calibration data of the imaging feed amount P4 in the case where the feed amount of the feed unit 23 is the imaging feed amount P4.

For example, in the case where there is nonlinear distortion in a monocular image (in an object appearing) extracted from a captured image, image processing using calibration data in the image processing unit 53 is useful. According to the image processing using the calibration data in the image processing unit 53, image processing in which distortion of a monocular image is corrected can be performed. Note that in the calibration, information on a positional deviation from a real position where an object as calibration data should appear may be generated for some pixels in addition to each pixel of the monocular image, and the information may be generated by interpolation in a spatial direction for other pixels at the time of general imaging.

< example of arrangement of light sources 32L and 32R >

Fig. 12 is a sectional view showing a configuration example of the light sources 32L and 32R.

Here, hereinafter, the light sources 32L and 32R will be referred to as the light sources 32 unless it is necessary to distinguish them.

In fig. 12, the light source 32 includes a housing 121, an LED 122, and a lens 123.

The housing 121 is, for example, an elongated cylindrical housing, and accommodates therein the LED 122 and the lens 123.

The LED 122 emits light as spot light.

The lens 123 collects point light emitted from the LED 122.

In the light source 32 configured as described above, the point light emitted from the LED 122 is condensed by the lens 123.

Therefore, the point light emitted from the light source 32 in fig. 12 (ideally) is condensed at one point and then diffused, and thus is non-parallel light (light that is not parallel light).

Here, a point at which non-parallel light, which is point light emitted from the light source 32 in fig. 12, is condensed is also referred to as a condensing point.

Hereinafter, assuming that the light source 32 emits non-parallel light as spot light, a method of detecting the feeding amount of the feeding unit 23 will be described.

< example of arrangement of the multi-eye interchangeable lens 20>

Fig. 13 is a sectional view showing a configuration example of the multi-eye interchangeable lens 20.

As described with reference to fig. 1, the feeding unit 23 is provided with the monocular lens 31iAnd a light source 32. The feeding unit 23 is configured in a cylinderThe shape barrel 21 is movable in the optical axis direction (in the figure, the up-down direction) of the barrel optical axis, and can be sent out from the frontmost side (from the image sensor 51 side) to the depth side. The feed unit 23 feeds the monocular lens 31iMove, and adjust the focus.

< first method for detecting feed amount >

Fig. 14 is a diagram for describing a first detection method of detecting the feed amount of the feed unit 23.

A of fig. 14 shows a cross section of the multi-eye interchangeable lens 20 in the minimum feed state (wide end) where the feed unit 23 is fed out to the minimum extent (at the foremost side). B of fig. 14 shows a cross section of the multi-eye interchangeable lens 20 in the maximum feed state (distal end) where the feed unit is fed out to the maximum extent (on the deepest side).

In the case of adopting the first detection method, the camera system 1 is configured such that the image sensor 51 is located between the condensing point at which the non-parallel light as the spot light is condensed when the feeding unit 23 is in the minimum feeding state and the condensing point at which the non-parallel light as the spot light is condensed when the feeding unit 23 is in the maximum feeding state.

Further, the light source 32 emits non-parallel light as spot light in the same direction as the barrel optical axis.

As described above, in the case where the image sensor 51 is located between the condensing point at which the spot light is condensed when the feeding unit 23 is in the minimum feeding state and the condensing point at which the spot light is condensed when the feeding unit 23 is in the maximum feeding state, the difference between the size of the spot light image in the minimum feeding state (hereinafter also referred to as the spot size) (for example, the radius of the spot light image, etc.) and the spot size in the maximum feeding state is minimized.

In the case where the light source 32 emits non-parallel light as spot light, the spot size varies according to the feeding amount of the feeding unit 23, and therefore, the feeding amount of the feeding unit 23 can be detected according to the spot size.

Fig. 15 is a (cross-sectional) view showing an example of a change in the spot size of non-parallel light as spot light.

As shown in fig. 15, the dot size at the convergence point is minimized, and the dot size increases as the distance from the convergence point increases in the lens barrel optical axis direction.

In the case where the image sensor 51 is located between the condensing point where the spot light is condensed when the feeding unit 23 is in the minimum feeding state and the condensing point where the spot light is condensed when the feeding unit 23 is in the maximum feeding state, there may be two feeding amounts where the spot sizes become the same.

That is, for example, as shown in fig. 15, two positions located at the same distance from the convergence point in the lens barrel optical axis direction have different feed amounts but have the same dot size.

As mentioned above, there are two feeds for a particular dot size, also referred to as binary uncertainties.

In the case where there are two feed amounts for a certain dot size, it is necessary to specify a true feed amount, that is, a feed amount in a state where a dot light image of the dot size is obtained from the two feed amounts, and to solve the binary uncertainty.

Examples of a method of specifying a real feed amount from two feed amounts in a case where there are two feed amounts for a certain point size include a method of performing template matching of a point light image, and a method of changing the feed amount to be large or small and detecting a change direction in which the point size increases or decreases according to a change direction of the feed amount.

Fig. 16 is a flowchart for describing an example of processing of detecting the feed amount by the first detection method.

In the process according to the flowchart of fig. 16, the binary uncertainty is solved by performing template matching of the spot light image.

In step S111, the spot light image detection unit 62 acquires lens information from the intraocular lens 20 via the communication unit 56, and the process proceeds to step S112.

In step S112, the control unit 61 waits for the user to perform an operation of adjusting the focus, and transmits information for specifying the focus to the multi-eye interchangeable lens 20 via the communication unit 56 according to the operation. In the multi-eye interchangeable lens 20, the control unit 43 moves (feeds) the feeding unit 23 according to the information for specifying the focus sent via the communication unit 56, and the processing proceeds to step S113. Note that the control unit 61 can automatically adjust the focus by an autofocus function or the like without user operation.

In step S113, the image sensor 51 images the captured image and supplies the captured image to the area specifying unit 52, the position calculating unit 57, and the point light image detecting unit 62, and the processing proceeds to step S114.

In step S114, the spot light image detection unit 62 detects a spot light image from (the vicinity of) the captured image from the image sensor 51 based on the individual difference spot light position information included in the lens information acquired from the multi-eye interchangeable lens 20, and supplies the image as a spot light image to the feed amount detection unit 64, and the processing proceeds to step S115.

Here, as described with reference to fig. 6, a value common to each model of the multi-eye interchangeable lens 20 may be adopted as the individual difference point light position information. In the case of employing common individual difference point light position information for each model of the multi-eye interchangeable lens 20, the individual difference point light position information for each model is stored in the camera body 10 in advance, so that the camera body 10 does not need to acquire the individual difference point light position information (lens information including the individual difference point light position information) from the multi-eye interchangeable lens 20. This similarly applies to the embodiments described below.

In step S115, the feed amount detection unit 64 acquires the spot light image information that has not been matched with the spot light image from the spot light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63, and the process proceeds to step S116.

Here, in the case of employing the first detection method, the feed amount information storage unit 63 stores feed amount information in which each of a plurality of feed amounts is associated with the spot light image information at the time when the feed unit 23 is fed out by the feed amount, wherein an image as a spot light image is used as the spot light image information.

In step S116, the feed amount detection unit 64 performs (template) matching between the spot light image detected from the captured image (hereinafter also referred to as a detected spot light image) in step S114 and the spot light image information acquired in the immediately preceding step S115, and the processing proceeds to step S117.

In step S117, the feed amount detection unit 64 determines whether matching with the detected spot light images is performed for all the spot light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63.

In step S117, in a case where it is determined that all the spot light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63 has not been completed, the process returns to step S115.

Then, in step S115, the point light image information that has not been matched with the point light image is acquired from the point light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63, and the similar processing is repeated thereafter.

Further, in step S117, in a case where it is determined that all the spot light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63 has been completed, the process proceeds to step S118.

In step S118, the point light image information that most matches the detected point light image is selected from the point light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63, and the feed amount associated with the point light image information is detected as the feed amount of the feeding unit 23 and supplied to the image processing unit 53, and the processing is terminated. Here, for example, in a case where the current focus position as the focus position after the focus adjustment in step S112 is not used as the focus position at the time of generating the feed amount information, the current focus position and the focus position at the time of generating the feed amount information with the feed amount detected in step S118 do not completely match each other, and the accuracy of the feed amount detected in step S118 may deteriorate. Therefore, as the feed amount information, in addition to the feed amount and the point light image information, information in which the focus position when the feeding unit 23 is fed out by the feed amount is associated may be employed. In this case, in step S112, when the current focus position does not match any of the focus positions included in the feed amount information, the current focus position may be drawn (readjusted) to the focus position closest to the current focus position among the focus positions included in the feed amount information. As a result, in step S118, an accurate feed amount can be detected.

Note that in fig. 16, the feeding amount is detected using an image as a spot light image as spot light image information and performing image matching. Alternatively, for example, the feed amount may be detected using a one-dimensional intensity distribution or a two-dimensional intensity distribution of the spot light image as spot light image information and performing matching of the one-dimensional intensity distribution or the two-dimensional intensity distribution.

Further, in the case where one or more light sources 32 are provided, the first detection method may be employed.

< second method for detecting feed amount >

Fig. 17 is a diagram for describing a second detection method of detecting the feed amount of the feed unit 23.

In the case where the second detection method is adopted, the camera system 1 is configured such that the image sensor 51 is located between the condensing point where the non-parallel light as the spot light is condensed when the feeding unit 23 is in the minimum feeding state and the condensing point where the non-parallel light as the spot light is condensed when the feeding unit 23 is in the maximum feeding state, similarly to the first detection method.

Further, similarly to the first detection method, the light sources 32L and 32R emit non-parallel light as spot light in the same direction as the barrel optical axis.

Note that, in the case of adopting the second detection method, the camera system 1 is configured such that the focal point of the spot light emitted by the light source 32L and the focal point of the spot light emitted by the light source 32R having the respective feed amounts are located at points having different distances from the image sensor 51.

In fig. 17, the distance from the image sensor 51 to the condensing point of the spot light emitted from the light source 32L with a certain feed amount is different from the distance from the image sensor 51 to the condensing point of the spot light emitted from the light source 32R. That is, in fig. 17, the image sensor 51 exists on the lower side in the figure, and the condensing point of the spot light emitted by the light source 32L is closer to the image sensor 51 than the condensing point of the spot light emitted by the light source 32R.

In this case, the binary uncertainty cannot be solved only by the dot size of the point light emitted by one of the light sources 32L and 32R, but the binary uncertainty can be solved by a combination of the dot sizes of the point lights emitted by the light sources 32L and 32R.

Fig. 18 is a flowchart for describing an example of processing of detecting the feed amount by the second detection method.

In the process according to the flowchart of fig. 18, the binary uncertainty is solved by using a combination of the dot sizes of the dot lights emitted by the light sources 32L and 32R.

In steps S121 to S123, similar processing to that in steps S111 to S113 in fig. 16 is performed.

Then, in step S124, the point light image detection unit 62 detects (as images) a point light image PL of point light emitted from the light source 32L and a point light image PR of point light emitted from the light source 32R from the captured image from the image sensor 51 on the basis of individual difference point light position information included in the lens information acquired from the multi-eye interchangeable lens 20, and the processing proceeds to step S125.

In step S125, the dot light image detection unit 62 detects the dot sizes of the dot light images PL and PR and supplies the dot sizes to the feed amount detection unit 64, and the process advances to step S126.

In step S126, the feed amount detection unit 64 selects, from the spot light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63, spot light image information that matches the combination of the dot sizes of the spot light images PL and PR detected in step S125 (hereinafter also referred to as a combination of detected dot sizes). Further, in step S126, the feed amount detection unit 64 detects the feed amount associated with the combined spot light image information matching the detected point size as the feed amount of the feed unit 23, supplies the feed amount to the image processing unit 53, and terminates the processing.

Here, in the case of employing the second detection method, the feed amount information storage unit 63 stores feed amount information in which each of a plurality of feed amounts is associated with the spot light image information when the feed unit 23 is fed out by the feed amount, wherein a combination of the spot sizes of the spot light images PL and PR is used as the spot light image information.

In the second detection method, in the feed amount information in which each of the plurality of feed amounts is associated with a combination of dot sizes of the dot light images PL and PR as the dot light image information when the feeding unit 23 is fed out by the feed amount, the feed amount associated with a combination of dot sizes of the dot light images PL and PR that matches the combination of detected dot sizes is detected as the feed amount of the feeding unit 23.

According to the second detection method, the binary uncertainty is solved, and the feed amount of the feed unit 23 can be detected from (a combination of) the size of the detection point.

Note that in the case where two or more light sources 32 are provided, the second detection method may be employed.

< third method for detecting feed amount >

Fig. 19 is a diagram for describing a third detection method of detecting the feed amount of the feed unit 23.

A of fig. 19 shows a cross section of the multi-eye interchangeable lens 20 in a minimum feed state where the feed unit 23 is fed out by the minimum degree (at the forefront). B of fig. 19 shows a cross section of the multi-eye interchangeable lens 20 in a maximum feed state where the feed unit is fed out to the maximum extent (on the deepest side).

If the third detection method is adopted, the camera system 1 is configured such that when the feeding unit 23 is moved from the minimum feeding state to the maximum feeding state, the condensing point at which the non-parallel light as the point light is condensed is located on one of the depth side and the front side including the image sensor 51.

Further, the light source 32 emits non-parallel light as spot light in the same direction as the barrel optical axis.

As described above, in the case where the feeding unit 23 moves from the minimum feeding state to the maximum feeding state, when the condensing point at which the non-parallel light as the point light is condensed is located on one of the depth side and the front side including the image sensor 51, the difference between the dot size of the point light image in the minimum feeding state and the dot size in the maximum feeding state is maximized. Further, in the case where the feeding unit 23 moves from the minimum feeding state to the maximum feeding state, the dot size monotonically decreases or monotonically increases. Thus, no binary uncertainty occurs.

Fig. 20 is a diagram for describing a state in which a condensing point at which non-parallel light as spot light is condensed is located on one of the rear side and the front side including the image sensor 51 when the feeding unit 23 moves from the minimum feeding state to the maximum feeding state.

A of fig. 20 shows that when the feeding unit 23 moves from the minimum feeding state to the maximum feeding state, the condensing point where the non-parallel light as the spot light is condensed is located on the front side including the image sensor 51 (with the monocular lens 31 being disposed)iThe opposite side of the side).

In a of fig. 20, when the feeding unit 23 moves from the minimum feeding state to the maximum feeding state, the dot size monotonically decreases.

B of fig. 20 shows a state in which a condensing point at which non-parallel light as spot light is condensed is located on the depth side (the side where the monocular lens 31i is provided) including the image sensor 51 when the feeding unit 23 moves from the minimum feeding state to the maximum feeding state.

In B of fig. 20, when the feeding unit 23 moves from the minimum feeding state to the maximum feeding state, the dot size monotonically increases.

In the case where the third detection method is adopted, the feed amount information storage unit 63 stores feed amount information in which each of a plurality of feed amounts is associated with the spot light image information when the feed unit 23 is fed out by the feed amount, wherein the spot size of the spot light image is used as the spot light image information.

Then, in the third detection method, the spot light image detection unit 62 detects the spot size of the spot light image, similarly to the second detection method.

Further, in the feed amount detection unit 64, in the feed amount information in which each of the plurality of feed amounts is associated with the dot size of the dot light image as the dot light image information when the feed unit 23 is fed out by the feed amount, the feed amount associated with the dot size as the dot light image information matching the dot size detected by the dot light image detection unit 62 is detected as the feed amount of the feed unit 23.

According to the third detection method, binary uncertainty does not occur, and the feed amount of the feed unit 23 can be detected according to the dot size.

Note that in the case where one or more light sources 32 are provided, the third detection method may be employed.

< fourth method for detecting feed amount >

Fig. 21 is a diagram for explaining a fourth detection method of detecting the feed amount of the feed unit 23. Note that the fourth detection method can be applied not only to the case where the spot light is non-parallel light but also to the case where the spot light is parallel light, as described below.

That is, fig. 21 shows a cross section of the multi-eye interchangeable lens 20.

In the case where the fourth detection method is adopted, the camera system 1 is configured such that the light source 32 emits point light in an oblique direction inclined from the lens barrel optical axis direction.

In fig. 21, the light source 32 emits point light in a direction from the peripheral portion toward the central portion of the image sensor 51.

Fig. 22 is a diagram showing the irradiation position of the spot light in the case where the feeding unit 23 is in the minimum feeding state and the irradiation position of the spot light in the case where the feeding unit 23 is in the maximum feeding state.

A of fig. 22 shows the irradiation position of the spot light in the case where the feeding unit 23 is in the minimum feeding state, and B of fig. 22 shows the irradiation position of the spot light in the case where the feeding unit 23 is in the maximum feeding state.

In fig. 22, the irradiation position of the spot light (i.e., the positions of the spot light images PL 'and PR' of the spot light emitted by the light sources 32L and 32R) with the feeding unit 23 in the minimum feeding state is a position closest to (by) the outer peripheral side of the image sensor 51 (captured image) within the movable range of the spot light images PL and PR of the spot light.

When the feeding unit 23 moves from the minimum feeding state to the maximum feeding state, the point light images PL and PR move toward the center of the image sensor 51.

Then, the irradiation position of the spot light (i.e., the positions of the spot light images PL "and PR" of the spot light emitted by the light sources 32L and 32R) with the feeding unit 23 in the maximum feeding state is the position closest to the center side of the image sensor 51 within the movable range of the spot light images PL and PR of the spot light.

Fig. 23 is a diagram showing an example of photographed images of the spot light images PL 'and PR' occurring in a case where the feeding unit 23 is in the minimum feeding state, and photographed images of the spot light images PL "and PR" occurring in a case where the feeding unit 23 is in the maximum feeding state.

In fig. 23, the spot light images PL 'and PR' in the case where the feeding unit 23 is in the minimum feeding state are located on the outermost periphery side of the captured image.

When the feeding unit 23 moves from the minimum feeding state to the maximum feeding state, the spot light images PL and PR move toward the center of the photographed image.

Then, the spot light images PL "and PR" in the case where the feeding unit 23 is in the maximum feeding state are located on the most central side of the captured image.

As described above, in the case where the light sources 32L and 32R emit the point light in the oblique direction, the positions of the point light images PL and PR vary depending on the feeding amount of the feeding unit 23.

Further, in the case where the light sources 32L and 32R emit the point light in a direction as an oblique direction, for example, from the peripheral portion toward the central portion of the image sensor 51, the distance between (the positions of) the point light images PL and PR changes depending on the feeding amount of the feeding unit 23 in addition to the positions of the point light images PL and PR. In fig. 23, the distance between the spot light images PL 'and PR' when the feeding unit 23 is in the minimum feeding state is the maximum value of the distance between the spot light images PL and PR. Further, the distance between the spot light images PL "and PR" when the feeding unit 23 is in the maximum feeding state is the minimum value of the distance between the spot light images PL and PR.

In the fourth detection method, the feeding amount of the feeding unit 23 is detected from the positions of (one or both of) the spot light images PL and PR or the distance between the spot light images PL and PR obtained from these positions.

Fig. 24 is a flowchart for describing an example of processing of detecting the feed amount by the fourth detection method.

In steps S131 to S134, similar processing to that in steps S121 to S124 in fig. 18 is performed.

Then, in step S135, the spot light image detecting unit 62 detects the positions of the spot light images PL and PR (detects the light image positions), and detects the distance between the light images, which is the distance between the positions. The spot light image detecting unit 62 supplies the distance between the light images to the feed amount detecting unit 64, and the processing proceeds from step S135 to step S136.

In step S136, the feed amount detection unit 64 selects, from the spot light image information associated with the feed amount among the feed amount information stored in the feed amount information storage unit 63, spot light image information that matches the distance between the light images detected in step S135 (hereinafter also referred to as the detection distance between the light images). Further, in step S136, the feed amount detection unit 64 detects the feed amount associated with the spot light image information matching the distance between the detected light images as the feed amount of the feeding unit 23, supplies the feed amount to the image processing unit 53, and terminates the processing.

Here, in the case of adopting the fourth detection method, the feed amount information storage unit 63 stores feed amount information in which each of a plurality of feed amounts is associated with the spot light image information when the feed unit 23 is fed out by the feed amount, wherein the distance between the light images is used as the spot light image information.

In the fourth detection method, in the feeding amount information in which each of the plurality of feeding amounts is associated with the distance between the light images as the point light image information when the feeding unit 23 is fed out by the feeding amount, the feeding amount associated with the distance between the light images matching the distance between the detected light images is detected as the feeding amount of the feeding unit 23.

According to the fourth detection method, the feeding amount of the feeding unit 23 can be detected from the distance between the detection light images without causing binary uncertainty.

Note that in the fourth detection method, the position of the spot light image (the detection light image position) may be employed as the spot light image information, instead of the distance between the light images. In the case of adopting the position of the spot light image as the spot light image information, the feed amount information storage unit 63 stores feed amount information in which each of a plurality of feed amounts is associated with the position of the spot light image as the spot light image information when the feed unit 23 is fed out by the feed amount. Further, in this case, the spot light image detection unit 62 detects the position of the spot light image.

Then, the feed amount detecting unit 64 detects the feed amount associated with the position of the spot light image matching the position of the spot light image detected in the spot light image detecting unit 62 as the feed amount of the feeding unit 23 in the feed amount information stored in the feed amount information storing unit 63.

In addition, in the fourth detection method, similarly to the first detection method, an image as a spot light image may be adopted as the spot light image information, and similarly to the second and third detection methods, a spot size may be adopted as the spot light image information.

Here, in the fourth detection method, in the case of employing the distance between light images as spot light image information, it is necessary to provide two or more light sources 32. However, not all of the two or more light sources 32 need to emit point light in an oblique direction, but at least one light source 32 needs to emit point light in an oblique direction.

Further, in the fourth detection method, in the case of adopting the position of the spot light image as the spot light image information, it is necessary to provide one or more light sources 32.

< Another configuration example of the multi-eye interchangeable lens 20>

Fig. 25 is a diagram showing another configuration example of the multi-eye interchangeable lens 20.

Note that in the drawings, portions corresponding to those in fig. 4, 13, and the like are given the same reference numerals, and hereinafter, description thereof will be appropriately omitted.

The configuration of the multi-eye interchangeable lens 20 in fig. 25 is similar to the case of fig. 4 and 13 and the like except that the light sources 32U and 32D configured similarly to the light sources 32L and 32R are newly provided.

The multi-eye interchangeable lens 20 of fig. 25 has a configuration in which two light sources 32U and 32D as a plurality of light sources are disposed on a straight line (for example, orthogonal straight lines in a plan view) that is not parallel to a straight line connecting the light sources 32L and 32R.

In the case where the multi-eye interchangeable lens 20 having the two light sources 32U and 32D disposed on the straight line orthogonal to the above-described straight line connecting the light sources 32L and 32R is mounted on the camera body 10, when there is a lens tilt about the x-axis or a lens tilt about the y-axis of the multi-eye interchangeable lens 20, the dot size and the position of the dot light image of the dot light emitted by the light sources 32L, 32R, 32U, and 32D are changed as shown in fig. 25, relative to the case where there is no lens tilt.

Therefore, the lens tilt about the x-axis and the lens tilt about the y-axis of the multi-eye interchangeable lens 20 can be detected from (the spot size and the position of) the spot light image.

In this case, when the lens inclination by an amount exceeding the allowable amount is detected, the user may be urged to reattach the multi-ocular interchangeable lens 20. Further, the lens tilt amount is detected, and parallax information may be obtained or an area of the monocular image may be specified so as to eliminate the influence of the lens tilt amount.

< Another configuration example of light sources 32L and 32R >

Fig. 26 is a sectional view showing another configuration example of the light sources 32L and 32R.

Note that in the drawings, portions corresponding to those in fig. 12 are given the same reference numerals, and hereinafter, description thereof will be appropriately omitted.

In fig. 26, the light source 32 includes a housing 121, an LED 122, and lenses 123 and 124.

Therefore, the light source 32 of fig. 26 is the same as the case of fig. 12 in terms of including the housing 121 to the lens 123, and is different from the case of fig. 12 in terms of newly including the lens 124.

The lens 124 is disposed on the image sensor 51 side of the lens 123, and converts the point light condensed by the lens 123 into parallel light and emits the parallel light.

Therefore, the point light emitted from the light source 32 in fig. 26 is parallel light. The light source 32 that emits parallel light as point light is hereinafter also referred to as parallel light source 32.

The parallel light source 32 may be provided in the feeding unit 23 of the multi-eye interchangeable lens 20. In the case where the parallel light source 32 is provided in the feeding unit 23, the dot size is constant regardless of the outward feeding of the feeding unit 23. Therefore, by reducing the dot size, the calculation error and the calculation amount when the center of gravity of the dot light image is obtained as the position of the dot light image are reduced as compared with the case of adopting the non-parallel light of which the dot size is changed as the dot light. Therefore, the attachment error and the feed amount can be obtained with higher accuracy, and the calculation load when obtaining the attachment error and the feed amount can be reduced.

A parallel light source 32 may be provided in the feeding unit 23 so that the spot light becomes parallel to the barrel optical axis. Note that in this case, the attachment error can be obtained using the spot light (spot light image), but the feed amount cannot be detected.

The parallel light source 32 is provided in the feeding unit 23 so that the point light is emitted in an oblique direction inclined from the lens barrel optical axis direction, whereby the feeding amount can be detected by the fourth detection method described with reference to fig. 21 to 24.

Further, even in the case of employing the parallel light source 32, the two parallel light sources 32U and 32D as a plurality of parallel light sources may be disposed on a line not parallel to the line connecting the parallel light sources 32L and 32R, for example, on an orthogonal line in a plan view, for example, as in the case of fig. 25.

In this case, when the multi-eye interchangeable lens 20 is mounted on the camera body 10, the position of the spot light image of the spot light emitted by the parallel light sources 32L, 32R, 32U, and 32D changes according to the lens tilt of the multi-eye interchangeable lens 20 about the x-axis or the lens tilt about the y-axis.

Fig. 27 is a diagram showing a state in which the position of the spot light image of the parallel light emitted by the parallel light source 32 changes according to the lens inclination.

In the case where the multi-eye interchangeable lens 20 having the two light sources 32U and 32D disposed on the straight line orthogonal to the straight line connecting the parallel light sources 32L and 32R is mounted on the camera body 10, the position of the spot light image of the parallel light emitted by the parallel light source 32 is changed according to the lens tilt about the x-axis or the lens tilt about the y-axis of the multi-eye interchangeable lens 20.

Therefore, the lens tilt about the x-axis and the lens tilt about the y-axis of the multi-eye interchangeable lens 20 can be detected from the position of the spot light image.

In this case, when the lens inclination by an amount exceeding the allowable amount is detected, the user may be urged to reattach the multi-ocular interchangeable lens 20. Further, the lens tilt amount is detected, and parallax information may be obtained or an area of the monocular image may be specified so as to eliminate the influence of the lens tilt amount.

< Another example of the Electrical configuration of the Camera System 1>

Fig. 28 is a block diagram showing another electrical configuration example of the camera system 1 of fig. 1.

Note that in the drawings, portions corresponding to those in fig. 2 are given the same reference numerals, and hereinafter, description thereof will be appropriately omitted.

Here, the camera system 1 (or lens-integrated camera system) to which the present technology is applied holds monocular image position information for specifying the position of a monocular image on the entire image (captured image) captured by the image sensor 51, that is, indicating the position of the monocular lenses 31 from the plurality in the image sensor 51iMonocular image position information of the emission position of the emitted imaging light. Further, the camera system 1 holds spot light image position information for specifying the position of the spot light image of the spot light of the light source 32.

Here, the entire image refers to the entire captured image captured by the image sensor 51, or an image obtained by deleting a part or all of the outside of all monocular images included in the captured image from the entire captured image.

Note that the monocular image position information and the spot light image position information may be information calculated for each camera system 1, or may be information calculated for each model.

Further, the monocular image position information may be information of the absolute position of the monocular image, or may be information using one predetermined monocular lens 31iAs information of the absolute position of the monocular image of the reference lens relative to the reference lens, and information of the relative position of the other monocular images based on the position of the monocular image relative to the reference lens.

For example, the monocular image position information and the point light image position information held by the camera system 1 may be values corresponding to the individual difference reflection position information (known reference position) and the individual difference point light position information (known light position), respectively, but are not limited thereto.

The camera system 1 corrects monocular image position information by using the position of a spot light image (detected spot light image position information) detected from the entire image at the time of actual imaging (at the time of imaging a general captured image (unknown captured image)).

In the multi-eye interchangeable lens 20 of the camera system 1, the single-eye lens 31iAnd the light source 32 are provided integrally. Therefore, even if the monocular lens 31 is fed out with the adjusted focal length (or zoom)iAlternatively, a single-eye lens 31 may be usediDetected spot light image position information of the light source 32 is integrally given to accurately modify (correct) monocular image position information.

That is, when the monocular lens 31iWhen the sheet is fed out, the amount of positional deviation differs depending on the amount of feed for "various reasons". However, even the monocular lens 31iIs fed out, the positional relationship between the monocular image and the light source 32 (i.e., the positional relationship between the monocular image and the spot light image) does not change either. Therefore, by detecting the position of the spot light image on the entire image at the time of actual imaging (detected spot light image position information)) And the deviation from the spot light image position information held by the camera system 1 is grasped, and the monocular image position information of the monocular image can be accurately corrected.

The "various causes" include, for example, a group of monocular lenses 31 that move integrally at the time of feediThe (monocular lens unit) tilt, the monocular lens unit rotates at the time of feed-out, the multi-eye interchangeable lens 20 has a rotation error at the time of attachment, a tilt error, and the like due to an attachment error even with the same feed amount.

Since the camera system 1 can accurately modify the monocular image position information of the monocular image, the camera system 1 can accurately extract (cut out) a predetermined range centered on the optical axis center position from the entire image as the monocular image using the modified monocular image position information (modified monocular image position information), and can perform processing (generation of parallax information or the like) that suppresses the influence of lens distortion or the like.

Note that the detected spot light image position information detected from the entire image at the time of actual imaging may be, but is not limited to, a value corresponding to, for example, mounting error spot light position information (unknown light position). In addition, the modified monocular image positional information obtained by modifying the monocular image positional information using the detected spot light image positional information may be, but is not limited to, a value corresponding to, for example, mounting error reflection positional information (unknown reference position).

In fig. 28, the area specifying unit 52 includes a monocular image position information modifying unit 211 and a monocular image extracting unit 212.

The spot light image position information and the monocular image position information stored as (a part of) the lens information of the storage unit 41 of the multi-eye interchangeable lens 20 are supplied from the communication unit 56 to the monocular image position information modifying unit 211. Further, the detected spot light image position information detected from the entire image (captured image) captured by the image sensor 51 is supplied from the spot light image detecting unit 62 to the monocular image position information modifying unit 211.

The monocular image position information modifying unit 211 modifies the monocular image position information from the communication unit 56 using the spot light image position information from the communication unit 56 and the detected spot light image position information from the spot light image detecting unit 62, and supplies the resulting modified monocular image position information to the monocular image extracting unit 212 and the associating unit 221. In the monocular image position information modifying unit 211, modification of the monocular image position information is performed similarly to the process of obtaining the mounting error reflection position information in the position calculating unit 57 (fig. 2).

As described above, the modified monocular image position information is supplied from the monocular image position information modifying unit 211, and the entire image (captured image) is supplied from the image sensor 51 to the monocular image extracting unit 212.

The monocular image extracting unit 212 obtains the monocular lens 31 indicating the entire image from the image sensor 51 using the modified monocular image position information from the monocular image position information modifying unit 2110To 314Area information of the area of the monocular image. For example, the monocular image extracting unit 212 obtains information indicating a rectangular region centered on the modified monocular image position information as the region information.

Then, the monocular image extracting unit 212 extracts the region indicated by the region information from the entire image from the image sensor 51 as a monocular image, and supplies the extracted monocular image to the display unit 54 and the associating unit 221 as necessary.

Further, the monocular image extracting unit 212 supplies the entire image from the image sensor 51 to the display unit 54 and the associating unit 221 as necessary.

The display unit 54 displays the entire image and the monocular image from the monocular image extracting unit 212.

The associating unit 221 associates the modified monocular image position information from the monocular image position information modifying unit 211 with the monocular image or the entire image from the monocular image extracting unit 212.

The associating unit 221 associates, for example, monocular images extracted from the same entire image from the monocular image extracting unit 212. Further, the association unit 221 associates, for example, a monocular image extracted from the same entire image and modified monocular image position information for extracting a monocular image from the monocular image extracting unit 212. Further, the associating unit 221 associates the entire image (captured image) from the monocular image extracting unit 212 with the modified monocular image position information (imaged monocular image position) obtained by modifying the monocular image position information using the detected spot light image position information detected from the entire image from the monocular image position information modifying unit 211, for example.

For example, the association may be performed by recording the associated objects in the same recording medium or by assigning the same Identification (ID) to the associated objects. Alternatively, the association may be performed using, for example, metadata of an association object (metadata of a monocular image or an entire image associated with the modified monocular image position information).

The association unit 221 may collectively record or transmit information (association information) associated by association.

Note that the association unit 221 may, for example, associate the spot light image position information (included in the lens information) with the detected spot light image position information detected from the entire image.

Further, the association information obtained in the association unit 221 may be an object for post-processing in the camera system 1 or the external device. In the post-processing, for example, the monocular image may be extracted from the entire image included in the related information in which the entire image is associated with the modified monocular image position information, using the modified monocular image position information included in the related information.

Further, the association unit 221 may associate the entire image (captured image), the detected spot light image position information (detected light image position) detected from the entire image, the spot light image position information (stored light image position), and the monocular image position information (stored monocular image position). In this case, in the post-processing of the entire image, the detected spot light image position information, the association information in which the spot light image position information and the monocular image position information are associated with each other, the monocular image position information is modified using the detected spot light image position information and the spot light image position information, and the monocular image can be extracted from the entire image using the resulting modified monocular image position information.

Further, the association unit 221 may associate, for example, spot light image position information (storing light image position), a difference between the spot light image position information and the detected spot light image position information (storing difference between light image position and detected light image position), monocular image position information (storing monocular image position), and the entire image (captured image).

Further, the association unit 221 may employ any association capable of specifying the position of the monocular image on the captured image.

As the target image to be associated, in addition to the entire image and the monocular image extracted from the entire image, one synthesized image in which the monocular image extracted from the entire image is arranged may be employed.

Further, as the target information to be associated with the target image, any information capable of specifying the position of the monocular image on the captured image may be employed in addition to the modified monocular image position information.

As the target information, for example, a set of monocular image position information, spot light image position information, and detected spot light image position information may be employed.

In the association of the target image and the target information, the target image and the target information may be associated with each other and stored in a storage medium, transmitted via a transmission medium, or made into one file.

Here, "associated" means, for example, that one data is made available (linkable) when another data is processed. The form of the target image and the target information as data (file) is arbitrary. For example, the target image and the target information may be collected as one data (file), or they may be set as data (file) separately. For example, the target information associated with the target image may be transmitted on a different transmission path than the transmission path of the target image. Further, for example, the target information associated with the target image may be recorded in a recording medium different from the target image, or may be recorded in a different recording area in the same recording medium. The target image and the target information may be combined into one stream data or one file.

The target image may be a still image or a moving image. In the case of a moving image, the target information and the target image of each frame may be associated with each other.

The "association" may be performed on partial data (file) rather than the entire data of the target image. For example, in the case where the target image is a moving image including a plurality of frames, the target information may be associated with the target image in an arbitrary unit (for example, a plurality of frames, one frame, or a part of frames).

Note that, in the case where the object image and the object information are separate data (files), the object image and the object information can be associated with each other by assigning the same ID (identification number) to both the object image and the object information. Further, for example, in the case where the target image and the target information are put together in one file, the target information may be added to a header or the like of the file.

< post-treatment apparatus >

Fig. 29 is a block diagram showing a configuration example of a post-processing device that performs post-processing on associated information.

In fig. 29, the post-processing apparatus 230 includes an area specifying unit 231, an image processing unit 232, a display unit 233, a recording unit 234, and a transmitting unit 235. The region specifying unit 231 includes a monocular image position information modifying unit 241 and a monocular image extracting unit 242.

The associated information in which the entire image, the detected spot light image position information, the spot light image position information, and the monocular image position information are associated is supplied from a recording medium or a transmission medium (not shown) to the post-processing device 230.

Note that there are many ways in which to correlate which types of information and how to modify the information. For example, if the spot light image position deviation information indicating the position deviation of the spot light image is calculated in advance from the detected spot light image position information and the spot light image position information, and the entire image or the monocular image is associated with the spot light image position deviation information, it is not necessary to associate the detected spot light image position information and the spot light image position information with the entire image or the like. Further, the monocular image position information may be modified in advance, and the modified monocular image position information may be associated with the entire image or the monocular image. In this case, the monocular image position information does not need to be modified in post-processing. Further, as the detected spot light image position information, for example, in addition to the position itself of the spot light image on the entire image, information of an image portion of an area where the spot light image appears in the entire image may be employed. In the case of employing information of an image portion of a region where a spot light image appears in the entire image as detected spot light image position information, for example, the position of the spot light image on the entire image can be obtained from the information.

The monocular image position information, the detected spot light image position information, and the spot light image position information included in the correlation information are supplied to the monocular image position information modifying unit 241. The entire image included in the associated information is supplied to the monocular image extracting unit 242.

The monocular image position information modifying unit 241 modifies the monocular image position information included in the related information by using the spot light image position information included in the related information and the detected spot light image position information, and supplies the resultant modified monocular image position information to the monocular image extracting unit 242. The monocular image positional information modifying unit 241 modifies the monocular image positional information similarly to the monocular image positional information modifying unit 211 (fig. 28).

The monocular image extracting unit 242 obtains the monocular lens 31 indicated on the entire image included in the related information using the modified monocular image positional information from the monocular image positional information modifying unit 2410To 314Area information of the area of the monocular image.

Then, the monocular image extracting unit 242 extracts the region indicated by the region information from the entire image as a monocular image, and supplies the extracted monocular image to the image processing unit 232, the display unit 233, the recording unit 234, and the transmitting unit 235 as necessary.

Note that, in addition to the monocular image, the monocular image extracting unit 242 may also supply the entire image and the modified monocular image position information to the image processing unit 232, the display unit 233, the recording unit 234, and the transmitting unit 235 as necessary.

The image processing unit 232 performs image processing on the monocular image from the monocular image extracting unit 242, and supplies the result of the image processing to the display unit 233, the recording unit 234, and the transmitting unit 235 as necessary. For example, the image processing unit 232 may generate parallax information using the monocular image from the monocular image extracting unit 242 and the modified monocular image position information, and may perform refocusing using the parallax information and the monocular image.

The display unit 233 displays the entire image, the monocular image, the modified monocular image position information from the monocular image extracting unit 242, and the result of the image processing by the image processing unit 232 as necessary. The recording unit 234 records the entire image, the monocular image, the modified monocular image position information from the monocular image extracting unit 242, and the result of the image processing by the image processing unit 232 in a recording medium (not shown) as necessary. The transmission unit 235 transmits the entire image, the monocular image, the modified monocular image position information from the monocular image extracting unit 242, and the result of the image processing by the image processing unit 232 via a transmission medium (not shown) as necessary.

As the post-processing, the post-processing device 230 may perform a process of modifying the monocular image position information using the detected spot light image position information and the spot light image position information, and a process of extracting the monocular image from the entire image using the modified monocular image position information obtained by modifying the monocular image position information.

The post-processing device 230 as described above may be provided in a device that reproduces, displays, and performs image processing for a monocular image.

Fig. 30 is a block diagram showing another configuration example of a post-processing device that performs post-processing on associated information.

Note that in the drawings, components corresponding to the post-processing apparatus 230 in fig. 29 are given the same reference numerals, and hereinafter, description thereof will be appropriately omitted.

In fig. 30, the post-processing apparatus 250 includes an area specifying unit 231 to a transmitting unit 235. The area specifying unit 231 includes a monocular image position information modifying unit 241.

Therefore, the post-processing apparatus 250 is the same as the post-processing apparatus 230 in fig. 29 in terms of including the area specifying unit 231 to the transmitting unit 235. However, the post-processing apparatus 250 is different from the post-processing apparatus 230 in that the region specifying unit 231 does not include the monocular image extracting unit 242.

The association information in which the plurality of monocular images, the detected spot light image position information, the spot light image position information, and the plurality of monocular image position information respectively corresponding to the plurality of monocular images are associated is supplied from a recording medium or a transmission medium (not shown) to the post-processing device 250.

The monocular image position information, the detected spot light image position information, and the spot light image position information included in the correlation information are supplied to the monocular image position information modifying unit 241. The monocular image included in the related information is supplied to the image processing unit 232, the display unit 233, the recording unit 234, and the transmitting unit 235 as necessary.

As described with reference to fig. 29, the monocular image position information modifying unit 241 modifies the monocular image position information included in the related information by using the spot light image position information included in the related information and the detected spot light image position information. The monocular image position information modifying unit 241 supplies the modified monocular image position information obtained by modifying the monocular image position information to the image processing unit 232, the display unit 233, the recording unit 234, and the transmitting unit 235 as necessary.

The image processing unit 232 performs image processing on the monocular image, and supplies the result of the image processing to the display unit 233, the recording unit 234, and the transmitting unit 235 as necessary. For example, similar to the case of fig. 29, the image processing unit 232 may generate parallax information using the monocular image and the modified monocular image position information, and may perform refocusing using the parallax information and the monocular image.

The display unit 233 displays the entire image, the monocular image, the modified monocular image position information from the monocular image extracting unit 242, the result of the image processing by the image processing unit 232, and the modified monocular image position information obtained by the monocular image position information modifying unit 241, as needed. The recording unit 234 records the monocular image, the modified monocular image position information, the result of the image processing performed by the image processing unit 232, and the modified monocular image position information obtained by the monocular image position information modifying unit 241, as needed. The transmitting unit 235 transmits the monocular image, the modified monocular image position information, the result of the image processing performed by the image processing unit 232, and the modified monocular image position information obtained by the monocular image position information modifying unit 241, as needed.

The post-processing device 250 may perform processing of modifying the monocular image position information using the detected spot light image position information and the spot light image position information as post-processing. Note that in fig. 30, the entire image may be included in the associated information. The entire image may be provided to the image processing unit 232 to the transmitting unit 235 and may be processed.

The post-processing device 250 as described above may be provided in a device that reproduces, displays, and performs image processing for a monocular image. Note that, as described above, the post-processing device 230 and the post-processing device 250 may be provided in the camera system 1 as a post-processing function.

< other embodiments of Camera System to which the present technology is applied >

< first another embodiment of Camera System >

Fig. 31 is a block diagram showing an example of an electrical configuration of the first another embodiment of the camera system to which the present technology is applied.

Note that in the drawings, portions corresponding to those in fig. 2 are given the same reference numerals, and hereinafter, description thereof will be appropriately omitted.

In fig. 31, the camera system 300 is a lens-integrated camera system. The camera system 300 includes a lens unit 320, an image sensor 351, a RAW signal processing unit 352, an area extraction unit 353, a camera signal processing unit 354, a through image generation unit 355, an area specification unit 356, an image reconfiguration processing unit 357, a bus 360, a display unit 361, a storage unit 362, a communication unit 364, an archive unit 365, a control unit 381, a storage unit 382, and an optical system control unit 384.

The lens unit 320 includes a feeding unit 23. As described with reference to fig. 1 and 2, the feeding unit 23 includes the monocular lens 310To 314And light sources 32L and 32R.

The feeding unit 23 moves in the optical axis direction of the lens barrel optical axis within the lens barrel 21 (fig. 1) not shown in fig. 31. A monocular lens 31 included in the feeding unit 23 as the feeding unit 23 moves0To 314And the light sources 32L and 32R are also integrally moved.

Monocular lens 31iAre configured such that the optical paths of light passing through the lenses are independent of each other. That is, on the light receiving surface (e.g., effective pixel region) of the image sensor 351, the light is emitted through the monocular lens 31 at positions different from each otheriWithout entering the other monocular lens 31j. On the light receiving surface of the image sensor 351, the monocular lens 31iAre located at positions different from each other, and on the light receiving surface of the image sensor 351, the emission having passed through the monocular lens 31 is made at positions different from each otheriAt least some of the light of (a).

Therefore, in the captured image generated by the image sensor 351 (the entire image output by the image sensor 351), the monocular lens 31 is passediThe formed images of the object are formed at different positions from each other. In other words, the image having the monocular lens 31 is obtained from the photographed imageiAn image of a viewpoint (monocular image) at the position of (a).

The image sensor 351 is complementaryA metal oxide semiconductor (CMOS) image sensor, for example, similar to the image sensor 51, images a subject to generate a photographed image. The light receiving surface of the image sensor 351 is formed by the monocular lens 310To 314The focused beam of light. The image sensor 351 receives the image from the monocular lens 310To 314And performs photoelectric conversion, thereby generating a light beam including a lens having a single eye 31iA captured image of a monocular image of a viewpoint at (c).

The image sensor 351 may be a monochrome (so-called monochrome) image sensor similar to the image sensor 51, or may be a color image sensor in which, for example, color filters of a bayer array are arranged in pixel groups. That is, the captured image output by the image sensor 351 may be a monochrome image or a color image. Hereinafter, a description will be given based on the assumption that the image sensor 351 is a color image sensor and generates and outputs a captured image in a RAW format.

Note that, in the present embodiment, the RAW format refers to an image in a state where the positional relationship of the arrangement of the color filters of the image sensor 351 is maintained, and may include an image obtained by performing signal processing such as image size conversion processing, noise reduction processing, and defect correction processing of the image sensor 351, and compression encoding on an image output from the image sensor 351.

The image sensor 351 can output a photographed image (entire image) in a RAW format, which is generated by photoelectrically converting illumination light. For example, the image sensor 351 may supply a captured image (entire image) in a RAW format to at least one of the bus 360, the RAW signal processing unit 352, the area extraction unit 353, or the area specification unit 356.

For example, the image sensor 351 can supply a captured image (entire image) in a RAW format to the storage unit 362 via the bus 360 and store the captured image in the storage medium 363. Further, the image sensor 351 can supply a captured image (entire image) in a RAW format to the communication unit 364 via the bus 360 to transmit the captured image to the outside of the camera system 300. Further, the image sensor 351 can supply a captured image (entire image) in a RAW format to the filing unit 365 via the bus 360 to convert the captured image into a file. Further, the image sensor 351 can supply a captured image (entire image) in a RAW format to the image reconfiguration processing unit 357 via the bus 360 to perform image reconfiguration processing.

Note that the image sensor 351 may be a single-board image sensor, or may be a set of image sensors (also referred to as a multi-board image sensor) including a plurality of image sensors (e.g., a three-board image sensor).

Note that in the case of a multi-panel image sensor, the image sensor is not limited to the image sensor for RGB. All image sensors may be monochrome or may include color filters of a bayer array or the like. Note that in the case where all the color filters are bayer arrays, noise reduction can be performed by setting all the arrays to be the same and adjusting the positional relationship of the mutual pixels, and also image quality can be improved by utilizing the effect of so-called spatial pixel shift by changing the positional relationship of the RGB image sensor.

Even in the case of such a multi-panel imaging apparatus, a plurality of monocular images or a plurality of viewpoint images are included in a captured image output from each image sensor (i.e., from one image sensor).

The RAW signal processing unit 352 performs processing related to signal processing on the image in the RAW format. For example, the RAW signal processing unit 352 may acquire a captured image (entire image) in a RAW format supplied from the image sensor 351. Further, the RAW signal processing unit 352 may perform predetermined signal processing on the acquired captured image. The content of such signal processing is arbitrary. For example, the signal processing may be defect correction, noise reduction, compression (encoding), or the like, or other signal processing. Of course, the RAW signal processing unit 352 may also perform various types of signal processing on the captured image. Note that the signal processing that can be performed on the image in the RAW format is limited to signal processing in which the image after the signal processing is an image in a state of maintaining the positional relationship of the arrangement of the color filters of the image sensor 351 as described above (in the case of a multi-plate imaging device, the image maintains a state of an R image, a G image, and a B image).

The RAW signal processing unit 352 can supply the captured image in the RAW format (RAW') or the compressed (encoded) captured image (compressed RAW) to which the signal processing is applied to the storage unit 362 via the bus 360, and store the captured image in the storage medium 363. Further, the RAW signal processing unit 352 may supply the captured image in the RAW format (RAW') or the compressed (encoded) captured image (compressed RAW) to which the signal processing is applied to the communication unit 364 via the bus 360 to transmit the captured image. Further, the RAW signal processing unit 352 may supply the captured image in the RAW format (RAW') or the compressed (encoded) captured image (compressed RAW) to which the signal processing is applied to the filing unit 365 via the bus 360 to convert the captured image into a file. Further, the RAW signal processing unit 352 may supply the captured image in the RAW format (RAW') or the compressed (encoded) captured image (compressed RAW) to which the signal processing is applied to the image reconfiguration processing unit 357 via the bus 360 to perform the image reconfiguration processing. Note that in the case where there is no need to distinguish and describe the RAW, the RAW', and the compressed RAW, they are referred to as RAW images.

The region extraction unit 353 performs processing related to extracting a partial region (cut-out partial image) from the captured image in the RAW format. For example, the region extraction unit 353 can acquire a captured image (entire image) in the RAW format from the image sensor 351. Further, the region extraction unit 353 may acquire information indicating a region extracted from the captured image supplied from the region specification unit 356 (also referred to as extraction region information). Then, the region extraction unit 353 may extract a partial region (cut out partial image) from the captured image based on the extraction region information.

For example, the region extraction unit 353 may cut out the image having the single-eye lens 31 per one eye from the captured image (entire image)iA monocular image of a viewpoint at the position of (a). Further, in the captured image, a region from which a monocular image is cut out (a region corresponding to the monocular image) is also referred to as a monocular image region. For example, the region extracting unit 353 may acquire viewpoint-related information, which is supplied from the region specifying unit 356And is used to designate a monocular image region as information of the extraction region information, and each monocular image region indicated in the viewpoint-related information may be extracted from the captured image, i.e., each monocular image may be cut out. Then, the region extraction unit 353 may supply the cut-out monocular image (RAW format) to the camera signal processing unit 354.

The viewpoint-related information may be, for example, a value corresponding to the individual difference reflection position information or the mounting error reflection position information, but is not limited thereto, and does not need to be set for distinguishing or correcting such individual differences, mounting errors, or the like. Even information indicating only the relationship between the area of the monocular image on the captured image and the position of the spot light may be modified in consideration of various errors including the above-described error.

The region extraction unit 353 may combine the monocular images cut out from the captured image (entire image) to generate a composite image. The composite image is obtained by combining the single-eye images into one data or one image. For example, the region extraction unit 353 may generate one image in which the monocular image is arranged in a plane as a composite image. The region extraction unit 353 may supply the generated synthesized image (RAW format) to the camera signal processing unit 354.

Further, for example, the region extraction unit 353 may supply the entire image to the camera signal processing unit 354. For example, the region extraction unit 353 may extract a partial region including all the monocular images from the acquired captured image (i.e., cut out a partial image including all the monocular images), and supply the cut-out partial image (i.e., an image obtained by deleting a part or all of the region other than all the monocular images included in the captured image) as the entire image to the camera signal processing unit 354 in the RAW format. The position (range) of the region to be extracted in this case may be predetermined in the region extraction unit 353, or may be specified by the viewpoint-related information supplied from the region specification unit 356.

Further, the area extraction unit 353 may also supply the acquired captured image (i.e., not the cut-out partial image including all the monocular images but the entire captured image) as an entire image to the camera signal processing unit 354 in the RAW format.

Note that the region extraction unit 353 can supply the partial image in the RAW format (whole image, monocular image, or composite image) cut out from the captured image as described above to the storage unit 362, the communication unit 364, the filing unit 365, the image reconfiguration processing unit 357, and the like via the bus 360, similarly to the case of the image sensor 351.

Further, the region extraction unit 353 may supply the partial image (entire image, monocular image, or composite image) in the RAW format to the RAW signal processing unit 352 to perform predetermined signal processing or compression (encoding). Even in this case, the RAW signal processing unit 352 can supply the captured image (RAW') in the RAW format or the compressed (encoded) captured image (compressed RAW) to which the signal processing is applied to the storage unit 362, the communication unit 364, the filing unit 365, the image reconfiguration processing unit 357, and the like via the bus 360.

That is, at least one of the entire image, the monocular image, or the synthesized image may be a RAW image.

The camera signal processing unit 354 performs processing related to camera signal processing on the image. For example, the camera signal processing unit 354 may acquire an image (entire image, monocular image, or synthesized image) supplied from the region extraction unit 353. Further, the camera signal processing unit 354 may apply camera signal processing (camera processing) to the acquired image. For example, the camera signal processing unit 354 may perform color separation processing of separating each color of RGB on an image to be processed to generate an R image, a G image, and a B image, each image having the same number of pixels as the image to be processed (demosaicing processing in the case of using a mosaic color filter such as a bayer array), YC conversion processing of converting the color space of the image after color separation from RGB into YC (luminance/color difference), and the like. Further, the camera signal processing unit 354 may perform processing such as defect correction, noise reduction, Automatic White Balance (AWB), or gamma correction on an image to be processed. Further, the camera signal processing unit 354 may also compress (encode) an image to be processed. Of course, the camera signal processing unit 354 may perform a plurality of camera signal processes on an image to be processed, and may also perform camera signal processes other than the above-described examples.

Note that in the following description, it is assumed that the camera signal processing unit 354 acquires an image in the RAW format, performs color separation processing or YC conversion on the image, and outputs an image in the YC format (YC). The image may be a whole image, a monocular image, or a composite image. Further, the image in YC format (YC) may or may not be encoded. That is, the data output from the camera signal processing unit 354 may be encoded data or unencoded image data.

That is, at least one of the entire image, the monocular image, or the composite image may be an image in YC format (also referred to as a YC image).

Further, the image output by the camera signal processing unit 354 may be an image to which the full development process has not been applied, and may be an image to which some or all of the processes (such as gamma correction and color matrix) related to irreversible image quality adjustment (color adjustment) have not been applied as an image in YC format (YC). In this case, the image in YC format (YC) can be returned substantially to the image in RAW format without being degraded in a subsequent stage, reproduction, or the like.

The camera signal processing unit 354 may supply an image (YC) in YC format to which camera signal processing has been applied to the storage unit 362 via the bus 360, and store the image (YC) in the storage medium 363. Further, the camera signal processing unit 354 may supply an image (YC) in a YC format to which camera signal processing has been applied to the communication unit 364 via the bus 360 to transmit the image to the outside. Further, the camera signal processing unit 354 may supply an image (YC) in YC format to which camera signal processing has been applied to the filing unit 365 via the bus 360 to convert the image into a file. Further, the camera signal processing unit 354 may supply an image (YC) in the YC format to which the camera signal processing has been applied to the image reconfiguration processing unit 357 via the bus 360 to perform the image reconfiguration processing.

Further, for example, the camera signal processing unit 354 may supply an image in YC format (YC) to the through-image generating unit 355.

Note that in the case where an image in the RAW format (a whole image, a monocular image, or a composite image) is stored in the storage medium 363, the camera signal processing unit 354 may read the image in the RAW format from the storage medium 363 and perform signal processing. Even in this case, the camera signal processing unit 354 can supply an image (YC) in the YC format to which camera signal processing has been applied to the display unit 361, the storage unit 362, the communication unit 364, the filing unit 365, the image reconfiguration processing unit 357, and the like via the bus 360.

Further, the camera signal processing unit 354 may perform camera signal processing on the captured image in the RAW format (entire image) output from the image sensor 351, and the region extraction unit 353 may extract a partial region from the captured image after the camera signal processing (entire image).

The through-image generation unit 355 performs processing related to generation of a through image. The through image is an image displayed to the user for checking an image to be photographed during photographing or during a photographing preparation period (non-recording period). The through image is also referred to as a live view image or an electron-pair (EE) image. Note that the through image is an image before shooting when a still image is shot. However, in shooting a moving image, a through image corresponding not only to an image during preparation for shooting but also to an image during shooting (recording) is displayed.

For example, the through-image generation unit 355 may acquire an image (entire image, monocular image, or composite image) supplied from the camera signal processing unit 354. Further, for example, the through-image generating unit 355 may generate a through-image, which is an image for display, by performing image size (resolution) conversion of converting the acquired image into an image size corresponding to the resolution of the display unit 361. The through-image generating unit 355 may provide the generated through-image to the display unit 361 via the bus 360 to display the through-image.

The region specifying unit 356 performs processing related to specification (setting) of a region to be extracted from the captured image by the region extracting unit 353. For example, the region specifying unit 356 acquires the viewpoint-related information VI for specifying a region to be extracted from the captured image, and supplies the viewpoint-related information VI to the region extracting unit 353 as the extracted region information.

The viewpoint-related information VI may be, for example, a value corresponding to the individual difference reflection position information described above, but is not limited thereto. The viewpoint-related information VI indicates, for example, a design position of a monocular image in the captured image, a position when a known captured image is imaged, and the like.

The viewpoint-related information VI includes, for example, monocular area information indicating a monocular image area in the captured image. The monocular area information may represent the monocular image area in any manner. For example, the monocular image area may be represented by coordinates (also referred to as center coordinates of the monocular image area) indicating a position (optical axis center position) corresponding to the optical axis of the monocular lens 31 in the captured image and the resolution (number of pixels) of the monocular image (monocular image area). That is, the monocular area information may include the center coordinates of the monocular image area in the captured image and the resolution of the monocular image area. In this case, the position of the monocular image area in the entire image may be specified in accordance with the center coordinates of the monocular image area and the resolution (number of pixels) of the monocular image area.

Note that monocular image region information is set for each monocular image region. That is, in the case where the captured image includes a plurality of monocular images, the viewpoint-related information VI may include, for each monocular image (each monocular image region), viewpoint identification information (e.g., ID) for identifying the monocular image (region) and monocular region information.

Furthermore, the viewpoint-related information VI may include other arbitrary information. For example, the viewpoint-related information VI may include viewpoint time information indicating a photographing time of a photographed image from which the monocular image is extracted.

The region specifying unit 356 may provide the viewpoint-related information VI to the bus 360. For example, the area specifying unit 356 may supply the viewpoint-related information VI to the storage unit 362 via the bus 360, and store the viewpoint-related information VI in the storage medium 363. Further, the region specifying unit 356 may provide the viewpoint-related information VI to the communication unit 364 via the bus 360 to transmit the viewpoint-related information VI. Further, the area specifying unit 356 may provide the viewpoint-related information VI to the filing unit 365 via the bus 360 to convert the viewpoint-related information VI into a file. In addition, the region specifying unit 356 may supply the viewpoint-related information VI to the image reconfiguration processing unit 357 via the bus 360 for image reconfiguration processing.

For example, the area specifying unit 356 may acquire the viewpoint-related information VI from the control unit 381, and supply the viewpoint-related information VI to the area extracting unit 353 and the bus 360. In this case, the control unit 381 reads the viewpoint-related information VI stored in the storage medium 383 via the storage unit 382, and supplies the viewpoint-related information VI to the area designation unit 356. The area specifying unit 356 supplies the viewpoint-related information VI to the area extracting unit 353 and the bus 360.

The viewpoint-related information VI supplied to the storage unit 362, the communication unit 364, or the filing unit 365 via the bus 360 in this manner is associated with an image (whole image, monocular image, or composite image) therein. For example, the storage unit 362 may store the supplied viewpoint-related information VI and an image (entire image, monocular image, or composite image) in the storage medium 363 in association with each other. Further, the communication unit 364 may transmit the provided viewpoint-related information VI to the outside in association with an image (entire image, monocular image, or composite image). Further, the filing unit 365 may generate a file including the supplied viewpoint-related information VI and an image (entire image, monocular image, or composite image) in association with each other.

Further, the region specifying unit 356 may acquire the captured image in the RAW format supplied from the image sensor 351, generate viewpoint-related information VI 'based on the captured image, and supply the generated viewpoint-related information VI' to the region extracting unit 353 and the bus 360. In this case, the area specifying unit 356 specifies each monocular image area from the captured image, and generates viewpoint-related information VI' indicating the monocular image area (for example, the monocular image area is indicated by the center coordinates of the monocular image area and the resolution of the monocular image area in the captured image). Then, the area specifying unit 356 supplies the generated viewpoint-related information VI' to the area extracting unit 353 and the bus 360. Note that the point light information SI 'generated by the area specifying unit 356 based on the captured image may be provided together with the viewpoint-related information VI'.

The spot light information is information on a spot light image, and may be, but is not limited to, a value corresponding to, for example, the above-described individual difference spot light position information or the mounting error spot light position information.

The region specifying unit 356 may acquire the viewpoint-related information VI from the control unit 381, acquire a captured image in the RAW format supplied from the image sensor 351, generate the point light information SI 'based on the captured image, add the point light information SI' to the viewpoint-related information VI, and supply the viewpoint-related information VI to the region extraction unit 353 and the bus 360. In this case, the control unit 381 reads the view related information VI stored in the storage medium 383 via the storage unit 382, and supplies the view related information VI to the area designation unit 356. The area specifying unit 356 corrects the viewpoint-related information VI using the point light information SI ', and generates corrected viewpoint-related information VI'. The area specifying unit 356 supplies the viewpoint-related information VI' to the area extracting unit 353 and the bus 360.

Further, the region specifying unit 356 may acquire the viewpoint-related information VI from the control unit 381, acquire the captured image in the RAW format supplied from the image sensor 351, generate the point light information SI ' based on the captured image, correct the viewpoint-related information VI using the point light information SI ', and supply the corrected viewpoint-related information VI ' to the region extraction unit 353 and the bus 360. In this case, the control unit 381 reads the view related information VI stored in the storage medium 383 via the storage unit 382, and supplies the view related information VI to the area designation unit 356. The area specifying unit 356 corrects the viewpoint-related information VI using the point light information SI 'to generate viewpoint-related information VI'. The area specifying unit 356 supplies the viewpoint-related information VI' to the area extracting unit 353 and the bus 360.

For example, the spot light information SI' may be, but is not limited to, a value corresponding to the above-described mounting error spot light position information or spot light image information. The spot light information SI' indicates, for example, the position and/or the spot size of the spot light image appearing in the captured image.

Here, due to various offsets associated with the movement of the feeding unit 23 (for example, an offset between a direction perpendicular to the light receiving surface of the image sensor 351 and the moving direction of the feeding unit 23, and the monocular lens 31 associated with the movement of the feeding unit 23iRotational shift) of the image, the position of the monocular image in the captured image may be shifted with the movement of the feeding unit 23.

In the case where the position of the monocular image in the captured image is shifted with the movement of the feeding unit 23, when an image is cut out (extracted) from the position of the captured image indicated by the viewpoint-related information VI, an image of a region shifted from the monocular image region of the (original) monocular image that is the monocular image.

Therefore, the area specifying unit 356 may detect the positional deviation (amount) of the monocular image in the captured image using the position and/or the dot size of the spot light image indicated by the spot light information SI' generated from the captured image.

Then, the region specifying unit 356 may obtain information for modifying the position at which the monocular image is cut out from the captured image from the positional deviation of the monocular image, and supply the information to the region extracting unit 353.

That is, the region specifying unit 356 corrects the viewpoint-related information VI in accordance with the positional deviation of the monocular image so as to represent the position of the monocular image after the positional deviation, and supplies the viewpoint-related information VI' obtained by the correction to the region extracting unit 353.

Here, the storage medium 383 stores, for example, the viewpoint-related information VI and the point light information SI. The spot light information SI may be, but is not limited to, a value corresponding to the above-described individual difference spot light position information, for example. The spot light information SI indicates, for example, a design position and/or a spot size of a spot light image in a captured image, a position and/or a spot size during imaging of a known captured image, and the like.

The viewpoint-related information VI and the point light information SI are information obtained at the same timing. For example, when the viewpoint-related information VI indicates information such as a position (viewpoint) at which a monocular image is designed, the spot light information SI also indicates information such as a position at which a spot light image is designed. Further, for example, in the case where the viewpoint-related information VI is information indicating the position or the like of a monocular image detected when imaging a known captured image, the spot light information SI is also information indicating the position or the like of a spot light image detected when imaging a known captured image.

For example, the region specifying unit 356 may detect a difference between the spot light information SI and the spot light information SI' generated from the captured image as a positional deviation of the monocular image in the captured image. Then, the region specifying unit 356 may correct the viewpoint-related information VI using the positional deviation of the monocular image in the captured image, that is, the difference between the spot light information SI and the spot light information SI 'generated from the captured image, and generate the viewpoint-related information VI' in which the positional deviation of the monocular image in the captured image has been corrected (modified).

In addition, the area specifying unit 356 may detect the feeding amount of the feeding unit 23 using the point light information SI' generated from the captured image.

The image reconfiguration processing unit 357 performs processing related to image reconfiguration. For example, the image reconfiguration processing unit 357 may acquire an image in YC format (entire image, monocular image, or composite image) from the camera signal processing unit 354 or the storage unit 362 via the bus 360. Further, the image reconfiguration processing unit 357 may acquire the viewpoint-related information from the region specifying unit 356 or the storage unit 362 via the bus 360.

Further, the image reconfiguration processing unit 357 may perform image processing, for example, generation of depth information and refocusing for generating (reconfiguring) an image focused on an arbitrary object, using the acquired image and viewpoint-related information associated with the acquired image. For example, in the case where monocular images are to be processed, the image reconfiguration processing unit 357 performs processing such as generation and refocusing of depth information using each monocular image. Further, in the case where the captured image or the synthesized image is to be processed, the image reconfiguration processing unit 357 extracts each monocular image from the captured image or the synthesized image, and performs processing such as generation of depth information and refocusing using the extracted monocular image.

The image reconfiguration processing unit 357 may supply the generated depth information and the refocused image as a processing result to the storage unit 362 via the bus 360, and store the processing result in the storage medium 363. Further, the image reconfiguration processing unit 357 may supply the generated depth information and the refocused image as a processing result to the communication unit 364 via the bus 360 to transmit the processing result to the outside. Further, the image reconfiguration processing unit 357 may supply the generated depth information and the refocused image as a processing result to the filing unit 365 via the bus 360 to convert the processing result into a file.

The image sensor 351, the RAW signal processing unit 352, the region extraction unit 353, the camera signal processing unit 354, the through image generation unit 355, the region specification unit 356, the image reconfiguration processing unit 357, the display unit 361, the storage unit 362, the communication unit 364, and the filing unit 365 are connected to the bus 360. The bus 360 serves as a transmission medium (transmission path) for various data exchanged between these blocks. Note that bus 360 can be implemented by wire or wirelessly.

The display unit 361 includes, for example, a liquid crystal panel, an organic Electroluminescence (EL) panel, and the like, and is provided integrally with or separately from a housing of the camera system 300. For example, the display unit 361 may be disposed on a back surface (a surface opposite to a surface on which the lens unit 320 is disposed) of a housing of the camera system 300.

The display unit 361 performs processing related to image display. For example, the display unit 361 may acquire the YC-format through image supplied from the through image generation unit 355, convert the through image into an RGB format, and display an RGB-format image. In addition, the display unit 361 may also display information such as menus and settings of the camera system 300, for example.

Further, the display unit 361 can acquire and display an image in YC format (whole image, monocular image, or composite image) supplied from the storage unit 362. Further, the display unit 361 may acquire and display a thumbnail image in the YC format supplied from the storage unit 362. Further, the display unit 361 may acquire and display an image (entire image, monocular image, or composite image) in YC format supplied from the camera signal processing unit 354.

The storage unit 362 controls storage of a storage medium 363 including, for example, a semiconductor memory or the like. The storage medium 363 may be a removable storage medium or a storage medium built in the camera system 300. For example, the storage unit 362 may store an image (entire image, monocular image, or composite image) supplied via the bus 360 in the storage medium 363 according to the control unit 381 or an operation of the user or the like.

For example, the storage unit 362 may acquire an image in a RAW format (whole image, monocular image, or composite image) supplied from the image sensor 351 or the region extraction unit 353, and store the image in the storage medium 363. Further, the storage unit 362 may acquire an image in the RAW format (whole image, monocular image, or composite image) to which signal processing is applied or a compressed (encoded) image in the RAW format (whole image, monocular image, or composite image) supplied from the RAW signal processing unit 352, and store the image in the storage medium 363. Further, the storage unit 362 may acquire an image in the YC format (entire image, monocular image, or composite image) supplied from the camera signal processing unit 354 and store the image in the storage medium 363.

At this time, the storage unit 362 may acquire the viewpoint-related information supplied from the area specifying unit 356 and associate the viewpoint-related information with the above-described image (whole image, monocular image, or composite image). That is, the storage unit 362 may store the image (entire image, monocular image, or composite image) and the viewpoint-related information in the storage medium 363 in association with each other. That is, the storage unit 362 functions as an associating unit that associates at least one of the entire image, the monocular image, or the synthetic image with the viewpoint-related information.

Further, for example, the storage unit 362 may acquire the depth information and the refocused image supplied from the image reconfiguration processing unit 357 and store the information in the storage medium 363. Further, the storage unit 362 may acquire a file supplied from the filing unit 365 and store the file in the storage medium 363. The file includes, for example, an image (whole image, monocular image, or composite image) and viewpoint-related information. That is, in this document, an image (entire image, monocular image, or composite image) and viewpoint-related information are associated with each other.

Further, for example, the storage unit 362 may read data, files, and the like stored in the storage medium 363 according to the control unit 381, user operations, and the like, and supply the read data, files, and the like to the camera signal processing unit 354, the display unit 361, the communication unit 364, the filing unit 365, the image reconfiguration processing unit 357, and the like via the bus 360. For example, the storage unit 362 can read an image in the YC format (entire image, monocular image, or composite image) from the storage medium 363, supply the image to the display unit 361, and display the image. Further, the storage unit 362 may read an image in a RAW format (whole image, monocular image, or composite image) from the storage medium 363, supply the image to the camera signal processing unit 354, and apply camera signal processing to the image.

Further, the storage unit 362 may read data or a file of the image (entire image, monocular image, or composite image) and the viewpoint-related information stored in the storage medium 363 in association with each other, and supply the data or the file to other processing units. For example, the storage unit 362 may read an image (entire image, monocular image, or composite image) and viewpoint-related information associated with each other from the storage medium 363, and supply the image and viewpoint-related information to the image reconfiguration processing unit 357 to perform processing such as generation and refocusing of depth information. Further, the storage unit 362 may read an image (entire image, monocular image, or composite image) and viewpoint-related information associated with each other from the storage medium 363, and supply the image and viewpoint-related information to the communication unit 364 to transmit the image and viewpoint-related information. Further, the storage unit 362 may read an image (entire image, monocular image, or composite image) and viewpoint-related information associated with each other from the storage medium 363, and supply the image and viewpoint-related information to the filing unit 365 to convert the image and viewpoint-related information into a file.

Note that the storage medium 363 may be a Read Only Memory (ROM) or a rewritable memory such as a Random Access Memory (RAM) or a flash memory. In the case of a rewritable memory, the storage medium 363 may store any information.

The communication unit 364 communicates with a server on the internet, a PC on a wired or wireless LAN, other external devices, and the like by an arbitrary communication method. For example, the communication unit 364 may transmit data or a file of an image (entire image, monocular image, or composite image) and viewpoint-related information to a communication partner (external device) by a streaming method, an upload method, or the like through communication according to the control unit 381, an operation by the user, or the like.

For example, the communication unit 364 may acquire and transmit an image in RAW format (whole image, monocular image, or composite image) supplied from the image sensor 351 or the region extraction unit 353. Further, the communication unit 364 may acquire and transmit an image in the RAW format (whole image, monocular image, or composite image) to which signal processing is applied or a compressed (encoded) image (whole image, monocular image, or composite image) supplied from the RAW signal processing unit 352. Further, the communication unit 364 can acquire and transmit an image (entire image, monocular image, or composite image) in YC format supplied from the camera signal processing unit 354.

At this time, the communication unit 364 may acquire the viewpoint-related information supplied from the area specifying unit 356 and associate the viewpoint-related information with the above-described image (whole image, monocular image, or composite image). That is, the communication unit 364 may transmit the image (entire image, monocular image, or composite image) and the viewpoint-related information in association with each other. For example, in the case of transmitting an image by a streaming method, the communication unit 364 repeats processing of acquiring an image to be transmitted (entire image, monocular image, or composite image) from the processing unit that provides the image and transmitting the viewpoint-related information and the image provided from the area specifying unit 356 in association with each other. That is, the communication unit 364 functions as an associating unit that associates at least one of the entire image, the monocular image, or the synthetic image with the viewpoint-related information.

Further, for example, the communication unit 364 may acquire and transmit the depth information and the refocused image supplied from the image reconfiguration processing unit 357. Further, the communication unit 364 may acquire and transmit the file provided from the filing unit 365. The file includes, for example, an image (whole image, monocular image, or composite image) and viewpoint-related information. That is, in this document, an image (entire image, monocular image, or composite image) and viewpoint-related information are associated with each other.

Filing unit 365 performs processing related to file generation. For example, the filing unit 365 may acquire an image (whole image, monocular image, or composite image) in a RAW format supplied from the image sensor 351 or the area extraction unit 353. Further, the filing unit 365 may acquire an image in the RAW format (whole image, monocular image, or composite image) to which signal processing has been applied, or a compressed (encoded) image in the RAW format (whole image, monocular image, or composite image) supplied from the RAW signal processing unit 352. Further, the filing unit 365 can acquire an image in YC format (whole image, monocular image, or composite image) supplied from the camera signal processing unit 354. Further, for example, the filing unit 365 may acquire the viewpoint-related information supplied from the area specifying unit 356.

The filing unit 365 can associate a plurality of acquired data with each other by converting the plurality of data into a file and generating one file including the plurality of data. For example, the filing unit 365 may associate the above-described image (whole image, monocular image, or synthetic image) and the viewpoint-related information with each other by generating one file from the image and the viewpoint-related information. That is, the filing unit 365 functions as an associating unit that associates at least one of the entire image, the monocular image, or the synthetic image with the viewpoint-related information.

Further, for example, the filing unit 365 may acquire the depth information and the refocused image supplied from the image reconfiguration processing unit 357 and convert the information into a file. Further, the filing unit 365 may generate one file from the image (entire image, monocular image, or synthetic image) and the viewpoint-related information supplied from the storage unit 362 in association with each other.

Note that the filing unit 365 may generate a thumbnail image of an image to be filed (for example, a monocular image) and include the thumbnail image in the generated file. That is, the filing unit 365 can associate the thumbnail image with the image (entire image, monocular image, or composite image) and the viewpoint-related information by generating a file.

The filing unit 365 can supply the generated file (the image and the viewpoint-related information associated with each other) to the storage unit 362 via the bus 360, for example, and store the file in the storage medium 363. Further, for example, the filing unit 365 may supply the generated file (the image and the viewpoint-related information associated with each other) to the communication unit 364 via the bus 360 to transmit the file.

The storage unit 362, the communication unit 364, and the archiving unit 365 are also referred to as the association unit 70. The associating unit 70 associates an image (entire image, monocular image, or synthesized image) with the viewpoint-related information. For example, the storage unit 362 may cause the storage medium 363 to store at least one of the entire image, the monocular image, or the synthetic image in association with the viewpoint-related information. Further, the communication unit 364 may transmit at least one of the entire image, the monocular image, or the synthetic image in association with the viewpoint-related information. Further, the filing unit 365 may associate the entire image, the monocular image, or the synthetic image with the viewpoint-related information by generating one file from the viewpoint-related information and at least one of the entire image, the monocular image, or the synthetic image.

The associating unit 70 may associate the point light information and the viewpoint-related information other than the image (whole image, monocular image, or composite image).

The control unit 381 performs control processing regarding the camera system 300. That is, the control unit 381 may cause each unit of the camera system 300 to execute processing. For example, the control unit 381 may control the unit 384 via the optical systemLens unit 320 (monocular lens 31)iEach) are set for the imaging optical system, such as the aperture and the focal position. Further, the control unit 381 may control the image sensor 351 to cause the image sensor 351 to perform imaging (photoelectric conversion) and generate a captured image.

Further, the control unit 381 may supply the viewpoint-related information VI and thus the point light information SI to the area specifying unit 356 to specify an area to be extracted from the captured image. The control unit 381 may read the viewpoint-related information VI and the point light information SI stored in the storage medium 383 via the storage unit 382 and supply the read information to the area designation unit 356.

Further, the control unit 381 may acquire an image via the bus 360, and control the aperture via the optical system control unit 384 based on the brightness of the image. Further, the control unit 381 may control the focus via the optical system control unit 384 based on the sharpness of the image. Further, the control unit 381 may control the camera signal processing unit 354 to control a white balance gain based on the RGB ratio of the image.

The storage unit 382 controls storage of a storage medium 383 including, for example, a semiconductor memory or the like. The storage medium 383 may be a removable storage medium or a built-in memory. The storage medium 383 stores, for example, the view-point-related information VI. The viewpoint-related information VI is information corresponding to (each monocular lens 31 of) the lens unit 320 and the image sensor 351. That is, the viewpoint-related information VI is information on a monocular image having a viewpoint at the position of each monocular lens 31 of the lens unit 320, and is information for specifying a monocular image area. The storage medium 383 may further store the spot light information SI.

For example, the storage unit 382 may read the viewpoint-related information VI and the spot light information SI stored in the storage medium 383 according to the control unit 381 or an operation of a user, and supply the read information to the control unit 381.

Note that the storage medium 383 may be a ROM or a rewritable memory such as a RAM or a flash memory. In the case of a rewritable memory, the storage medium 383 may store arbitrary information.

Further, the storage unit 382 and the storage medium 383 may be replaced with the storage unit 362 and the storage medium 363. That is, information (viewpoint-related information VI, etc.) to be stored in the storage medium 383 may be stored in the storage medium 363. In this case, the storage unit 382 and the storage medium 383 may be omitted.

The optical system control unit 384 controls (the feed unit 23, the monocular lens 31 of) the lens unit 320 under the control of the control unit 381iEtc.). For example, the optical system control unit 384 may control each monocular lens 31iAnd an aperture to control each of the monocular lenses 31iAnd f-number of the optical system. Note that, in the case where the camera system 300 has an electric focus adjustment function, the optical system control unit 384 may control (each monocular lens 31 of) the lens unit 320i) The focal point (focal length). Further, the optical system control unit 384 can control each of the monocular lenses 31iPore diameter (F value).

Note that the camera system 300 may include a mechanism (physical configuration) for controlling the focal length by manually operating a focus ring provided in the lens barrel, instead of having such an electric focus adjustment function. In this case, the optical system control unit 384 may be omitted.

< correlation of viewpoint-related information and the like >

As described above, in the camera system 300, via the lens unit 320 (the plurality of monocular lenses 31)i) A subject is photographed in the image sensor 351, and the image is generated to include as the single-eye lens 31iA shot image of a monocular image of the corresponding image. The camera system 300 extracts some or all of the monocular image from the photographed image, thereby generating the image having the monocular lens 31iA monocular image of a viewpoint at the position of (a). Since a plurality of monocular images extracted from one captured image are images of viewpoints different from each other, processing such as depth estimation by multi-eye matching and correction for suppressing attachment errors of a multi-eye lens can be performed using these monocular images, for example. However, in order to perform processing, information such as a relative position between monocular images is necessary.

Accordingly, the camera system 300 correlates the viewpoint-related information with the information via the plurality of monocular lenses 31 having optical paths independent of each otheriA captured image generated by imaging a subject with the image sensor 351 as one imaging element, a lens having a plurality of monocular lenses 31 extracted from the captured imageiOr a composite image obtained by combining a plurality of monocular images, the viewpoint-related information being information for specifying areas of the plurality of monocular images in the captured image.

For example, the associating unit 70 acquires viewpoint-related information corresponding to an image (whole image, monocular image, or composite image) from the region specifying unit 356, and associates the image with the viewpoint-related information. For example, the storage unit 362 causes the storage medium 363 to store at least one of the entire image, the monocular image, or the synthetic image in association with the viewpoint-related information. Further, the communication unit 364 transmits at least one of the entire image, the monocular image, or the synthetic image in association with the viewpoint-related information. Further, the filing unit 365 associates the entire image, the monocular image, or the synthetic image with the viewpoint-related information by generating one file from the viewpoint-related information and at least one of the entire image, the monocular image, or the synthetic image.

With the above-described association, not only the camera system 300 but also systems other than the camera system 300 can perform high-precision image processing on a monocular image or the like using the viewpoint-related information.

In association, a monocular image or the like may be associated with the viewpoint-related information VI. Further, in the association, a monocular image or the like may be associated with corrected viewpoint-related information VI' obtained by correcting the viewpoint-related information VI. Further, in the association, a monocular image or the like may be associated with the viewpoint-related information VI, the spot light information SI, and the spot light information SI'. Further, in the association, a monocular image or the like may be associated with the viewpoint-related information VI and the difference between the spot light information SI and the spot light information SI'.

In the case of associating the monocular image or the like with the viewpoint-related information VI ', in the case of associating the monocular image or the like with the viewpoint-related information VI, the point light information SI, and the point light information SI ', and in the case of associating the monocular image or the like with the viewpoint-related information VI and the difference between the point light information SI and the point light information SI ', even if the positional deviation of the monocular image in the captured image occurs with the movement of the feeding unit 23, the position of each monocular image (the position of the viewpoint) in the captured image can be accurately identified.

< processing of deviation of attachment position of lens unit 320 >

In the lens-integrated camera system 300, the attachment position of the lens unit 320 may deviate due to a manufacturing error. Further, the attachment position of the lens unit 320 may be shifted with the movement of the feeding unit 23. When the attachment position of the lens unit 320 is shifted and an attachment error occurs at the attachment position, the accuracy of the process of cutting out a monocular image from a captured image and calculating parallax information using the monocular image is reduced.

Therefore, the area specifying unit 356 can detect an attachment error as a deviation (amount) of the attachment position of the lens unit 320 using a spot light image appearing in the captured image.

For example, the region specifying unit 356 can detect, from the captured image, the incidence range of the point light emitted from the light sources 32L and 32R to the image sensor 351, that is, the point light image appearing in the captured image, and can generate (detect) point light information SI' related to the point light image.

Further, the area specifying unit 356 may detect a difference between the point light information SI 'and the point light information SI, for example, a difference between the position of the point light image represented by the point light information SI' and the position of the point light image represented by the point light information SI, as the attachment error.

Then, the region specifying unit 356 may correct the viewpoint-related information VI using the attachment error and generate viewpoint-related information VI'. For example, the region specifying unit 356 may correct the position of the monocular image represented by the viewpoint-related information VI by the difference between the position of the spot light image represented by the spot light information SI 'and the position of the spot light image represented by the spot light information SI' according to the spot light information SI and the spot light information SI ', and generate the viewpoint-related information VI' as information for specifying the position of the monocular image deviated with the joint error.

The region extraction unit 353 may accurately cut out a monocular image by cutting out the monocular image from the photographed image using the viewpoint-related information VI'. Further, the region extraction unit 353 accurately specifies the position of the viewpoint of the monocular image using the viewpoint-related information VI', and can accurately obtain the parallax information using the monocular image and the position of the viewpoint.

Here, the region specifying unit 356 that generates the spot light information SI' from the captured image can be said to be a detection unit that detects the incident range of the spot light emitted from the light sources 32L and 32R to the image sensor 351.

Further, the area specifying unit 356 that generates the viewpoint-related information VI 'by correcting the viewpoint-related information VI using the difference between the point light information SI' and the point light information SI as an attachment error may also be said to be a processing unit that performs processing of correcting the viewpoint-related information VI according to the detection result of the detection unit (i.e., an optical image on a captured image).

< second other embodiment of Camera System >

Fig. 31 is a block diagram showing an example of an electrical configuration of a second other embodiment of a camera system to which the present technology is applied.

Note that in the drawings, portions corresponding to those in fig. 2 and 31 are given the same reference numerals, and hereinafter, description thereof will be appropriately omitted.

In fig. 32, the camera system 400 is a lens-interchangeable camera system. The camera system 400 includes a camera body 410 and a multi-eye interchangeable lens 420 (lens unit). The camera system 400 has a configuration substantially similar to that of the camera system 300 in a state where the multi-eye interchangeable lens 420 is mounted on the camera body 410, and substantially performs similar processing. That is, the camera system 400 functions as an imaging device that images a subject and generates image data of a captured image similar to the camera system 300.

The camera body 410 has a configuration in which the multi-eye interchangeable lens 420 and other general interchangeable lenses can be attached and detached, similarly to the camera body 10 having a configuration in which the multi-eye interchangeable lens 20 and the like can be attached or detached.

The multi-eye interchangeable lens 420 includes the feeding unit 23. As described with reference to fig. 1 and 2, the feeding unit 23 includes the monocular lens 310To 314And light sources 32L and 32R.

The feeding unit 23 moves in the optical axis direction of the lens barrel optical axis within the lens barrel 21 (fig. 1) not shown in fig. 32. A monocular lens 31 included in the feeding unit 23 as the feeding unit 23 moves0To 314And the light sources 32L and 32R are also moved integrally.

In fig. 32, similar to the case of the camera system 300, the monocular lens 31iAre configured such that the optical paths of light passing through the lenses are independent of each other. I.e. has passed through the monocular lens 31iAre emitted at mutually different positions on the light receiving surface (e.g., effective pixel region) of the image sensor 351 without entering the other monocular lens 31i. Monocular lens 31iAre located at mutually different positions on the light-receiving surface of the image sensor 351, and emit light that has passed through the monocular lens 31 at mutually different positions on the light-receiving surface of the image sensor 351iAt least some of the light.

Therefore, in the camera system 400, similarly to the case of the camera system 300, in the captured image generated by the image sensor 351 (the entire image output by the image sensor 351), via the monocular lens 31iThe formed images of the object are formed at positions different from each other. In other words, the image having the monocular lens 31 is obtained from the captured imageiA monocular image of a viewpoint at the position of (a). That is, by mounting the multi-eye interchangeable lens 420 on the camera body 410 and imaging a subject, a plurality of monocular images can be obtained.

In the camera system 400, the camera body 410 includes an image sensor 351, a RAW signal processing unit 352, an area extraction unit 353, a camera signal processing unit 354, a through image generation unit 355, an area specification unit 356, an image reconfiguration processing unit 357, a bus 360, a display unit 361, a storage unit 362, a communication unit 364, an archiving unit 365, a control unit 381, and a storage unit 382. That is, the camera body 410 has a configuration other than the lens unit 320 and the optical system control unit 384 of the camera system 300.

Note that, in addition to the above configuration, the camera body 410 includes a communication unit 441. The communication unit 441 communicates with (the communication unit 451 of) the multi-eye interchangeable lens 420 properly mounted on the camera body 410, and exchanges information and the like. The communication unit 441 may communicate with the multi-eye interchangeable lens 420 by any communication method. The communication may be wired communication or wireless communication.

For example, the communication unit 441 is controlled by the control unit 381, communicates with (the communication unit 451 of) the multi-eye interchangeable lens 420, and acquires information supplied from the multi-eye interchangeable lens 420. Further, for example, the communication unit 441 provides information supplied from the control unit 381 to the multi-eye interchangeable lens 420 by communicating with (the communication unit 451 of) the multi-eye interchangeable lens 420. The information exchanged between the communication unit 441 and the multi-eye interchangeable lens 420 is arbitrary. For example, the information may be data or control information such as command and control parameters.

In the camera system 400, the multi-eye interchangeable lens 420 further includes an optical system control unit 384, a communication unit 451, and a storage unit 452. The communication unit 451 communicates with the communication unit 441 in the multi-eye interchangeable lens 420 properly mounted on the camera body 410. This communication enables information exchange between the camera body 410 and the multi-eye interchangeable lens 420. The communication method of the communication unit 451 is arbitrary, and may be wired communication or wireless communication. Further, the information exchanged through the communication may be data or control information such as commands and control parameters.

For example, the communication unit 451 acquires control information and other various types of information transmitted from the camera body 410 via the communication unit 441. The communication unit 451 supplies the information acquired in this manner to the optical system control unit 384 as needed, and can use the information to control the feeding unit 23, the monocular lens 31iAnd the like.

Further, the communication unit 451 may supply the acquired information to the storage unit 452 and store the information in the storage medium 453. Further, the communication unit 451 may read information stored in the storage medium 453 via the storage unit 452 and transmit the read information to the camera body 410 (communication unit 441).

In the camera system 400, the storage positions of the viewpoint-related information VI and the spot light information SI corresponding to the multi-eye interchangeable lens 420 are arbitrary. For example, the viewpoint-related information VI and the spot light information SI may be stored in the storage medium 453 of the multi-eye interchangeable lens 420. Then, for example, the control unit 381 of the camera body 410 may access the storage unit 452 via the communication unit 451 and the communication unit 441 to read the viewpoint-related information VI and the point light information SI from the storage medium 453. Then, after acquiring the viewpoint-related information VI and the spot light information SI, the control unit 381 may supply and set the viewpoint-related information VI and the spot light information SI to the area specifying unit 356.

For example, such processing may be performed at any timing or trigger prior to imaging in time, such as when the multi-eye interchangeable lens 420 is properly mounted on the camera body 410, when the camera system 400 is powered on, or when the driving mode of the camera system 400 is switched to an imaging mode in which imaging of a subject can be performed.

In this way, the camera body 410 can perform image processing on the captured image and the monocular image using the viewpoint-related information VI and the point light information SI corresponding to the multi-eye interchangeable lens 420.

Further, the control unit 381 may supply the viewpoint-related information VI and the point light information SI of the multi-eye interchangeable lens 420 acquired from the multi-eye interchangeable lens 420 to the storage unit 382 together with the ID of the multi-eye interchangeable lens 420 to store the information. In this case, the storage unit 382 stores the supplied ID, viewpoint-related information VI, and point light information SI in the storage medium 383 in association with each other. That is, the camera body 410 can manage the viewpoint-related information VI, the point light information SI, and the ID of the multi-eye interchangeable lens 420. The camera body 410 may manage viewpoint-related information VI and spot light information SI of the plurality of multi-eye interchangeable lenses 420.

In this way, the control unit 381 can acquire the ID of the multi-eye interchangeable lens 420 next time, thereby reading the viewpoint-related information VI and the point light information SI corresponding to the ID from the storage unit 382 (storage medium 383). That is, the control unit 381 can easily acquire the viewpoint-related information VI and the point light information SI corresponding to the multi-eye interchangeable lens 420.

Further, the storage medium 383 may store the viewpoint-related information VI and the point light information SI of each of the plurality of multi-eye interchangeable lenses 420 in association with the ID of the multi-eye interchangeable lens 420 in advance. That is, in this case, the camera body 410 manages the viewpoint-related information VI and the spot light information SI of the plurality of multi-eye interchangeable lenses 420 in advance.

In this way, by using the ID of the multi-eye interchangeable lens 420 correctly mounted on the camera body 410, the control unit 381 can easily read the viewpoint-related information VI and the point light information SI corresponding to the ID from the storage unit 382 (storage medium 383).

< processing of attachment position deviation of the multi-eye interchangeable lens 420 >

In the lens interchangeable camera system 400, similar to the lens-integrated camera system 300, the attachment position of the multi-eye interchangeable lens 420 may deviate due to a manufacturing error or a movement of the feeding unit 23. Further, in the lens interchangeable camera system 400, the attachment position of the multi-eye interchangeable lens 420 may be deviated due to a mounting error. When the attachment positions of the multi-eye interchangeable lens 420 are deviated and an attachment error occurs at the attachment positions, the accuracy of the process of cutting out a monocular image from a captured image and calculating parallax information using the monocular image is lowered.

Therefore, the area specifying unit 356 can detect the attachment error as the deviation (amount) of the attachment position of the multi-eye interchangeable lens 420 using the spot light image appearing in the captured image.

For example, the region specifying unit 356 may generate the point light information SI 'about the point light image appearing in the captured image from the captured image, and detect the difference between the point light information SI' and the point light information SI as the attachment error.

Further, the region specifying unit 356 may correct the viewpoint-related information VI using the attachment error, and generate the viewpoint-related information VI' as information for specifying the position of the monocular image that deviates with the attachment error.

Then, the region extraction unit 353 may accurately cut out a monocular image from the captured image using the viewpoint-related information VI'. Further, the region extraction unit 353 accurately specifies the position of the viewpoint of the monocular image using the viewpoint-related information VI', and can accurately obtain the parallax information using the monocular image and the position of the viewpoint.

As described above, in the camera system 300 or 400, since the monocular lens 310To 314And the light sources 32L and 32R are fed out integrally, so that appropriate processing can be performed. That is, when a combination of an attachment error, a mounting error, and various other errors is referred to as a monocular image position error, the monocular lens 310To 314And the light sources 32L and 32R are moved integrally with the feeding unit 23, thereby being able to detect a positional deviation between the viewpoint-related information and the detected position of the point light, and specify an accurate position of the monocular image in the entire image without worrying about a monocular image positional error caused by any cause at any feeding position.

< description of computer to which the present technology is applied >

Next, a series of processes of the area specifying unit 52, the image processing unit 53, the position calculating unit 57, the spot light image detecting unit 62, the feed amount detecting unit 64, the area extracting unit 353, the area specifying unit 356, and the like may be performed by hardware or software. In the case where a series of processes are executed by software, a program configuring the software is installed in a general-purpose computer or the like.

Fig. 33 is a block diagram showing a configuration example of an embodiment of a computer in which a program for executing the series of processes described above is installed.

The program may be recorded in advance in the hard disk 905 or the ROM 903 as a recording medium built in the computer.

Alternatively, the program may be stored (recorded) in a removable recording medium 911 driven by a drive 909. Such a removable recording medium 911 may be provided as a so-called software package. Here, examples of the removable recording medium 911 include a flexible disk, a compact disc read only memory (CD-ROM), a magneto-optical (MO) disk, a Digital Versatile Disc (DVD), a magnetic disk, a semiconductor memory, and the like.

Note that, in addition to installing the program from the removable recording medium 911 to the computer as described above, the program may be downloaded to the computer via a communication network or a broadcast network and installed in the built-in hard disk 905. In other words, for example, the program may be transferred to the computer from a download site in a wireless manner via an artificial satellite for digital satellite broadcasting, or transferred to the computer in a wired manner via a network such as a Local Area Network (LAN) or the internet.

The computer includes a Central Processing Unit (CPU)902, and an input/output interface 910 is connected to the CPU 902 via a bus 901.

When a user operating the input unit 907 or the like inputs a command through the input/output interface 910, the CPU 902 executes a program stored in the Read Only Memory (ROM)903 according to the command. Alternatively, the CPU 902 loads a program stored in the hard disk 905 into a Random Access Memory (RAM)904 and executes the program.

As a result, the CPU 902 executes the above-described processing according to the flowchart or the above-described processing of the block diagram. Then, the CPU 902 causes the output unit 906 to output the processing result via the input/output interface 910 as necessary, the communication unit 908 to transmit the processing result, and causes the hard disk 905 to record the processing result, for example.

Note that the input unit 907 is configured by a keyboard, a mouse, a microphone, and the like. Further, the output unit 906 is constituted by a Liquid Crystal Display (LCD), a speaker, and the like.

Here, in this specification, the processing executed by the computer according to the program does not necessarily have to be executed chronologically in the order described as the flowchart. In other words, the processing performed by the computer according to the program also includes processing performed in parallel or individually (for example, parallel processing or processing performed by an object).

Further, the program may be processed by one computer (processor), or may be processed in a distributed manner by a plurality of computers. Further, the program may be transferred to a remote computer and executed.

Further, in the present specification, the term "system" refers to a group of a plurality of configuration elements (devices, modules (components), etc.), and whether all the configuration elements are in the same housing is irrelevant. Therefore, a plurality of devices accommodated in a single housing and connected via a network and one device accommodating a plurality of modules in one housing are both systems.

Note that the embodiments of the present technology are not limited to the above-described embodiments, and various modifications may be made without departing from the gist of the present technology.

For example, in the present technology, a configuration of cloud computing may be adopted in which one function is shared and processed cooperatively by a plurality of devices through a network.

Further, the steps described in the above flowcharts may be executed by one device, or may be shared and executed by a plurality of devices.

Further, in the case where a plurality of processes are included in one step, the plurality of processes included in one step may be executed by one device, or may be shared and executed by a plurality of devices.

Further, the effects described in this specification are merely examples and are not limited, and other effects may be exhibited.

Note that the present technology can adopt the following configuration.

<1> an interchangeable lens comprising:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses.

<2> the interchangeable lens according to <1>, wherein

The one or more light sources emit non-parallel light.

<3> the interchangeable lens according to <2>, wherein

The image sensor is located between a condensing point where the non-parallel light is condensed in a case where the movable unit is sent out to a minimum feeding state of a minimum extent and a condensing point where the non-parallel light is condensed in a case where the movable unit is sent out to a maximum feeding state of a maximum extent.

<4> the interchangeable lens according to <2>, wherein

In the case where the movable unit is sent out to the maximum feeding state of the maximum extent, the condensing point where the non-parallel light is condensed is located at one of the front side and the depth side including the image sensor.

<5> the interchangeable lens according to any one of <2> to <4>, wherein

The light source is arranged at a position different from the center of the optical axis of the movable unit, and emits non-parallel light in an oblique direction inclined to the center of the optical axis.

<6> the interchangeable lens according to any one of <1> to <5>, comprising:

a plurality of the plurality of light sources.

<7> the interchangeable lens according to any one of <1> to <6>, further comprising:

a storage unit configured to store spot light position information indicating a position of a light source illuminating the image sensor and monocular image position information indicating an emission position in the image sensor, the emission position being an emission position of each imaging light emitted from the plurality of monocular lenses.

<8> an information processing apparatus comprising:

a detection unit configured to detect a light image on a captured image captured by an image sensor, the light image being a light image of illumination light emitted from a light source of a lens unit, the lens unit including:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and

a processing unit configured to perform processing according to a detection result of the detecting unit.

<9> the information processing apparatus according to <8>, wherein

The detection unit detects a size of a light image in the captured image.

<10> the information processing apparatus according to <9>, wherein

The processing unit detects a feeding amount of the movable unit according to a size of the light image.

<11> the information processing apparatus according to <8>, wherein

The detection unit detects a detection light image position as a position of a light image in a captured image.

<12> the information processing apparatus according to <11>, wherein

The processing unit detects the amount of feed of the movable unit based on the detected light image position.

<13> the information processing apparatus according to <11> or <12>, wherein

The processing unit specifies an imaged monocular image position, which is a position of a monocular image with the position of the monocular lens as a viewpoint in the captured image, from the detected light image position.

<14> the information processing apparatus according to <13>, further comprising:

a storage unit configured to store a storage light image position indicating a position of a light source illuminating the image sensor, and a storage monocular image position indicating an emission position of each imaging light emitted from the plurality of monocular lenses in the image sensor, wherein

The processing unit specifies the imaged monocular image position based on the relationship between the stored light image position and the detected light image position.

<15> the information processing apparatus according to <14>, wherein

The processing unit specifies the imaged monocular image position by correcting the stored monocular image position based on the relationship between the stored optical image position and the detected optical image position.

<16> the information processing apparatus according to any one of <13> to <15>, further comprising:

an association unit configured to associate the captured image with the imaging monocular image position.

<17> the information processing apparatus according to <14> or <15>, further comprising:

an association unit configured to associate the stored optical image position, the detected optical image position, and the stored monocular image position with the captured image.

<18> the information processing apparatus according to any one of <13> to <15>, further comprising:

an association unit configured to associate the stored optical image position, the difference between the stored optical image position and the detected optical image position, and the stored monocular image position with the captured image.

<19> an information processing method comprising:

a detection step of detecting a light image on a captured image captured by an image sensor, the light image being a light image of irradiation light emitted from a light source of a lens unit, the lens unit including:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and

and a processing step of performing processing according to the detection result of the detection step.

<20> a program for causing a computer to function as:

a detection unit configured to detect a light image on a captured image captured by an image sensor, the light image being a light image of illumination light emitted from a light source of a lens unit, the lens unit including:

a lens barrel;

a movable unit configured to be movable along an optical axis with respect to the lens barrel;

a plurality of monocular lenses configured to be movable integrally with the movable unit and arranged such that emission positions of imaging light emitted by the respective monocular lenses do not overlap each other; and

one or more light sources configured to be movable along an optical axis integrally with the movable unit and the plurality of monocular lenses, and arranged such that an emission position of irradiation light emitted to an image sensor provided in the camera body does not overlap with an emission position of imaging light of each of the plurality of monocular lenses; and

a processing unit configured to perform processing according to a detection result of the detecting unit.

List of identifiers

10 Camera body

11 Camera mount pad

20-multiocular interchangeable lens

21 lens barrel

22 lens mounting base

23 feed unit

310To 314Monocular lens

32L, 32R, 32U, 32D light source

41 memory cell

42 communication unit

43 control unit

51 image sensor

52 area specifying unit

53 image processing unit

54 display unit

55 memory cell

56 communication unit

57 position calculating unit

61 control unit

62-point light image detection unit

63 feed amount information storage unit

64 feed amount detecting unit

101 calibration data generating unit

102 calibration data storage unit

103 interpolation unit

104 parallax information generating unit

121 outer casing

122 LED

123, 124 lens

211 monocular image position information modifying unit

212 monocular image extraction unit

221 association unit

230 post-processing equipment

231 region specifying unit

232 image processing unit

233 display unit

234 recording unit

235 sending unit

241 monocular image position information modifying unit

242 monocular image extraction unit

250 post-processing equipment

300 camera system

320 lens unit

351 image sensor

352 RAW signal processing unit

353 region extraction unit

354 camera signal processing unit

355 through image generating unit

356 region specifying Unit

357 image reconfiguration processing unit

360 bus

361 display unit

362 memory cell

363 storage medium

364 communication unit

365 filing unit

370 association unit

381 control unit

382 optical system control unit

400 camera system

410 Camera body

420 multiocular interchangeable lens

441. 451 communication unit

452 storage unit

453 storage Medium

901 bus

902 CPU

903 read-only memory

904 RAM

905 hard disk

906 output unit

907 input unit

908 communication unit

909 driver

910 input/output interface

911 removable recording medium

93页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:杂散光测试站

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!