Image processing apparatus and method

文档序号:1581032 发布日期:2020-01-31 浏览:21次 中文

阅读说明:本技术 图像处理装置和方法 (Image processing apparatus and method ) 是由 勝木祐伍 小林直树 于 2018-05-25 设计创作,主要内容包括:本公开涉及一种能够抑制对应点检测的精确度降低的图像处理装置和方法。向作为成像单元拍摄由投影单元投影的预定结构光图案的图像的结果而获得的拍摄图案图像应用单应性变换,并且使用应用了所述单应性变换的所述拍摄图案图像来检测由所述投影单元投影的所述投影图像与由所述成像单元拍摄到的所述拍摄图像之间的对应点。本公开可以应用于例如,图像处理装置、图像投影装置、控制装置、信息处理装置、投影成像系统、图像处理方法、程序等。(The present disclosure relates to image processing apparatuses and methods capable of suppressing a decrease in accuracy of corresponding point detection.A homography transform is applied to a shot pattern image obtained as a result of an imaging unit shooting an image of a predetermined structured light pattern projected by a projection unit, and corresponding points between the projected image projected by the projection unit and the shot image shot by the imaging unit are detected using the shot pattern image to which the homography transform is applied.)

An image processing apparatus of kind, comprising:

a corresponding point detection unit that applies a homography transform to a captured pattern image obtained as a result of an imaging unit capturing an image of a predetermined structured light pattern projected by a projection unit, and detects corresponding points between a projection image projected by the projection unit and a captured image captured by the imaging unit using the captured pattern image to which the homography transform is applied.

2. The image processing apparatus according to claim 1, wherein

The corresponding point detecting unit applies the homography transform based on design values of the projecting unit and the imaging unit so as to convert the shot pattern image into a coordinate system as seen from the front, and detects corresponding points by using the shot pattern image converted into the coordinate system as seen from the front.

3. The image processing apparatus according to claim 2, wherein

The corresponding point detecting unit converts coordinates of four corners of the projection image projected by the projecting unit into a coordinate system of the imaging unit based on the design value, and applies the homography conversion to the shot pattern image using the converted coordinates of the four corners.

4. The image processing apparatus according to claim 2, wherein

The corresponding point detecting unit applies an inverse homography transform that is an inverse transform of the homography transform to the detected corresponding point.

5. The image processing apparatus according to claim 1, wherein

The corresponding point detection unit

Applying the homography transformation based on design values of the projection unit and the imaging unit so as to convert the shot pattern image into a coordinate system as seen from the front, and detecting temporary corresponding points by using the shot pattern image converted into the coordinate system as seen from the front, and

the homography transformation is also applied based on the detected temporary corresponding points so as to convert the shot pattern image converted into a coordinate system as viewed from the front into a coordinate system of a projected image projected by the projection unit, and the corresponding points are detected by using the shot pattern image converted into the coordinate system of the projected image.

6. The image processing apparatus according to claim 5, wherein

The corresponding point detecting unit applies an inverse homography transform that is an inverse transform of the homography transform to the detected corresponding point.

7. The image processing apparatus according to claim 1, wherein

The shot pattern image is an image obtained by using a shot image of the structured light pattern projected superimposed onto another image.

8. The image processing apparatus according to claim 7, wherein

The shot pattern image is a difference image of shot images of each of two projected images including the structured light pattern, the two projected images having the same shape as each other and also having luminance change directions opposite to each other.

9. The image processing apparatus according to claim 8, wherein

The shot pattern images are differential images between composite images including the structured light pattern, the composite images having the mutually opposite luminance change directions, each of the composite images being obtained by adding, at , a respective shot image of a plurality of projection images including the structured light pattern, the plurality of projection images having mutually the same luminance change directions.

10. The image processing apparatus according to claim 1, wherein

The structured light pattern includes two elliptical patterns having mutually opposite brightness variation directions.

11. The image processing apparatus according to claim 10, wherein

The structured light pattern comprises a plurality of patterns of the elliptical shape having different longitudinal directions.

12. The image processing device of claim 1, further comprising:

the projection unit.

13. The image processing apparatus according to claim 12, wherein

The projection unit is positioned proximate to the projection plane.

14. The image processing apparatus according to claim 12, wherein

The projection unit projects the same structured light pattern multiple times.

15. The image processing apparatus according to claim 12, wherein

Is provided with a plurality of said projection units, an

Each projection unit sequentially projects the structured light pattern.

16. The image processing device of claim 1, further comprising:

the imaging unit.

17. The image processing apparatus according to claim 16, wherein

The imaging unit is positioned proximate to the projection plane.

18. The image processing apparatus according to claim 16, wherein

The imaging unit captures multiple projection images of the same structured light pattern.

19. The image processing apparatus according to claim 16, wherein

Is provided with a plurality of the image forming units, an

Each imaging unit takes an image of the projected image of the structured light pattern with .

20, a method of image processing, comprising:

applying a homography transform to a captured pattern image obtained as a result of an imaging unit capturing an image of a predetermined structured light pattern projected by a projection unit, and detecting a corresponding point between a projected image projected by the projection unit and the captured image captured by the imaging unit using the captured pattern image to which the homography transform is applied.

Technical Field

The present disclosure relates to image processing apparatuses and methods, and more particularly to image processing apparatuses and methods capable of suppressing a decrease in accuracy of corresponding point detection.

Background

In the related art, in order to reduce distortion of a projection image projected by a projector and align each of the projection images from a plurality of projectors, there are methods of photographing the projection image with a camera and using the photographed image to perform geometric correction on the projection image according to the position and posture of the projector(s), the shape of a projection plane, and the like.

For example, an Invisible Structured Light (ISL) method of embedding a pattern image into a content image for projection has been conceived as a technique of calculating corresponding points of the content image when projecting the content image, also referred to as online sensing (for example, see patent document 1). With the ISL method, invisibility of a pattern is achieved by embedding and projecting two pattern images having the same pattern and mutually opposite luminance change directions into successive frames of a content image.

Meanwhile, in recent years, an ultra-short focus projector has been developed which can radiate a large projection even in a case where the ultra-short focus projector is installed at a position very close to a projection plane, as compared with a general projector. In the case where distortion correction is performed by the ISL method by using such an ultra-short focus projector, it is conceivable to incorporate a camera into the ultra-short focus projector to make the work easier.

CITATION LIST

Patent document

Patent document 1: japanese patent application laid-open No. 2013-192098

Disclosure of Invention

Problems to be solved by the invention

However, in this case, the camera will take an image of the projected image at a view angle from below to above, for example, in the vicinity of the projection plane, pattern distortion in the taken image will increase, and the like, and there is a concern that: the accuracy of detecting the corresponding point will be reduced.

The present disclosure is designed in view of such a situation, and can suppress a decrease in accuracy of the corresponding point detection.

Problem solving scheme

An image processing apparatus according to an aspect of the present technology includes: a corresponding point detection unit that applies a homography transform to a captured pattern image obtained as a result of the imaging unit capturing an image of the predetermined structured light pattern projected by the projection unit, and detects corresponding points between the projection image projected by the projection unit and the captured image captured by the imaging unit using the captured pattern image to which the homography transform is applied.

An image processing method according to an aspect of the present technology includes: the method includes applying a homography transform to a shot pattern image obtained as a result of an imaging unit shooting an image of a predetermined structured light pattern projected by a projection unit, and detecting a corresponding point between a projection image projected by the projection unit and the shot image shot by the imaging unit using the shot pattern image to which the homography transform is applied.

In the image processing apparatus and method according to aspects of the present technology, homography transformation is applied to a captured pattern image obtained as a result of an imaging unit capturing an image of a predetermined structured light pattern projected by a projection unit, and a corresponding point between a projection image projected by the projection unit and the captured image captured by the imaging unit is detected using the captured pattern image to which the homography transformation is applied.

Effects of the invention

According to the present disclosure, images may be processed. Specifically, the accuracy of the corresponding point detection can be suppressed from being lowered.

Drawings

Fig. 1 is a schematic diagram illustrating an example of a manner in which geometric correction is performed.

Fig. 2 is a schematic diagram illustrating an example of a manner in which geometric correction is performed.

Fig. 3 is a schematic diagram illustrating an example of a manner in which geometric correction is performed.

Fig. 4 is a schematic diagram illustrating an example of a manner of detecting the corresponding point.

Fig. 5 is a schematic diagram illustrating an example of a manner of detecting the corresponding point.

Fig. 6 is a schematic diagram illustrating an example of a manner of detecting the corresponding point.

Fig. 7 is a schematic diagram illustrating an example of an ISL.

Fig. 8 is a schematic diagram illustrating an example of a structured light pattern.

Fig. 9 is a schematic diagram illustrating an example of a positive image and a negative image of a structured light pattern.

Fig. 10 is a schematic diagram illustrating an example of a manner in which an ultra-short focus projector projects an image.

Fig. 11 is a schematic diagram illustrating an example of capturing a pattern image.

Fig. 12 is a diagram illustrating an example of a manner in which a homography transformation is performed.

Fig. 13 is a diagram illustrating an example of design values.

Fig. 14 is a block diagram illustrating an exemplary main configuration of a projection imaging system.

Fig. 15 is a block diagram illustrating an exemplary main configuration of the control apparatus.

Fig. 16 is a functional block diagram illustrating exemplary functions implemented by the control apparatus.

Fig. 17 is a functional block diagram illustrating exemplary functions implemented by the projection imaging processing unit.

Fig. 18 is a functional block diagram illustrating exemplary functions implemented by the corresponding point detection processing unit.

Fig. 19 is a schematic diagram illustrating an example of a housing of the projection imaging apparatus.

Fig. 20 is a block diagram illustrating an exemplary main configuration of a projection imaging apparatus.

Fig. 21 is a block diagram illustrating an exemplary main configuration of the projection unit.

Fig. 22 is a schematic diagram illustrating an example of laser beam scanning.

Fig. 23 is a flowchart illustrating an example of the flow of the geometry correction processing.

Fig. 24 is a flowchart illustrating an example of the flow of the projection imaging process.

Fig. 25 is a flowchart illustrating an example of the flow of the corresponding point detection processing.

Fig. 26 is a schematic diagram illustrating an example of the pattern center of gravity detection result.

Fig. 27 is a diagram illustrating an example of a homography transformation error.

Fig. 28 is a diagram illustrating an example of a homography transformation error.

Fig. 29 is a schematic diagram illustrating an example of a housing of the projection imaging apparatus.

Fig. 30 is a block diagram illustrating another exemplary configuration of a projection imaging system.

Fig. 31 is a block diagram illustrating exemplary main configurations of a projection imaging system and a projection imaging apparatus.

Detailed Description

Hereinafter, embodiments for implementing the present disclosure (hereinafter, embodiments) will be described. Note that description will be made in the following order.

ISL method and corresponding point detection

2. best mode (projection imaging system)

3. Second embodiment (projection imaging system/projection imaging apparatus)

4. Others

<1.ISL method and corresponding Point detection >

< corresponding Point detection and geometry correction >

Depending on the pose (such as position and orientation) of a projection plane (such as a screen or wall) with respect to the projector, the shape of the projection plane, etc., the projected image (also referred to as a projected image) may be distorted and difficult to see in cases, for example, as in a of fig. 1 in which case the projected image may be distorted less and easier to see by performing geometric correction (such as distortion correction) on the image projected by the projector, as in the example of B of fig. 1.

Also, like the example of fig. 2, there are systems that project images with a plurality of projectors and cause a single projection image to be formed, for example, there are methods that increase the contrast ratio and achieve a high dynamic range by projecting images from a plurality of projectors to the same position as each other, like a of fig. 2 as another example, there are methods that achieve a projection image larger than the projection image projected by a single projector (a projection image higher in resolution than the projection image projected by a single projector) by arranging projection images projected respectively from a plurality of projectors, like b of fig. 2, in the case of these methods, if the positional relationship between the projection images projected from each projector, the projection images may be misaligned and superimposed with each other, or an improper gap may be generated, and there is a fear that the image quality of the entire will be reduced.

By performing geometric correction on the image in this way, even in the case where the image is projected onto a curved projection plane from a plurality of projectors as in the example in fig. 3, the image can be projected to look like a single image. Note that in the case where a plurality of projection images are arranged to form a larger projection image like the example in fig. 2B and 3, alignment can be performed more easily by partially superimposing (overlapping) adjacent projection images on each other, like the example in fig. 3.

Accordingly, methods of capturing an image of a projected image projected by a projector using a camera and performing geometric correction using the captured image have been conceived.

For example, as in the example in fig. 4, a standard light pattern 12 of a predetermined design is projected onto a screen 13 by a projector 11, and the projected standard light pattern 12 is imaged by a camera 14 to obtain a captured image 15. Subsequently, the corresponding point between the standard light pattern 12 and the captured image 15 is calculated based on the design of the standard light pattern 12, the posture (positional relationship) between the projector 11 and the camera 14, the shape of the screen 13, and the like are calculated in triangulation or the like based on the corresponding point, and geometric correction is performed based on the result. By performing the processing in this manner, the geometric correction can be performed more easily than in the manual case.

In the case where the geometric correction is performed using the camera in this manner, it is necessary to calculate a corresponding point (a pixel in the projection image and the captured image corresponding to the same position as each other in the projection plane) between the projection image (or the image to be projected) and the captured image. In other words, it is necessary to calculate the correspondence between the pixels of the camera 14 (captured image 15) and the pixels of the projector 11 (standard light pattern 12).

Moreover, in the case where a plurality of projectors are used like the example in fig. 2 and 3, it is also necessary to calculate the positional relationship of each projected image with each other.

For example, as in the example in fig. 5, it is assumed that an image is to be projected by cooperation between the projection imaging apparatus 20-1 including the projection unit 21-1 (projector) and the imaging unit 22-1 (camera) and the projection imaging apparatus 20-2 including the projection unit 21-2 (projector) and the imaging unit 22-2 (camera). In this document, the projection imaging device 20-1 and the projection imaging device 20-2 will be referred to as the projection imaging device(s) 20 without distinguishing the projection imaging device 20-1 and the projection imaging device 20-2 in the description. Also, in the case where it is not necessary to distinguish the projection unit 21-1 from the projection unit 21-2 in the description, the projection unit 21-1 and the projection unit 21-2 will be referred to as the projection unit(s) 21. Further, in the case where it is not necessary to distinguish between the imaging unit 22-1 and the imaging unit 22-2 in the description, the imaging unit 22-1 and the imaging unit 22-2 will be referred to as the imaging unit(s) 22.

As illustrated in fig. 5, the projection area (range of projection image) of the projection unit 21-1 of the projection imaging device 20-1 in the projection plane 23 is a range from P0L to P0R. Further, the projection area of the projection unit 21-2 of the projection imaging device 20-2 in the projection plane 23 is a range from P1L to P1R. In other words, the range indicated by the double arrow 24 (the range from P1L to P0R) becomes an overlapping region where the projection images are superimposed on each other.

Note that the imaging area (the range included in the captured image) in the projection plane 23 of the imaging unit 22-1 of the projection imaging apparatus 20-1 is a range from C0L to C0R. Also, the imaging area (the range included in the captured image) of the imaging unit 22-2 of the projection imaging device 20-2 in the projection plane 23 is a range from C1L to C1R.

In the case of such a system, as described above, in order to align the projection images with each other, it is necessary to calculate not only the corresponding points between the projection unit 21 and the imaging unit 22 in each projection imaging apparatus 20 but also the corresponding points between the projection unit 21 and the imaging unit 22 in the different projection imaging apparatus 20, and therefore, as in in fig. 6, for example, light is radiated from a specific pixel of the projection unit 21-1 (arrow 27), reflected at X in the projection plane 23 and received (arrow 28), whereby the pixel of the imaging unit 22-2 is calculated, and also a similar pixel correspondence relationship is calculated between the projection unit 21-2 and the imaging unit 22-1.

In this way, by calculating the corresponding points between all the projection units 21 and the imaging units 22 for which the corresponding points can be calculated, the alignment of the overlapping regions (the range illustrated by the double arrow 24) can be performed by geometric correction.

< on-line sensing >

Although it is conceivable to perform such corresponding point detection for geometric correction before starting projection of the visual image, due to external disturbance or the like (such as temperature and vibration at the time of projection of the visual image), there are concerns that: after the initial installation, the corresponding point will be shifted. If the corresponding point shifts, the geometric correction becomes inappropriate, and there is a concern that: the projected image will be distorted and misaligned.

In this case, it is necessary to detect the corresponding point again, but it is undesirable for the user who is viewing the visual image to interrupt the projection of the visual image for this reason (there is a concern that the user satisfaction is lowered). Therefore, a method of detecting corresponding points while continuing to project a visual image (online sensing) has been conceived.

For example, a method using invisible light (such as infrared light), a method using image features (such as SIFT), an Invisible Structured Light (ISL) method, and the like have been conceived as online sensing technologies. In the case of a method using invisible light such as infrared light, since a projector (for example, an infrared light projector) that projects the invisible light is also necessary, there is a concern that: the cost increases. Also, in the case of using image features (such as SIFT), since the detection accuracy and detection density of the corresponding points depend on the image content to be projected, it is difficult to perform the corresponding point detection with stable accuracy.

In contrast, since the case of the ISL method uses visible light, an increase in structural elements of the system (i.e., an increase in cost) can be suppressed. Also, the corresponding point detection can be performed with stable accuracy, without depending on the image to be projected.

< ISL method >

The ISL method is the following technique: the predetermined pattern image (i.e., the structured-light pattern) is positively and negatively transformed and embedded in the projection, and the image is projected so that the predetermined pattern image is not perceived by a human.

As illustrated in FIG. 7, by adding a predetermined structured light pattern to a particular frame of an input image, the projector generates a frame image in which a positive image of the structured light pattern is synthesized with the input image (content image), and by subtracting the structured light pattern from the lower frame of the input image, the projector generates a frame image in which a negative image of the structured light pattern is synthesized with the input image.

Instead, the camera takes images of the projected images of these frames, and extracts only the structured light pattern included in the taken images by calculating the difference between the projected images of the two frames. Corresponding point detection is performed using the extracted structured light pattern.

In this way, with the ISL method, since the structured light pattern can be easily extracted by simply calculating the difference between the captured images, ideally, the corresponding point detection is performed with stable accuracy without depending on the image to be projected.

< Structure of structured light Pattern >

A specific example of a structured light pattern is illustrated in fig. 8. The pattern image 100 illustrated in fig. 8 is a structured light pattern that is overlappingly projected onto a content image in the ISL method. The pattern image 100 is used to detect a corresponding point between a projection image projected by the projection unit and a captured image captured by the imaging unit (i.e., a pixel correspondence relationship between the projection unit and the imaging unit), and includes a plurality of elliptical luminance distribution patterns 101 having different luminance values from the surrounding environment, as illustrated in fig. 8. In other words, in the pattern image 100, a plurality of patterns 101 having different luminance values from the surrounding environment are arranged (formed).

In fig. 8, a white elliptical pattern 101 illustrates an example of a pattern in which the luminance change direction is a positive direction, and a black elliptical pattern 101 illustrates an example of a pattern in which the luminance change direction is a negative direction. Each pattern 101 may have any size, and the sizes of the patterns 101 may be the same as each other, or may include patterns 101 having different sizes. Also, the patterns 101 may have the same luminance distribution as each other, or may include patterns 101 having different luminance distributions.

In the case of the ISL method, the pattern image 100 having such a configuration is overlappingly projected onto another image (e.g., a content image), at this time, similarly to the case described with reference to fig. 7, the luminance value of the pattern image 100 is added to a specific frame of the content image and projected, but the luminance value of the pattern image 100 is subtracted from the lower frame and projected, in other words, the pattern image 100 is overlappingly projected onto the content image as a positive image 100-1 and a negative image 100-2 as illustrated in fig. 9, the negative image 100-2 is an image obtained by inverting the sign of the luminance value in the positive image 100-1.

By projecting such a positive image 100-1 and a negative image 100-2 to be superimposed on two consecutive frames, a user viewing the projected images may be made less able to perceive the pattern image 100 (which may promote invisibility of the pattern image 100) due to the integration effect.

< ultra short focal length projector >

Meanwhile, there is an ultra-short focus projector capable of radiating a large projection even in a case where the ultra-short focus projector is installed at a position very close to a projection plane, as compared with a general projector. For example, as illustrated in fig. 10, the ultra-short-focus projector 111 is installed near a wall 113, such as on the top of a table 112, and projects an image (projection image 114) onto the wall 113. In other words, the ultra-short focus projector 111 performs image projection from the vicinity of the projection plane, for example, as if the projection image 114 is viewed from below upward.

Also, if it is assumed that the projector and the camera required for the above-described ISL method are formed as separate devices and can be installed at any position, respectively, the relative positions of these devices need to be calculated to correctly perform triangulation in corresponding point detection (distortion correction). By providing (integrating) the projector and the camera in a single housing, the relative positions of these devices can be regarded as known information (the work of calculating the relative positions becomes unnecessary), and therefore, the corresponding point detection (distortion correction) can be made easier (simplified).

However, if the camera incorporates the ultra-short focus projector 111, the camera will take an image of a projected image at an angle of, for example, looking from below upward near the projection plane, pattern distortion in the taken image will increase, and there is a concern that the accuracy of detecting corresponding points will decrease.A taken pattern image 121 in A of FIG. 11 is a taken image of a pattern image obtained by taking an image of the projected image from the front.

< application of homography transformation to Pattern image >

Accordingly, a homography transform is applied to a shot pattern image obtained as a result of the imaging unit shooting an image of the predetermined structured light pattern projected by the projection unit, and by using the shot pattern image to which the homography transform is applied, a corresponding point between the projection image projected by the projection unit and the shot image shot by the imaging unit is detected.

For example, a plane in which a pattern is arranged in the detected shot pattern image 122 (as illustrated in B of fig. 11) is projected onto a plane as viewed from the front in the projection plane by using a homography transform (projection transform). By converting the pattern into a state of a projection plane as viewed from the front in this way, it is possible to suppress pattern distortion, size variation, and the like (in other words, to make the pattern closer to the shape of the pattern in the image to be projected). Therefore, by detecting the corresponding points using the shot pattern image after the homography conversion, it is possible to suppress a decrease in accuracy of the corresponding point detection.

< transformation of System homography >

As the homography transform, for example, a homography transform based on known design information (design values) of a projection unit (for example, a projector) and an imaging unit (for example, a camera) may be applied. Such homography transformations based on design values are also referred to as system homography transformations.

For example, as illustrated in fig. 12, (each coordinate of) a plane where a pattern is arranged in the shot pattern image 122 in B of fig. 11 is projected onto a plane (coordinate system) as viewed from the front in the projection plane by using a system homography matrix Hs calculated from design values. By applying the system homography transformation to the shot pattern image 122 in this manner, a shot pattern image 123 is obtained. In other words, pattern distortion, size variation, and the like can be suppressed. Therefore, by detecting the corresponding points using the shot pattern image 123 after the system homography conversion, it is possible to suppress the accuracy of the corresponding point detection from being lowered.

The system homography matrix Hs may be calculated in any manner, but may be calculated, for example, by using the four corner points of the projected image. For example, world coordinates of four corner points of the projection image in the projection plane are calculated (P1, P2, P3, and P4). For example, as illustrated in a of fig. 13, assuming that the origin of world coordinates is set as the center of projection, the size in the vertical direction is b (mm), and the x-coordinate and the y-coordinate are 1(mm) ═ 1, the world coordinates of the four corners of the projection image are P1(a/2, b/2), P2(a/2, -b/2), P3(-a/2, -b/2), and P4(-a/2, b/2).

Next, world coordinates (P1 to P4) of four corners are transformed into a camera coordinate system by using internal parameters approximately known about the camera (imaging unit). In other words, for example, it is specified which positions (coordinates) in the captured image the four corner points of the projection image projected onto the projection plane take (i.e., the correspondence between the projection plane and the captured image) by using information on the position of the image unit, the image capturing direction, the angle of view, and the like. For example, as illustrated in B of fig. 13, if the information is known, such a correspondence (i.e., system homography matrix Hs) between the projection plane (world coordinates) and the captured image (camera coordinate system) can be easily calculated.

In other words, by applying the system homography transform as the homography transform, it is possible to more easily suppress the reduction in accuracy of the corresponding point detection.

Note that, in order to restore the corresponding point detected in the coordinate system back to the original coordinate system (the coordinate system of the captured pattern image 122) after the homography transform, it is sufficient to perform an inverse transform of the homography transform (also referred to as an inverse homography transform) on the corresponding point. Therefore, for example, in order to restore the corresponding points detected in the coordinate system back to the original coordinate system after the system homography transform, it is sufficient to perform the inverse transform (Hs-1P) of the system homography transform described above (also referred to as an inverse system homography transform) on the obtained corresponding points P, as illustrated in fig. 12. In other words, in this case, the inverse matrix Hs-1 of the system homography matrix Hs is regarded as a homography matrix.

However, the system homography transform is derived based on several constraints, such as that the projection unit (projector) and the projection plane are parallel and the internal parameters of the imaging unit (camera) are known to some extent and errors may be generated during actual operation.

< algorithmic homography transformation >

Therefore, as illustrated in fig. 12, as the homography transformation described above, not only the system homography transformation but also the homography transformation for the coordinate system of the image projected by the projection unit and based on the information on the corresponding point detected by using the photographic pattern image 123 after the system homography transformation can be applied to the photographic pattern image 123. Such homography transformations based on corresponding points are also referred to as algorithmic homography transformations.

For example, as illustrated in FIG. 12, by projecting (each coordinate of) a plane in which a pattern is arranged in a shot pattern image 123 onto a coordinate system (plane) of an image projected by a projection unit using an algorithmic homography matrix Ha calculated from corresponding points between the image projected by the projection unit and the shot image shot by an imaging unit, the corresponding points are calculated by using the shot pattern image 123. by additionally applying an algorithmic homography transform to the shot pattern image 123 in this way, a shot pattern image 124 is obtained.

Note that, in order to restore the corresponding points detected in the coordinate system back to the original coordinate system (the coordinate system of the captured pattern image 123) after the algorithmic homography transform, it is sufficient to perform the inverse transform (Ha-1P) of the algorithmic homography transform described above (also referred to as an inverse algorithmic homography transform) on the obtained corresponding points P, as illustrated in fig. 12. In other words, in this case, the inverse matrix Ha-1 of the algorithm homography matrix Ha is regarded as a homography matrix. Note that by additionally applying the inverse system homography transform, the corresponding points can be restored back to the coordinate system of the shot pattern image 122.

<2 > embodiment

< projection imaging System >

In fig. 14, the projection imaging system 300 is a system capable of projecting an image by the ISL method according to a method applied to the present technology described in <1.ISL method and corresponding point detection >, taking an image of the projected image, and performing corresponding point detection.

As illustrated in fig. 14, the projection imaging system 300 includes a control device 301 and projection imaging devices 302-1 to 302-N (where N is any natural number). The projection imaging devices 302-1 to 302-N are connected to the control device 301 via cables 303-1 to 303-N, respectively.

Hereinafter, in the case where it is not necessary to distinguish the projection imaging devices 302-1 to 302-N in the description, the projection imaging devices 302-1 to 302-N will be referred to as projection imaging device(s) 302. Also, in the case where it is not necessary to distinguish the cables 303-1 to 303-N in the description, the cables 303-1 to 303-N will be referred to as a cable(s) 303.

The control means 301 controls each projection imaging device 302 through the cable 303, for example, the control means 301 may provide an image to be projected and cause each projection imaging device 302 to project the image, as another example, the control means 301 may instruct each projection imaging device 302 to photograph an image of a projected image or the like and obtain the photographed image, as yet another example, the control means 301 may detect a corresponding point between the projected image and the photographed image and perform geometric correction on the image to be projected by each projection imaging device 302 based on the calculated corresponding point.

The projection imaging devices 302-1 to 302-N include projection units 311-1 to 311-N that project images and imaging units 312-1 to 312-N that capture images of objects, respectively. Hereinafter, in the case where the projection units 311-1 to 311-N are not necessarily distinguished in the description, the projection units 311-1 to 311-N will be referred to as a projection unit(s) 311. Also, in the case where it is not necessary to distinguish the imaging units 312-1 to 312-N in the description, the imaging units 312-1 to 312-N will be referred to as the imaging unit(s) 312.

The projection unit 311 has a function of a so-called projector. In other words, the projection imaging device 302 can be driven as a projector by using the projection unit 311. For example, the projection imaging apparatus 302 may use the projection unit 311 to project an image supplied from the control apparatus 301 onto any projection surface.

The imaging unit 312 has a function of a so-called camera. In other words, the projection imaging device 302 can be driven as a camera by using the imaging unit 312. For example, the projection imaging device 302 may use the imaging unit 312 to capture an image of a projection plane onto which the projection unit 311 projects the image, and supply obtained data of the captured image to the control device 301.

In other words, the projection imaging device 302 has a function of a so-called projector and a function of a so-called camera, and is capable of projecting an image onto a projection plane and taking an image of the projection plane, for example. Further, as a function of a projector, the projection imaging apparatus 302 has a function of a so-called ultra-short focus projector, and compared with a general projector, the projection imaging apparatus 302 can radiate a large projection even in a case where the projection imaging apparatus 302 is installed at a position very close to a projection plane. In other words, for example, as illustrated in fig. 10, the projection imaging device 302 is installed near the wall 113 serving as a projection plane, and projects an image and captures an image of the projection plane from that position.

There may be any number of projection imaging devices 302, whether a single device or multiple devices. In the case where there are a plurality of projection imaging devices 302, under the control of the control device 301, as described with reference to fig. 2 and 3, the projection imaging devices 302 may respectively cooperate with each other and project an image. In other words, the projection imaging system 300 in this case is a so-called multi-projection system, and is capable of realizing so-called projection mapping.

Note that the projection direction and magnification in which the projection unit 311 projects an image, distortion correction of a projected image, and the like may also be controllable. To achieve such control, for example, the position and posture of an optical system included in the projection unit 311 or the entire projection unit 311 may be controllable.

In addition, the image capturing direction and angle of view in which the imaging unit 312 captures an image, distortion correction of the captured image, and the like may also be controllable. To achieve such control, for example, the position and posture of an optical system included in the imaging unit 312 or the entire imaging unit 312 may be controllable.

Further, such control of the projection unit 311 and control of the imaging unit 312 may be performed independently of each other. Also, the position and attitude of the projection imaging device 302 may be controllable. Note that such control of the projection unit 311, the imaging unit 312, and the projection imaging device 302 may be performed by the control device 301 or a device other than the control device 301.

The cable 303 is an electrical communication cable of any communication standard by which a communication channel between the control apparatus 301 and the projection imaging apparatus 302 can be formed. Note that it is sufficient that the control device 301 and the projection imaging device 302 can communicate with each other, and for example, the control device 301 and the projection imaging device 302 may also be connected by wireless communication. In this case, the cable 303 may be omitted.

In such a projection imaging system 300, in order to perform geometric correction on an image, the control device 301 performs corresponding point detection between each projection unit 311 and each imaging unit 312. For example, the control device 301 may perform corresponding point detection according to an online sensing ISL method. At this time, the control device 301 may perform corresponding point detection to which the present technique is applied.

< control device >

Fig. 15 is a block diagram illustrating an exemplary main configuration of a control apparatus 301 as embodiments of an image processing apparatus to which the present technology is applied.

As illustrated in fig. 15, the control device 301 includes a Central Processing Unit (CPU)321, a Read Only Memory (ROM)322, a Random Access Memory (RAM)323, a bus 324, an input/output interface 330, an input unit 331, an output unit 332, a storage unit 333, a communication unit 334, and a drive 335.

The CPU 321, ROM 322, and RAM 323 are connected to each other via a bus 324. In addition, an input/output interface 330 is also connected to bus 324. The input unit 331, the output unit 332, the storage unit 333, the communication unit 334, and the driver 335 are connected to the input/output interface 330.

The input unit 331 includes an input device that receives external information such as user input. For example, the input unit 331 may include a keyboard, a mouse, operation buttons, a touch panel, a camera, a microphone, an input terminal, and the like. In addition, various sensors (such as an acceleration sensor, an optical sensor, and a temperature sensor) and an input device (such as a barcode reader) may also be included in the input unit 331. The output unit 332 includes an output device that outputs information (such as images and sounds). For example, the output unit 332 may include a display, a speaker, an output terminal, and the like.

The storage unit 333 includes a storage medium storing information such as programs and data. For example, the storage unit 333 may include a hard disk, a RAM disk, a nonvolatile memory, and the like. The communication unit 334 includes a communication device that performs communication by exchanging information (such as programs and data) with an external device via a predetermined communication medium (any network such as, for example, the internet). For example, the communication unit 334 may include a network interface. For example, the communication unit 334 communicates (exchanges programs and data) with a device external to the control device 301. Note that the communication unit 334 may have a wired communication function, a wireless communication function, or both.

The drive 335 reads out information (such as programs and data) stored in a removable medium 341 (such as, for example, a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory) loaded into the drive 335 itself. The drive 335 supplies the information read out from the removable medium 341 to the CPU 321, the RAM 323, and the like. Also, in the case where the writable removable medium 341 is loaded into the drive 335 itself, the drive 335 can cause information (such as programs and data) supplied from the CPU 321, the RAM 323, and the like to be stored in the removable medium 341.

For example, the CPU 321 loads a program stored in the storage unit 333 into the RAM 323 through the input/output interface 330 and the bus 324 and executes the program to perform various processes. The RAM 323 also appropriately stores data necessary for the CPU 321 to execute various processes and the like.

By executing the program or the like in this manner, the CPU 321 can execute processing related to detecting the corresponding point, such as, for example, processing like those described in <1.ISL method and corresponding point detection >.

< control device function Block >

Fig. 16 is a functional block diagram illustrating an example of functions realized by the control device 301 executing a program or the like. As illustrated in fig. 16, by executing programs, for example, the control device 301 functions as a projection imaging processing unit 351, a corresponding point detection processing unit 352, and a geometric correction processing unit 353.

The projection imaging processing unit 351 performs processing related to image projection and image capturing. For example, the projection imaging processing unit 351 performs image processing or the like on an image to be projected by the projection unit 311. Also, the projection imaging processing unit 351 controls the projection unit 311 to perform processing related to controlling image projection. Further, the projection imaging processing unit 351 controls the imaging unit 312 to perform processing related to controlling image capturing.

More specifically, for example, the projection imaging processing unit 351 synthesizes the pattern image and the content image as described in < ISL method > and the like in <1.ISL method and corresponding point detection >, controls projection of the synthesized image, controls image capturing of the projected image, and the like. Obviously, the projection imaging processing unit 351 may perform any processing, not limited to the above-described processing.

The corresponding point detection processing unit 352 performs processing related to detecting corresponding points based on the captured image captured under the control of the projection imaging processing unit 351, for example, the corresponding point detection processing unit 352 performs processing like those described in < ISL method >, < application of homography transform to pattern images >, < system homography transform >, < algorithm homography transform > and the like in <1.ISL method and corresponding point detection >.

More specifically, for example, the corresponding point detection processing unit 352 performs processing such as generating a pattern difference image from a captured pattern image having a -like synthesis with the captured pattern image 122 (FIG. 12), system homography transformation, arithmetic homography transformation, corresponding point detection, and inverse homography transformation.

The geometric correction processing unit 353 performs processing related to geometric correction of an image to be projected. For example, the geometric correction processing unit 353 performs processing such as pose estimation for a projection unit or the like, reconfiguration of a screen (projection plane), and geometric correction for an image to be projected, based on the corresponding point detected by the corresponding point detection processing unit 352. Obviously, the geometry correction processing unit 353 may perform any processing, not limited to the above-described processing.

Note that the blocks can exchange information (such as, for example, commands and data) with each other as needed.

< projection imaging processing Unit >

An example of functions included in the projection imaging processing unit 351 is illustrated in fig. 17. In fig. 17, the projection imaging processing unit 351 includes functions explained by functional blocks such as, for example, a processing control unit 361, a projection control unit 362, and an imaging control unit 363.

The process control unit 361 performs a process related to controlling the projection imaging process. For example, the processing control unit 361 performs processing such as selecting a projection unit to be processed and managing a processing count. Obviously, the processing control unit 361 may perform any processing, not limited to the above-described processing.

For example, the projection control unit 362 may superimpose (synthesize) a pattern image (a positive image or a negative image of a structured light pattern) on another image (such as, for example, a content image), supply the synthesized image (superimposed image) to the projection unit 311, and control projection of the synthesized image (superimposed image) by controlling the projection unit 311. for example, as illustrated in fig. 8 and 9, the projection control unit 362 projects the pattern image according to the ISL method as described with reference to fig. 7 and the like by using the pattern image 100 including the elliptical pattern 101 having a plurality of luminance change directions and longitudinal directions.

The imaging control unit 363 performs processing related to controlling image capturing of a projection image projected on a projection plane by the projection unit 311, for example, the imaging control unit 363 controls the imaging unit 312 to capture an image of the projection image at a timing corresponding to image projection by the projection unit 311, which is controlled by the projection control unit 362, that is, the imaging control unit 363 performs image capturing corresponding to projection of a pattern image according to the ISL method as described with reference to fig. 7 and the like.

As described with reference to fig. 10 to 12 and the like, since the projection imaging device 302 is installed near the projection plane, the imaging unit 312 will capture an image in a direction, for example, from below to above near the projection plane in other words, under the control of the imaging control unit 363, for example, a captured pattern image having a composition like of the captured pattern image 122 illustrated in fig. 12 is generated.

Note that the blocks can exchange information (such as, for example, commands and data) with each other as needed.

< corresponding Point detection processing means >

An example of functions included in the corresponding point detection processing unit 352 is illustrated in fig. 18. In fig. 18, the corresponding point detection processing unit 352 includes functions explained by functional blocks such as, for example, the control unit 371, the noise reduction unit 372, the pattern difference image generation unit 373, the system homography conversion unit 374, the corresponding point detection unit 375, the algorithm homography conversion unit 376, the corresponding point detection unit 377, and the inverse homography conversion unit 378.

The control unit 371 performs processing related to control of the corresponding point detection. For example, the control unit 371 performs processing such as selecting a shot pattern image to be processed. Obviously, the control unit 371 may perform any process, not limited to the above-described process.

For example, the noise reduction unit 372 reduces noise (improves the S/N ratio) in a captured pattern image (e.g., a captured pattern image including a positive image or a captured pattern image including a negative image) obtained by capturing an image of a projected image in a composite image (superimposed image) in which the same type of pattern image is synthesized with (superimposed on) a content image and projected by the projection unit 311 by the imaging unit 312 by adding the captured pattern image (e.g., the captured pattern image including a positive image or the captured pattern image including a negative image).

The pattern difference image generation unit 373 performs processing related to the detection pattern 101. For example, the pattern difference image generation unit 373 generates a pattern difference image by calculating a difference between captured pattern images obtained by capturing images of projected images in a composite image (superimposed image) in which different types of pattern images are synthesized (superimposed on) a content image (for example, by subtracting a captured pattern image including a negative image from a captured pattern image including a positive image). In other words, the pattern differential image is a differential image of respective captured images including two projected images of the structured light pattern, the two projected images having the same shape as each other and also having mutually opposite luminance change directions.

Due to this difference, in the pattern difference image, the components of the content image included in the captured pattern image are cancelled and suppressed, and conversely, the components of the pattern 101 are synthesized so that the luminance change directions become the same direction as each other, and the luminance change direction is emphasized. That is, according to this processing, the pattern 101 is detected from the captured pattern image. In other words, the pattern difference image is an image including the detected pattern 101. Obviously, the pattern difference image generation unit 373 may perform any process, not limited to the above-described process.

The system homography conversion unit 374 performs processing related to homography conversion based on the design value, for example, the system homography conversion unit 374 performs processing like with those described in < system homography conversion > and the like in <1.ISL method and corresponding point detection >.

The corresponding point detecting unit 375 performs processing related to detecting the corresponding point, for example, the corresponding point detecting unit 375 performs processing like those described in < system homography transform > or the like in <1.ISL method and corresponding point detection >, for example, the corresponding point detecting unit 375 detects the corresponding point between the projection image and the photographed image (in other words, the corresponding relationship between the pixel of the projecting unit 311 and the pixel of the imaging unit 312) using the pattern 101 in the system homography transform pattern difference image.

The algorithm homography conversion unit 376 performs processing related to homography conversion based on corresponding points, for example, the algorithm homography conversion unit 376 performs processing like with those described in < algorithm homography conversion > and the like in <1.ISL method and corresponding point detection >.

The corresponding point detecting unit 377 performs processing related to detecting a corresponding point, for example, the corresponding point detecting unit 377 performs processing like those described in < algorithm homography transform > or the like in <1.ISL method and corresponding point detection >. for example, the corresponding point detecting unit 377 detects a corresponding point between a projection image and a photographed image using the pattern 101 in the algorithm homography transform pattern difference image (in other words, a corresponding relationship between a pixel of the projecting unit 311 and a pixel of the imaging unit 312).

The inverse homography transformation unit 378 performs processes related to the inverse homography transformation, for example, the inverse homography transformation unit 378 performs processes like those described in < system homography transformation >, < algorithm homography transformation >, etc. in <1.ISL method and corresponding point detection >, for example, the inverse homography transformation unit 378 performs inverse algorithm homography transformation and inverse system homography transformation on the corresponding point P detected by the corresponding point detection unit 377 to restore the coordinate system of the original pattern differential image.

In other words, these processing units perform processes like those of process described with reference to FIG. 12, FIG. 13, etc., for example, note that the blocks can exchange information (such as, for example, commands and data) with each other as needed.

< projection imaging apparatus >

Fig. 19 is a perspective view illustrating an appearance state of the projection imaging apparatus 302. As illustrated in fig. 19, the projection unit 311 and the imaging unit 312 are securely disposed at predetermined positions in the housing of the projection imaging device 302. The projection unit 311 is formed to project at a predetermined angle with respect to the housing, and the imaging unit 312 is formed to take an image at a predetermined angle with respect to the housing.

With this arrangement, the relative positions of the projection unit 311 and the imaging unit 312, the relative angles of projection and image capturing, the angle of view, and the like can be regarded as preset known information. Therefore, system homography transformation can be easily implemented. Also, since the baseline between the projection unit 311 and the imaging unit 312 can be fixed, distortion of the projected image can be corrected with only the housing of the single projection imaging device 302.

In other words, the shot pattern image obtained by the projection unit 312 becomes an image having a composition like the composition of the shot pattern image 122 in FIG. 12.

Fig. 20 is a block diagram illustrating an exemplary main configuration of the projection imaging apparatus 302. As illustrated in fig. 20, the projection imaging apparatus 302 includes a control unit 401, a projection unit 311, an imaging unit 312, an input unit 411, an output unit 412, a storage unit 413, a communication unit 414, and a driver 415.

The control unit 401 includes, for example, a CPU, a ROM, a RAM, and the like, and controls each processing unit within the apparatus and executes various processing required for the control, such as, for example, image processing. For example, the control unit 401 executes these processes based on the control of the control device 301.

The projection unit 311 is controlled by the control unit 401 to perform processing related to projecting an image. For example, the projection unit 311 projects the image supplied from the control unit 401 to the outside of the projection imaging device 302 (such as, for example, onto a projection plane). The projection unit 311 projects an image by using a laser beam as a light source and by scanning the laser beam using a Micro Electro Mechanical System (MEMS). Obviously, the projection unit 311 may have any light source, not limited to the laser beam. For example, the light source may be a Light Emitting Diode (LED), xenon, or the like.

The imaging unit 312 is controlled by the control unit 401 to capture an image of an object (such as, for example, a projection plane) outside the apparatus, generate a captured image, and supply the captured image to the control unit 401. For example, the imaging unit 312 captures an image of a projection image projected onto a projection plane by the projection unit 311. The imaging unit 312 includes, for example, an image sensor using a Complementary Metal Oxide Semiconductor (CMOS), an image sensor using a Charge Coupled Device (CCD), or the like, and photoelectrically converts light from a subject and generates an electric signal (data) of a captured image using the image sensor.

The input unit 411 includes an input device that receives external information (such as user input). The input unit 411 includes, for example, operation buttons, a touch panel, a camera, a microphone, an input terminal, and the like. In addition, various sensors (such as an optical sensor and a temperature sensor) may also be included in the input unit 411. The output unit 412 includes an output device that outputs information (such as images and sounds). For example, the output unit 412 includes a display, a speaker, an output terminal, and the like.

The storage unit 413 includes, for example, a hard disk, a RAM disk, a nonvolatile memory, and the like. For example, the communication unit 414 includes a network interface. For example, the communication unit 414 is connected to the communication cable 303 and is capable of communicating with the control device 301 connected through the communication cable 303. Note that the communication unit 414 may have a wired communication function, a wireless communication function, or both. The drive 415 drives a removable medium 421 such as, for example, a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.

< projection Unit >

Fig. 21 is a block diagram illustrating an exemplary main configuration of the projection unit 311. As illustrated in fig. 21, the projection unit 311 includes a video processor 431, a laser driver 432, a laser output unit 433-1, a laser output unit 433-2, a laser output unit 433-3, a mirror 434-1, a mirror 434-2, a mirror 434-3, a MEMS driver 435, and a MEMS mirror 436.

The video processor 431 retains the image supplied from the control unit 401 and performs necessary image processing on the image. The video processor 431 provides the image to be projected to the laser driver 432 and the MEMS driver 435.

The laser driver 432 controls the laser output units 433-1 to 433-3 to project an image provided from the video processor 431. For example, the laser output units 433-1 to 433-3 output laser beams (wavelength bands) of mutually different colors, such as, for example, red, blue, and green. In other words, the laser driver 432 controls the output of each color of laser light to project an image provided from the video processor 431. Note that in the case where it is not necessary to distinguish the laser output units 433-1 to 433-3 in the description, the laser output units 433-1 to 433-3 will be referred to as the laser output unit(s) 433.

The mirror 434-1 reflects the laser beam output from the laser output unit 433-1 and guides the laser beam to the MEMS mirror 436. The mirror 434-2 reflects the output from the laser output unit 433-2 and guides the laser beam to the MEMS mirror 436. The mirror 434-3 reflects the laser beam output from the laser output unit 433-3 and guides the laser beam to the MEMS mirror 436. Note that in the case where mirrors 434-1 to 434-3 do not have to be distinguished in the description, mirrors 434-1 to 434-3 will be referred to as mirror(s) 434.

MEMS driver 435 controls driving of the mirrors in MEMS mirror 436 to project the image provided from video processor 431 for example, MEMS mirror 436 scans the laser beams of each color like in the example of FIG. 22 by driving the mirrors attached to the MEMS according to the control of MEMS driver 435 for example, the laser beams are output to the outside of the device through the projection aperture and radiated onto the projection plane with this arrangement, the image provided from video processor 431 is projected onto the projection plane.

Note that the example of fig. 21 is described as being provided with three laser output units 433 that output laser beams of three colors, but there may be any number of laser beams (or any number of colors). For example, there may be four or more laser output units 433, or there may be two or less laser output units 433. In other words, the number of laser beams output from the projection imaging device 302 (projection unit 311) may be two or less or four or more. Further, the number of colors of the laser beam output from the projection imaging device 302 (projection unit 311) may be two or less or four or more. Also, the mirror 434 and the MEMS mirror 436 may be configured in any manner and are not limited to the example in fig. 21. Obviously, any laser beam scanning pattern may be used.

< flow of geometry correction processing >

Next, a process performed in the projection imaging system 300 having such a configuration will be described. As described above, in the projection imaging system 300, the control device 301 controls each projection imaging device 302, uses on-line sensing according to the ISL method to perform corresponding point detection between each projection unit 311 and each imaging unit 312 while projecting an image, and based on the corresponding points, estimates the attitude and the like of each projection unit and each imaging unit 312, performs projection plane formation and the like, and performs geometric correction on the image to be projected.

An example of the flow of the geometry correction processing executed in the control device 301 to implement the above-described processing will be described with reference to the flowchart in fig. 23.

When the geometry correction processing is started, in step S101, the projection imaging processing unit 351 of the control device 301 performs projection imaging processing and performs processing related to control of projection and image capturing, for example, the projection imaging processing unit 351 causes the projection imaging device 302 to project a structured light pattern and capture an image of a projected image, these processes related to projection of a structured light pattern and capturing of an image of a projected image will be described in detail later, but include processing like those described with reference to fig. 7 and the like, for example.

In step S102, the corresponding point detection processing unit 352 performs corresponding point detection processing, and performs processing related to corresponding point detection, for example, the corresponding point detection processing unit 352 causes the projection imaging apparatus 302 to detect a corresponding point based on the captured image obtained by the processing in step S101 the corresponding point detection processing will be described in detail later, but for example, the corresponding point detection processing includes processing -like with those described in < system homography transform >, < algorithm homography transform > and the like in <1.ISL method and corresponding point detection >.

In step S103, the geometry correction processing unit 353 estimates the postures of each projection unit 311 and each imaging unit 312 (or each projection imaging device 302) using the detected corresponding points and performs projection screen reconfiguration. The projection screen reconfiguration refers to a process of estimating the shape of the projection screen serving as a projection plane.

In step S104, based on the processing results of the pose estimation and the projection screen reconfiguration in step S103, the geometric correction processing unit 353 performs geometric correction on the image to be projected from each projection unit 311 as necessary.

When the geometric correction ends, the geometric correction processing ends. The control device 301 performs this geometric correction process on all combinations of the projection unit(s) 311 and the imaging unit(s) 312.

< flow of projection imaging processing >

Next, an example of the flow of the projection imaging process performed in step S101 of fig. 23 will be described with reference to the flowchart in fig. 24.

When the projection imaging process is started, in step S121, the process control unit 361 selects the projection unit 311 to be processed from the unprocessed projection unit (S) 311.

In step S122, the projection control unit 362 performs processing related to the projection unit 311 to be processed projecting the positive image of the structured light pattern. For example, the projection control unit 362 acquires a positive image of the structured light pattern as illustrated in fig. 8 or 9 and a content image serving as an input image. Subsequently, for example, as illustrated in fig. 7, the projection control unit 362 superimposes the positive image on the content image to generate a superimposed image. Also, the projection control unit 362 supplies the superimposed image to the projection unit 311 to be processed, which has been selected in step S121, through the communication unit 334, the cable 303, and the like, and, for example, causes the superimposed image to be projected as illustrated in fig. 7. After the control, the projection unit 311 to be processed acquires the superimposed image or the like supplied from the control device 301 (projection control unit 362) through the communication unit 414, and projects the superimposed image to the projection plane at predetermined timing.

In step S123, the imaging control unit 363 executes processing relating to capturing an image of the projected image by each imaging unit 312. For example, the imaging control unit 363 controls each imaging unit 312, and, for example, as illustrated in fig. 7, causes each imaging unit 312 to capture a projection image (a projection image of a superimposed image of a positive image of the structured light pattern and a content image) projected from the projection unit 311 to be processed according to the processing in step S122. After the control, each imaging unit 312 captures an image of the projected image and generates a captured pattern image. Also, each imaging unit 312 supplies the generated shot pattern image to the control device 301 (imaging control unit 363) through the communication unit 414, the cable 303, and the like. The imaging control unit 363 acquires each shot pattern image through the communication unit 334.

In step S124, the projection control unit 362 performs a process similar to the process in step S122 on the negative image of the structured light pattern. For example, the projection control unit 362 acquires a negative image of the structured light pattern as illustrated in fig. 8 or 9 and a content image serving as an input image. Subsequently, for example, as illustrated in fig. 7, the projection control unit 362 superimposes a negative image on the content image to generate a superimposed image. Also, the projection control unit 362 supplies the superimposed image to the projection unit 311 to be processed, which has been selected in step S121, through the communication unit 334, the cable 303, and the like, and, for example, causes the superimposed image to be projected as illustrated in fig. 7. After the control, the projection unit 311 to be processed acquires the superimposed image or the like supplied from the control device 301 (projection control unit 362) through the communication unit 414, and projects the superimposed image to the projection plane at predetermined timing.

In step S125, similar to the processing in step S123, the imaging control unit 363 performs processing related to each imaging unit 312 capturing an image of the projected image. For example, the imaging control unit 363 controls each imaging unit 312, and, for example, as illustrated in fig. 7, causes each imaging unit 312 to capture a projection image (a projection image of a superimposed image of a shadow image of the structured light pattern and a content image) projected from the projection unit 311 to be processed according to the processing in step S122. After the control, each imaging unit 312 captures an image of the projected image and generates a captured pattern image. Also, each imaging unit 312 supplies the generated shot pattern image to the control device 301 (imaging control unit 363) through the communication unit 414, the cable 303, and the like. The imaging control unit 363 acquires each shot pattern image through the communication unit 334.

In step S126, the processing control unit 361 determines whether projection and image capturing (each process from step S122 to step S125) have been repeated a predetermined number of times. In order to reduce noise in the captured image (improve the S/N ratio), the processing control unit 361 causes the above-described projection and image capturing to be performed a plurality of times to obtain a plurality of captured pattern images including the same type of structured light pattern. Therefore, the processing control unit 361 makes the determination as described above in step S126. Subsequently, in a case where it is determined that the predetermined number of times has not been reached, the process returns to step S122 and is repeated from this point.

In the case where the processing from step S122 to step S126 is repeatedly performed as described above and it is determined in step S126 that the processing has been repeated a predetermined number of times, the processing proceeds to step S127.

In step S127, the processing control unit 361 determines whether each process from step S122 to step S125 has been performed for all the projection units 311. The processing control unit 361 causes each processing from step S122 to step S125 to be performed on all the projection units 311. Therefore, the processing control unit 361 makes the determination as described above in step S127. Subsequently, in a case where it is determined that there is an unprocessed projection unit 311, the process returns to step S121. When the process returns to step S121, in step S121, a new projection unit 311 is selected as the projection unit 311 to be processed, and the processes from step S122 to step S127 are performed on the newly selected projection unit 311.

In other words, in the case where there are a plurality of projection units 311 (or projection imaging devices 302), the processing from step S121 to step S127 is repeatedly performed as described above, and the image of the structured light pattern is projected in turn from each projection unit, further, in the case where there are a plurality of imaging units 312 (or projection imaging devices 302), each imaging unit 312 takes an image of the projected image projected from each projection unit 311 (in other words, the plurality of imaging units 312 take images of the same projected image), in step S127, in the case where it is determined that the processing has been performed for all projection units 311, the projection imaging processing ends, and the processing returns to fig. 23.

< flow of corresponding Point detection processing >

Next, an example of the flow of the corresponding point detection processing executed in step S102 of fig. 23 will be described with reference to the flowchart in fig. 25.

When the corresponding point detection processing is started, the control unit 371 selects a captured pattern image to be processed from the unprocessed captured pattern images in step S141.

In step S142, the noise reduction unit 372 adds the shot pattern image to be processed, which has been selected in step S141, to the shot image of the projection image in the composite image (superimposed image) in which the pattern image of the same type as the pattern image included in the shot pattern image is synthesized with (superimposed on) the content image (i.e., added to the shot pattern image including the pattern image of the same type), and reduces noise in the shot image (improves the S/N ratio).

In step S143, the pattern difference image generation unit 373 generates a pattern difference image between the captured pattern images, the noise of which is reduced by the processing in step S142 and which include pattern images (positive or negative) of mutually different types.

In step S144, for example, as described in < system homography transform > and the like in <1.ISL method and corresponding point detection >, the system homography transform unit 374 applies homography transform (system homography transform) to the pattern difference image obtained by the processing in step S143 based on the design values of the projection unit 311 and the imaging unit 312. For example, the system homography transformation unit 374 calculates the system homography matrix Hs from the four corner points of the projection image using the design values of the projection unit 311 and the imaging unit 312. Subsequently, the system homography transformation unit 374 performs system homography transformation on the pattern difference image obtained by the processing in step S143 using the system homography matrix Hs.

In step S145, for example, as described in < system homography conversion > and the like in <1.ISL method and corresponding point detection >, the corresponding point detecting unit 375 detects corresponding points between the pixels of the projecting unit 311 and the pixels of the imaging unit 312 by using the pattern of the system homography conversion pattern difference image obtained by the process in step S144.

In step S146, for example, as described in < algorithm homography conversion > or the like in <1.ISL method and corresponding point detection >, the algorithm homography conversion unit 376 calculates homography conversion from the corresponding points detected by the processing in step S145. For example, the algorithm homography transformation unit 376 calculates the algorithm homography matrix Ha using the corresponding points detected by the processing in step S145.

In step S147, for example, as described in < algorithm homography transform > or the like in <1.ISL method and corresponding point detection >, the algorithm homography transform unit 376 applies homography transform (algorithm homography transform) based on corresponding points to the pattern difference image obtained by the processing in step S143. For example, the algorithm homography transformation unit 376 performs algorithm homography transformation on the pattern difference image obtained through the processing in step S143 using the algorithm homography matrix Ha obtained through the processing in step S146.

In step S148, for example, as described in < algorithm homography conversion > in <1.ISL method and corresponding point detection > or the like, the corresponding point detecting unit 377 detects a corresponding point between the pixel of the projecting unit 311 and the pixel of the imaging unit 312 by using the pattern of the algorithm homography conversion pattern difference image obtained by the process in step S147.

In step S149, for example, as described in < system homography >, < algorithm homography > and the like in <1.ISL method and corresponding point detection >, the inverse homography transform unit 378 applies an inverse homography transform that is an inverse of the homography transform described above to the corresponding point calculated by the processing in step S148. For example, the inverse homography transform unit 378 applies, to the corresponding points calculated by the processing in step S148, the inverse algorithm homography transform that is the inverse transform of the processing in step S147 and the inverse system homography transform that is the inverse transform of the processing in step S144.

In step S150, the control unit 371 determines whether all shot pattern images have been processed. In the case where it is determined that there is an unprocessed captured pattern image, the process returns to step S141. Subsequently, in step S141, a new unprocessed captured pattern image is selected as the captured pattern image to be processed. Further, the processing from step S142 to step S150 is performed on the newly selected shot pattern image to be processed.

In this way, each process from step S141 to step S150 is repeatedly executed, and in the event that determination is made in step S150 that all shot pattern images have been processed, the corresponding point detection process ends, and the process returns to fig. 23 in other words, each processing unit of the corresponding point detection processing units 352 executes processes like those described with reference to fig. 12, 13, and the like.

As described in <1.ISL method and corresponding point detection >, by performing each process as described above, it is possible to suppress a decrease in accuracy of corresponding point detection.

< comparison of the number of detected corresponding points >

Next, the influence of the homography transformation on the number of detected corresponding points will be described in more detail. For example, in the case where the simulation for detecting the corresponding points is performed using the shot pattern image before the homography conversion is performed (for example, the shot pattern image 122 in fig. 12), the number of the corresponding points successfully detected is 415. In the case where the system homography transformation is applied to the shot pattern image and the simulation for detecting the corresponding dots is performed using the shot pattern image after the system homography transformation is performed (for example, the shot pattern image 123 in fig. 12), the number of successfully detected corresponding dots is increased to 461. Further, in the case where the algorithm homography transform is applied to the shot pattern image and the simulation of detecting the corresponding points is performed using the shot pattern image after the algorithm homography transform is performed (for example, the shot pattern image 124 in fig. 12), the number of the corresponding points successfully detected increases to 735.

In other words, by applying the homography transform to the shot pattern image and performing the corresponding point detection as described above, it is possible to suppress a reduction in the number of detected corresponding points. In general, increasing the number of detected corresponding points makes it possible to perform geometric correction by using more accurate corresponding points or based on more information, and therefore, the accuracy of geometric correction can be improved. Since the accuracy of the geometric correction can be improved, this is equivalent to being able to improve the accuracy of the detection of the corresponding points. In other words, by applying the homography transform to the shot pattern image and performing the corresponding point detection as described above, it is possible to suppress the accuracy of the corresponding point detection from being lowered.

< comparison of detection accuracy of corresponding Point >

Next, the influence of the homography transformation on the accuracy of the corresponding point detection will be described more specifically. A of fig. 26 is a schematic diagram illustrating an example of the corresponding point detection result and the accuracy thereof in the case where the corresponding point is detected using the shot pattern image before the homography conversion is performed (for example, the shot pattern image 122 in fig. 12). B of fig. 26 is a schematic diagram illustrating an example of the corresponding point detection result and the accuracy thereof in the case where the corresponding point is detected using the shot pattern image (for example, the shot pattern image 124 in fig. 12) after the system homography transform and the algorithm homography transform are performed as the homography transform.

In a of fig. 26 and B of fig. 26, each circle represents a corresponding point detected in the coordinates of the image to be projected. Also, the hue of each circle represents the magnitude of the error of the corresponding dot detection, with darker hues accounting for greater degrees of error. As clearly shown by the comparison between a of fig. 26 and B of fig. 26, the case (B of fig. 26) where the corresponding point is detected using the shot pattern image 124 has less error than the case (a of fig. 26) where the corresponding point is detected using the shot pattern image 122. In other words, by applying the homography transform to the shot pattern image and performing the corresponding point detection as described above, it is possible to suppress the accuracy of the corresponding point detection from being lowered.

< comparison of detection accuracy of corresponding Point >

Next, a description will be given of the accuracy of the corresponding point detection compared between the case where the imaging unit 312 is set to have an ultra-short focus and the corresponding point detection is performed according to the method described above (the case where an image is taken from the vicinity of the projection plane) and the case where the imaging unit 312 is set to have a long focus (the case where an image is taken from the front of the projection plane).

An example of the corresponding point detection result and its accuracy in the case where an image is taken from the vicinity of the projection plane is illustrated in fig. 27. Also, an example of the corresponding point detection result and its accuracy in the case of taking an image from the front of the projection plane is illustrated in fig. 28. As clearly shown in fig. 27 and 28, the accuracy of the corresponding point detection does not greatly vary between the two cases. In other words, by applying the present technology as described above, even if the imaging unit 312 is set to have an ultra-short focus, it is possible to obtain substantially the same corresponding point detection accuracy as in the case where the imaging unit 312 is set to have a long focus. In other words, by applying the present technology, it is possible to suppress a decrease in the accuracy of the corresponding point detection.

< projection imaging apparatus >

Note that, in fig. 19, the projection unit 311 and the imaging unit 312 are described as being provided at mutually different positions in the housing of the projection imaging device 302, but the configuration is not limited thereto, and the projection unit 311 and the imaging unit 312 may also be provided coaxially. In the case of the example in fig. 29, the projection imaging unit 451 is provided in the housing of the projection imaging apparatus 302. The projection imaging unit 451 includes a projection unit 311 and an imaging unit 312 provided coaxially with the optical system of the projection unit 311.

By adopting such a configuration, no additional optical system needs to be added, and the housing of the projection imaging apparatus 302 can be made more compact than the case of fig. 19. Also, since the use of the housings of the plurality of projection imaging devices 302 means that there is a baseline between the optical systems with respect to each other, distortion of the projected image can be corrected.

Note that in the housing of the projection imaging device 302, the positions, postures, angles of view, and the like of the projection unit 311 and the imaging unit 312 may also be variable. However, in order to make it possible to easily implement the system homography transformation, it is preferable that the above information is known information or a measurement function capable of easily determining the above information is provided.

< Pattern image >

Note that although it is described above that the pattern image 100 as illustrated in fig. 8 and 9 is used, the pattern image may be any image without being limited to these examples. For example, the shape, size, position, longitudinal direction, luminance change direction, and the like of the pattern may be set in any manner. Also, any number of pattern images may be used for corresponding point detection. The corresponding point may be detected from a single pattern image, or may be detected by using three or more pattern images. In addition, for example, a pattern image to be used may be adaptively selected from a plurality of candidates including mutually different types of pattern images according to a content image or the like. Alternatively, the existing pattern image may be adaptively modified according to the content image or the like. Further, a new pattern image can be adaptively generated according to a content image or the like.

< method for detecting corresponding Point >

In addition, although it is described above that the ISL method is used, the corresponding point detection method may be any method as long as the method involves using the pattern image, and is not limited to the ISL method.

In addition, it is not necessary to superimpose the pattern image on the content image. In other words, the shot pattern image may also be obtained by shooting an image of the projected projection image without superimposing the pattern image on the content image. That is, the shot pattern image in this case includes the pattern image but does not include the content image. The homography transform can be applied similarly to the case described earlier, and even the homography transform can be applied to such a shot pattern image.

<3. second embodiment >

< other exemplary configurations of projection imaging System and projection imaging apparatus >

Note that the exemplary configuration of the projection imaging system to which the present technology is applied is not limited to the example described above. For example, like the projection imaging system 500 illustrated in a of fig. 30, the control device 301 and each projection imaging device 302 may also be connected to each other through the network 501.

For example, a communication network and a communication channel according to any communication standard may be included in the network 501, such as the internet, a public telephone network, an -domain communication network for wireless mobile stations (such as a so-called 3G network or a 4G network), a WAN ( domain network), a LAN (local area network), a wireless communication network that performs communication conforming to the Bluetooth (registered trademark) standard, a communication channel for short-range wireless communication such as Near Field Communication (NFC), a communication channel for infrared communication, or a communication network for wired communication conforming to a standard such as a high-definition multimedia interface (HDMI (registered trademark)) or a Universal Serial Bus (USB).

The control device 301 and each projection imaging device 302 are communicably connected to the network 501. Note that the connection may be wired (i.e., a connection through wired communication), wireless (i.e., a connection through wireless communication), or both. Note that the number of each of the devices, the shape and size of the housing, the arrangement position, and the like may be set in any manner.

The control device 301 and each projection imaging device 302 can communicate (exchange information and the like) with each other through the network 501. In other words, the control apparatus 301 and each projection imaging apparatus 302 may also be communicably connected to each other through other devices (apparatus, transmission channel, etc.).

Even in the case where the projection imaging system 500 has such a configuration, the present technology can be applied similarly to the case of the projection imaging system 300 described in the th embodiment, and the effects described earlier can be exhibited.

In addition, the projection unit 311 and the imaging unit 312 may also be configured as devices different from each other, for example, as in the projection imaging system 510 illustrated in B of fig. 30. Instead of projection imaging device 302, projection imaging system 510 includes projection devices 511-1 to 511-N (where N is any natural number) and imaging devices 512-1 to 512-M (where M is any natural number). The projection devices 511-1 to 511-N include projection units 311 (projection units 311-1 to 311-N), respectively, and project images, respectively. The imaging devices 512-1 to 512-M respectively include the imaging units 312 (imaging units 312-1 to 312-M), and respectively take images of projection planes (projection images projected by the projection unit 311).

In the case where it is not necessary to distinguish the projection devices 511-1 to 511-N in the description, the projection devices 511-1 to 511-N will be referred to as a projection device(s) 511. In the case where it is not necessary to distinguish the imaging devices 512-1 to 512-M in the description, the imaging devices 512-1 to 512-M will be referred to as the imaging device(s) 512.

Each projection device 511 and each imaging device 512 are respectively communicably connected to the control device 301, and can communicate (exchange information) with the control device 301 by wired communication, wireless communication, or both. Note that each projection device 511 and each imaging device 512 may also be configured to communicate with other projection devices 511, other imaging devices 512, or both, through the control device 301.

Also, the number of each device, the shape and size of the housing, the arrangement position, and the like may be arranged in any manner. Also, as in the example in a of fig. 30, each apparatus may also be communicably connected to each other through other devices (apparatuses or transmission channels) like the network 501.

Even in the case where the projection imaging system 510 has such a configuration, the present technology can be applied similarly to the case of the projection imaging system 300 described in the -th embodiment, and the effects described earlier can be exhibited.

Further, the control device 301 may also be omitted, for example, as in the projection imaging system 520 illustrated in a of fig. 31. As illustrated in a of fig. 31, the projection imaging system 520 includes projection imaging devices 521-1 to 521-N (where N is any natural number). In the case where it is not necessary to distinguish the projection imaging devices 521-1 to 521-N in the description, the projection imaging devices 521-1 to 521-N will be referred to as the projection imaging device(s) 521. Each of the projection imaging devices 521 may be communicably connected to each other by a communication cable 522, and further, each of the projection imaging devices 521 may be communicably connected to each other by wireless communication.

The projection imaging devices 521-1 to 521-N include control units 523-1 to 523-N, respectively. In the case where it is not necessary to distinguish the control units 523-1 to 523-N in the description, the control units 523-1 to 523-N will be referred to as control unit(s) 523. The control unit 523 has a similar function to the control device 301, and can perform a similar process.

In other words, in the case of the projection imaging system 520, the processing performed in the control apparatus 301 described above is performed in (the control unit 523 of) the projection imaging apparatus 521. Note that (the control unit 523 of) any projection imaging apparatus 521 may be configured to execute all of the processes executed in the control apparatus 301, or (the control unit 523 of) a plurality of projection imaging apparatuses 521 may be configured to cooperatively execute the processes by exchanging information and the like with each other.

Even in the case where the projection imaging system 520 has such a configuration, the present technology can be applied similarly to the case of the projection imaging system 300 described in the th embodiment, and the effects described earlier can be exhibited.

Further, for example, as illustrated in B of fig. 31, the projection imaging system 300 may also be configured as a single apparatus. The projection imaging apparatus 530 illustrated in B of fig. 31 includes a projection unit 311 (projection units 311-1 to 311-N (where N is any natural number)), an imaging unit 312 (imaging units 312-1 to 312-M (where M is any natural number)), and a control unit 523.

In the projection imaging device 530, by executing the processing executed in the control device 301 described above, the control unit 523 controls each projection unit 311 and each imaging unit 312 to detect the corresponding point and the like.

Therefore, even in the case where the projection imaging system 530 has such a configuration, the present technology can be applied similarly to the case of the projection imaging system 300 described in the th embodiment, and the effects described earlier can be exhibited.

<4. others >

< software >

The series of processes described above may be performed by hardware and may also be performed by software, moreover, some processes may be performed by hardware and the other processes may be performed by software, in the case where the series of processes described above is performed by software, programs, data, and the like forming the software are installed through a network or a recording medium.

For example, in the case of the control device 301 in fig. 15, a recording medium is configured as a removable medium 341 separately from the main body of the device, and the removable medium 341 has a program, data, or the like recorded thereon and is distributed to deliver the program, data, or the like to a user. In this case, for example, by loading the removable medium 341 into the drive 335, a program, data, or the like stored in the removable medium 341 can be read out and installed in the storage unit 333.

As another example, in the case of the projection imaging apparatus 302 in fig. 20, a recording medium is configured as a removable medium 421 separately from the main body of the apparatus, the removable medium 421 having a program, data, or the like recorded thereon and being distributed to deliver the program, data, or the like to the user, in this case, for example, by loading the removable medium 421 into the drive 415, the program, data, or the like stored in the removable medium 421 can be read out and installed in the storage unit 413.

Further, programs, data, and the like may also be provided via a wired or wireless transmission medium (such as a local area network, the internet, or a digital satellite broadcast.) for example, in the case of the control apparatus 301 in fig. 15, programs, data, and the like may be received by the communication unit 334 and may be installed in the storage unit 333. further step by way of another example, in the case of the projection imaging apparatus 302 in fig. 20, programs, data, and the like may be received by the communication unit 414 and may be installed in the storage unit 413.

For example, in the case of the control device 301 in fig. 15, the program, the data, and the like may be installed in advance in the storage unit 333, the ROM 322, and the like, and further in step , in the case of the projection imaging device 302 in fig. 20, the program, the data, and the like may be installed in advance in the storage unit 413, the ROM (not shown) built in the control unit 401, and the like, as another example.

< supplement >

Embodiments of the present technology are not limited to the embodiments described above, and various changes may be made without departing from the scope of the present technology.

For example, the present technology may also be implemented by any configuration in configurations constituting an apparatus or a system, for example, a set (that is, a local configuration of a device) as a processor of a system Large Scale Integration (LSI) or the like, a module using a plurality of processors or the like, a unit using a plurality of modules or the like, a step of adding a set of other functions to the set or the like.

Note that in this specification, a system refers to a set of a plurality of constituent elements (e.g., devices or modules (parts)) regardless of whether all of the constituent elements are in the same casing or not, therefore, devices included in different casings but connected via a network and modules included in casings are all systems.

Also, each of the processing units described above may be implemented by any configuration as long as the configuration has the functions described for the processing unit. For example, a processing unit may be configured by using any type of circuit, LSI, system LSI, processor, module, unit, set, apparatus, device, system, or the like. In addition, the above-described contents may be combined in plural. For example, the same type of configuration, such as a plurality of circuits or a plurality of processors, may be combined, or different types of configurations, such as a circuit and an LSI, may be combined.

Further , for example, elements described as a single device (or processing unit) may be divided and configured into multiple devices (or processing units). conversely, elements described above as multiple devices (or processing units) may be collectively configured as a single device (or processing unit). further elements other than those described above may be added to the configuration of each device (or each processing unit). further, portions of the configuration of a given device (or processing unit) may be included in the configuration of another device (or another processing unit) so long as the configuration or operation of the system as a whole is substantially the same.

In addition, for example, the present technology may employ a configuration of cloud computing that performs processing by distributing and sharing functions of a plurality of devices through a network.

In addition, for example, the above-described program may be executed in any apparatus. In this case, it is sufficient if the apparatus has necessary functions (function blocks, etc.) and can obtain necessary information.

In addition, in the case where a plurality of processes are included in steps, a plurality of processes included in steps may be performed by apparatuses or may be performed by being allocated to a plurality of apparatuses.

In a program executed by a computer, the processes in the steps describing the program may be performed chronologically in the order described in this specification, or the processes in the steps may be performed simultaneously, or the processes in the steps may be performed separately at necessary timings (such as when called).

In examples, part or all of the of the present technology described in any embodiment may be performed in conjunction with part or all of the of the present technology described in another embodiment, additionally, any or all of the present technology described above may be performed in conjunction with another of the technology not described above.

Further, the present technology can also be configured as follows.

(1)

an image processing apparatus includes:

a corresponding point detection unit that applies a homography transform to a captured pattern image obtained as a result of the imaging unit capturing an image of the predetermined structured light pattern projected by the projection unit, and detects a corresponding point between the projection image projected by the projection unit and the captured image captured by the imaging unit using the captured pattern image to which the homography transform is applied.

(2)

The image processing apparatus according to (1), wherein

The corresponding point detecting unit applies homography transformation based on design values of the projecting unit and the imaging unit so as to convert the captured pattern image into a coordinate system as seen from the front, and detects the corresponding point by using the captured pattern image converted into the coordinate system as seen from the front.

(3)

The image processing apparatus according to (2), wherein

The corresponding point detecting unit converts coordinates of four corners of the projection image projected by the projecting unit into a coordinate system of the imaging unit based on the design value, and applies homography transformation to the shot pattern image using the converted coordinates of the four corners.

(4)

The image processing apparatus according to (2) or (3), wherein

The corresponding point detecting unit applies an inverse homography transform that is an inverse transform of the homography transform to the detected corresponding point.

(5)

The image processing apparatus according to (1), wherein

Corresponding point detection unit

Applying homography transformation based on design values of the projection unit and the imaging unit so as to convert the shot pattern image into a coordinate system as seen from the front, and detecting temporary corresponding points by using the shot pattern image converted into the coordinate system as seen from the front, and

homography transformation is also applied based on the detected temporary corresponding points so as to convert the shot pattern image converted into the coordinate system as seen from the front into the coordinate system of the projected image projected by the projection unit, and the corresponding points are detected by using the shot pattern image converted into the coordinate system of the projected image.

(6)

The image processing apparatus according to (5), wherein

The corresponding point detecting unit applies an inverse homography transform that is an inverse transform of the homography transform to the detected corresponding point.

(7)

The image processing apparatus according to any of (1) to (6), wherein

The captured pattern image is an image obtained by using a captured image of the structured light pattern projected superimposed onto the other image.

(8)

The image processing apparatus according to (7), wherein

The shot pattern image is a differential image of respective shot images including two projected images of the structured light pattern, the two projected images having the same shape as each other and also having luminance change directions opposite to each other.

(9)

The image processing apparatus according to (8), wherein

The shot pattern images are differential images between composite images including the structured light pattern, the composite images having mutually opposite luminance change directions, each of the composite images being obtained by adding, at , a corresponding shot image of a plurality of projected images including the structured light pattern, the plurality of projected images having mutually the same luminance change direction.

(10)

The image processing apparatus according to any of (1) to (9), wherein

The structured light pattern comprises two elliptical patterns having mutually opposite directions of brightness variation.

(11)

The image processing apparatus according to (10), wherein

The structured light pattern includes a plurality of patterns having elliptical shapes with different longitudinal directions.

(12)

The image processing apparatus according to any of (1) to (11), further comprising steps of:

a projection unit.

(13)

The image processing apparatus according to (12), wherein

The projection unit is positioned proximate to the projection plane.

(14)

The image processing apparatus according to (12) or (13), wherein

The projection unit projects the same structured light pattern multiple times.

(15)

The image processing apparatus according to (12), wherein

Is provided with a plurality of projection units, an

Each projection unit projects the structured light pattern in turn.

(16)

The image processing apparatus according to any of (1) to (15), further comprising steps of:

an imaging unit.

(17)

The image processing apparatus according to (16), wherein

The imaging unit is positioned proximate to the projection plane.

(18)

The image processing apparatus according to (16) or (17), wherein

The imaging unit captures multiple projected images of the same structured light pattern.

(19)

The image processing apparatus according to any of (16) to (18), wherein,

is provided with a plurality of image forming units, an

Each imaging unit takes an image of the projected image of the structured light pattern with .

(20)

the image processing method comprises:

the method includes applying a homography transform to a shot pattern image obtained as a result of an imaging unit shooting an image of a predetermined structured light pattern projected by a projection unit, and detecting a corresponding point between a projection image projected by the projection unit and the shot image shot by the imaging unit using the shot pattern image to which the homography transform is applied.

List of reference numerals

100 pattern image

101 pattern

300 projection imaging system

301 control device

302 projection imaging device

311 projection unit

312 imaging unit

351 projection imaging processing unit

352 corresponding point detection processing unit

353 normalization correction processing unit

361 process control unit

362 projection control unit

363 imaging control unit

371 control unit

372 noise reduction unit

373 pattern differential image generating unit

374 system homography conversion unit

375 corresponding point detecting unit

376 algorithm homography transformation unit

377 corresponding point detecting unit

378 inverse homography transform unit

401 control unit

500 projection imaging system

501 network

510 projection imaging system

511 projection device

512 imaging device

520 projection imaging system

521 projection imaging device

523 control unit

530 projecting the imaging device.

59页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种刚体配置方法及光学动作捕捉方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!