Three-dimensional measurement device, three-dimensional measurement method, and program

文档序号:1256220 发布日期:2020-08-21 浏览:10次 中文

阅读说明:本技术 三维测定装置、三维测定方法以及程序 (Three-dimensional measurement device, three-dimensional measurement method, and program ) 是由 松本慎也 大西康裕 于 2019-02-06 设计创作,主要内容包括:本发明提供一种三维测定装置等,可提高针对拍摄条件变动的稳健性,并且以更高分辨率测定对象物的三维形状。三维测定装置包括:投光部,向对象物投射将数据进行编码而成的图案;摄像部,拍摄经投射图案的对象物的图像;以及算出部,基于特征点的位置及经解码的数据来算出三维点群的位置,图案包含多个至少表示二位且用于算出三维点群的位置的单位图案,单位图案包含第一区域、及面积大于第一区域的第二区域,第一区域与第二区域的面积比为0.3以上且0.9以下。(The invention provides a three-dimensional measuring device and the like, which can improve the robustness against the change of the shooting condition and measure the three-dimensional shape of an object with higher resolution. The three-dimensional measurement device includes: a light projecting unit for projecting a pattern obtained by encoding data onto an object; an image pickup unit that picks up an image of the object having the projected pattern; and a calculation unit that calculates the position of the three-dimensional point group based on the position of the feature point and the decoded data, wherein the pattern includes a plurality of unit patterns that represent at least two bits and calculate the position of the three-dimensional point group, the unit patterns include a first region and a second region having an area larger than that of the first region, and the area ratio between the first region and the second region is 0.3 to 0.9.)

1. A three-dimensional measurement apparatus, comprising:

a light projection unit that projects a pattern obtained by encoding data with a two-dimensional structure onto an object;

an image pickup unit that picks up an image of the object on which the pattern is projected; and

a calculation unit that extracts feature points of the pattern, calculates positions of three-dimensional point groups representing a three-dimensional shape of the object based on positions of the feature points in the image and the decoded data,

the pattern includes a plurality of unit patterns of minimum units, each unit pattern of minimum units representing at least two bits, including the feature points, for calculating the position of the three-dimensional point group,

the unit pattern includes a first region and a second region distinguished from the first region and having an area larger than the first region,

an area ratio of an area of the first region divided by an area of the second region is 0.3 or more and 0.9 or less.

2. The three-dimensional measurement apparatus according to claim 1,

the unit pattern is a quadrangle having a short side of 3 pixels or more and 10 pixels or less on the image.

3. The three-dimensional assay device according to claim 1 or 2,

an area ratio of an area of the first region divided by an area of the second region on the image is 0.3 or more and 0.9 or less.

4. The three-dimensional assay device of any one of claims 1 to 3, further comprising:

and a setting unit that sets a range of the area ratio of the pattern projected by the light projection unit, based on the number of pixels on the short side of the unit pattern on the image.

5. The three-dimensional measurement apparatus according to claim 4,

the setting unit sets the area ratio to be narrower as the number of pixels on the short side of the unit pattern on the image is smaller.

6. The three-dimensional assay device of any one of claims 1 to 5,

the first region and the second region are distinguished by the brightness of light projected by the light projection unit.

7. The three-dimensional assay device of any one of claims 1 to 5,

the first area and the second area are distinguished by a wavelength band of light projected by the light projection unit.

8. The three-dimensional assay device of any one of claims 1 to 5,

the first area and the second area are distinguished by polarization of light projected by the light projection unit.

9. The three-dimensional assay device of any one of claims 1 to 8,

the unit pattern contains a two-dimensional shape in which the first regions are continued without being separated.

10. The three-dimensional assay device of any one of claims 1 to 8,

the unit pattern includes two-dimensional shapes in which the first regions are separated from the second regions.

11. The three-dimensional assay device of any one of claims 1 to 8,

the pattern includes: the unit pattern includes the two-dimensional shape in which the first region is continuous without being separated, and the unit pattern includes the two-dimensional shape in which the first region is separated with the second region interposed therebetween.

12. The three-dimensional assay device of claim 10 or 11,

the first region is separated into two parts with the second region interposed therebetween in the unit pattern.

13. The three-dimensional assay device of claim 10 or 11,

the first region is separated into three or more regions in the unit pattern with the second region interposed therebetween.

14. The three-dimensional assay device of any one of claims 1 to 13,

the light projection unit includes: a modulation element to modulate a size of the projected pattern.

15. A three-dimensional measurement method, comprising:

projecting a pattern obtained by encoding data by a two-dimensional structure onto an object;

capturing an image of the object on which the pattern is projected; and

extracting feature points of the pattern, calculating positions of three-dimensional point groups representing a three-dimensional shape of the object based on positions of the feature points in the image and the decoded data,

the pattern includes a plurality of unit patterns of minimum units, each unit pattern of minimum units representing at least two bits, including the feature points, for calculating the position of the three-dimensional point group,

the unit pattern includes a first region and a second region distinguished from the first region and having an area larger than the first region,

an area ratio of an area of the first region divided by an area of the second region is 0.3 or more and 0.9 or less.

16. A three-dimensional measurement program for causing a calculation unit included in a three-dimensional measurement device to function as a calculation unit, the three-dimensional measurement device comprising: a light projection unit that projects a pattern obtained by encoding data with a two-dimensional structure onto an object; and an image pickup unit for picking up an image of the object on which the pattern is projected,

the calculation unit extracts feature points of the pattern, calculates positions of three-dimensional point groups representing a three-dimensional shape of the object based on positions of the feature points in the image and the decoded data,

the pattern includes a plurality of unit patterns of minimum units, each unit pattern of minimum units representing at least two bits, including the feature points, for calculating the position of the three-dimensional point group,

the unit pattern includes a first region and a second region distinguished from the first region and having an area larger than the first region,

an area ratio of an area of the first region divided by an area of the second region is 0.3 or more and 0.9 or less.

Technical Field

The present invention relates to a three-dimensional measurement device, a three-dimensional measurement method, and a program.

Background

In recent years, the following systems are being used: the three-dimensional shape of the object is measured by projecting a coded pattern or a random dot pattern onto the object, capturing an image of the object on which the pattern is projected, and analyzing the captured image.

Patent document 1 listed below describes a method of determining the three-dimensional shape of an object (object) by acquiring an image obtained by projecting a pattern onto the object, selecting a reference component from the image, obtaining relative coordinates of pattern components other than the reference component, and determining the relative depth of the position of the object by linear conversion based on the relative coordinates of a geometric model.

Patent document 2 describes a method of: the first image and the second image of the object on which the encoded pattern is projected are captured, and the pixel region of the first image is searched along an epipolar line (epipolar line) of the second image, thereby acquiring distance data from the two-dimensional image of the object.

Disclosure of Invention

Problems to be solved by the invention

Patent documents 1 and 2 describe techniques for measuring the three-dimensional shape of a relatively large object such as a person, but when measuring the three-dimensional shape of a relatively small object, it is necessary to measure the three-dimensional shape of the object with higher resolution. Therefore, it is conceivable to project the pattern to the object with a higher density. However, the inventors have found that when measuring the three-dimensional shape of a relatively small object such as bulk-loaded components, the pattern may be projected on a slope of the object and distorted, the contrast of the pattern may be reduced by the influence of ambient light or the like or texture (texture) of the object, or the pattern may be deformed by irregularities on the surface of the object, and it may be difficult to measure the three-dimensional shape when simply thinning the pattern. That is, when the pattern is simply thinned, robustness against variation in imaging conditions is impaired.

Accordingly, the present invention provides a three-dimensional measurement device, a three-dimensional measurement method, and a program that can measure the three-dimensional shape of an object with higher resolution while improving robustness against changes in imaging conditions.

Means for solving the problems

A three-dimensional measurement device according to an aspect of the present disclosure includes: a light projection unit that projects a pattern obtained by encoding data with a two-dimensional structure onto an object; an image pickup unit that picks up an image of the object having the projected pattern; and a calculation unit that extracts feature points of a pattern, calculates a position of a three-dimensional point group indicating a three-dimensional shape of the object based on the position of the feature points in the image and the decoded data, wherein the pattern includes a plurality of unit patterns of minimum units, each of the unit patterns of minimum units includes at least two bits, includes the feature points, and calculates the position of the three-dimensional point group, the unit patterns include a first region and a second region that is separated from the first region and has an area larger than the first region, and an area ratio obtained by dividing the area of the first region by the area of the second region is 0.3 to 0.9. Here, the object may be any object, and may be, for example, a bulk component. The two-dimensional structure of the pattern may be any structure, and the single-bit pattern may be laid in a lattice shape. The feature point of the pattern may be, for example, a center point of the unit pattern. The unit pattern may have any pattern represented by a first region and a second region inside a square, for example. In addition, the unit pattern may have a rectangular shape or a parallelogram shape. The first region and the second region may be distinguished from each other by a pixel value of each pixel of an image captured by the imaging unit. For example, the first region and the second region may be distinguished by a luminance value of a pixel, and the first region and the second region may be distinguished according to whether or not the luminance value is equal to or greater than a threshold value.

According to the above aspect, since the area ratio of the first region to the second region included in the unit pattern is 0.3 or more and 0.9 or less, even when the imaging condition varies, the first region and the second region can be distinguished, the feature point of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Therefore, the three-dimensional shape of the object can be measured with higher resolution while improving robustness against variation in imaging conditions.

In the above-described aspect, the unit pattern is a quadrangle having a short side of 3 pixels or more and 10 pixels or less on the image. Here, the short side of the unit pattern may be the shortest side among the four sides of the unit pattern.

According to the above aspect, the area ratio of the first region and the second region included in the unit pattern is 0.3 or more and 0.9 or less, and the unit pattern is a quadrangle whose short side is 3 pixels or more on the image, whereby the first region and the second region can be recognized even when the imaging condition varies, the feature points of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Further, the area ratio of the first region and the second region included in the unit pattern is 0.3 or more and 0.9 or less, and the unit pattern is a quadrangle having a short side of 10 pixels or less on the image, whereby a two-dimensional structure in which data is encoded can be arranged at high density, and the density of extracted feature points can be increased. Therefore, the three-dimensional shape of the object can be measured with higher resolution while improving robustness against variation in imaging conditions.

In the above aspect, an area ratio of the area of the first region divided by the area of the second region in the image may be 0.3 or more and 0.9 or less.

According to the above aspect, not only the area ratio of the area of the first region divided by the area of the second region of the unit pattern projected by the light projection unit is 0.3 or more and 0.9 or less, but also the area ratio of the area of the first region divided by the area of the second region in the unit pattern imaged by the imaging unit is 0.3 or more and 0.9 or less, whereby the robustness against the variation of imaging conditions can be improved and the three-dimensional shape of the object can be measured with higher resolution.

In the above aspect, the method may further include: and a setting unit that sets a range of the area ratio of the pattern projected by the light projection unit, based on the number of pixels on the short side of the unit pattern on the image.

According to the above aspect, by setting the range of the area ratio between the first region and the second region included in the unit pattern according to the number of pixels on the short side of the unit pattern on the image, the range of the area ratio between the first region and the second region can be set according to the density of the unit pattern on the image, and the balance between the resolution of the three-dimensional shape of the measurement target and the robustness against the variation in the imaging conditions can be adjusted.

In the above aspect, the setting unit may be configured to narrow the range of the area ratio as the number of pixels of the short side of the unit pattern on the image decreases.

According to the above aspect, the range of the area ratio between the first region and the second region can be set to be narrowed as the density of the unit pattern on the image becomes higher, and the range of the area ratio between the first region and the second region can be limited to a range in which robustness against a variation in imaging conditions can be secured as the resolution of the three-dimensional shape of the measurement target becomes higher. Therefore, the balance between the resolution of the three-dimensional shape of the measurement target and the robustness against the variation of the imaging conditions can be adjusted.

In the above aspect, the first region and the second region may be distinguished by the brightness of the light projected by the light projection unit.

According to the above aspect, the first region and the second region are distinguished by the light and dark of the light, whereby the pattern can be recognized even by the imaging unit that images the monochrome image, the configuration of the imaging unit can be simplified, and the cost of the three-dimensional measurement apparatus can be reduced.

In the above aspect, the first area and the second area may be distinguished by a wavelength band of light projected by the light projection unit.

According to the above aspect, since the first region and the second region are separated by the wavelength band of light, even when white light such as ambient light is irradiated on the object, the difference between the first region and the second region is not easily changed, and the first region and the second region are easily recognized.

In the above aspect, the first area and the second area may be distinguished by polarization of light projected by the light projection unit.

According to the above aspect, since the first region and the second region are distinguished by polarization of light, even when the object is irradiated with light such as ambient light, a difference between the first region and the second region is not easily changed, and the first region and the second region are easily recognized.

In the above-described form, the unit pattern may have a two-dimensional shape in which the first regions are continuous without being separated.

According to the above configuration, the two-dimensional structure of the unit pattern can be simplified, and the first region and the second region can be easily recognized.

In the above aspect, the unit pattern may have a two-dimensional shape in which the first region is separated from the second region.

According to the above aspect, various two-dimensional structures can be formed by the unit pattern, and the density of data encoded into the unit pattern can be increased. This reduces the number of unit patterns that need to be decoded to identify the pattern rows, and enables matching (matching) of the pattern on the image and the projected pattern with fewer computations, thereby reducing the computational load for image recognition for measuring the three-dimensional shape of the object.

In the above aspect, the pattern may include: the unit pattern includes a two-dimensional shape in which the first region is continuous without being separated, and a unit pattern including a two-dimensional shape in which the first region is separated with the second region interposed therebetween.

According to this aspect, the density of data encoded by the pattern can be increased by increasing the number of bits that can be represented by the unit pattern by increasing the change (variation) in the two-dimensional structure of the unit pattern. Therefore, the number of unit patterns to be decoded for specifying the pattern row can be reduced, matching between the pattern on the image and the projected pattern can be performed by a smaller number of calculations, and the calculation load for image recognition for measuring the three-dimensional shape of the object can be reduced.

In the above aspect, the first region may be separated into two parts with the second region interposed therebetween in the unit pattern.

According to the above aspect, the two-dimensional structure of the unit pattern can be relatively simplified, the first region and the second region can be easily distinguished, and the data amount that can be expressed by the unit pattern can be increased, so that the density of data encoded by the pattern can be increased. Therefore, the number of unit patterns to be decoded for specifying the pattern row can be reduced, matching between the pattern on the image and the projected pattern can be performed by a smaller number of calculations, and the calculation load for image recognition for measuring the three-dimensional shape of the object can be reduced.

In the above aspect, the first region may be separated into three or more regions in the unit pattern with the second region interposed therebetween.

According to the above aspect, the two-dimensional structure of the unit pattern is relatively complicated, the number of bits that can be expressed by the unit pattern is increased, and the position of the unit pattern is easily determined. Therefore, it is possible to reduce the processing time for window matching (window matching) while ensuring robustness against the variation of the imaging conditions, and to reduce the computational load for image recognition for measuring the three-dimensional shape of the object.

In this aspect, the light projecting section may include a modulation element that modulates the size of the projected pattern.

According to the above aspect, the size of the pattern can be modulated according to the unevenness or inclination of the object to which the pattern is projected, and robustness against variation in imaging conditions can be secured. By modulating the image so as to increase the pattern, the number of short sides on the image is increased, and the number of three-dimensional point groups can be reduced, thereby reducing the computational load for image recognition for measuring the three-dimensional shape of the object.

A three-dimensional measurement method according to another aspect of the present disclosure includes: projecting a pattern obtained by encoding data by a two-dimensional structure onto an object; capturing an image of the object having the projected pattern; and extracting feature points of a pattern, and calculating positions of a three-dimensional point group indicating a three-dimensional shape of the object based on the positions of the feature points in the image and the decoded data, wherein the pattern includes a plurality of unit patterns, each unit pattern includes at least two bits, includes the feature points, and calculates the positions of the three-dimensional point group, and each unit pattern is a quadrangle having a short side of 3 pixels or more and 10 pixels or less on the image.

According to the above aspect, the unit pattern is a quadrangle having a shorter side of 3 pixels or more on the image, and thus even when the imaging condition varies, the feature points of the unit pattern can be extracted and the data represented by the unit pattern can be decoded. Further, the unit pattern is a quadrangle having a short side of 10 pixels or less on the image, and thus a two-dimensional structure obtained by encoding data can be arranged at high density, and the density of extracted feature points can be increased. Therefore, the three-dimensional shape of the object can be measured with higher resolution while improving robustness against variation in imaging conditions.

A three-dimensional measurement program according to another aspect of the present disclosure causes a calculation unit included in a three-dimensional measurement device to function as a calculation unit, the three-dimensional measurement device including: a light projection unit that projects a pattern obtained by encoding data with a two-dimensional structure onto an object; and an imaging unit that images an image of the object on which the pattern is projected, wherein the calculation unit extracts feature points of the pattern, calculates positions of three-dimensional point groups representing a three-dimensional shape of the object based on the positions of the feature points in the image and the decoded data, the pattern includes a plurality of unit patterns, each of the unit patterns represents at least two bits, includes the feature points, and calculates the positions of the three-dimensional point groups, and each of the unit patterns has a quadrangle with a short side of 3 pixels or more and 10 pixels or less on the image.

According to the above aspect, the unit pattern is a quadrangle having a shorter side of 3 pixels or more on the image, and thus even when the imaging condition varies, the feature points of the unit pattern can be extracted and the data represented by the unit pattern can be decoded. Further, the unit pattern is a quadrangle having a short side of 10 pixels or less on the image, and thus a two-dimensional structure obtained by encoding data can be arranged at high density, and the density of extracted feature points can be increased. Therefore, the three-dimensional shape of the object can be measured with higher resolution while improving robustness against variation in imaging conditions.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the present invention, it is possible to provide a three-dimensional measurement device, a three-dimensional measurement method, and a program that can measure the three-dimensional shape of an object with higher resolution while improving robustness against changes in imaging conditions.

Drawings

Fig. 1 is a functional block diagram of a three-dimensional measurement device according to an embodiment of the present disclosure.

Fig. 2 is a diagram showing the physical configuration of the control unit and the recognition unit of the three-dimensional measurement device according to the present embodiment.

Fig. 3 is a diagram showing an example of a pattern projected by the light projecting section of the three-dimensional measurement device according to the present embodiment.

Fig. 4 is a diagram showing an example of a unit pattern projected by the light projecting section of the three-dimensional measurement device according to the present embodiment.

Fig. 5 is a diagram showing an example of patterns having different area ratios of the first region and the second region, which are projected by the light projecting section of the three-dimensional measurement device according to the present embodiment.

Fig. 6 is a diagram showing an example of a pattern disturbed by a change in imaging conditions.

Fig. 7 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when decoding data by the pattern, for the pattern disordered in the first form.

Fig. 8 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when data is decoded by the pattern, for the pattern disordered in the second form.

Fig. 9 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when decoding data by the pattern, for a pattern disordered in the third form.

Fig. 10 is a diagram showing an example of a unit pattern formed by a combination of patterns having different code patterns, which is projected by the light projection unit of the three-dimensional measurement device according to the present embodiment.

Fig. 11 is a diagram showing another example of a pattern which is projected by the light projecting section of the three-dimensional measurement device of the present embodiment, is composed of a combination of patterns having different code patterns, and has different area ratios of the first region and the second region.

Fig. 12 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when decoding data by the pattern, for another example pattern in which the first form is distorted.

Fig. 13 is a graph showing a relationship between the area ratio of the first region and the second region and the success rate when decoding data from the pattern, for another example pattern disordered in the second form.

Fig. 14 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when data is decoded by the pattern, for another example pattern scrambled in the third mode.

Fig. 15 is a diagram illustrating an example of a pattern which is projected by the light projecting section of the three-dimensional measurement device of the present embodiment, is composed of a combination of patterns having different code patterns, and has different area ratios of the first region and the second region.

Fig. 16 is a flowchart of a process of measuring the three-dimensional shape of the object, which is executed by the three-dimensional measuring apparatus according to the present embodiment.

Fig. 17 is a diagram showing an example of a unit pattern projected by a light projecting section of a three-dimensional measurement device according to a first modification of the present embodiment.

Fig. 18 is a diagram showing an example of a unit pattern projected by a light projecting section of a three-dimensional measurement device according to a second modification of the present embodiment.

Fig. 19 is a diagram showing an example of a pattern projected by a light projecting section of a three-dimensional measurement device according to a third modification of the present embodiment.

Fig. 20 is a diagram showing an example of a pattern projected by a light projecting section of a three-dimensional measurement device according to a fourth modification of the present embodiment.

Fig. 21 is a diagram showing an example of a unit pattern projected by a light projecting section of a three-dimensional measurement device according to a fifth modification of the present embodiment.

Detailed Description

Hereinafter, an embodiment (hereinafter, referred to as "the present embodiment") according to one aspect of the present invention will be described with reference to the drawings. In the drawings, the same or similar components are denoted by the same reference numerals.

Application example § 1

First, an example of a scenario to which the present invention is applied will be described with reference to fig. 1. Fig. 1 is a functional block diagram of a three-dimensional measurement device 10 according to an embodiment of the present disclosure. The three-dimensional measurement device 10 of the present embodiment includes: a light projecting unit 20 that projects a pattern obtained by encoding data with a two-dimensional structure onto an object; an image pickup unit 30 for picking up an image of the object having the projected pattern; a controller 40 for controlling the light emitter 20 and the image pickup unit 30 and outputting a three-dimensional point group representing the three-dimensional shape of the object based on the picked-up image; and a recognition unit 50 for recognizing the three-dimensional shape of the object based on the three-dimensional point group. The three-dimensional measurement device 10 may not necessarily include the recognition unit 50, and the recognition unit 50 may include another device that can communicate with the three-dimensional measurement device 10. The object may be any object such as a bulk component or a flat component.

The light projecting section 20 is capable of projecting a pattern including a plurality of unit patterns, each of which represents at least two bits, includes a feature point, and calculates the position of a three-dimensional point group, onto an object. The light projecting section 20 can project a pattern in which a square unit pattern is laid in a grid pattern on an object, for example. However, the pattern may include a unit pattern having an arbitrary shape, and may include a unit pattern having at least one of a circular shape, a curved surface, a random dot shape, a grid shape, and a wavy shape. The unit patterns may be arranged such that the columns of the lattice can be determined according to the arrangement order of the unit patterns. The unit pattern of a quadrangle may have an arbitrary pattern inside a square, for example, and the unit pattern may have a rectangular or parallelogram shape.

The image pickup unit 30 is disposed at a predetermined distance and angle from the light projecting unit 20, and picks up an image of the object having the pattern projected thereon. In the image, a pattern that is deformed according to the state of the position, orientation, or the like of the object is captured. The imaging unit 30 may be one unit, and may take one image to measure the three-dimensional shape of the object. However, the three-dimensional measurement device 10 may include a plurality of imaging units. The imaging unit 30 may capture an image of one object or may capture a plurality of images.

The control unit 40 extracts feature points of the pattern, and calculates the position of a three-dimensional point group indicating the three-dimensional shape of the object based on the positions of the feature points in the image and the decoded data. In the present embodiment, the control unit 40 extracts the center of the unit pattern as a feature point, and decodes the data represented by the unit pattern based on the two-dimensional shapes of the first region and the second region included in the unit pattern. Here, the first region and the second region of the unit pattern are regions distinguishable from each other, and may be regions distinguished by the brightness of the projected light, for example. The first area and the second area on the image can be distinguished according to the pixel value of each pixel of the image captured by the imaging unit 30. For example, the first region and the second region may be distinguished by a luminance value of a pixel, and the first region and the second region may be distinguished according to whether or not the luminance value is equal to or greater than a threshold value. In this embodiment, the second region is defined as a region having a larger area than the first region. In addition, the definition of the first region and the second region may be reversed, and the first region may be defined as a region having a larger area than the second region. The control unit 40 may extract feature points for a pattern projected within the depth of field of the imaging unit 30, or may not extract feature points for a pattern projected outside the depth of field and captured in a blurred manner. That is, when the distinction between the first region and the second region becomes unclear due to the influence of the blur, the unit pattern may not be used for calculating the position of the three-dimensional point group.

The control unit 40 decodes data indicated by adjacent unit patterns, for example, for a column to which a pixel of interest on an image belongs, and determines which column of the pattern projected by the light projection unit 20 the column to which the pixel of interest belongs, based on the arrangement order of the data. After the row of the pattern projected by the light projecting section 20 is specified, a plane passing through the specified row and the light source of the light projecting section 20 and penetrating the object is specified. Then, a straight line passing through the target pixel on the image and the unit pattern corresponding to the target pixel in the pattern projected on the object is specified. Here, if the distance and angle between the light projecting section 20 and the image pickup section 30 are known, the distance from the intersection of the plane passing through the light projecting section 20 and the straight line passing through the image pickup section 30 to the three-dimensional measuring apparatus 10, that is, the distance to the object can be calculated by a triangulation method. In this way, the control unit 40 can calculate the position of one point of the three-dimensional point group representing the three-dimensional shape of the object from one unit pattern. The control unit 40 can specify a plurality of unit patterns included in the pattern by window matching, and calculate the positions of the three-dimensional point groups for each of the specified plurality of unit patterns. The control unit 40 may determine not only the columns but also the rows of the pattern projected by the light projecting unit 20 by using epipolar constraint (epipolar constraint).

In order to measure the three-dimensional shape of an object with higher resolution, it is conceivable to arrange unit patterns obtained by encoding data at higher density and project the patterns onto the object. The size of the unit pattern is limited by the resolution of the imaging unit 30, that is, the density of the light receiving elements included in the imaging unit 30. If a unit pattern obtained by encoding data is arranged at a higher density and the shorter side of the unit pattern on an image is smaller than 3 pixels, it is difficult to extract a feature point of the unit pattern from the image, or to distinguish between a first region and a second region of the unit pattern, and it is difficult to measure a three-dimensional shape. Therefore, as for the size of the unit pattern, it is desirable that the shorter side is 3 pixels or more in the image. If the short side of the unit pattern is 3 pixels or more in the image, the first region and the second region can be distinguished by, for example, the brightness and darkness of the pixels, and the feature point of the unit pattern and the data represented by the unit pattern can be decoded. The short side may be the shortest of the four sides of the unit pattern.

Further, by setting the size of the unit pattern on the image to be 10 pixels or less on the short side, the unit patterns obtained by encoding data can be arranged at high density, and the three-dimensional shape of the object can be measured with higher resolution. The size of the unit pattern in the image may be set to 9 pixels or less or 8 pixels or less on the short side.

In short, the image captured by the imaging unit 30 of the three-dimensional measurement device 10 according to the present embodiment may be an image obtained by projecting, onto the object, a pattern including a plurality of unit patterns each having a quadrangle whose short side is 3 pixels or more and 10 pixels or less on the image.

The inventors have found that, when the pattern is simply made fine by merely increasing the arrangement density of the unit patterns, it is difficult to measure the three-dimensional shape of the object when the contrast of the pattern is lowered by the influence of ambient light or the like, or when the pattern is distorted by being projected on a slope. That is, when the pattern is simply made fine, robustness against the variation of the imaging conditions is impaired. In contrast, the inventors verified the robustness against the variation of the imaging conditions by variously changing the shapes of the first region and the second region included in the unit pattern. It has been found that if the area ratio obtained by dividing the area of the first region by the area of the second region is 0.3 or more and 0.9 or less, even when the imaging conditions vary, it is possible to extract feature points from the unit pattern or decode data represented by the unit pattern. Here, the area ratio of the area of the first region divided by the area of the second region included in the unit pattern may be 0.3 or more and 0.9 or less in the pattern projected by the light projecting section 20, or may be 0.3 or more and 0.9 or less in the image captured by the image capturing section 30. Even if noise is added to the first region or the second region in the pattern projected by the light projecting unit 20 and the area ratio of the area of the first region divided by the area of the second region is slightly out of the range of 0.3 or more and 0.9 or less, if the area ratio of the area of the first region divided by the area of the second region is 0.3 or more and 0.9 or less on the image captured by the imaging unit 30, the feature points can be extracted from the unit pattern or the data represented by the unit pattern can be decoded even when the imaging conditions vary. Here, the noise may be a second area included in the first area of the projected pattern, and the second area is so minute as not to be recognized as one pixel on the captured image. In addition, conversely, the noise may be a first region included in the second region of the projected pattern, and the first region may be so small that the first region is not recognized as one pixel in the captured image.

According to the three-dimensional measurement device 10 of the present embodiment, the area ratio of the first region and the second region included in the unit pattern is 0.3 or more and 0.9 or less, and the unit pattern is a quadrangle whose short side is 3 pixels or more on the image, so that the first region and the second region can be distinguished even when the imaging conditions vary, the feature points of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Further, the area ratio of the first region and the second region included in the unit pattern is 0.3 or more and 0.9 or less, and the unit pattern is a quadrangle having a short side of 10 pixels or less on the image, whereby a two-dimensional structure in which data is encoded can be arranged at high density, and the density of extracted feature points and the amount of encoded data can be increased. Therefore, the three-dimensional shape of the object can be measured with higher resolution while improving robustness against variation in imaging conditions.

Constitution example 2

[ functional constitution ]

< light projection part >

The light projector 20 projects a pattern obtained by encoding data with a two-dimensional structure onto an object, and any pattern can be used as the projected pattern. Specific examples of the pattern will be described in detail below using fig. 3 and the like.

The pattern projected by the light projecting section 20 may be a pattern of a plurality of unit patterns which are laid out in a square shape, and the unit patterns may include a first area and a second area. Here, the first region and the second region can be distinguished by the brightness of the light projected by the light projection unit 20. For example, the first area may be defined as a bright area that is illuminated with light and the second area may be defined as a dark area that is not illuminated with light, or vice versa. By distinguishing the first region and the second region according to the brightness of light, the pattern can be recognized even by the imaging unit 30 that captures a monochrome image, and the configuration of the imaging unit 30 can be simplified, thereby reducing the cost of the three-dimensional measurement apparatus 10.

The first area and the second area of the unit pattern may be distinguished by the wavelength band of the light projected by the light projection unit 20. For example, the first region may be defined as a region irradiated with blue light having a wavelength band of about 450nm, and the second region may be defined as a region irradiated with red light having a wavelength band of about 650nm, or vice versa. As another example, the first region may be defined as a region irradiated with blue light having a wavelength band of about 450nm, and the second region may be defined as a region irradiated with yellow light having a wavelength band of about 580nm, or vice versa. As another example, the first region may be defined as a region irradiated with blue light having a wavelength band of about 450nm, and the second region may be defined as a region irradiated with infrared light having a wavelength band of about 1 μm, or vice versa. By thus distinguishing the first region and the second region according to the wavelength band of the light to be irradiated, even when white light such as ambient light is irradiated on the object, the difference between the first region and the second region is not easily changed, and the first region and the second region are easily recognized. In addition, a combination of wavelength bands in which the difference between the first region and the second region of the unit pattern becomes significant may be selected according to the absorbance of the object, and the first region and the second region may be distinguished. Further, a combination of wavelength bands in which the difference between the first region and the second region of the unit pattern becomes significant may be selected according to the spectral sensitivity of the imaging element of the imaging unit 30, and the first region and the second region may be distinguished.

The first area and the second area of the unit pattern may be distinguished by the polarization of the light projected by the light projection unit 20. For example, the first region may be defined as a region irradiated with light linearly polarized in a first direction, and the second region may be defined as a region irradiated with light linearly polarized in a second direction orthogonal to the first direction. In addition, as another example, the first region may be defined as a region irradiated with light circularly polarized clockwise with respect to the traveling direction of the light, and the second region may be defined as a region irradiated with light circularly polarized counterclockwise with respect to the traveling direction of the light. Further, a combination of polarized light beams in which the difference between the first region and the second region of the unit pattern becomes significant may be selected according to the absorbance of polarized light beams of the object, and the first region and the second region may be distinguished. By thus distinguishing the first region and the second region according to the polarization of light, for example, in the case of an object such as a black body or a transparent body, even when the object is irradiated with light such as ambient light, the difference between the first region and the second region is not easily changed, and the first region and the second region are easily recognized.

The light projecting section 20 may be a projector (projector) that projects an arbitrary fixed pattern, or may be a projector that projects one fixed pattern per unit time by Micro Electro-Mechanical Systems (MEMS) or the like. The light projector 20 may include a modulation element for modulating the size of the projected pattern, and for example, the size of the projected pattern may be modulated according to the unevenness or inclination of the object. By modulating the size of the pattern according to the state of the object, robustness against variation in imaging conditions can be ensured. By increasing the number of short sides on the image by modulating the pattern to be large, the number of three-dimensional point groups can be reduced, and the computational load of image recognition for measuring the three-dimensional shape of the object can be reduced.

< imaging part >

The image pickup unit 30 picks up an image of the object having the projected pattern, and outputs the picked-up image to the control unit 40 or to another device. The imaging unit 30 may be disposed at a predetermined distance and angle from the light projecting unit 20, and for example, the direction in which the light projecting unit 20 projects light is substantially the same as the imaging direction of the imaging unit 30, and may be disposed on the same plane at a predetermined distance.

< control part >

The control unit 40 includes an image input unit 41, an image recording unit 42, a calculation unit 43, a three-dimensional point group output unit 44, and a setting unit 45. The image input unit 41 acquires an image captured by the imaging unit 30 from the imaging unit 30, and inputs the image to the image recording unit 42. The image recording unit 42 records the image captured by the imaging unit 30 in the memory.

The calculation unit 43 extracts feature points of the captured pattern. The calculation unit 43 may extract a feature point for each unit pattern included in the pattern, for example, the center of the unit pattern as a feature point. The calculation unit 43 decodes the data represented by the unit pattern based on the two-dimensional shapes of the first region and the second region included in the unit pattern. Further, the calculating unit 43 calculates the position of the three-dimensional point group indicating the three-dimensional shape of the object based on the position of the feature point in the image and the decoded data. More specifically, the data represented by the unit pattern is decoded for the unit pattern of interest and the unit patterns adjacent to the unit pattern of interest, and the column to which the unit pattern of interest belongs is determined according to the order of arrangement of the data. Next, the distance to the object is calculated by triangulation based on the position on the image of the feature point extracted from the unit pattern of interest and the specified column. The calculation unit 43 may also determine not only the column to which the unit pattern of interest belongs but also the row to which the unit pattern of interest belongs, using the epipolar constraint. In this way, the positions of a plurality of points representing the three-dimensional shape of the object can be calculated for a plurality of unit patterns, and the three-dimensional shape of the object can be represented.

The three-dimensional point group output unit 44 outputs the calculated data of the three-dimensional point group to the recognition unit 50. The three-dimensional point group output unit 44 may output the data of the three-dimensional point group to the display unit or to a device other than the three-dimensional measurement device 10.

The setting unit 45 can set a pattern projected by the light projecting unit 20, a diaphragm of the imaging unit 30, an exposure time, and the like. The setting unit 45 may set a range of an area ratio of the first region and the second region of the unit pattern included in the pattern projected by the light projecting unit 20, based on the number of pixels on the short side of the unit pattern on the image. That is, the setting unit 45 may set the area ratio of the first region to the second region to a range of 0.3 to 0.9 or less, or a range of 0.3 to 0.9 or less, depending on the number of pixels on the short side of the unit pattern on the image. By setting the range of the area ratio of the first region to the second region included in the unit pattern according to the number of pixels on the short side of the unit pattern on the image in this manner, the range of the area ratio of the first region to the second region can be set according to the density of the unit pattern on the image, and the balance between the resolution of the three-dimensional shape of the measurement object and the robustness against the variation in the imaging conditions can be adjusted.

The setting unit 45 may set the area ratio of the first area to the second area to be narrower as the number of pixels on the short side of the unit pattern on the image is smaller. For example, the setting can be made in such a manner that: when the number of pixels on the short side of the unit pattern on the image is 10 pixels, the area ratio between the first region and the second region is set to a range of 0.3 to 0.9, and the area ratio between the first region and the second region is made narrower than a range of 0.3 to 0.9 as the number of pixels on the short side of the unit pattern on the image becomes less than 10 pixels. Thus, the range of the area ratio between the first region and the second region can be set to be narrowed as the density of the unit pattern on the image increases, and the range of the area ratio between the first region and the second region can be limited to a range in which robustness against the variation of the imaging conditions can be secured as the resolution of the three-dimensional shape of the measurement object increases. Therefore, the balance between the resolution of the three-dimensional shape of the measurement target and the robustness against the variation of the imaging conditions can be adjusted.

< identification part >

The recognition unit 50 includes a Computer Aided Design (CAD) model storage unit 51, a CAD matching calculation unit 52, and a CAD matching output unit 53. The CAD model storage unit 51 can store a three-dimensional CAD model of an object. The CAD matching calculation unit 52 can match the three-dimensional point group acquired from the three-dimensional point group output unit 44 with the three-dimensional CAD model of the object stored in the CAD model storage unit 51. The matching of the three-dimensional point groups to the three-dimensional CAD model can be performed by means of an arbitrary algorithm (algorithm). The CAD matching output section 53 may output the result of the matching calculated by the CAD matching calculation section 52 to a display section or other machine.

[ hardware constitution ]

< light projection part >

Next, an example of the hardware configuration of the three-dimensional measurement device 10 according to the present embodiment will be described. The light projecting section 20 may include a light source and a photomask (photomask) for generating light having a pattern, and may include, for example, a laser light source and a refractive optical element. The Light projecting unit 20 may be a projector having an optical element for forming a fixed pattern and a Digital Light Processor (DLP) as a Light modulation element, a Liquid Crystal Display (LCD), a Liquid Crystal On Silicon (LCOS), a Micro Electro-mechanical systems (MEMS), or the like, and may include a modulation element for modulating the size of the projected pattern. The light projecting unit 20 can cause light from the laser light source to enter the refractive optical element, and generate light having a two-dimensional structure by using a refractive pattern formed on the surface of the refractive optical element. The light projecting section 20 may include any optical member such as a lens, and the wavelength band of light emitted from the light source is not limited to the visible light region, and may be in the infrared region or the ultraviolet region.

< imaging part >

The imaging unit 30 may be a camera including at least a light receiving element that detects light projected by the light projecting unit 20. The imaging unit 30 may include any optical member such as a filter for separating wavelengths, a filter for separating polarized light, or other lenses.

< control part and recognition part >

Fig. 2 is a diagram showing the physical configuration of the control unit 40 and the recognition unit 50 of the three-dimensional measurement device 10 according to the present embodiment. The three-dimensional measurement device 10 includes a Central Processing Unit (CPU) 10a corresponding to a calculation Unit, a Random Access Memory (RAM) 10b corresponding to a storage Unit, a Read Only Memory (ROM) 10c corresponding to a storage Unit, a communication Unit 10d, an input Unit 10e, and a display Unit 10 f. These components are connected to each other via a bus so as to be capable of transmitting and receiving data. In the present example, a case where the three-dimensional measurement apparatus 10 includes one computer is described, but the three-dimensional measurement apparatus 10 may be implemented using a plurality of computers.

The CPU10a is a control unit that performs control related to execution of programs stored in the RAM 10b or the ROM 10c, and arithmetic and processing of data. The CPU10a is a calculation unit that executes the following program (three-dimensional measurement program): based on the image of the object with the projected pattern, the positions of the three-dimensional point groups representing the three-dimensional shape of the object are calculated, and the three-dimensional point groups are matched with the three-dimensional CAD model. The CPU10a receives various input data from the input unit 10e or the communication unit 10d, and displays the calculation result of the input data on the display unit 10f or stores the calculation result in the RAM 10b or the ROM 10 c. Although not shown in fig. 2, the calculation of the mechanical learning related to the measurement of the three-dimensional shape may be performed by means of parallel arithmetic Processing performed by an Application Specific Integrated Circuit (ASIC) such as a Graphics Processing Unit (GPU) or a Tensor Processing Unit (TPU).

The RAM 10b is a memory unit in which data can be rewritten, and includes, for example, a semiconductor memory element. The RAM 10b can store data such as a three-dimensional measurement program executed by the CPU10a, an image of the object acquired from the imaging unit 30, data relating to the calculated three-dimensional point group, and a three-dimensional CAD model of the object. Note that these are examples, and the RAM 10b may store data other than these, or may not store a part of these.

The ROM 10c reads out data in a storage section, and includes, for example, a semiconductor memory element. The ROM 10c can store, for example, a three-dimensional measurement program or data that is not rewritten.

The communication unit 10d is an interface for connecting the three-dimensional measurement device 10 to another device. The communication unit 10d may be connected to the light projecting unit 20 and the image pickup unit 30 via a Local Area Network (LAN), for example, and transmits information on setting of a pattern to the light projecting unit 20 and information on setting of an aperture, exposure time, shutter speed (shutter speed), and the like to the image pickup unit 30. The communication unit 10d can receive the image of the object from the imaging unit 30. The communication unit 10d may be connected to a communication network such as the Internet (Internet). Further, the light projection time of the light projection unit 20, the light modulation element of the light projection unit 20, or the exposure time or shutter speed of the photographic subject 30 may be controlled by a Field-Programmable Gate Array (FPGA).

The input unit 10e receives data input from a user, and includes, for example, a keyboard (keyboard), a mouse (mouse), and a touch panel (touch panel).

The Display unit 10f visually displays the calculation result obtained by the CPU10a, and includes, for example, a Liquid Crystal Display (LCD). The display unit 10f can display, for example, an image of the object captured by the imaging unit 30, the calculated three-dimensional point group, or a three-dimensional CAD model matching the three-dimensional point group.

The three-dimensional measurement program may be stored in a computer-readable storage medium such as the RAM 10b or the ROM 10c, or may be provided via a communication network connected via the communication unit 10 d. The three-dimensional measurement device 10 realizes various operations described with reference to fig. 1 by executing a three-dimensional measurement program by the CPU10 a. These physical configurations are exemplified and may not necessarily be independent configurations. For example, the three-dimensional measurement device 10 may include a Large-Scale integrated circuit (LSI) in which the CPU10a is integrated with the RAM 10b or the ROM 10 c.

3 run example

Fig. 3 is a diagram showing an example of a pattern projected by the light projecting section 20 of the three-dimensional measurement device 10 according to the present embodiment. In this figure, a part of the pattern projected by the light projector 20 is enlarged and shown. The pattern projected by the light projecting section 20 includes unit patterns U arranged in a lattice shape of N × M (N and M are arbitrary natural numbers), and includes a code area a normalized by unit patterns U arranged in a lattice shape of N × M (N is a natural number smaller than N, and M is a natural number smaller than M). In the pattern of this example, one coding region a is formed by normalizing the unit patterns U arranged in a 2 × 2 grid, and one data is decoded from one coding region a. The adjacent coding regions a repeatedly contain unit patterns U.

Fig. 4 is a diagram showing an example of the unit pattern projected by the light projecting section 20 of the three-dimensional measurement device 10 according to the present embodiment. In this figure, the first, second, third and fourth unit patterns U1, U2, U3 and U4 are illustrated. The first unit pattern U1, the second unit pattern U2, the third unit pattern U3, and the fourth unit pattern U4 are patterns including the first region S1 and the second region S2, and have a two-dimensional structure in which one relatively small square is arranged at the center of a lattice pattern formed of relatively large squares.

The first unit pattern U1 includes a white first region S1 at the lower left and upper right and a black second region S2 from the upper left to the lower right. The second unit pattern U2 has a white first region S1 at the upper left and lower right and a black second region S2 from the lower left to the upper right. The third unit pattern U3 has a black first region S1 at the upper left and lower right and a white second region S2 from the lower left to the upper right. The fourth unit pattern U4 has a black first region S1 at the lower left and upper right and a white second region S2 from the upper left to the lower right.

Here, the area represented in white may be a bright area irradiated with light, and the area represented in black may be a dark area not irradiated with light. However, the area indicated by white may be a dark area not irradiated with light, and the area indicated by black may be a bright area irradiated with light. Further, the area represented in white may be, for example, an area irradiated with blue light, the area represented in black may be an area irradiated with red light, or the area represented in white may be an area irradiated with light linearly polarized in a first direction, and the area represented in black may be an area irradiated with light linearly polarized in a second direction orthogonal to the first direction.

The first, second, third, and fourth unit patterns U1, U2, U3, and U4 include two-dimensional shapes in which the first region S1 is separated from the second region S2. The first unit pattern U1 and the second unit pattern U2 are rotated by 90 °, the third unit pattern U3 and the fourth unit pattern U4 are rotated by 90 °, the first unit pattern U1 and the fourth unit pattern U4 are reversed in black and white, and the second unit pattern U2 and the third unit pattern U3 are reversed in black and white.

By separating the first region S1 with the second region S2 interposed therebetween, various two-dimensional structures can be formed by the unit pattern, and the density of data encoded in the unit pattern can be increased. This reduces the number of unit patterns that need to be decoded to identify the pattern row, and enables matching of the pattern on the image with the projected pattern with fewer computations, thereby reducing the computational load for image recognition for measuring the three-dimensional shape of the object.

In the first unit pattern U1, the second unit pattern U2, the third unit pattern U3, and the fourth unit pattern U4, the first region S1 is separated into two regions with the second region S2 interposed therebetween. In other words, in the first, second, third, and fourth unit patterns U1, U2, U3, and U4, the first region S1 is separated by the second region S2, but is not separated into three or more by the second region S2. By suppressing the number of separations of the first region S1 to two, the two-dimensional structure of the unit pattern can be relatively simplified, and the first region S1 and the second region S2 can be easily recognized. In addition, the amount of data that can be represented by the unit pattern can be increased, and the density of data encoded by the pattern can be increased. Therefore, the number of unit patterns to be decoded for specifying the pattern row can be reduced, matching between the pattern on the image and the projected pattern can be performed by a smaller number of calculations, and the calculation load for image recognition for measuring the three-dimensional shape of the object can be reduced.

Fig. 5 is a diagram showing an example of patterns having different area ratios of the first region and the second region, which are projected by the light projecting section 20 of the three-dimensional measurement device 10 according to the present embodiment. In this figure, the first pattern P1, the second pattern P2, the third pattern P3, the fourth pattern P4, and the fifth pattern P5 are illustrated, which include the first unit pattern U1, the second unit pattern U2, the third unit pattern U3, and the fourth unit pattern U4 shown in fig. 4, and the area ratio of the first region and the second region is changed without changing the lengths of the four sides of the unit patterns. In this example, the unit pattern has a square outer shape, and the four sides of the unit pattern are equal in length. The first pattern P1 shows the unit pattern U and the feature points F extracted from the unit pattern U. In addition, the first pattern P1, the second pattern P2, the third pattern P3, the fourth pattern P4, and the fifth pattern P5 are illustrated with a part of the patterns being enlarged. The unit pattern U and the feature point F are illustrated in the first pattern P1, but the second pattern P2, the third pattern P3, the fourth pattern P4, and the fifth pattern P5 similarly include unit patterns, and feature points are extracted from the unit patterns.

The first pattern P1 is an example in which the area ratio of the area of the first region divided by the area of the second region included in the unit pattern U is 4/5 equal to 0.8. The second pattern P2 is an example in which the area ratio of the area of the first region divided by the area of the second region contained in the unit pattern is 21/29 ≈ 0.724. The third pattern P3 is an example in which the area ratio of the area of the first region divided by the area of the second region included in the unit pattern is 3/5 equal to 0.6. The fourth pattern P4 is an example in which the area ratio of the area of the first region divided by the area of the second region contained in the unit pattern is 5/13 ≈ 0.385. The fifth pattern P5 is an example in which the area ratio of the area of the first region divided by the area of the second region included in the unit pattern is 0.2.

Fig. 6 is a diagram showing an example of a pattern disturbed by a change in imaging conditions. In this figure, an example of a pattern in which the short side of the unit pattern on the image has a length of 4 pixels, 5 pixels, and 6 pixels is represented, in which the contrast of the image is changed, noise is added to the image, the image is smoothed, and affine transformation of the image is performed, and thereby, the variation in the imaging conditions is reproduced. Here, the affine transformation includes enlargement, reduction, and shear deformation of an image in a specific direction. In the figure, an example of a pattern in which the area ratio of the area of the first region divided by the area of the second region included in the unit pattern is 3/5 equal to 0.6 is shown.

In the first row of fig. 6, which shows "no variation", an example of a pattern in the case where there is no variation in the imaging conditions is shown for each of the cases where the length of the short side of the unit pattern on the image is 4 pixels, 5 pixels, and 6 pixels. The pattern shown in the first row includes four kinds of unit patterns shown in fig. 4.

In the second row of fig. 6, which is expressed as the "first condition", an example of a pattern that varies under the first condition is shown for each of the cases where the length of the short side of the unit pattern on the image is 4 pixels, 5 pixels, and 6 pixels. Here, the first condition is a condition that when the luminance of the brightest pixel is denoted by M and the luminance of the darkest pixel is denoted by L, the contrast is decreased so that M-L becomes 30. From the pattern shown in the second line, it can be confirmed that the image is disturbed under the first condition, and thus the first area and the second area are difficult to recognize compared with the case where there is no variation shown in the first line.

In the third row of fig. 6, which is expressed as the "second condition", an example of a pattern that varies under the second condition is shown for each case where the length of the short side of the unit pattern on the image is 4 pixels, 5 pixels, and 6 pixels. Here, the second condition is a condition in which when the luminance of the brightest pixel is represented by M and the luminance of the darkest pixel is represented by L, the contrast is reduced so that M-L becomes 30, noise is added to the image based on a gaussian distribution with a standard deviation of 5, and the image is smoothed based on a gaussian distribution with a standard deviation of 1. From the pattern shown in the third row, it was confirmed that the image was disturbed under the second condition, and thus the first area and the second area were more difficult to recognize than the case where there was no variation shown in the first row.

In the fourth row of fig. 6 expressed as the "third condition", an example of a pattern that varies under the third condition is shown for each case where the length of the short side of the unit pattern on the image is 4 pixels, 5 pixels, and 6 pixels. Here, the third condition is a condition in which noise is added to the image based on the gaussian distribution with the standard deviation of 5, the image is smoothed based on the gaussian distribution with the standard deviation of 1, 20% magnification is performed in the x-axis direction, and 30 ° shear deformation is performed in the pattern stretching direction. From the pattern shown in the fourth row, it was confirmed that the image was disturbed under the third condition, and thus the first area and the second area were more difficult to recognize than the case where there was no variation shown in the first row.

In the fifth row of fig. 6, which is expressed as the "fourth condition", an example of a pattern that varies under the third condition is shown for each case where the length of the short side of the unit pattern on the image is 4 pixels, 5 pixels, and 6 pixels. Here, the third condition is a condition in which noise is added to the image based on the gaussian distribution with the standard deviation of 5, the image is smoothed based on the gaussian distribution with the standard deviation of 1, reduction of 20% is performed in the x-axis direction, and shearing deformation of 30 ° is performed in the pattern contraction direction. From the pattern shown in the fifth line, it can be confirmed that the image is disturbed under the fourth condition, and thus the first area and the second area are more difficult to recognize than the case where there is no variation shown in the first line.

Fig. 7 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when decoding data by the pattern, for the pattern disordered in the first form. In this figure, a case where the length of the short side of the unit pattern on the image is 10 pixels is shown by a graph G11 of a solid line, a case where the length of the short side of the unit pattern on the image is 8 pixels is shown by a graph G12 of a broken line, a case where the length of the short side of the unit pattern on the image is 6 pixels is shown by a graph G13 of a one-dot chain line, a case where the length of the short side of the unit pattern on the image is 5 pixels is shown by a graph G14 of a two-dot chain line, and a case where the length of the short side of the unit pattern on the image is 4 pixels is shown by a graph G15 of a dot line.

When the luminance of the brightest pixel is represented by M and the luminance of the darkest pixel is represented by L, the first mode disorder pattern is a pattern in which the contrast is reduced so that M-L becomes 30, noise is added to the image based on a gaussian distribution with a standard deviation of 5, and the image is smoothed based on a gaussian distribution with a standard deviation of 1. That is, the first mode chaotic pattern is a pattern that reproduces: the contrast of the image is reduced by ambient light or the like, and noise is applied to the image pickup element of the image pickup unit 30, which causes image disturbance.

As is apparent from graphs G11 to G15, when the length of the short side of the unit pattern is any one of 4 to 10 pixels, the decoding success rate is 50% or more when the area ratio of the first region and the second region of the unit pattern is 0.3. It is also found that, when the length of the short side of the unit pattern is any one of 4 pixels to 10 pixels, if the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.8, the decoding success rate is 80% or more. It is also found that, when the length of the short side of the unit pattern is 6 to 10 pixels, the decoding success rate is 90% or more when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.9.

The setting unit 45 may set the range of the area ratio of the first region and the second region of the unit pattern to approximately 0.4 to 0.9 when the length of the short side of the unit pattern is 6 pixels to 10 pixels, and set the range of the area ratio of the first region and the second region of the unit pattern to approximately 0.4 to 0.8 by narrowing the range when the length of the short side of the unit pattern is 4 pixels or 5 pixels, for example.

In this way, by setting the area ratio of the first region and the second region included in the unit pattern to 0.3 or more and 0.9 or less and setting the unit pattern to a quadrangle whose short side is 3 pixels or more and 10 pixels or less on the image, even when the contrast of the image is lowered by ambient light or the like and noise is applied to the image pickup element of the image pickup unit 30, the first region and the second region can be distinguished, the feature points of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Therefore, according to the three-dimensional measurement device 10 of the present embodiment, it is possible to measure the three-dimensional shape of the object with higher resolution while improving robustness against the variation in the imaging conditions.

Fig. 8 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when data is decoded by the pattern, for the pattern disordered in the second form. In this figure, a case where the length of the short side of the unit pattern on the image is 10 pixels is shown by a graph G21 of a solid line, a case where the length of the short side of the unit pattern on the image is 8 pixels is shown by a graph G22 of a broken line, a case where the length of the short side of the unit pattern on the image is 6 pixels is shown by a graph G23 of a one-dot chain line, a case where the length of the short side of the unit pattern on the image is 5 pixels is shown by a graph G24 of a two-dot chain line, and a case where the length of the short side of the unit pattern on the image is 4 pixels is shown by a graph G25 of a dot line.

The pattern disordered in the second form is a pattern that reproduces the following cases, namely: noise is added to an image based on a gaussian distribution with a standard deviation of 5, the image is smoothed based on a gaussian distribution with a standard deviation of 1, 20% magnification is performed in the x-axis direction, and 20 ° shear deformation is performed in the direction of pattern elongation, thereby causing image blurring. That is, the pattern disordered in the second form is a pattern that reproduces: the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and a pattern is projected on a slope of the object or the background and distorted, which causes image disturbance.

As is apparent from graphs G21 to G25, when the length of the short side of the unit pattern is any one of 4 to 10 pixels, the decoding success rate is 50% or more when the area ratio of the first region and the second region of the unit pattern is 0.3. It is also found that, when the length of the short side of the unit pattern is any one of 4 pixels to 10 pixels, if the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.9, the decoding success rate is 70% or more. It is also found that, when the length of the short side of the unit pattern is 5 to 10 pixels, the decoding success rate is 80% or more when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.9.

In this way, by setting the area ratio of the first region and the second region included in the unit pattern to 0.3 or more and 0.9 or less and setting the unit pattern to a quadrangle having a shorter side of 3 pixels or more and 10 pixels or less on the image, even when the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and the pattern is distorted by projecting the pattern onto the slope of the object or the background, the first region and the second region can be recognized, the feature point of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Therefore, according to the three-dimensional measurement device 10 of the present embodiment, it is possible to measure the three-dimensional shape of the object with higher resolution while improving robustness against the variation in the imaging conditions.

Fig. 9 is a graph showing a relationship between an area ratio of the first region and the second region and a success rate when decoding data by the pattern, for a pattern disordered in the third form. In this figure, a case where the length of the short side of the unit pattern on the image is 10 pixels is shown by a graph G31 of a solid line, a case where the length of the short side of the unit pattern on the image is 8 pixels is shown by a graph G32 of a broken line, a case where the length of the short side of the unit pattern on the image is 6 pixels is shown by a graph G33 of a one-dot chain line, a case where the length of the short side of the unit pattern on the image is 5 pixels is shown by a graph G34 of a two-dot chain line, and a case where the length of the short side of the unit pattern on the image is 4 pixels is shown by a graph G35 of a dot line.

The third chaotic pattern is a pattern that reproduces the following cases: noise is added to an image based on a gaussian distribution with a standard deviation of 5, the image is smoothed based on a gaussian distribution with a standard deviation of 1, reduction is performed by 20% in the x-axis direction, and shear deformation is performed by 20 ° in the direction of pattern shrinkage, thereby causing image blurring. That is, the pattern disordered in the third form is a pattern that reproduces: the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and a pattern is projected on a slope of the object or the background and distorted, which causes image disturbance.

As is apparent from graphs G31 to G35, when the length of the short side of the unit pattern is any one of 4 to 10 pixels, the decoding success rate is 50% or more when the area ratio of the first region and the second region of the unit pattern is 0.35. It was found that, when the length of the short side of the unit pattern is any one of 4 pixels to 10 pixels, the decoding success rate is 90% or more when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.6 to 0.7. It is found that, when the length of the short side of the unit pattern is 5 to 10 pixels, the decoding success rate is 90% or more when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.8.

In this way, by setting the area ratio of the first region and the second region included in the unit pattern to 0.3 or more and 0.9 or less and setting the unit pattern to a quadrangle having a shorter side of 3 pixels or more and 10 pixels or less on the image, even when the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and the pattern is distorted by projecting the pattern onto the slope of the object or the background, the first region and the second region can be recognized, the feature point of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Therefore, according to the three-dimensional measurement device 10 of the present embodiment, it is possible to measure the three-dimensional shape of the object with higher resolution while improving robustness against the variation in the imaging conditions.

Fig. 10 is a diagram showing an example of a unit pattern formed by a combination of patterns having different code patterns, which is projected by the light projecting unit 20 of the three-dimensional measurement device 10 according to the present embodiment. In this figure, fifth, sixth, seventh, and eighth unit patterns U11, U12, U13, and U14 are illustrated. The fifth unit pattern U11, the sixth unit pattern U12, the seventh unit pattern U13, and the eighth unit pattern U14 are patterns including the first region S1 and the second region S2, and have a two-dimensional structure in which a lattice pattern of a rectangular shape and a circle are combined so as to be formed by a combination of patterns having different coding patterns.

The fifth unit pattern U11 has a white first region S1 at the lower left and upper right and a black second region S2 from the upper left to the lower right. The sixth unit pattern U12 has a white first region S1 at the upper left and lower right and a black second region S2 from the lower left to the upper right. The seventh unit pattern U13 has a black first region S1 at the upper left and lower right and a white second region S2 from the lower left to the upper right. The eighth unit pattern U14 has a black first region S1 at the lower left and upper right and a white second region S2 from the upper left to the lower right.

Here, the area represented in white may be a bright area irradiated with light, and the area represented in black may be a dark area not irradiated with light. However, the area indicated by white may be a dark area not irradiated with light, and the area indicated by black may be a bright area irradiated with light. Further, the area represented in white may be, for example, an area illuminated with blue light, the area represented in black may be an area illuminated with red light, or the area represented in white may be an area illuminated with light linearly polarized in a first direction, and the area represented in black may be an area illuminated with light linearly polarized in a second direction orthogonal to the first direction.

Fig. 11 is a diagram showing an example of a pattern which is formed by a combination of patterns having different code patterns and has different area ratios of the first region and the second region when projected by the light projecting section 20 of the three-dimensional measurement apparatus 10 according to the present embodiment. In this figure, the seventh pattern P11, the eighth pattern P12, the ninth pattern P13, and the tenth pattern P14 are illustrated, which include the fifth unit pattern U11, the sixth unit pattern U12, the seventh unit pattern U13, and the eighth unit pattern U14 shown in fig. 10, and the area ratio of the first region and the second region is changed without changing the lengths of the four sides of the unit patterns. The seventh pattern P11 shows the unit pattern U and the feature point F extracted from the unit pattern U. In addition, the seventh pattern P11, the eighth pattern P12, the ninth pattern P13, and the tenth pattern P14 are illustrated with a part of the patterns enlarged. The unit pattern U and the feature point F are illustrated in the seventh pattern P11, but the eighth pattern P12, the ninth pattern P13, and the tenth pattern P14 similarly include unit patterns, and feature points are extracted from the unit patterns.

The seventh pattern P11 is an example in which the area ratio of the area of the first region divided by the area of the second region included in the unit pattern U is 4/5 equal to 0.8. The eighth pattern P12 is an example in which the area ratio of the area of the first region divided by the area of the second region contained in the unit pattern is 21/29 ≈ 0.724. The ninth pattern P13 is an example in which the area ratio of the area of the first region divided by the area of the second region included in the unit pattern is 3/5 equal to 0.6. The tenth pattern P14 is an example in which the area ratio of the area of the first region divided by the area of the second region contained in the unit pattern is 5/13 ≈ 0.385.

Fig. 12 is a graph showing a relationship between an area ratio of a first region and a second region and a success rate when decoding data by a pattern in the case where the pattern of another example includes a plurality of unit patterns having a two-dimensional structure in which a lattice pattern formed of a rectangle is combined with a circle so as to be formed by a combination of patterns having different coding patterns is disordered in a first form. In this figure, a case where the length of the short side of the unit pattern on the image is 10 pixels is shown by a graph G41 of a solid line, a case where the length of the short side of the unit pattern on the image is 8 pixels is shown by a graph G42 of a broken line, a case where the length of the short side of the unit pattern on the image is 6 pixels is shown by a graph G43 of a one-dot chain line, a case where the length of the short side of the unit pattern on the image is 5 pixels is shown by a graph G44 of a two-dot chain line, and a case where the length of the short side of the unit pattern on the image is 4 pixels is shown by a graph G45 of a dot line.

When the luminance of the brightest pixel is represented by M and the luminance of the darkest pixel is represented by L, the first form of disordered pattern is a pattern in which the contrast is reduced so that M-L becomes 30, noise is added to the image based on a gaussian distribution with a standard deviation of 5, and the image is smoothed based on a gaussian distribution with a standard deviation of 1. That is, the first mode chaotic pattern is a pattern that reproduces: the contrast of the image is reduced by ambient light or the like, and noise is applied to the image pickup element of the image pickup unit 30, which causes image disturbance.

As is apparent from graphs G41 to G45, when the length of the short side of the unit pattern is any one of 4 to 10 pixels, the decoding success rate is 50% or more when the area ratio of the first region and the second region of the unit pattern is 0.3. It is also found that, when the length of the short side of the unit pattern is any one of 4 pixels to 10 pixels, if the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.7, the decoding success rate is 80% or more. It is found that, when the length of the short side of the unit pattern is 5 to 10 pixels, the decoding success rate is 90% or more when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.8.

For example, the setting unit 45 may set the range of the area ratio of the first region and the second region of the unit pattern to approximately 0.4 to 0.8 when the length of the short side of the unit pattern is 5 pixels to 10 pixels, and set the range of the area ratio of the first region and the second region of the unit pattern to approximately 0.4 to 0.7 by narrowing the range when the length of the short side of the unit pattern is 4 pixels.

In this way, by setting the area ratio of the first region and the second region included in the unit pattern to 0.3 or more and 0.9 or less and setting the unit pattern to a quadrangle whose short side is 3 pixels or more and 10 pixels or less on the image, even when the contrast of the image is reduced by ambient light or the like and noise is applied to the image pickup element of the image pickup unit 30, the first region and the second region can be distinguished, the feature points of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Therefore, according to the three-dimensional measurement device 10 of the present embodiment, it is possible to measure the three-dimensional shape of the object with higher resolution while improving robustness against the variation in the imaging conditions.

Fig. 13 is a graph showing a relationship between an area ratio of a first region and a second region and a success rate when decoding data from the pattern, in a case where the pattern of another example includes a plurality of unit patterns each having a two-dimensional structure in which a lattice pattern formed of a rectangle and a circle are combined so as to be formed of a combination of patterns having different coding patterns, is disordered in a second form. In this figure, a case where the length of the short side of the unit pattern on the image is 10 pixels is shown by a graph G51 of a solid line, a case where the length of the short side of the unit pattern on the image is 8 pixels is shown by a graph G52 of a broken line, a case where the length of the short side of the unit pattern on the image is 6 pixels is shown by a graph G53 of a one-dot chain line, a case where the length of the short side of the unit pattern on the image is 5 pixels is shown by a graph G54 of a two-dot chain line, and a case where the length of the short side of the unit pattern on the image is 4 pixels is shown by a graph G55 of a dot line.

The pattern disordered in the second form is a pattern that reproduces the following cases, namely: noise is added to an image based on a gaussian distribution with a standard deviation of 5, the image is smoothed based on a gaussian distribution with a standard deviation of 1, 20% magnification is performed in the x-axis direction, and 20 ° shear deformation is performed in the direction of pattern elongation, thereby causing image blurring. That is, the pattern disordered in the second form is a pattern that reproduces: the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and a pattern is projected on a slope of the object or the background and distorted, which causes image disturbance.

As is apparent from graphs G51 to G55, when the length of the short side of the unit pattern is any one of 4 to 10 pixels, the decoding success rate is 50% or more when the area ratio of the first region and the second region of the unit pattern is 0.3. It was found that, when the length of the short side of the unit pattern is any one of 4 pixels to 10 pixels, the decoding success rate is almost 100% when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.7. It was found that, when the length of the short side of the unit pattern is 5 to 10 pixels, the decoding success rate is almost 100% when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.9.

In this way, by setting the area ratio of the first region and the second region included in the unit pattern to 0.3 or more and 0.9 or less and setting the unit pattern to a quadrangle having a shorter side of 3 pixels or more and 10 pixels or less on the image, even when the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and the pattern is distorted by projecting the pattern onto the slope of the object or the background, the first region and the second region can be recognized, the feature point of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Therefore, according to the three-dimensional measurement device 10 of the present embodiment, it is possible to measure the three-dimensional shape of the object with higher resolution while improving robustness against the variation in the imaging conditions.

Fig. 14 is a graph showing a relationship between an area ratio of a first region and a second region and a success rate when decoding data by a pattern, in a case where the pattern of another example includes a plurality of unit patterns having a two-dimensional structure in which a lattice pattern formed of rectangles and circles are combined so as to be formed by a combination of patterns having different coding patterns, is confused by a third pattern. In this figure, a case where the length of the short side of the unit pattern on the image is 10 pixels is shown by a graph G61 of a solid line, a case where the length of the short side of the unit pattern on the image is 8 pixels is shown by a graph G62 of a broken line, a case where the length of the short side of the unit pattern on the image is 6 pixels is shown by a graph G63 of a one-dot chain line, a case where the length of the short side of the unit pattern on the image is 5 pixels is shown by a graph G64 of a two-dot chain line, and a case where the length of the short side of the unit pattern on the image is 4 pixels is shown by a graph G65 of a dot line.

The third chaotic pattern is a pattern that reproduces the following cases: noise is added to an image based on a gaussian distribution with a standard deviation of 5, the image is smoothed based on a gaussian distribution with a standard deviation of 1, reduction is performed by 20% in the x-axis direction, and shear deformation is performed by 20 ° in the direction of pattern shrinkage, thereby causing image blurring. That is, the pattern disordered in the third form is a pattern that reproduces: the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and a pattern is projected on a slope of the object or the background and distorted, which causes image disturbance.

As is apparent from graphs G61 to G65, when the length of the short side of the unit pattern is any one of 4 to 10 pixels, the decoding success rate is 50% or more when the area ratio of the first region and the second region of the unit pattern is 0.3. It was found that, when the length of the short side of the unit pattern is any one of 4 pixels to 10 pixels, the decoding success rate is 90% or more when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.7. It was found that, when the length of the short side of the unit pattern is 5 to 10 pixels, the decoding success rate is almost 100% when the area ratio of the first region and the second region of the unit pattern is in the range of approximately 0.4 to 0.8.

In this way, by setting the area ratio of the first region and the second region included in the unit pattern to 0.3 or more and 0.9 or less and setting the unit pattern to a quadrangle having a shorter side of 3 pixels or more and 10 pixels or less on the image, even when the contrast of the image is reduced by ambient light or the like, noise is applied to the image pickup element of the image pickup unit 30, and the pattern is distorted by projecting the pattern onto the slope of the object or the background, the first region and the second region can be recognized, the feature point of the unit pattern can be extracted, and the data represented by the unit pattern can be decoded. Therefore, according to the three-dimensional measurement device 10 of the present embodiment, it is possible to measure the three-dimensional shape of the object with higher resolution while improving robustness against the variation in the imaging conditions.

According to fig. 7 to 9 and 12 to 14, the success rate when decoding data from a pattern shown in fig. 4 (the pattern includes a plurality of unit patterns each having a two-dimensional structure in which one relatively small square is arranged at the center of a lattice pattern formed of relatively large squares) and the other example pattern shown in fig. 11 (the pattern includes a plurality of unit patterns each having a two-dimensional structure in which a lattice pattern formed of a rectangle is combined with a circle) can be sufficiently increased as long as the area ratio of the first region and the second region included in the unit pattern is 0.3 or more and 0.9 or less and the unit pattern is a quadrangle having a shorter side of 3 pixels or more and 10 pixels or less on the image. Therefore, the success rate in decoding data from a pattern is not affected by the pattern of the two-dimensional structure constituting the unit pattern, and it is considered that the area ratio of the first region and the second region included in the unit pattern and the number of pixels of the short side of the unit pattern on the image are important.

Fig. 15 is a diagram illustrating an example of a pattern which is formed by a combination of patterns having different code patterns and different area ratios of the first region and the second region, which is projected by the light projecting section 20 of the three-dimensional measurement apparatus 10 according to the present embodiment. In this figure, the example of the pattern in which the area ratio "S1/S2" obtained by dividing the area of the first region by the area of the second region is changed to "4/5", "21/29", "3/5" and "5/13" in the case where the pattern disposed at the center of the unit pattern is "square" is shown for each of the cases where the length of the short side of the unit pattern on the image is 6 pixels, 5 pixels and 4 pixels. In the figure, the example of the pattern in which the area ratio "S1/S2" obtained by dividing the area of the first region by the area of the second region is changed to "4/5", "21/29", "3/5" and "5/13" in the case where the figure disposed at the center of the unit pattern is "circular" is shown for each of the cases where the length of the short side of the unit pattern on the image is 6 pixels, 5 pixels and 4 pixels.

In each of the patterns shown in fig. 15, the area ratio S1/S2 obtained by dividing the area of the first region by the area of the second region of the unit pattern is 0.3 or more and 0.9 or less, and the unit pattern is a quadrangle having a short side of 3 pixels or more and 10 pixels or less on the image. By using such a pattern, the three-dimensional shape of the object can be measured with higher resolution while improving robustness against variation in imaging conditions.

Fig. 16 is a flowchart of a process of measuring the three-dimensional shape of the object, which is executed by the three-dimensional measuring apparatus 10 according to the present embodiment. Before the measurement process of the three-dimensional shape of the object, calibration (calibration) of the light projecting unit 20 and the image pickup unit 30 may be performed, a test pattern may be projected by the light projecting unit 20, the test pattern may be picked up by the image pickup unit 30, the size of the unit pattern included in the test pattern on the image may be checked, and the amount of light projected by the light projecting unit 20 or the aperture and exposure time of the image pickup unit 30 may be adjusted.

The three-dimensional measurement device 10 sets a range of an area ratio obtained by dividing the area of the first region by the area of the second region of the unit pattern, based on the number of pixels on the short side of the unit pattern in the image (S10). Here, the range of the area ratio may be set so that the smaller the number of pixels on the short side of the unit pattern on the image, the narrower the range.

The three-dimensional measurement device 10 projects a pattern having a two-dimensional structure in which a unit pattern including a first region and a second region having an area ratio within a set range is spread in a grid shape onto an object by the light projecting section 20 (S11). The three-dimensional measurement device 10 captures an image of the object having the projected pattern by the imaging unit 30 (S12). Here, the image pickup unit 30 may pick up an image of one target object to which a pattern is projected.

The three-dimensional measurement device 10 extracts feature points for each unit pattern included in the pattern, and decodes data encoded by the two-dimensional structure of the unit pattern (S13). Next, the three-dimensional measurement device 10 calculates the position of the three-dimensional point group based on the position of the feature point in the image and the decoded data (S14).

Then, the three-dimensional measurement device 10 matches the three-dimensional point group with the CAD model (S15). Finally, the three-dimensional measurement device 10 outputs the matching result (S16). The process of measuring the three-dimensional shape is completed in the above manner. The matching between the three-dimensional point group and the CAD model and the output of the result thereof may be omitted, and the three-dimensional measurement device 10 may output the calculated position of the three-dimensional point group to end the measurement process of the three-dimensional shape.

Modification example 4

<4.1>

Fig. 17 is a diagram showing an example of a unit pattern projected by the light projecting section 20 of the three-dimensional measurement device 10 according to the first modification of the present embodiment. In this figure, a first unit pattern group U20 including an example of eight unit patterns and a second unit pattern group U30 including an example of eight unit patterns are shown.

As for an example of the unit patterns shown on the upper left in the first unit pattern group U20, the first region S1 and the second region S2 are illustrated. As illustrated, the first regions S1 may be separated into three or more regions in a unit pattern with the second regions S2 interposed therebetween. Further, an area ratio of the area of the first region S1 divided by the area of the second region S2 may be 0.3 or more and 0.9 or less.

Further, an example of the unit pattern shown on the upper left in the second unit pattern group U30 is illustrated as the first region S1 and the second region S2. As illustrated, the first regions S1 may be separated into three or more regions in a unit pattern with the second regions S2 interposed therebetween. Further, an area ratio of the area of the first region S1 divided by the area of the second region S2 may be 0.3 or more and 0.9 or less.

The first regions S1 are separated into three or more units in the unit pattern with the second regions S2 interposed therebetween, so that the two-dimensional structure of the unit pattern is relatively complicated, the number of bits that can be expressed by the unit pattern is increased, and the position of the unit pattern can be easily determined. Therefore, it is possible to reduce the number of unit patterns that need to be decoded in order to specify the pattern sequence while ensuring robustness against the variation in imaging conditions, and it is possible to reduce the processing time for window matching and reduce the computational load for image recognition for measuring the three-dimensional shape of the object.

<4.2>

Fig. 18 is a diagram showing an example of a unit pattern projected by the light projecting section 20 of the three-dimensional measurement device 10 according to the second modification of the present embodiment. In this figure, a third unit pattern group U40 including thirteen example unit patterns is shown.

As for an example of the unit patterns shown on the upper left in the third unit pattern group U40, the first region S1 and the second region S2 are illustrated. As illustrated, the unit pattern may contain a two-dimensional shape in which the first regions S1 are continuously separated. In this example, the first region S1 is a region surrounded by the second region S2. Further, an area ratio of the area of the first region S1 divided by the area of the second region S2 may be 0.3 or more and 0.9 or less.

Since the first region S1 has a two-dimensional shape that is continuous without being separated, the two-dimensional structure of the unit pattern can be simplified, and the first region S1 and the second region S2 can be easily recognized.

In addition, the first region S1 and the second region S2 are illustrated as examples of the unit patterns shown at the bottom left in the third unit pattern group U40. As illustrated, the first regions S1 may be separated into three or more regions in a unit pattern with the second regions S2 interposed therebetween. Further, an area ratio of the area of the first region S1 divided by the area of the second region S2 may be 0.3 or more and 0.9 or less.

The first regions S1 are separated into three or more regions in the unit pattern with the second regions S2 interposed therebetween, so that the two-dimensional structure of the unit pattern can be made relatively complicated, the number of bits that can be represented by the unit pattern can be increased, and the density of data encoded by the pattern can be further increased. Therefore, it is possible to reduce the number of unit patterns that need to be decoded in order to specify the pattern sequence while ensuring robustness against the variation in imaging conditions, and to reduce the computational load of image recognition for measuring the three-dimensional shape of the object.

As exemplified as the third unit pattern group U40, the pattern projected to the object by the light projection unit 20 may include: the unit pattern includes a two-dimensional shape in which the first region is continuous without being separated, and a unit pattern including a two-dimensional shape in which the first region is separated with the second region interposed therebetween. Thus, the change in the two-dimensional structure of the unit pattern can be increased, the number of bits that can be represented by the unit pattern can be increased, and the density of data encoded by the pattern can be increased. Therefore, the number of unit patterns to be decoded for specifying the pattern row can be reduced, matching between the pattern on the image and the projected pattern can be performed by a smaller number of calculations, and the calculation load for image recognition for measuring the three-dimensional shape of the object can be reduced.

<4.3>

Fig. 19 is a diagram showing an example of a pattern projected by the light projecting section 20 of the three-dimensional measurement device 10 according to the third modification of the present embodiment. In this figure, an example of four coding regions a is shown, and the coding regions a are configured by arranging the unit pattern U51, the unit pattern U52, and the unit pattern U53 in a 2 × 2 lattice shape, or by arranging the unit pattern U54, the unit pattern U55, and the unit pattern U56 in a 2 × 2 lattice shape. One data can be decoded by one coding region a, and four bits of information can be represented by four coding regions a as shown in this example.

The unit pattern U51 shown on the upper left in the encoding region a is illustrated as a first region S1 and a second region S2. As illustrated, the unit pattern may contain a two-dimensional shape in which the first regions S1 are continuously separated. Further, an area ratio of the area of the first region S1 divided by the area of the second region S2 may be 0.3 or more and 0.9 or less.

<4.4>

Fig. 20 is a diagram showing an example of a pattern projected by the light projecting section 20 of the three-dimensional measurement device 10 according to the fourth modification of the present embodiment. In this figure, four example coding regions a are shown, and each coding region a is configured by arranging the unit pattern U61, the unit pattern U62, and the unit pattern U63 in a 2 × 2 lattice, the unit pattern U64, the unit pattern U62, and the unit pattern U63 in a 2 × 2 lattice, the unit pattern U65, the unit pattern U62, and the unit pattern U63 in a 2 × 2 lattice, or the unit pattern U66, the unit pattern U62, and the unit pattern U63 in a 2 × 2 lattice. One data can be decoded by one coding region a, and four bits of information can be represented by four coding regions a as shown in this example.

The unit pattern U61 shown on the upper left in the encoding region a is illustrated as a first region S1 and a second region S2. As illustrated, the unit pattern may contain a two-dimensional shape in which the first regions S1 are continuously separated. Further, an area ratio of the area of the first region S1 divided by the area of the second region S2 may be 0.3 or more and 0.9 or less.

<4.5>

Fig. 21 is a diagram showing an example of a unit pattern projected by the light projecting unit 20 of the three-dimensional measurement device 10 according to the fifth modification of the present embodiment. In this figure, examples of the unit pattern U70 and the unit pattern U71 are shown. The unit pattern U70 and the unit pattern U71 include lattice patterns in a grid shape, and the thicknesses of the lattices are different from each other. In the unit pattern U70 and the unit pattern U71, the thickness of the lattice is changed to change the area ratio of the area of the first region S1 divided by the area of the second region S2.

In the unit pattern U70 and the unit pattern U71, the first region S1 is separated into three or more, specifically nine regions with the second region S2 interposed therebetween. In this example, the area ratio of the area of the first region S1 divided by the area of the second region S2 may be 0.3 to 0.9.

The first regions S1 are separated into three or more regions in the unit pattern with the second regions S2 interposed therebetween, so that the two-dimensional structure of the unit pattern can be made relatively complicated, the number of bits that can be represented by the unit pattern can be increased, and the density of data encoded by the pattern can be further increased. Therefore, it is possible to reduce the number of unit patterns that need to be decoded in order to specify the pattern sequence while ensuring robustness against the variation in imaging conditions, and to reduce the computational load of image recognition for measuring the three-dimensional shape of the object.

The embodiments described above are for easy understanding of the present invention, and are not intended to be restrictive. The elements included in the embodiments, their arrangement, materials, conditions, shapes, sizes, and the like are not limited to the examples, and may be appropriately changed. Also, the configurations shown in the different embodiments may be partially replaced or combined with each other.

[ additional notes 1]

A three-dimensional assay device (10), comprising:

a light projection unit (20) that projects a pattern, which is obtained by encoding data by a two-dimensional structure, onto an object;

an image pickup unit (30) for picking up an image of the object on which the pattern is projected; and

a calculation unit (43) that extracts feature points of the pattern, calculates the position of a three-dimensional point group representing the three-dimensional shape of the object on the basis of the positions of the feature points in the image and the decoded data,

the pattern includes a plurality of unit patterns of minimum units, each unit pattern of minimum units representing at least two bits, including the feature points, for calculating the position of the three-dimensional point group,

the unit pattern includes a first region and a second region which is distinguished from the first region and has an area larger than the first region,

an area ratio of an area of the first region divided by an area of the second region is 0.3 or more and 0.9 or less.

[ appendix 2]

The three-dimensional measurement device (10) according to supplementary note 1,

the unit pattern is a quadrangle having a short side of 3 pixels or more and 10 pixels or less on the image.

[ additional notes 3]

The three-dimensional measurement device (10) according to supplementary note 1 or 2, wherein,

an area ratio of an area of the first region divided by an area of the second region on the image is 0.3 or more and 0.9 or less.

[ additional notes 4]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 3, further comprising:

and a setting unit (45) that sets the range of the area ratio of the pattern projected by the light projection unit (20) according to the number of pixels on the short side of the unit pattern on the image.

[ additional notes 5]

The three-dimensional measurement device (10) according to supplementary note 4,

the setting unit (45) sets the area ratio to be narrower as the number of pixels on the short side of the unit pattern on the image is smaller.

[ additional notes 6]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 5, wherein,

the first region and the second region are distinguished by the brightness of light projected by the light projection unit (20).

[ additional notes 7]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 5, wherein,

the first area and the second area are distinguished by a wavelength band of light projected by the light projection unit (20).

[ additional notes 8]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 5, wherein,

the first area and the second area are distinguished by polarization of light projected by the light projection unit (20).

[ appendix 9]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 8, wherein,

the unit pattern contains a two-dimensional shape in which the first regions are continued without being separated.

[ appendix 10]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 8, wherein,

the unit pattern includes two-dimensional shapes in which the first regions are separated from the second regions.

[ appendix 11]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 8, wherein,

the pattern includes: the unit pattern includes the two-dimensional shape in which the first region is separately continuous, and the unit pattern includes the two-dimensional shape in which the first region is separated with the second region interposed therebetween.

[ appendix 12]

The three-dimensional measurement device (10) according to supplementary note 10 or 11, wherein,

the first region is separated into two parts with the second region interposed therebetween in the unit pattern.

[ additional notes 13]

The three-dimensional measurement device (10) according to supplementary note 10 or 11, wherein,

the first region is separated into three or more regions in the unit pattern with the second region interposed therebetween.

[ appendix 14]

The three-dimensional measurement device (10) according to any one of supplementary notes 1 to 13, wherein,

the light projection unit (20) includes: a modulation element to modulate a size of the projected pattern.

[ appendix 15]

A three-dimensional assay method comprising:

projecting a pattern obtained by encoding data by a two-dimensional structure onto an object;

capturing an image of the object on which the pattern is projected; and

extracting feature points of the pattern, calculating positions of three-dimensional point groups representing a three-dimensional shape of the object based on positions of the feature points in the image and the decoded data,

the pattern includes a plurality of unit patterns of minimum units, each unit pattern of minimum units representing at least two bits, including the feature points, for calculating the position of the three-dimensional point group,

the unit pattern includes a first region and a second region which is distinguished from the first region and has an area larger than the first region,

an area ratio of an area of the first region divided by an area of the second region is 0.3 or more and 0.9 or less.

[ additional notes 16]

A three-dimensional measurement program for causing a calculation unit included in a three-dimensional measurement device (10) to function as a calculation unit (43), the three-dimensional measurement device (10) comprising: a light projection unit (20) that projects a pattern, which is obtained by encoding data by a two-dimensional structure, onto an object; and an imaging unit (30) for capturing an image of the object on which the pattern is projected,

the calculating unit (43) extracts feature points of the pattern, calculates the position of a three-dimensional point group representing the three-dimensional shape of the object based on the position of the feature points in the image and the decoded data,

the pattern includes a plurality of unit patterns of minimum units, each unit pattern of minimum units representing at least two bits, including the feature points, for calculating the position of the three-dimensional point group,

the unit pattern includes a first region and a second region which is distinguished from the first region and has an area larger than the first region,

an area ratio of an area of the first region divided by an area of the second region is 0.3 or more and 0.9 or less.

41页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:利用相机进行车辆环境建模

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!