Three-dimensional measurement system and three-dimensional measurement method

文档序号:1256217 发布日期:2020-08-21 浏览:28次 中文

阅读说明:本技术 三维测量系统以及三维测量方法 (Three-dimensional measurement system and three-dimensional measurement method ) 是由 大西康裕 清水隆史 松本慎也 于 2019-02-14 设计创作,主要内容包括:为了提供一种提高测量分辨率且可实现高速处理的图像处理系统及方法,本发明的图像处理系统包括:摄像部,具有远离配置的第一摄像部及第二摄像部,拍摄对象物的互不相同的图像;第一计算部,通过使用第一摄像部及第二摄像部的至少任一者,并使用与立体相机方式不同的三维测量方式的距离信息或用于计算距离的信息,从而计算第一特征点的视差;以及第二计算部,使用第一摄像部及第二摄像部,通过立体相机方式,基于针对第二特征点的对应点的搜索结果来计算第二特征点的视差,根据第一特征点的视差及第二特征点的视差来确定对象物的三维形状,且第二计算部基于第一特征点的视差来设定搜索范围。(In order to provide an image processing system and method capable of improving measurement resolution and realizing high-speed processing, the image processing system of the present invention includes: an image pickup unit having a first image pickup unit and a second image pickup unit which are disposed apart from each other and which picks up different images of an object; a first calculation unit that calculates a parallax of the first feature point by using at least one of the first imaging unit and the second imaging unit and using distance information of a three-dimensional measurement method different from the stereo camera method or information for calculating a distance; and a second calculation unit that calculates a parallax of the second feature point based on a search result for a corresponding point of the second feature point by a stereo camera system using the first image pickup unit and the second image pickup unit, determines a three-dimensional shape of the object from the parallax of the first feature point and the parallax of the second feature point, and sets a search range based on the parallax of the first feature point.)

1. A three-dimensional measurement system comprising:

an image pickup unit that has a first image pickup unit and a second image pickup unit that are disposed apart from each other and that picks up different images of the object;

a first calculation unit that calculates a parallax of a first feature point in the image by using at least one of the first imaging unit and the second imaging unit, and using distance information of a three-dimensional measurement method different from a stereo camera method or using information for calculating a distance; and

a second calculation unit that searches for a corresponding point for a second feature point by a stereo camera system using the first image pickup unit and the second image pickup unit, calculates a parallax of the second feature point based on the search result, and specifies a three-dimensional shape of the object from the parallax of the first feature point and the parallax of the second feature point,

the second calculation section sets a search range for the corresponding point of the second feature point based on the disparity of the first feature point.

2. The three-dimensional measurement system of claim 1, comprising:

and a projecting unit that projects the measurement light onto the object to determine the three-dimensional shape.

3. The three-dimensional measurement system of claim 1 or 2,

the first feature point and the second feature point are the same point or exist at positions near each other.

4. The three-dimensional measurement system of any one of claims 1 to 3,

the three-dimensional measurement method different from the stereo camera method obtains three-dimensional information of the object by one shot by the image pickup unit.

5. The three-dimensional measurement system of any one of claims 1 to 4,

the second calculation unit reduces the threshold value of the index of the degree of matching of the stereo camera system, as compared with a case where the search range is not set based on the parallax of the first feature point.

6. The three-dimensional measurement system of any one of claims 1 to 5,

the first calculation unit restores a three-dimensional point group representing a three-dimensional position of the first feature point,

the second calculation unit sets the search range based on the parallax of the first feature point for the second feature point corresponding to the region of the first feature point where the three-dimensional point group is restored, and sets the search range to a predetermined range for the second feature point corresponding to the region of the first feature point where the three-dimensional point group is not restored, when the three-dimensional point group is not restored with respect to a part of the first feature points.

7. The three-dimensional measurement system of any one of claims 1 to 6,

the first calculation unit includes: a first image processing unit that restores a three-dimensional point group indicating a three-dimensional position of the first feature point; and a second image processing unit that two-dimensionally projects the three-dimensional coordinates of each first feature point in the restored three-dimensional point group onto the image, obtains the two-dimensional coordinates of each first feature point, and calculates the parallax of each first feature point from the two-dimensional coordinates,

the second calculation unit includes: a third image processing unit that calculates a parallax of the second feature point by obtaining an estimated value of a parallax of the second feature point from a parallax of the first feature point, setting a search region of the corresponding point based on the estimated value of the parallax of the second feature point, and performing stereo matching between the second feature point and the corresponding point in the search region; and a fourth image processing unit configured to determine a three-dimensional shape of the object based on the parallax of the first feature point and the parallax of the second feature point.

8. The three-dimensional measurement system of any one of claims 2 to 7,

the distance between the optical axis of the first image pickup unit and the optical axis of the second image pickup unit is equal to the distance between the optical axis of the projection unit and the optical axis of the first image pickup unit or the optical axis of the second image pickup unit.

9. The three-dimensional measurement system of any one of claims 2 to 7,

the distance between the optical axis of the first image pickup unit and the optical axis of the second image pickup unit is longer than the distance between the optical axis of the projection unit and the optical axis of the first image pickup unit or the optical axis of the second image pickup unit.

10. The three-dimensional measurement system of any one of claims 2 to 9,

the optical axis of the projecting unit, the optical axis of the first image pickup unit, and the optical axis of the second image pickup unit are disposed on the same plane.

11. The three-dimensional measurement system of any one of claims 2 to 9,

the optical axis of the first image pickup part and the optical axis of the second image pickup part are arranged on the same plane, and the optical axis of the projection part is not arranged on the plane.

12. The three-dimensional measurement system of any one of claims 2 to 11,

the projecting unit projects normal illumination light different from the measurement light to the object.

13. A three-dimensional measurement method using a three-dimensional measurement system comprising: an image pickup unit having a first image pickup unit and a second image pickup unit disposed apart from each other; a first calculation unit; and a second calculation unit, the three-dimensional measurement method including the steps of:

the image pickup unit picks up images of the object which are different from each other;

the first calculation unit calculates a parallax of a first feature point in the image by using at least one of the first image pickup unit and the second image pickup unit and using distance information of a three-dimensional measurement method different from a stereo camera method or information for calculating a distance; and

the second calculation unit searches for a corresponding point to a second feature point by a stereo camera system using the first image pickup unit and the second image pickup unit, calculates a parallax of the second feature point based on the search result, and specifies a three-dimensional shape of the object from the parallax of the first feature point and the parallax of the second feature point,

in the step of determining the three-dimensional shape of the object, the second calculation unit sets a search range for the corresponding point of the second feature point based on the parallax of the first feature point.

Technical Field

The present disclosure relates to a three-dimensional measurement system and a three-dimensional measurement method.

Background

Conventionally, various methods for three-dimensional measurement of an object have been known, and these methods are roughly classified into a method using straightness of light and a method using speed of light, taking into consideration properties of light. Among these methods, the method using the linearity of light includes a method classified into either active (active) measurement (active measurement) or passive (passive measurement), and the method using the velocity of light includes a method classified into active measurement (active measurement). Here, for example, non-patent document 1 describes a method using a so-called active single shot (one-shot) method as a specific example of a spatial coding pattern projection method as an example of an active measurement method, that is: a three-dimensional shape is acquired by projecting pattern light including a single image of a spatially encoded (coding) pattern onto an object, capturing the object on which the single image is projected with an imaging device, and calculating a distance.

As an example of the passive measurement method, a method using a so-called stereo camera method is known, in which: the three-dimensional shape of the object is obtained using two imaging devices. In the stereo camera system, an image pickup device such as a camera is arranged on the left and right sides, for example, to simultaneously pick up an image of an object, a pair of corresponding pixels (that is, a feature point in a reference image and a corresponding point in a comparison image) is searched for from the obtained left and right images, a parallax (distance) between the feature point and the corresponding point in the left and right images is obtained, and the three-dimensional position of each pixel is calculated based on the parallax, thereby specifying the three-dimensional shape of the object.

Disclosure of Invention

Problems to be solved by the invention

However, although the conventional method using the stereo camera system can improve the measurement resolution to a pixel unit of the imaging device, the measurement time is relatively long due to the principle, and there is a problem.

Accordingly, the present disclosure has been made in view of the above circumstances, and an object thereof is to provide a three-dimensional measurement technique which has a high measurement resolution of three-dimensional measurement and can realize high-speed processing by narrowing a search range.

Means for solving the problems

In order to solve the problem, the present disclosure adopts the following structure.

That is, in summary, an example of the three-dimensional measurement system of the present disclosure acquires the parallax of the first feature point as the three-dimensional information of the object by a three-dimensional measurement method different from the stereo camera method, acquires the parallax of the second feature point as the three-dimensional information of the object by the stereo camera method that performs stereo matching in a limited search range set based on the result thereof, and specifies the three-dimensional shape of the object using both the acquired parallax of the first feature point and the acquired parallax of the second feature point. As described above, in the example of the three-dimensional measurement system of the present disclosure, after three-dimensional measurement is performed by a three-dimensional measurement method different from the stereo camera method, three-dimensional measurement by the stereo camera method is performed.

In the above configuration, the search range for three-dimensional measurement by the stereo camera method to be performed later can be made narrower than the search range of the conventional stereo camera method based on three-dimensional information obtained by a three-dimensional measurement method different from the stereo camera method, and thus the processing speed can be increased as compared with the conventional method for performing three-dimensional measurement by using the stereo camera method alone. Also, since three-dimensional information acquired by the stereo camera method is used, high measurement resolution can be achieved.

[1] Specifically, an example of the three-dimensional measurement system of the present disclosure includes: an image pickup unit that has a first image pickup unit and a second image pickup unit that are disposed apart from each other and that picks up different images of the object; a first calculation unit that calculates a parallax of a first feature point in the image by using at least one of the first imaging unit and the second imaging unit, and using distance information of a three-dimensional measurement method different from a stereo camera method or using information for calculating a distance; and a second calculation unit that searches for a corresponding point for a second feature point by a stereo camera system using the first image pickup unit and the second image pickup unit, calculates a parallax of the second feature point based on the search result, and specifies a three-dimensional shape of the object from the parallax of the first feature point and the parallax of the second feature point. Further, the second calculation unit sets a search range for the corresponding point of the second feature point based on the disparity of the first feature point.

In the above configuration, by using at least one of the first image pickup unit and the second image pickup unit and calculating the parallax of the first feature point using the distance information of the three-dimensional measurement method different from the stereo camera method or the information for calculating the distance, a parallax map (paralaxmap) having a relatively coarse density distribution corresponding to the interval of the first feature point can be obtained. Then, by the stereo camera system, a search for a corresponding point of the second feature point is performed using the first image pickup unit and the second image pickup unit, and the parallax of the second feature point is calculated based on the search result, whereby a parallax map having a relatively fine density distribution in pixel units of the captured image can be obtained. Next, the three-dimensional shape of the object is determined from the two-dimensional disparity map.

Here, in the conventional image processing using only the stereo camera system, it is generally necessary to search for a corresponding point of a feature point by stereo matching in a wide range (measurement distance) in a captured image, and thus the processing time inevitably becomes long. In contrast, in the above configuration, the search by the stereo camera method is performed based on the parallax of the first feature point obtained using the distance information of the three-dimensional measurement method other than the stereo camera method or the information for calculating the distance (for example, phase information, wavelength information, out-of-focus information, and the like). Thus, the search range for the corresponding point of each second feature point can be limited to a particularly narrow range as compared with the conventional stereo camera system. As a result, the search time for stereo matching can be particularly shortened and high-speed processing can be performed. On the other hand, the search for the second feature point and the corresponding point in the stereo matching can be performed in pixel units of the captured image, and thus high measurement resolution can be achieved.

In the above configuration, the search range for obtaining the parallax of the second feature point may be set based on one of the parallaxes obtained for each of the first feature points, or the search range for obtaining the parallax of the second feature point may be set based on a plurality of parallaxes obtained for each of the first feature points. In the case where the search range is set based on a plurality of parallaxes, the search region can be more accurately defined than in the case where the search range is set based on one parallax, and as a result, further reduction in processing time and/or further improvement in stereo matching accuracy can be achieved.

[2] More specifically, the structure may include: and a projecting unit that projects the measurement light onto the object to determine the three-dimensional shape.

Here, the "measurement light" is not particularly limited as long as it is projection light or illumination light used in various three-dimensional measurement methods, and examples thereof include pattern light having a predetermined fixed dot pattern, pattern light having a random dot pattern, and slit light. The structure is useful in the case where: as a three-dimensional measurement method different from the stereo camera method, an active measurement method is used in particular to project predetermined measurement light.

[3] In the above structure, the first feature point and the second feature point may be the same point or different points. In particular, if the first feature point and the second feature point are the same point or exist at positions near each other, the search range of the second feature point can be more accurately defined, and thus the processing time can be further shortened and/or the stereo matching accuracy can be further improved.

As an active measurement system using the straightness of light among three-dimensional measurement systems of an object, for example, there are generally mentioned: a spatial code pattern projection method, a temporal code pattern projection method, a moire pattern (contour line) method, an illuminance difference Stereo method (irradiation direction/Photometric Stereo), and the like, which use triangular ranging as a basic principle, and an illuminance difference method (single irradiation/Inverse Square ratio) + random Forest (Regression Forest), a laser confocal method, a white confocal method, an interference method, and the like, which use coaxial ranging as a basic principle. Further, as a passive measurement method using the straightness of light, for example, there are: a stereo camera system (including multi-baseline stereo) based on the principle of triangulation, a view volume intersection system (Shape from simple), a factorization system (factorization), a Motion Depth estimation (Motion from Motion) system, a Depth from shading (Depth from shading) system, and the like, and a Depth from focusing (Depth from focusing) system, a Depth from defocusing (Depth from focusing) system, a Depth from zooming system, and a Depth from zooming system based on the principle of coaxial triangulation. Further, as an active measurement method using the speed of light, for example, there are: a Time Of Flight (TOF) measurement method, a laser scanning method, a TOF measurement method, a single shot (single shot) method, a TOF measurement method, a (TOF) method using radio waves, sound waves, and microwaves, and the like, which are fundamental principles Of simultaneous distance measurement.

[4] In addition, the "three-dimensional measurement method different from the stereo camera method" as an example of the three-dimensional measurement system of the present disclosure may be applied without any limitation as long as it is a method other than the stereo camera method among the above-described methods, and among these methods, a method of obtaining three-dimensional information of the object by one shot by the image pickup unit may be used as a three-dimensional measurement method different from the stereo matching.

With this configuration, it is possible to reduce the time required for imaging in a three-dimensional measurement system different from the stereo camera system, which is performed before three-dimensional measurement by the stereo camera system, and to reduce the processing time required for the entire three-dimensional measurement system.

[5] In the above configuration, the second calculation unit may be configured to lower the threshold value of the index of the degree of matching of the stereo camera system (stereo matching) than in a case where the search range is not set based on the parallax of the first feature point.

With this configuration, excellent robustness against mutual reflection, which cannot be achieved by the conventional stereo camera system, can be achieved.

[6] In the above configuration, the first calculation unit may restore a three-dimensional point group indicating a three-dimensional position of the first feature point, and the second calculation unit may set the search range based on a parallax of the first feature point for the second feature point corresponding to a region of the first feature point in which the three-dimensional point group is restored, and set the search range to a predetermined range for the second feature point corresponding to a region of the first feature point in which the three-dimensional point group is not restored, when the three-dimensional point group is not restored for a part of the first feature point.

According to the above configuration, even when a three-dimensional point group cannot be restored by the first three-dimensional measurement (three-dimensional measurement by a three-dimensional measurement method different from the stereo camera method) before the three-dimensional measurement by the stereo camera method with respect to a part of the object, it is not necessary to perform stereo matching for expanding the search range on the entire captured image, and it is possible to increase the speed of the processing.

[7] In the configuration, the first calculation portion may have: a first image processing unit that restores a three-dimensional point group indicating a three-dimensional position of the first feature point; and a second image processing unit that two-dimensionally projects the three-dimensional coordinates of each first feature point in the restored three-dimensional point group onto the image, obtains the two-dimensional coordinates of each first feature point, and calculates the parallax of each first feature point from the two-dimensional coordinates. Further, the second calculation portion may have: a third image processing unit that calculates a parallax of the second feature point by obtaining an estimated value of a parallax of the second feature point from a parallax of the first feature point, setting a search region of the corresponding point based on the estimated value of the parallax of the second feature point, and performing stereo matching between the second feature point and the corresponding point in the search region; and a fourth image processing unit configured to determine a three-dimensional shape of the object based on the parallax of the first feature point and the parallax of the second feature point.

In the above configuration, the calculation of the parallax of the first feature point and the calculation of the parallax of the second feature point can be appropriately performed, whereby high measurement resolution and high-speed processing of three-dimensional measurement can be more reliably realized.

[8] In the above configuration, a distance between the optical axis of the first imaging unit and the optical axis of the second imaging unit may be equal to a distance between the optical axis of the projecting unit (optical axis of the measurement light) and the optical axis of the first imaging unit or the optical axis of the second imaging unit. Here, the "optical axis of the imaging unit" means an optical axis of the imaging unit, and means an optical path of the light beam that is perpendicularly incident on the center of the imaging plane defined by the imaging element (in other words, a direction perpendicular to the imaging plane and passing through the center of the imaging plane), regardless of the configuration of the optical system of the imaging unit. The term "optical axis of the projection unit" means an optical path of a light ray emitted perpendicularly from the center of a projection surface defined by the light source or the light emitting element (in other words, a direction perpendicular to the projection surface and passing through the center of the projection surface, or a direction in which the intensity of the light projected from the projection surface is maximized), regardless of the configuration of an optical system of the projection unit.

In the above configuration, the base line length of the first imaging unit and the second imaging unit can be equal to the base line length of the projecting unit and the first imaging unit or the second imaging unit, and therefore, the measurement accuracy of the three-dimensional measurement can be improved.

[9] In the above configuration, a distance between the optical axis of the first imaging unit and the optical axis of the second imaging unit may be longer than a distance between the optical axis of the projecting unit (optical axis of the measurement light) and the optical axis of the first imaging unit or the optical axis of the second imaging unit.

In the above configuration, the base line length of the first imaging unit and the second imaging unit can be larger than the base line length of the projection unit and the first imaging unit or the second imaging unit, and thus the measurement accuracy of the three-dimensional measurement can be improved.

[10] In the above configuration, the optical axis of the projecting unit (optical axis of the measurement light), the optical axis of the first imaging unit, and the optical axis of the second imaging unit may be arranged on the same plane.

In the above configuration, either arrangement of [8] and [9] may be configured, and in the case where the projection unit is integrated with the first imaging unit and the second imaging unit to configure, for example, a sensor unit, the baseline length of the first imaging unit and the second imaging unit may be relatively large, so that the measurement accuracy by the stereo camera system may be further improved.

[11] In the above configuration, the optical axis of the first image pickup unit and the optical axis of the second image pickup unit may be arranged on the same plane, and the optical axis of the projection unit (the optical axis of the measurement light) may not be arranged on the plane.

In the above configuration, either arrangement of [8] and [9] may be configured, and in the case where the projecting unit is integrated with the first imaging unit and the second imaging unit to configure, for example, a sensor unit, the occupation area (footprint) of the sensor unit can be made relatively small, and the installation area of the system can be reduced.

[12] In the above configuration, the projection unit may project normal illumination light different from the measurement light to the object. For convenience, in the embodiments described below, a component that projects measurement light onto an object may be referred to as a "first projection unit", and a component that projects normal illumination light onto an object may be referred to as a "second projection unit".

In the above configuration, since normal illumination light different from the measurement light can be used as, for example, general illumination for inspection, three-dimensional measurement can be appropriately performed even when the object is in a dark ambient environment. Further, by comparing an image obtained by photographing the object to which the normal illumination is projected with shape Design data (Computer Aided Design (CAD) model data) of the object set or held in advance, so-called CAD matching is performed, for example, the position and orientation of the object can be grasped more accurately.

[13] An example of the three-dimensional measurement method of the present disclosure is a method that can be effectively implemented by an example of a three-dimensional measurement system having the above-described configuration, and includes the following steps. That is, the method uses a three-dimensional measurement system comprising: an image pickup unit having a first image pickup unit and a second image pickup unit disposed apart from each other; a first calculation unit; and a second calculation section, the method including the steps of: the image pickup unit picks up images of the object which are different from each other; the first calculation unit calculates a parallax of a first feature point in the image by using at least one of the first image pickup unit and the second image pickup unit and using distance information of a three-dimensional measurement method different from a stereo camera method or information for calculating a distance; and the second calculation unit searches for a corresponding point for a second feature point by a stereo camera method, calculates a parallax of the second feature point based on the search result, and determines a three-dimensional shape of the object from the parallax of the second feature point. In the step of determining the three-dimensional shape of the object, the second calculation unit sets a search range for the corresponding point of the second feature point based on the parallax of the first feature point.

In the present disclosure, the terms "section", "part", "device" and "system" do not mean only a physical part, but also include a structure in which functions of the "section", "part", "device" and "system" are realized by software. Further, functions of one "section", "part", "device", and "system" may be realized by two or more physical parts or devices, or functions of two or more "sections", "parts", "devices", and "systems" may be realized by one physical part or device.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the present disclosure, the measurement resolution of three-dimensional measurement can be improved, and high-speed processing can be achieved by narrowing the search range. Further, by the high-speed processing, the processing speed of the entire system can be increased, the memory capacity can be saved, the data amount of communication can be reduced, and the processing reliability can be improved.

Drawings

Fig. 1 is a schematic plan view schematically showing an example of an application scenario of the three-dimensional measurement system according to the embodiment.

Fig. 2 is a plan view schematically showing an example of a hardware configuration of the three-dimensional measurement system according to the embodiment.

Fig. 3 is a plan view schematically showing an example of a functional configuration of the three-dimensional measurement system according to the embodiment.

Fig. 4 is a flowchart showing an example of a processing procedure of the three-dimensional measurement system according to the embodiment.

Fig. 5 is a timing chart showing an example of a processing procedure of the three-dimensional measurement system according to the embodiment.

Fig. 6(a) and (B) show an image obtained by imaging an example of an object by the three-dimensional measurement system according to the embodiment, and a three-dimensional point group image restored by performing first image processing using the image.

Fig. 7(a) is a parallelized image showing an image obtained by imaging an example of an object by the three-dimensional measurement system of the embodiment, and a partially enlarged image showing an image obtained by two-dimensionally projecting three-dimensional coordinates of restored three-dimensional points on the parallelized image. (B) The present invention is a parallelized image showing an image obtained by imaging an example of an object by the three-dimensional measurement system of the embodiment, and a partially enlarged image showing an image obtained by two-dimensionally projecting three-dimensional coordinates of restored three-dimensional points on the parallelized image.

Fig. 8 is a conceptual diagram schematically showing a parallax map having a relatively large density distribution according to the intervals of a plurality of first feature points.

Fig. 9(a) and (B) are an image showing an example of a merged parallax map obtained by the three-dimensional measurement system according to the embodiment for an example of an object, and a three-dimensional point group image showing a three-dimensional shape restored by using the image, respectively.

Fig. 10 shows a list of images showing parallax maps obtained as a result of three-dimensional measurement in various ways for various objects.

Fig. 11(a) to (D) are perspective views schematically showing first to fourth configuration examples of the sensor unit according to the embodiment, respectively.

Fig. 12(a) and (B) are perspective views schematically showing a fifth configuration example and a sixth configuration example of the sensor unit according to the embodiment, respectively.

Fig. 13(a) to (C) are plan views schematically showing ninth structural examples to eleventh structural examples of the sensor unit of the embodiment, respectively.

Detailed Description

Hereinafter, an embodiment (hereinafter also referred to as "embodiment") of the present disclosure will be described with reference to the drawings. However, the embodiments described below are merely examples, and are not intended to exclude the application of various modifications or techniques not explicitly described below. That is, an example of the present disclosure can be implemented by being variously modified within a scope not departing from the gist thereof. In the following description of the drawings, the same or similar portions are denoted by the same or similar reference numerals, and the drawings are schematic and do not necessarily correspond to actual dimensions, ratios, and the like. Further, the drawings may include portions having different dimensional relationships or ratios from each other.

Application example § 1

First, an example of a scenario to which an example of the present disclosure is applied will be described with reference to fig. 1. Fig. 1 is a schematic plan view schematically showing an example of an application scenario of a three-dimensional measurement system 100 according to the present embodiment. The three-dimensional measurement system 100 of the present embodiment is a system for measuring the three-dimensional shape of an object OB to be measured.

In the example of fig. 1, the three-dimensional measurement system 100 includes a sensor unit 200 disposed to face the object OB, and a computer 300 connected to the sensor unit 200. The sensor unit 200 is configured integrally with a Three-dimensional (3D) projector 110, and for example, a first camera 210 and a second camera 220 arranged so as to sandwich the 3D projector 110. The 3D projector 110, the first camera 210, and the second camera 220 may not be all integrated as the sensor unit 200, but may be provided separately or partially integrated.

The 3D projector 110 projects illumination (hereinafter also referred to as "illumination for 3D") including measurement light (for example, pattern light) for performing three-dimensional measurement of the object OB onto the object OB at the projection region S110. The first camera 210 and the second camera 220 each include, for example, a camera device mounted with a normal optical sensor, and image the object OB illuminated with the projection 3D illumination at the angle of view S210 and the angle of view S220, respectively.

The computer 300 controls the projection process by the 3D projector 110 and the image capturing process by the first camera 210 and the second camera 220, executes image processing of images Img1 and Img2 captured by the first camera 210 and the second camera 220, respectively, and specifies the three-dimensional shape of the object OB.

More specifically, the computer 300 performs the first to fourth image processing shown in (1) to (4) below as the image processing of the images Img1, Img 2.

(1) First image processing

The first image processing is, for example, three-dimensional measurement by an active single shot method, and restores, from at least one of the image Img1 and the image Img2, a three-dimensional point group indicating three-dimensional positions of a plurality of target pixels corresponding to a pattern included in measurement light (pattern light). In this case, for example, a plurality of target pixels are distributed at a density that is coarser than the pixel units in the images Img1 and Img 2. In the case of using the passive measurement method as a three-dimensional measurement method different from the stereo camera method, the projection of the measurement light may not be performed, that is, the 3D projector 110 may not be included.

(2) Second image processing

Three-dimensional coordinates of each point (restored three-dimensional point) in a three-dimensional point group restored from a plurality of first feature points (target pixels) are two-dimensionally projected onto the images Img1 and Img2 (including images obtained by subjecting these images to appropriate processing as necessary, the same applies hereinafter), two-dimensional coordinates of each first feature point in each of the images Img1 and Img2 are obtained, and the parallax of each first feature point is calculated from the two-dimensional coordinates. Thus, a parallax map having a relatively coarse density distribution corresponding to the intervals of the plurality of first feature points can be obtained.

(3) Third image processing

This third image processing is three-dimensional measurement by a stereo camera system, and stereo matching is performed using one of the images Img1 and Img2 as a reference image and the other as a comparison image. First, in the reference image, an estimated value of the parallax between the second feature point and the corresponding point is obtained based on the parallax of the first feature point existing at a predetermined position in the vicinity of the second feature point. Then, based on the estimated value of the parallax of the second feature point, a search region for the corresponding point of the second feature point is set in a limited manner (i.e., the search region is limited to a narrow range). Then, stereo matching is performed in the search area, and the parallax of the second feature point is calculated. Thus, the parallax maps having relatively fine density distributions of the pixel units of the images Img1 and Img2 are obtained so as to complement the parallax map having a relatively coarse density distribution obtained in the above (2). The search range for obtaining the parallax of the second feature point may be set based on one of the parallaxes obtained for each of the first feature points, or may be set based on a plurality of parallaxes among the parallaxes obtained for each of the first feature points (see also the description of fig. 8 described later).

(4) Fourth image processing

The three-dimensional shape of the object OB is determined by merging the parallax maps obtained in (1) to (3) to create a merged parallax map, performing post-processing such as appropriate filtering on the merged parallax map as necessary, and converting the parallax of each pixel (the first feature point in (1) and the second feature point in (3)) for which the parallax is obtained into a distance in the depth direction (so-called parallax-depth conversion).

As described above, the sensor unit 200 corresponds to an example of the "projecting part" ("first projecting part") and the "image pickup part" of the present disclosure, the 3D projector 110 corresponds to an example of the "projecting part" ("first projecting part") of the present disclosure, and the first camera 210 and the second camera 220 correspond to an example of the "first image pickup part" and the "second image pickup part" of the present disclosure, respectively. The images Img1 and Img2 correspond to an example of "images of different objects" in the present disclosure. Further, a portion of the computer 300 that particularly performs image processing (for example, an image processing unit 350 described later) corresponds to an example of each of the "first image processing unit", the "second image processing unit", and the "first calculation unit including the image processing units," and the "third image processing unit", the "fourth image processing unit", and the "second calculation unit including the image processing units" of the present disclosure.

As described above, the present embodiment may be referred to as an example of a hybrid three-dimensional measurement system that fuses three-dimensional measurement by a three-dimensional measurement method (for example, but not limited to, an active single shot method in which pattern light is projected as measurement light onto an object) different from a stereo camera method, which is capable of obtaining a three-dimensional shape of the object OB, with three-dimensional measurement by a stereo camera method, which is capable of obtaining a three-dimensional shape of the object OB by stereo matching, and a method thereof. However, the problems of the prior art described above show that it is not possible to improve the measurement resolution and realize high-speed processing even when the conventional two systems are simply combined.

In contrast, in the present embodiment, the three-dimensional measurement method different from the stereo camera method and the stereo camera method are not simply combined, and three-dimensional information (parallax of the first feature point) obtained by the three-dimensional measurement method different from the stereo camera method is used, whereby the search region for the corresponding point of the second feature point for stereo matching can be limited to a particularly narrow range as compared with the normal stereo camera method.

In other words, according to the present embodiment, first, parallax information on other pixels (second feature points and their corresponding points) between the first feature points for which parallax information is obtained in a three-dimensional measurement method different from the stereo camera method can be supplemented by the stereo camera method. In addition, by limiting the search area for stereo matching at this time to a reliable narrow range, an extremely short processing time can be achieved. As a result, the search time and processing time for stereo matching can be greatly shortened while maintaining the high measurement resolution obtained by the stereo camera system. Further, by defining the search region in this way, excellent robustness against mutual reflection can be achieved, and also mismatching between pixels in stereo matching can be reduced.

Construction example 2

[ hardware configuration ]

Next, an example of the hardware configuration of the three-dimensional measurement system 100 according to the present embodiment will be described with reference to fig. 2. Fig. 2 is a plan view schematically showing an example of the hardware configuration of the three-dimensional measurement system 100 according to the present embodiment.

In the example of fig. 2, the three-dimensional measurement system 100 includes: a sensor unit 200 integrally configured with the 3D projector 110 and the first and second cameras 210 and 220, which are also illustrated in fig. 1; and a computer 300 connected to the sensor unit 200.

The 3D projector 110 includes, for example, a laser light source 111, a pattern mask 112, and a lens 113. The light emitted from the laser light source 111 is converted into measurement light (pattern light) having a predetermined pattern by the pattern mask 112, and is projected to the outside through the lens 113. The wavelength of the laser light generated by the laser light source 111 is not particularly limited, and for example, infrared light, visible light, ultraviolet light, or the like can be used. The pattern mask 112 is a pattern mask in which a predetermined pattern is formed. The 3D projector 110 is not limited to the above configuration, and a general projector may be used. In this case, the pattern mask 112 having a predetermined pattern formed thereon may be stored in the projector body or the like. The configuration of the 3D projector 110 is not limited to the above configuration, and a projection apparatus including an appropriate light source and other optical system parts used in various three-dimensional measurement methods can be applied.

The computer 300 includes a control arithmetic unit 301, a communication Interface (I/F) unit 302, a storage unit 303, an input unit 304, and an output unit 305, which are communicably connected to each other via a bus 306.

The control operation Unit 301 includes a Central Processing Unit (CPU), a Random Access Memory (RAM), a Read Only Memory (ROM), and the like, and controls each component and performs various operations according to information Processing.

The communication I/F section 302 is, for example, a communication module for communicating with other devices by wire or wirelessly. The communication method used by the communication I/F unit 302 to communicate with other devices is arbitrary, and examples thereof include a Local Area Network (LAN) and a Universal Serial Bus (USB). In particular, the 3D projector 110, the first camera 210, and the second camera 220 of the sensor unit 200 may be provided so as to be communicable with the control arithmetic unit 301 and the like via the communication I/F unit 302.

The storage unit 303 is an auxiliary storage device such as a Hard Disk Drive (HDD) or a Solid State Drive (SSD), and stores data such as various programs (a control program of hardware such as the sensor unit 200, an image processing program, and the like) executed by the control arithmetic unit 301, measurement light (pattern light having a predetermined pattern, for example), the captured image Img1, and the image Img 2. CAD model data of the object OB and the like may be stored in the storage unit 303. Here, the image processing program includes a program for executing the first to fourth image processing described in the above application example, and the control arithmetic unit 301 executes the image processing program to realize an image processing function in a functional configuration example described later.

The input unit 304 is an interface device that can be realized by, for example, a mouse (mouse), a keyboard (keyboard), a touch panel (touch panel), or the like, and receives various input operations from a user. The output unit 305 is an interface device such as a display, a speaker, or a printer for notifying a user or the like using the three-dimensional measurement system 100 of various information by display, sound, printing, or the like.

[ functional Structure ]

Next, an example of the functional configuration of the three-dimensional measurement system 100 according to the present embodiment will be described with reference to fig. 3. Fig. 3 is a plan view schematically showing an example of a functional configuration of the three-dimensional measurement system 100 according to the present embodiment.

The control arithmetic unit 301 of the three-dimensional measurement system 100 shown in fig. 2 expands various programs (control programs, image processing programs, and the like) stored in the storage unit 303 in the RAM. Next, the control arithmetic unit 301 interprets and executes various programs developed in the RAM by the CPU to control the respective components. Thus, as illustrated in fig. 3, the three-dimensional measurement system 100 according to the present embodiment can realize a configuration including the control unit 310, the image acquisition unit 320, the image recording unit 330, the image output unit 340, and the image processing unit 350.

The control unit 310 controls projection of 3D illumination onto the object OB from the 3D projector 110 of the sensor unit 200 and imaging of the object OB with the projected 3D illumination by the first camera 210 and the second camera 220, among other things. The image acquisition unit 320 acquires the images Img1 and Img2 of the object OB captured by the first camera 210 and the second camera 220. The image recording unit 330 holds the image Img1 of the object OB acquired by the image acquisition unit 320, the image Img2, a three-dimensional point group image representing the three-dimensional shape of the object OB finally obtained by the image processing in the image processing unit 350, and the like. The image output unit 340 outputs the three-dimensional point group image representing the three-dimensional shape of the object OB obtained in this way to a display, a printer, or the like so as to be visible to the user of the three-dimensional measurement system 100.

The image processing unit 350 includes a first image processing unit 351, a second image processing unit 352, a third image processing unit 353, and a fourth image processing unit 354. The first image processing unit 351, the second image processing unit 352, the third image processing unit 353, and the fourth image processing unit 354 perform the first image processing to the fourth image processing shown in (1) to (4), respectively, and obtain a three-dimensional point group image representing the three-dimensional shape of the object OB.

In the present embodiment, an example was described in which each function realized by the computer 300 included in the three-dimensional measurement system 100 is realized by a general-purpose CPU, but some or all of the above functions may be realized by one or more dedicated processors. It is to be noted that, needless to say, the functional configuration of the computer 300 included in the three-dimensional measurement system 100 may be omitted, replaced, or added as appropriate depending on the embodiment. The term "computer" is used to mean a general information processing apparatus.

3 run example

Next, an example of the operation of the three-dimensional measurement system 100 will be described with reference to fig. 4 to 8. Fig. 4 is a flowchart showing an example of a processing procedure of the three-dimensional measurement system 100 according to the present embodiment, and is also a flowchart showing an example of a processing procedure of a three-dimensional measurement method using the three-dimensional measurement system 100. Fig. 5 is a timing chart showing an example of a processing procedure of the three-dimensional measurement system 100 according to the present embodiment. The processing procedure described below is only an example, and each process can be changed as much as possible. The process procedure described below can be appropriately omitted, replaced, and added according to the embodiment. Further, each "time t" described below indicates the timing of starting and ending the processing of fig. 5.

(Start)

First, the user of the three-dimensional measurement system 100 starts up the three-dimensional measurement system 100, and causes the started-up three-dimensional measurement system 100 to execute various programs (a control program, an image processing program, and the like). Then, the controller 310 of the computer 300 controls the operations of the sensor unit 200 and the computer 300 in accordance with the following processing procedure, and performs image processing of the image Img1 and the image Img2 of the object OB.

(step S10)

First, in step S10, the relative arrangement of the object OB and the sensor unit 200 is adjusted as necessary, and after the conditions for projecting the 3D illumination from the 3D projector 110 and the imaging conditions of the first camera 210 and the second camera 220 are set, the 3D illumination including the pattern light is projected from the 3D projector 110 to the object OB at time t1 to time t2 based on an appropriate trigger (timing signal) at time t 1.

(step S20)

Next, in step S20, the first camera 210 and the second camera 220 capture the object OB between time t1 and time t2 when the 3D illumination is projected, and the images Img1 and Img2 obtained by the first camera 210 and the second camera 220 are read out from time t1 to time t 3. Next, in time t3 to time t5, image Img1 and image Img2 are each transmitted to the computer 300 sequentially or simultaneously, respectively. Fig. 5 illustrates a case where the image Img1 is transferred between time t3 and time t4, and the image Img2 is transferred between time t4 and time t 5.

(step S30)

Next, in step S30, the first image processing (e.g., three-dimensional measurement using the active single shot method) of the above-described (1) is performed at time t4 to time t6 using at least one of the image Img1 and the image Img2, and a three-dimensional point group indicating the three-dimensional positions of the plurality of first feature points corresponding to the pattern included in the measurement light (pattern light) is restored from at least one of the image Img1 and the image Img 2. In fig. 4 and 5, the case of using the image Img1 is illustrated, but the image Img2 may be used, and both the image Img1 and the image Img2 may be used. Here, fig. 6 a and 6B show an image Img1 obtained by imaging an example of the object OB (a metal workpiece) by the three-dimensional measurement system 100, and a three-dimensional point cloud image Img3D1 restored by performing the first image processing of step S30 using the image Img1, respectively.

(step S40)

Next, in step S40, the following processing is performed at time t6 to time t 7. That is, here, first, the normal parallelization processing of the images Img1 and Img2 is performed, and the second image processing (restored three-dimensional point group projection) of (2) above is performed using the parallelized images Img1 and Img2 (parallelized images). Thus, the three-dimensional coordinates of each first feature point (restored three-dimensional point) in the restored three-dimensional point group (i.e., the three-dimensional coordinates of each point in fig. 6B) are two-dimensionally projected onto the image Img1 and the image Img 2. Further, the two-dimensional projection onto the image Img1 may be performed substantially simultaneously between the time t4 and the time t6 in the processing of the image Img1 in step S30.

Fig. 7 a shows a partially enlarged view of a parallelized image Img1 '(substantially equivalent to fig. 6 a) of an image Img1 obtained by imaging an example of an object OB (a metal workpiece) by the three-dimensional measuring system 100 and an image Img10 obtained by two-dimensionally projecting three-dimensional coordinates of a restored three-dimensional point on the parallelized image Img 1'. Similarly, fig. 7B shows a partially enlarged view of a parallelized image Img2 'of an image Img2 obtained by imaging an example of the object OB (a metal workpiece) by the three-dimensional measuring system 100 and an image Img20 obtained by two-dimensionally projecting the three-dimensional coordinates of the restored three-dimensional points on the parallelized image Img 2'. The "+" sign in each of the images Img10 and Img20 corresponds to a two-dimensional projected point of the restored three-dimensional point (target pixel).

Next, the parallax of each restored three-dimensional point between the image Img10 and the image Img20 is calculated as each parallax of a plurality of first feature points (target pixels) between the image Img1 and the image Img 2. Here, the target pixels G10 to G13 of the symbols illustrated in the image Img10 in fig. 7(a) correspond to the target pixels G20 to G23 of the symbols illustrated in the image Img20 in fig. 7(B), respectively, and the parallax of each first feature point (target pixel) is calculated from the two-dimensional coordinates between the corresponding target pixels. Thereby, a parallax map having a relatively coarse density distribution corresponding to the intervals of the plurality of first feature points is obtained.

Here, a more specific example of the parallelization processing of the image Img1 and the image Img2 and the numerical processing for obtaining two-dimensional coordinates by two-dimensionally projecting three-dimensional coordinates of restored three-dimensional points on the parallelized image will be described below.

First, in the parallelization process, the origin coordinates of the first camera 210 are set as reference coordinates. Next, when the distortion of the lenses of the first camera 210 and the second camera 220 is removed from the images Img1 and Img2 in advance and is considered by a linear model, the three-dimensional point X of the active single shot method is obtainedAThe corresponding two-dimensional coordinate point U in the image Img1 obtained by the first camera 210LAnd a corresponding two-dimensional coordinate point U in the image Img2 obtained by the second camera 220RThe relationship expressed by the following formulas (1) and (2) can be modeled.

[ number 1]

UL=KL(I︱O)XA…(1)

UR=KR(R2︱t2)XA…(2)

In the formula, KLRepresenting an internal line, K, of the first camera 210RRepresenting an inner line, a rotation line R, of the second camera 2202And a translation vector t2Representing the pose of the second camera 220 with respect to the first camera 210.

Next, when the first camera 210 and the second camera 220 are horizontally disposed, since corresponding pixels in the images Img1 and Img2 of the two are present at the same position in the vertical direction, it is assumed that the first camera 210 and the second camera 220 are rotated and ideally parallel to each other. In the parallelization process, the rotation matrix R satisfying the condition expressed by the following formula (3) is usedrectAnd an imaginary camera line KrectThe first camera 210 is virtually rotated Rrect·R2Making the second camera 220 virtually rotate Rrect

[ number 2]

Then, from the relationship expressed by the following equations (4) and (5), a restored three-dimensional point X obtained by a three-dimensional measurement method (for example, an active single shot method) different from the stereo camera method can be obtainedACorresponding two-dimensional coordinates U 'in corresponding parallelized images Img 1'LAnd corresponding two-dimensional coordinates U 'in the parallelized image Img 2'R

[ number 3]

U’L=KrectRrectR2(I︱O)XA…(4)

U’R=KrectRrect(R2︱t2)XA…(5)

(step S50)

Next, in step S50, using image Img1 and image Img2, at time t6 to time t7 which are the same as in step S40, the above-described (3) third image processing (three-dimensional measurement using the stereo camera system) is performed, and stereo matching of an arbitrary second feature point in the reference image and a corresponding point in the comparison image for the second feature point is performed with one of image Img1 and image Img2 (image Img1 'and image Img 2' in the case of performing the parallelization processing as a reference image and the other as a comparison image. Further, the image Img2 may be used as a reference image, and the image Img1 may be used as a comparison image.

First, a second feature point in a reference image (for example, the image Img 1') is extracted, and an estimated value of the parallax between the second feature point and the corresponding point is calculated based on the parallax of the first feature point existing at a predetermined vicinity position of the second feature point.

Here, fig. 8 is a conceptual diagram schematically showing a parallax map having a relatively coarse density distribution corresponding to the intervals of the plurality of first feature points obtained in step S40. In fig. 8, each region divided into rows and columns corresponds to a unit pixel in the reference image, and the pixel GA serving as the second feature point extracted arbitrarily and, for example, the target pixels G10 to G13 serving as the first feature points existing in the vicinity thereof and having known parallax in the parallax map are shown together (see fig. 7 a). Here, an example of a method of calculating an estimated value of the parallax between the pixel GA as the second feature point and the corresponding point thereof will be described below. However, the estimation method is not limited to the following method. In addition, although the case where the first feature point and the second feature point are different points and exist in the vicinity of each other is described here as an example, the first feature point and the second feature point may be the same point, and even if the feature points are different points, the case is not limited to an example where the feature points exist in the vicinity of each other. Further, if the first feature point and the second feature point are different points and the accuracy of the distance information of the first feature point obtained by a three-dimensional measurement method other than the stereo camera is sufficient, the calculation speed can be increased by excluding the process of obtaining the distance information of the first feature point by the stereo camera method.

That is, when the parallax of the pixel GA as the first feature point existing in the vicinity of the pixel GA as the second feature point is d, the parallax d of the pixel GA as the second feature point is dGAFor example, it can be estimated that the value is within a range satisfying the following expression (6). In the following formula (6), Δ d represents a margin that can be appropriately set (the same applies hereinafter).

[ number 4]

d-Δd≦dGA≦d+Δd…(6)

For example, as shown in fig. 8, when a plurality of target pixels G10 to G13 as first feature points where parallax can be obtained exist in the vicinity of the pixel GA as the second feature point, the concept of expression (6) is developed, and the parallax d of the pixel GA as the second feature point is calculatedGAFor example, the value may be estimated to be within a range satisfying the following expression (7). In the following formula (7), min (d)n) Indicates that the minimum d is selectednOperation of (d), max (d)n) Indicates the largest d of the choicesnOperation of d0~d3The parallax of each of the pixels G10 to G13 as the first feature point is shown.

[ number 5]

min(d0,d1,d2,d3)-Δd≦dGA≦max(d0,d1,d2,d3)+Δd…(7)

The parallax d of the pixel GA as the second feature point estimated in this wayGAAnd a search region for the corresponding point of the second feature point is set restrictively (i.e., the search region is limited to a narrow range). Next, stereo matching of the parallelized image Img1 'and the parallelized image Img 2' is performed in the search area, and the true parallax of the pixel GA as the second feature point is calculated. Before stereo matching, preprocessing such as appropriate filtering may be performed on the parallelized image Img1 'and the parallelized image Img 2' as necessary. Here, the search range for obtaining the parallax of the second feature point is set based on a plurality of (four) parallaxes among the parallaxes obtained for the respective first feature points, but the present invention is not limited to this, and the search range for obtaining the parallax of the second feature point may be set based on one, two, three, or five or more parallaxes among the parallaxes obtained for the respective first feature points.

By performing the above processing on the plurality of second feature points (for example, all pixels other than the pixels serving as the first feature points shown in fig. 8), a parallax map in which the density distribution in pixel units of the image Img1 and the image Img2 is relatively fine is obtained (for example, the parallaxes of all the pixels other than the pixels G10 to G13 serving as the first feature points shown in fig. 8 are calculated) so as to complement the parallax map (fig. 8) in which the density distribution is relatively coarse according to the intervals between the plurality of first feature points.

(step S60)

Then, in step S60, at time t7 to time t8, the parallax information obtained in steps S30 and S50 are merged, that is, the parallax map having the relatively coarse density distribution obtained in step S30 and the parallax map having the relatively fine density distribution obtained in step S50 are merged to obtain a merged parallax map (for example, a map in which the parallax of all the pixels shown in fig. 8 is known). Next, the merged disparity map is subjected to post-processing such as appropriate filtering as necessary, and then the three-dimensional shape of the object OB is determined by converting the disparities of all the pixels of the first and second feature points into distances in the depth direction (so-called disparity-depth conversion).

Then, the obtained three-dimensional point group image representing the three-dimensional shape of the object OB is output to a display, a printer, or the like as needed so as to be visible to the user of the three-dimensional measurement system 100, and the series of processes is ended. Here, fig. 9 a and 9B show an image ImgM (here, the magnitude of parallax is schematically shown in gray scale) showing an example of a merged parallax map obtained by the three-dimensional measurement system 100 for an example of the object OB (a metal workpiece), and a three-dimensional point group image Img3DM showing a three-dimensional shape restored by using the image ImgM, respectively.

4 action and Effect

As described above, an example of the three-dimensional measurement system and the three-dimensional measurement method according to the present embodiment provides a hybrid three-dimensional measurement system in which three-dimensional measurement by a three-dimensional measurement method different from the stereo camera method is merged with three-dimensional measurement by the stereo camera method in which a three-dimensional position of an object can be obtained by stereo matching, and a method thereof.

However, in the example of the present embodiment, the search region for the corresponding point of the second feature point for stereo matching can be limited to a particularly narrow range compared to the normal stereo camera system by using three-dimensional information (parallax of the first feature point) obtained by a three-dimensional measurement system different from the stereo camera system, instead of simply combining the three-dimensional measurement system and the stereo camera system, which are different from the stereo camera system.

In other words, according to the example of the present embodiment, it is possible to focus on the pixels between the first feature points obtained by the three-dimensional measurement method different from the stereo camera method, supplement the parallax information for the pixels (the second feature points and the corresponding points thereof) by the stereo camera method, and limit the search area for stereo matching at this time to a reliable narrow range, thereby achieving an extremely short processing time. As a result, the search time for stereo matching by the stereo camera method having a high measurement resolution in pixel units (as shown in fig. 5, the processing time in step S40 is a very short time from time t6 to time t 7) and the total processing time can be significantly shortened. Further, by defining the search region in this way, excellent robustness against mutual reflection can be achieved, and also mismatching between pixels in stereo matching can be reduced.

Here, fig. 10 shows a list of images showing parallax maps obtained as a result of three-dimensional measurement in various ways in a table format for various objects OB having different surface properties. In both embodiments, the sensor unit 200 of the three-dimensional measurement system 100 according to the present embodiment is used as a sensor unit, and two types of images Img1 and Img2 (in which the pattern is deleted) obtained are shown in fig. 10. For convenience, the results of image processing using only the active single-shot method and the results of image processing using only the stereo camera method are referred to as "comparative example 1" and "comparative example 2", respectively. The result of image processing performed by the three-dimensional measurement system 100 according to the present embodiment is expressed as "example 1". Further, as the object OB, an object without texture, an object with a shape edge, an object with texture, and a workpiece with regular reflection are selected. In fig. 10, in the column of the image of the parallax map obtained in each manner for each object OB, an "o" mark is given to an image showing a good result, and an "x" mark is given to an image not showing a good result.

As shown in fig. 10, in comparative example 1 (image processing by the active single shot method only), good results were obtained for the a. non-textured object and the d. regularly-reflected workpiece, but good results were not obtained for the b. textured object and the c. textured object. In contrast, in comparative example 2 (image processing by the stereo camera method only), good results were obtained for the object having the shape edge b and the object having the texture c, but good results were not obtained for the object having no texture a and the workpiece having regular reflection d. In contrast, in example 1 (image processing performed by the three-dimensional measurement system 100 according to the present embodiment), a good result was obtained for any object OB. These results are indicative of an example of the superiority of the three-dimensional measurement system 100 according to the present embodiment over other methods (particularly, high robustness against the difference in the surface properties of the object OB). As described above, it has been confirmed that the example of the present embodiment effectively combines differences in measurable regions based on differences in the measurement principles of the three-dimensional measurement system (for example, the active single shot system) and the stereo camera system, which are different from the stereo camera system, and thereby achieves advantageous effects compared to the conventional system.

Modification example 5

While the embodiments as the examples of the present disclosure have been described in detail, the description so far is merely an example of the present disclosure in all aspects, and it is needless to say that various improvements and modifications can be made without departing from the scope of the present disclosure. For example, the following modifications can be made. In the following, the same reference numerals are used for the same components as those of the above-described embodiment, and the description thereof will be omitted as appropriate. The following modifications can be combined as appropriate.

<5.1>

For example, in the first image processing of step S30 in the operation example of the above embodiment, the case where only image Img1 of images Img1 and Img2 is used has been mentioned, but the first image processing of (1) may be performed using only image Img1 and image Img2 of image Img2, or may be performed using both image Img1 and image Img 2.

In particular, according to the configuration using both the image Img1 and the image Img2, two disparity maps having relatively coarse density distributions can be obtained, and thus the accuracy and/or precision of estimation of the unknown disparity of the feature point can be improved. Further, according to the configuration in which either one of the two images which is relatively good is selectively used, the accuracy of the disparity map itself in which the density distribution is relatively coarse can be improved, and therefore, in this case, the accuracy and/or precision of the estimation of the unknown disparity of the feature point can be improved.

The cameras included in the sensor unit 200 are not limited to one first camera 210 and one second camera 220 (two cameras in total), and at least one of the first camera 210 and the second camera 220 may be a plurality of cameras (three cameras or more in total). Further, the images Img1 and Img2 captured by the cameras are not limited to one image, and may be a plurality of images. According to these configurations, since the range of selection of images for obtaining a disparity map having a relatively coarse density distribution is wide, the accuracy and/or precision of estimation of an unknown disparity of a feature point can be further improved at this time, and the selectivity between two images used in the stereo camera system can be improved, so that the accuracy and/or precision of a finally determined three-dimensional shape can be further improved.

<5.2>

Here, a description will be given of a configuration example of the relative geometrical arrangement of the 3D projector 110 and the first and second cameras 210 and 220 in the sensor unit 200, with reference to fig. 11(a) to (D).

<5.2.1>

Fig. 11(a) is a perspective view schematically showing a first configuration example of the sensor unit 200. In the first configuration example, the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 are arranged on the same plane P1, and the optical axis P11 of the 3D projector 110 is not arranged on the plane P1. In other words, in the first configuration example, the optical axis P11 of the 3D projector 110 is arranged at a position different from the position on the virtual plane P1 defined by the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220. In the first configuration example, the distance between the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 is equal to the distance between the optical axis P11 of the 3D projector 110 and the optical axis P21 of the first camera 210 and the distance between the optical axis P11 of the 3D projector 110 and the optical axis P22 of the second camera 220.

In the first configuration example, the base length (distance) between the first camera 210 and the second camera 220 is equal to any one of the base length between the 3D projector 110 and the first camera 210 and the base length between the 3D projector 110 and the second camera 220, and therefore, the measurement accuracy of the three-dimensional measurement can be improved. In this case, the measurement accuracy obtained by the active single-shooting method using the first camera 210 can be made equal to the measurement accuracy obtained by the active single-shooting method using the second camera 220, and this is useful when three-dimensional measurement is performed by the active single-shooting method using two cameras. Further, the occupation area of the sensor unit 200 can be made relatively small, and thus the installation area of the three-dimensional measurement system can be reduced.

<5.2.2>

Fig. 11(B) is a perspective view schematically showing a second configuration example of the sensor unit 200. In the second configuration example, the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 are arranged on the same plane P1, and the optical axis P11 of the 3D projector 110 is not arranged on the plane P1. In other words, the second configuration example is also configured such that the optical axis P11 of the 3D projector 110 is disposed at a position different from the position on the virtual plane P1 defined by the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220. In the second configuration example, the distance between the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 is longer than the distance between the optical axis P11 of the 3D projector 110 and the optical axis P21 of the first camera 210.

In the second configuration example, the length of the base line of the first camera 210 and the second camera 220 is larger than the length of the base line of the 3D projector 110 and the first camera 210, so that the measurement accuracy of the three-dimensional measurement can be improved. In this case, the measurement accuracy by the active single-shot method using the second camera 220 can be made higher than that in the first configuration example (fig. 11 a), and this is useful when three-dimensional measurement is performed by the active single-shot method using a single-sided camera. Further, the occupation area of the sensor unit 200 can be made relatively small, and thus the installation area of the three-dimensional measurement system can be reduced.

<5.2.3>

Fig. 11(C) is a perspective view schematically showing a third configuration example of the sensor unit 200. In the third configuration example, the optical axis P11 of the 3D projector 110 is arranged on a virtual plane P2 defined by the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 (that is, all of the optical axes are arranged on the same plane). In the third configuration example, the distance between the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 is equal to the distance between the optical axis P11 of the 3D projector 110 and the optical axis P21 of the first camera 210 and the distance between the optical axis P11 of the 3D projector 110 and the optical axis P22 of the second camera 220.

In the third configuration example, the base line length of the first camera 210 and the second camera 220 is larger than the base line length of the 3D projector 110 and the first camera 210 and the base line length of the 3D projector 110 and the second camera 220, and thus the measurement accuracy of the three-dimensional measurement can be improved. Further, since the base line length of the 3D projector 110 and the first camera 210 is equal to the base line length of the 3D projector 110 and the second camera 220, the measurement accuracy by the active single-shot method using the first camera 210 can be equal to the measurement accuracy by the active single-shot method using the second camera 220, and the method is useful when three-dimensional measurement is performed by the active single-shot method using two cameras. Further, since the base line length of the first camera 210 and the second camera 220 can be increased as compared with, for example, the first configuration example (fig. 11(a)) or the second configuration example (fig. 11(B)), the measurement accuracy by the stereo camera method can be further improved under the same condition as other measurement parameters or algorithms, and it is useful in the case where the measurement accuracy by the stereo camera method is to be improved even if the installation area of the sensor unit 200 is enlarged.

<5.2.4>

Fig. 11(D) is a perspective view schematically showing a fourth configuration example of the sensor unit 200. The fourth configuration example is also configured such that the optical axis P11 of the 3D projector 110 is disposed on a virtual plane P2 defined by the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 (that is, all of the optical axes are disposed on the same plane P2). In the fourth configuration example, the distance between the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 is equal to the distance between the optical axis P11 of the 3D projector 110 and the optical axis P21 of the first camera 210.

In the fourth configuration example, the baseline length of the first camera 210 and the second camera 220 is equal to the baseline length of the 3D projector 110 and the first camera 210, and thus the measurement accuracy of the three-dimensional measurement can be improved. In this case, the measurement accuracy by the active single-shot method using the second camera 220 can be made higher than that in the third configuration example (fig. 11(C)), and the method is useful in the case where three-dimensional measurement is performed by the active single-shot method using a single-sided camera and the measurement accuracy by the active single-shot method is further improved.

<5.3>

Next, a configuration example in which a Two-dimensional (2D) projector 120 for projecting normal illumination light onto an object OB is additionally arranged in the sensor unit 200 will be described with reference to fig. 12(a) and (B). As described above, the 2D projector 120 corresponds to an example of the "projecting part" ("second projecting part") of the present disclosure, the 3D projector 110 and the 2D projector 120 correspond to an example of the "projecting part" of the present disclosure, and the sensor unit 200 having the 2D projector 120 also corresponds to an example of the "projecting part" of the present disclosure.

<5.3.1>

Fig. 12(a) is a perspective view schematically showing a fifth configuration example of the sensor unit 200. The fifth configuration example has a configuration in which a 2D projector 120 is additionally provided to the second configuration example shown in fig. 11 (B). In the fifth configuration example, the optical axis P21 of the first camera 210 and the optical axis P22 of the second camera 220 are arranged on the same plane P1, and the optical axis P11 of the 3D projector 110 and the optical axis P12 of the 2D projector 120 are arranged on the same plane P3 different from the plane P1. Further, the plane P1 is parallel to the plane P3.

In the fifth configuration example, since the 2D projector 120 can be used as a general illumination for an inspection, for example, the three-dimensional shape can be measured appropriately even when the object OB is in a dark ambient environment. Further, by acquiring an image obtained by imaging the object OB projected with the normal illumination from the 2D projector 120, and comparing the image with shape design data (CAD model data) of the object OB set in advance or held in the storage unit 303 or the image recording unit 330 of the computer 300, for example, to perform so-called CAD matching, for example, the position and orientation of the object OB can be grasped more accurately.

In the fifth configuration example, since the configuration of the sensor unit 200 other than the 2D projector 120 is the same as that of the second configuration example, the measurement accuracy of the three-dimensional measurement can be improved as in the second configuration example. In this case, the measurement accuracy by the active single-shot method using the second camera 220 can be made higher than that in the first configuration example (fig. 11 a), and this is useful when three-dimensional measurement is performed by the active single-shot method using a single-sided camera. Further, the occupation area of the sensor unit 200 can be made relatively small, and thus the installation area of the three-dimensional measurement system can be reduced.

<5.3.2>

Fig. 12(B) is a perspective view schematically showing a sixth configuration example of the sensor unit 200. The sixth configuration example has a configuration in which a 2D projector 120 is additionally provided to the third configuration example shown in fig. 11 (C). In the sixth configuration example, the optical axis P12 of the 2D projector 120 is arranged at a position on a virtual plane P4 defined by the optical axis P11 of the 3D projector 110, the optical axis P21 of the first camera 210, and the optical axis P22 of the second camera 220 (that is, all of them are arranged on the same plane P4), and the 2D projector 120 is arranged between the 3D projector 110 and the second camera 220.

In the sixth configuration example, as in the fifth configuration example, since the 2D projector 120 can be used as a general illumination for an inspection, for example, the measurement of the three-dimensional shape can be appropriately performed even when the object OB is in a dark ambient environment. Further, by acquiring an image obtained by imaging the object OB projected with the normal illumination from the 2D projector 120, and comparing the image with shape design data (CAD model data) of the object OB set in advance or held in the storage unit 303 or the image recording unit 330 of the computer 300, for example, to perform so-called CAD matching, for example, the position and orientation of the object OB can be grasped more accurately.

In the sixth configuration example, the base line length of the first camera 210 and the second camera 220 is larger than the base line length of the 3D projector 110 and the first camera 210 and the base line length of the 3D projector 110 and the second camera 220, and thus the measurement accuracy of the three-dimensional measurement can be improved. In this case, the measurement accuracy by the active single-shot method using the second camera 220 can be made higher than that in the third configuration example (fig. 11(C)), and the method is useful in the case where three-dimensional measurement is performed by the active single-shot method using a single-sided camera and the measurement accuracy by the active single-shot method is further improved. Further, since the base line length of the first camera 210 and the second camera 220 can be further increased as compared with, for example, the third configuration example (fig. 11(C)), if other measurement parameters or algorithms are the same, the measurement accuracy by the stereo camera method can be further improved, which is useful in the case where the measurement accuracy by the stereo camera method is to be further improved even if the installation area of the sensor unit 200 is further increased.

<5.4>

Further, the arrangement of the sensor unit 200 of the first structural example to the sixth structural example is exemplified as a structure useful in the case where: as a three-dimensional measurement method different from the stereo camera method, measurement information is obtained by a measurement method based on triangulation, and stereo matching by the stereo camera method is performed based on the measurement information.

On the other hand, as an embodiment using another three-dimensional measurement method, there is a seventh configuration example in which: as a three-dimensional measurement method different from the stereo camera method, measurement information is obtained by a measurement method (various TOF measurement methods and the like) based on the principle of coaxial ranging, and stereo matching by the stereo camera method is performed based on the measurement information. In the case of the seventh configuration example, the sensor unit 200 shown in fig. 11(a) to (D) and fig. 12(a) and (B) may be used, and among these, the arrangement shown in fig. 11(C) and 12(B) in which the 3D projector 110 (illumination light for time measurement) and the first camera 210 have extremely short base lines (close to each other) and the first camera 210 and the second camera have extremely long base lines (distant from each other) may further improve the measurement accuracy, and the arrangement shown in fig. 12(B) is particularly useful.

<5.5>

In the above embodiment, for example, the following configuration is explained: one (single) of the first camera 210 and the second camera 220 is used to measure the distance by an active single shot method, and then the parallax of the first feature point is obtained. On the other hand, as an embodiment using another three-dimensional measurement method, there is an eighth configuration example in which: in a state where the pattern light is projected onto the object OB, the spatial code is specified by using both the first camera 210 and the second camera 220, and the parallax of the first feature point is obtained. The measurement according to the eighth configuration example can be implemented by a spatial coding pattern projection method and/or a temporal coding pattern projection method.

<5.6>

Further, a configuration example of the projection area S110 of the 3D illumination projected from the 3D projector 110 of the sensor unit 200 will be described with reference to fig. 13(a) to (C).

<5.6.1>

Fig. 13(a) is a plan view schematically showing a ninth configuration example of the sensor unit 200. The ninth configuration example includes the following 3D projector 110: the 3D covers the overlapping part of the viewing angle S210 of the first camera 210 and the viewing angle S220 of the second camera 220 (common field of view) with the illuminated projection area S110. The ninth structural example is particularly suitable for the case where: in the three-dimensional measurement system 100 of the embodiment, the images Img1 and Img2 are captured at the time of one projection of 3D illumination, and the mixed three-dimensional measurement is performed using the images Img1 and Img 2.

<5.6.2>

Fig. 13(B) is a plan view schematically showing a tenth configuration example of the sensor unit 200. The tenth configuration example includes the following 3D projector 110: the 3D covers the entirety of either the angle of view S210 of the first camera 210 or the angle of view S220 of the second camera 220 (single field of view) with the illuminated projection area S110. The tenth configuration example is useful in the three-dimensional measurement system 100 according to the embodiment for capturing an image used in the active single-shot mode, and in this case, an image used in the stereo camera mode may be captured by other illumination (for example, a 2D projector) in a different projection area.

<5.6.3>

Fig. 13(C) is a plan view schematically showing an eleventh configuration example of the sensor unit 200. The eleventh configuration example includes the following 3D projector 110: the 3D covers an entire portion (multiple fields of view) of both the viewing angle S210 of the first camera 210 and the viewing angle S220 of the second camera 220 with the illuminated projection area S110. The eleventh configuration example can suitably deal with any of the following cases: in the three-dimensional measurement system 100 of the embodiment, the images Img1 and Img2 are captured at the time of one projection of 3D illumination, and the mixed three-dimensional measurement is performed using the images Img1 and Img 2; and a case where an image used in the active single shot mode and an image used in the stereo camera mode are photographed separately.

<5.7>

In the above embodiment, in step S30, a three-dimensional point group indicating the three-dimensional positions of a plurality of first feature points corresponding to the pattern included in the measurement light (pattern light) is restored using at least one of the image Img1 and the image Img2, but depending on the shape of the object OB and the measurement condition, it is also conceivable that the three-dimensional point group cannot be restored for a part of the first feature points. Therefore, in this case, for example, the twelfth configuration example is configured to set the search range for stereo matching to the second feature point corresponding to the region of the first feature point from which the three-dimensional point group is restored to a narrow range based on the parallax of the first feature point, and to set the search range for stereo matching to the second feature point corresponding to the region of the first feature point from which the three-dimensional point group is not restored to a predetermined range. The hardware configuration of the twelfth configuration example may be the same as those of the other embodiments and other configuration examples.

According to the twelfth configuration example, when the three-dimensional point group cannot be restored by the first three-dimensional measurement (three-dimensional measurement by a three-dimensional measurement method different from the stereo camera method; for example, the active single shot method) before the three-dimensional measurement by the stereo camera method with respect to a part of the object OB, it is possible to realize the high speed of the processing without performing the stereo matching for expanding the search range for the entire of the image Img1 and the image Img 2.

6 additional notes

The embodiments described above are for the purpose of facilitating understanding of the present disclosure, and are not intended to be restrictive. The elements included in the embodiments, their arrangement, materials, conditions, shapes, sizes, and the like are not limited to the examples, and may be appropriately changed. Also, the structures shown in different embodiments may be partially replaced or combined with each other.

(attached note 1)

A three-dimensional measurement system (100), comprising:

an image pickup unit (200) which has a first image pickup unit (210) and a second image pickup unit (220) disposed apart from each other and which picks up images (Img1, Img2) of the Object (OB) that are different from each other;

a first calculation unit (350) that calculates the parallax (d) of the first feature point in the image by using at least either the first imaging unit (210) or the second imaging unit (220) and using distance information of a three-dimensional measurement method different from the stereo camera method or information for calculating the distance0~d3) (ii) a And

a second calculation unit (350) that searches for a corresponding point for a second feature point by a stereo camera method using the first image pickup unit (210) and the second image pickup unit (220), calculates a parallax of the second feature point based on the search result, and calculates a parallax (d) from the parallax (d) of the first feature point0~d3) And the parallax of the second feature point to determine the three-dimensional shape of the Object (OB),

the second calculation unit (350) calculates the parallax (d) based on the first feature point0~d3) A search range of the corresponding point to the second feature point is set.

(attached note 2)

The three-dimensional measurement system (100) according to supplementary note 1, comprising:

and a projection unit (200, 110) that projects measurement light onto the Object (OB) in order to determine the three-dimensional shape.

(attached note 3)

The three-dimensional measurement system (100) according to supplementary note 1 or 2, wherein,

the first feature point and the second feature point are the same point or exist at positions near each other.

(attached note 4)

The three-dimensional measurement system (100) according to any one of supplementary notes 1 to 3, wherein,

a three-dimensional measurement method different from the stereo camera method obtains three-dimensional information of the Object (OB) by one-time shooting by the image pickup unit (200).

(attached note 5)

The three-dimensional measurement system (100) according to any one of supplementary notes 1 to 4, wherein,

disparity (d) with respect to a feature point not based on the first feature point0~d3) The second calculation unit (350) reduces the threshold value of the index of the degree of matching of the stereo camera system as compared with the case of setting the search range.

(attached note 6)

The three-dimensional measurement system according to any one of supplementary notes 1 to 5, wherein,

the first calculation unit (350) restores a three-dimensional point group representing the three-dimensional position of the first feature point,

the second calculation unit (350) is configured so that, when the three-dimensional point group is not restored for a part of the first feature points, the second feature points corresponding to the regions of the first feature points, from which the three-dimensional point group was restored, are subjected to parallax (d) based on the first feature points0~d3) The search range is set to a predetermined range with respect to the second feature point corresponding to the region of the first feature point in which the three-dimensional point group is not restored.

(attached note 7)

The three-dimensional measurement system (100) according to any one of supplementary notes 1 to 6, wherein,

the first calculation unit (350) has: a first image processing unit (351) for restoring a three-dimensional point group representing the three-dimensional position of the first feature point; and a second image processing unit (352) that two-dimensionally projects the three-dimensional coordinates of each first feature point in the restored three-dimensional point group onto the image, obtains the two-dimensional coordinates of each first feature point, and extracts the two-dimensional coordinates from the twoCalculating the parallax (d) of each first feature point by using the coordinates0~d3),

The second calculation unit (350) has: a third image processing unit (353) for processing the image based on the parallax (d) of the first feature point0~d3) Calculating a disparity of the second feature point by obtaining an estimated value of the disparity of the second feature point, setting a search region of the corresponding point based on the estimated value of the disparity of the second feature point, and performing stereo matching between the second feature point and the corresponding point in the search region; and a fourth image processing unit (354) that processes the parallax (d) based on the first feature point0~d3) And determining the three-dimensional shape of the Object (OB) by the parallax of the second feature point.

(attached note 8)

The three-dimensional measurement system (100) according to any one of supplementary notes 2 to 7, wherein,

the distance between the optical axis (P21) of the first image pickup unit (210) and the optical axis (P22) of the second image pickup unit (220) is equal to the distance between the optical axis (P11) of the projection unit (110) and the optical axis (P21) of the first image pickup unit (210) or the optical axis (P22) of the second image pickup unit (220).

(attached note 9)

The three-dimensional measurement system (100) according to any one of supplementary notes 2 to 7, wherein,

the distance between the optical axis (P21) of the first imaging unit (210) and the optical axis (P22) of the second imaging unit (220) is longer than the distance between the optical axis (P11) of the projection unit (110) and the optical axis (P21) of the first imaging unit (210) or the optical axis (P22) of the second imaging unit (220).

(attached note 10)

The three-dimensional measurement system (100) according to any one of supplementary notes 2 to 9, wherein,

an optical axis (P11) of the projecting unit (110), an optical axis (P21) of the first imaging unit (210), and an optical axis (P22) of the second imaging unit (220) are arranged on the same plane (P2).

(attached note 11)

The three-dimensional measurement system (100) according to any one of supplementary notes 2 to 9, wherein,

an optical axis (P21) of the first imaging unit (210) and an optical axis (P22) of the second imaging unit (220) are arranged on the same plane (P1), and an optical axis (P11) of the projection unit (110) is not arranged on the plane (P1).

(attached note 12)

The three-dimensional measurement system (100) according to any one of supplementary notes 2 to 11, wherein,

the projection units (200, 120) project normal illumination light different from the measurement light to the Object (OB).

(attached note 13)

A three-dimensional measurement method using a three-dimensional measurement system (100), the three-dimensional measurement system (100) comprising: an imaging unit (200) having a first imaging unit (210) and a second imaging unit (220) which are disposed apart from each other; a first calculation unit (350); and a second calculation unit (350), wherein the three-dimensional measurement method comprises the following steps:

the imaging unit (200) captures images (Img1, Img2) of the Object (OB) that are different from each other;

the first calculation unit (350) calculates the parallax (d) of the first feature point in the image by using at least either one of the first imaging unit (210) and the second imaging unit (220) and using distance information of a three-dimensional measurement method different from a stereo camera method or information for calculating the distance0~d3) (ii) a And

the second calculation unit (350) searches for a corresponding point for a second feature point by a stereo camera method using the first image pickup unit (210) and the second image pickup unit (220), calculates a parallax of the second feature point based on the search result, and calculates a parallax (d) from the parallax (d) of the first feature point0~d3) And the parallax of the second feature point to determine the three-dimensional shape of the Object (OB),

in the step of determining the three-dimensional shape of the Object (OB), the second calculation unit (350) calculates the parallax (d) based on the first feature point0~d3) A search range of the corresponding point to the second feature point is set.

Description of the symbols

100: three-dimensional measuring system

110: projector for 3D

111: laser light source

112: pattern mask

113: lens and lens assembly

120: 2D projector

200: sensor unit

210: first camera

220: second camera

300: computer with a memory card

301: control arithmetic unit

302: communication I/F section

303: storage unit

304: input unit

305: output unit

306: bus line

310: control unit

320: image acquisition unit

330: image recording unit

340: image output unit

350: image processing unit

351: first image processing unit

352: second image processing unit

353: third image processing section

354: fourth image processing unit

OB: object

d0~d3: parallax of first feature point

dGA: (estimated value of) parallax of pixel (second feature point)

G10-G13: pixel as first characteristic point

G20-G23: pixel as first characteristic point

GA: pixel (second characteristic point)

Img1, Img 2: image of a person

Img1 ', Img 2': parallelized images

Img10, Img 20: two-dimensional projected image

Img3D1, Img3 DM: three-dimensional point group image

ImgM: image (merging parallax map)

P1, P2, P3, P4: imaginary plane

P11, P12, P21, P22: optical axis

S10-S60: step (ii) of

S110: projection area

S210 and S220: angle of view

t 1-t 8: time of day

35页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像获取方法及系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!