Optoelectronic device for collecting three-dimensional data
阅读说明:本技术 用于收集三维数据的光电装置 (Optoelectronic device for collecting three-dimensional data ) 是由 库图·芙萝音 于 2018-03-12 设计创作,主要内容包括:本公开阐述用于在不确定视差的情况下收集三维数据的光电装置。该方法相对于用于收集三维数据的目前最先进技术方法更具显著灵活性。举例而言,由所公开的装置产生的照射(例如,图案)不需要呈现与一般由结构化光的方法所需要的照射相同的复杂程度或随机性程度。此外,在某些例子中,本公开中所阐述的这些装置在宽距离范围内呈现最佳分辨率。本文中所阐述的光电装置可操作以产生照射,该照射包括在由并入至该光电装置中的成像器成像时呈现距离相依改变的多个高强度特征。(The present disclosure sets forth optoelectronic devices for collecting three-dimensional data without determining parallax. This method is significantly more flexible than the state of the art methods for collecting three-dimensional data. For example, the illumination (e.g., pattern) produced by the disclosed apparatus need not exhibit the same degree of complexity or randomness as is typically required by structured light methods. Moreover, in certain examples, the devices set forth in this disclosure exhibit optimal resolution over a wide range of distances. The optoelectronic devices described herein are operable to generate illumination comprising a plurality of high intensity features that exhibit distance dependent changes when imaged by an imager incorporated into the optoelectronic device.)
1. An optoelectronic device for collecting three-dimensional data, comprising:
a first emitter comprising a first array of light emitting elements and a first emitter optical assembly aligned with the first array of light emitting elements, the first emitter characterized by a first emitter optical axis;
the first array of light emitting elements is operable to emit first light onto the first emitter optical assembly, and the first emitter optical assembly is operable to direct the first light within a first illumination field;
a second emitter comprising a second array of light-emitting elements and a second emitter optical component aligned with the second array of light-emitting elements, the second emitter characterized by a second emitter optical axis;
the second array of light-emitting elements is operable to emit second light onto the second emitter optical assembly, and the second emitter optical assembly is operable to direct the second light within a second illumination field;
the light directed within the first and second illumination fields comprises illumination comprising a plurality of high intensity features, wherein the high intensity features are characterized by a feature density;
an imager comprising an array of photosensitive intensity elements and an imager optical component aligned with the array of photosensitive intensity elements, the imager characterized by an imager optical axis;
the imager optical assembly is operable to direct the reflected portion of the illumination onto the array of photosensitive intensity elements over a field of view, thereby producing an image of the reflected portion of the illumination;
wherein the reflected portion of the illumination originates from one or more objects within the field of view, the one or more objects respectively disposed at one or more distances from the optoelectronic device, the one or more distances depicting three-dimensional data; and is
The first emitter, the second emitter, and the imager are configured such that the feature density is substantially constant in the generated image.
2. The optoelectronic device of claim 1, further comprising a processor communicatively coupled to the imager, the processor operable to extract the three-dimensional data from the imagery.
3. The optoelectronic device of claim 1, wherein the first emitter, the second emitter, and the imager are configured such that the high-intensity feature exhibits distance-dependent changes in the generated image.
4. The optoelectronic device of claim 2, wherein the first emitter, the second emitter and the imager are configured such that the high-intensity features exhibit distance-dependent changes in the generated image, and the processor is operable to extract three-dimensional data in dependence on the distance-dependent changes.
5. The optoelectronic device of claim 3, wherein the first emitter optical axis, the second emitter optical axis, and the imager optical axis are all skewed relative to one another such that the high intensity feature density is substantially constant in the generated image.
6. The optoelectronic device of claim 3, wherein the distance-dependent change in the plurality of high intensity features includes distortion of at least a portion of the high intensity features, and the distortion is a function of the distance from the optoelectronic device.
7. The optoelectronic device of claim 6, wherein the distance-dependent change in the plurality of high-intensity features includes at least one additional distortion of at least one additional portion of the high-intensity features, and the at least one additional distortion is another function of distance from the optoelectronic device.
8. The optoelectronic device of claim 5, wherein the first emitter, the second emitter, and the imager are configured such that the first emitter optical axis, the second emitter optical axis, and the imager optical axis are skewed between 0.1 ° and 10 ° with respect to one another.
9. The optoelectronic device of claim 3, wherein the field of view is characterized by a field of view angle, the first illumination field is characterized by a first illumination field angle, the second illumination field is characterized by a second illumination field angle, the first illumination field is not equal to the second illumination field, and the field of view angle is not equal to the first illumination field angle or the second illumination field angle.
10. The optoelectronic device of claim 9, wherein the first illumination field angle and/or the second illumination field angle is between 0.1 ° and 10 °, greater than the field of view angle, and neither the first illumination field angle nor the second illumination field angle is greater than the other by between 0.1 ° and 10 °.
11. The optoelectronic device of claim 3, wherein the first light directed within the first illumination field and the second light directed within the second illumination field are each a quasi-regular grid of high intensity features.
12. The optoelectronic device of claim 3, wherein the illumination comprises a moire pattern.
Background
Triangulation methods such as structured light (or encoded light) are sometimes used to collect three-dimensional data. Such approaches present several challenges. Generally, for example, triangulation methods, which may be resource intensive, require determining disparity. The method of structured light requires illuminating a three-dimensional object with encoded emissions (i.e., structured light). Coded emissions require a degree of complexity or randomness (e.g., pseudo-randomness) that can be difficult to achieve, as they can require expensive or complicated optical elements and other components. Furthermore, triangulation methods generally require precise alignment of components in order to accurately calculate three-dimensional data. In addition, the state of the art methods for collecting three-dimensional data that require illumination of a three-dimensional object sometimes only exhibit optimal resolution over a narrow range of distances.
Disclosure of Invention
This disclosure sets forth an apparatus for collecting three-dimensional data without the need to determine disparity. This approach permits significant flexibility over the state of the art methods for collecting three-dimensional data. For example, the illumination (e.g., pattern) produced by the disclosed apparatus need not exhibit the same degree of complexity or randomness as is typically required by structured light methods. In some examples, the illumination produced by the disclosed apparatus may be regular (e.g., a rectangular grid of points or lines). In some instances, such illumination may be produced with simple, inexpensive optical elements. Furthermore, in certain examples, the apparatus set forth in the present disclosure may exhibit optimal resolution over a wide range of distances.
In one aspect, for example, an optoelectronic device for collecting three-dimensional data includes a transmitter. The emitter includes an array of light-emitting elements and an emitter optical assembly aligned with the array of light-emitting elements. The emitter is characterized by an emitter optical axis. The array of light-emitting elements is operable to emit light onto the emitter optical assembly, and the emitter optical assembly is operable to direct the light within an illumination field. The light directed within the illumination field forms an illumination. The illumination includes high intensity features characterized by a high intensity feature density.
The optoelectronic device further comprises an imager. The imager includes an array of photosensitive intensity elements and imager optics aligned with the array of photosensitive intensity elements. The imager is characterized by an imager optical axis. The imager optical assembly is operable to direct the reflected portions of the illumination onto the array of photosensitive intensity elements over a field-of-view (FOV) to produce an image of the reflected portions of the illumination. The reflected portions of the illumination originate from one or more objects within the field of view, the one or more objects being respectively disposed at one or more distances from the optoelectronic device. The one or more distances depict three-dimensional data. Furthermore, the emitter and the imager are configured such that high intensity feature density is substantially constant in the generated image.
In certain embodiments, an optoelectronic device comprises a processor communicatively coupled to the imager operable to generate an image. The processor is operable to extract three-dimensional data from the image.
In certain implementations, an optoelectronic device includes an emitter and an imager configured such that a plurality of high intensity features included in illumination exhibit distance-dependent changes in an image produced by the imager.
In certain implementations, an optoelectronic device includes an emitter and an imager configured such that a plurality of high intensity features included in illumination exhibit distance-dependent changes in an image produced by the imager. The optoelectronic device further comprises a processor operable to extract three-dimensional data in accordance with the distance-dependent change.
In certain implementations, an optoelectronic device includes an emitter having an emitter optical axis and an imager having an imager optical axis, where the two axes are skewed relative to each other such that a high intensity feature density is substantially constant in an image produced by the imager.
In certain implementations, an optoelectronic device includes a plurality of high intensity features characterized by distance-dependent changes. The distance dependent change comprises a general projective transformation, a portion of which is a lateral translation of at least a portion of the high intensity features. The lateral translation is a function of distance from the optoelectronic device.
In certain implementations, the optoelectronic device includes a high intensity feature characterized by a distance-dependent change. The distance dependent change comprises a general projective transformation, a portion of which is a lateral translation of at least a portion of the high intensity features. The lateral translation is a function of distance from the optoelectronic device. Moreover, the distance-dependent change further comprises at least one additional translation of at least one additional portion of the high intensity features, wherein the at least one additional lateral translation is another function of the distance from the optoelectronic device.
In certain implementations, an optoelectronic device includes a field of view characterized by a field of view angle and an illumination field characterized by an illumination field angle. The field of view angle is not equal to the illumination field angle.
In certain implementations, an optoelectronic device includes an emitter characterized by an emitter optical axis and an imager characterized by an imager optical axis, the emitter and the imager configured such that the emitter optical axis and the imager optical axis are skewed between 0.1 ° and 10 ° with respect to each other.
In certain implementations, the optoelectronic device includes an illumination field angle between 0.1 ° and 10 °, which is greater than the field of view angle.
In certain implementations, the optoelectronic device includes light directed within the illumination field, where each light is a quasi-regular grid of high intensity features.
In another aspect, an optoelectronic device for collecting three-dimensional data includes a first transmitter. The first emitter includes a first array of light emitting elements and a first emitter optical assembly aligned with the first array of light emitting elements. The first emitter is characterized by a first emitter optical axis. The first array of light emitting elements is operable to emit first light onto the first emitter optical assembly, and the first emitter optical assembly is operable to direct the first light within a first illumination field.
The optoelectronic device further includes a second emitter including a second array of light-emitting elements and a second emitter optical component aligned with the second array of light-emitting elements. The second emitter is characterized by a second emitter optical axis. The second array of light-emitting elements is operable to emit second light onto the second emitter optical assembly, and the second emitter optical assembly is operable to direct the second light within a second illumination field. Light directed within the first illumination field and the second illumination field forms illumination. The illumination contains high intensity features characterized by a feature density.
The optoelectronic device further comprises an imager. The imager includes an array of photosensitive intensity elements and imager optics aligned with the array of photosensitive intensity elements. The imager is characterized by an imager optical axis. The imager optical assembly is operable to direct the reflected portions of the illumination onto the array of photosensitive intensity elements within a field of view to produce an image of the reflected portions of the illumination. The reflected portions of the illumination originate from one or more objects within the field of view. The one or more objects are respectively positioned at one or more distances from the optoelectronic device. The one or more distances depict three-dimensional data. The first emitter, the second emitter, and the imager are configured such that the feature density is substantially constant in the generated image.
In certain implementations, an optoelectronic device includes a processor communicatively coupled to an imager operable to generate an image. The processor is operable to extract three-dimensional data from the image.
In certain embodiments, an optoelectronic device includes a first emitter, a second emitter, and an imager configured such that a plurality of high intensity features included in illumination exhibit distance dependent changes in an image produced by the imager.
In certain embodiments, an optoelectronic device includes a first emitter, a second emitter, and an imager configured such that a plurality of high intensity features included in illumination exhibit distance dependent changes in an image produced by the imager. The optoelectronic device further comprises a processor operable to extract three-dimensional data in accordance with the distance-dependent change.
In certain implementations, the optoelectronic device includes a first emitter characterized by a first emitter optical axis, a second emitter characterized by a second emitter optical axis, and an imager characterized by an imager optical axis, wherein all axes are skewed relative to each other such that the high intensity feature density is substantially constant in the resulting image.
In certain embodiments, the optoelectronic device is operable to generate illumination. The illumination includes high intensity features that exhibit distance dependent changes. The distance dependent change includes distortion of at least a portion of the high intensity features. The distortion is a function of the distance from the optoelectronic device. In some examples, the distance-dependent change in the high-intensity features includes at least one additional distortion of at least one additional portion of the high-intensity features. The at least one additional distortion is another function of distance from the optoelectronic device.
In certain implementations, the optoelectronic device includes a first emitter, a second emitter, and an imager, each characterized by a first emitter optical axis, a second emitter optical axis, and an imager optical axis, respectively. The first emitter, the second emitter, and the imager are configured such that the first emitter optical axis, the second emitter optical axis, and the imager optical axis are skewed between 0.1 ° and 10 ° with respect to each other.
In certain implementations, an optoelectronic device includes a first emitter, a second emitter, and an imager, each characterized by a first illumination field, a second illumination field, and a field of view. The first illumination field is characterized by a first illumination field angle, the second illumination field is characterized by a second illumination field angle, and the field of view is characterized by a field of view angle. The first illumination field angle is not equal to the second illumination field angle, and the field angle is not equal to the first illumination field angle or the second illumination field angle.
In some examples, the first illumination field angle and/or the second illumination field angle is between 0.1 ° and 10 ° greater than the field angle, and neither the first illumination field angle nor the second illumination field angle is between 0.1 ° and 10 ° greater than the other.
In certain implementations, an optoelectronic device is operable to generate a first light directed within a first field of illumination and a second light directed within a second field of illumination, each of the first and second lights having a quasi-regular grid of high intensity features.
In certain implementations, the optoelectronic device is operable to generate illumination that includes a moire (moire) pattern.
In certain implementations, an optoelectronic device includes a non-transitory computer-readable medium including instructions stored thereon that, when executed on a processor, perform operations comprising:
illuminating one or more calibration objects with a first emitter and a second emitter
Collecting a set of calibration images at different distances, each calibration image being associated with a distance value
Identifying a set of training images from the set of calibration images, the set of training images being distinguishable from each other and depicting a depth-ordered sequence of training images.
In certain implementations, an optoelectronic device includes a non-transitory computer-readable medium including instructions stored thereon that, when executed on a processor, perform operations comprising:
collecting test images of one or several objects
Extracting a first test block with a first set of coordinates from the test image
Extracting one or more corresponding training blocks having the same first set of coordinates as the first test block from one or more training images
Comparing the first test block with the one or more training blocks extracted from the one or more corresponding training images
Identifying a training block matching the first test block from among the one or more training blocks
Correlating the first test block with a first distance value associated with the matching training block associated with a corresponding training image within the depth-ordered training image sequence
Storing the first distance value.
In certain implementations, an optoelectronic device includes a non-transitory computer-readable medium including instructions stored thereon that, when executed on a processor, perform operations comprising:
extracting one or more additional test blocks from the test image, each additional test block having a set of coordinates within the test image
For each of the one or more additional test blocks, extracting from one or more training images one or more corresponding training blocks having the same set of coordinates as the additional test block
Comparing, for each of the one or more additional test blocks, the additional test block with the one or more training blocks extracted from the one or more corresponding training images
For each of the one or more additional test blocks, identifying a training block from the one or more training blocks that matches the additional test block
For each of the one or more additional test blocks, correlating the additional test block with a distance value associated with the matching training block associated with the corresponding training image within the depth-ordered training image sequence
Storing the one or more distance values associated with the one or more additional test blocks.
In certain embodiments, an optoelectronic device includes a non-transitory computer-readable medium comprising instructions stored thereon that, when executed on a processor, perform operations comprising: a three-dimensional representation of the one or more objects is constructed from the first distance value and the one or more additional distance values.
In certain embodiments, an optoelectronic device includes a non-transitory computer-readable medium comprising instructions stored thereon that, when executed on a processor, perform operations comprising: a refined first distance value is determined by interpolating distance values associated with training images adjacent to a training image having a matching training block with the same first set of coordinates.
In certain embodiments, an optoelectronic device includes a non-transitory computer-readable medium comprising instructions stored thereon that, when executed on a processor, perform operations comprising: a refined distance value for each of one or more additional distance values is determined by interpolating distance values associated with training images adjacent to the training image having the matching training block with the same set of coordinates as the additional test block. In some examples, interpolating the distance values includes quadratic interpolation.
In certain embodiments, an optoelectronic device includes a non-transitory computer-readable medium comprising instructions stored thereon that, when executed on a processor, perform operations comprising: a depth-ordered training image sequence is summarized with a plurality of intrinsic images.
In some embodiments, each training image is a linear combination of intrinsic images. The eigen-images are orthogonal to each other, forming an eigen-space coordinate system expressing the training images, and the weight of the linear combination is the eigen-space coordinate of the training images in eigen-space.
In certain implementations, an optoelectronic device includes a non-transitory computer-readable medium including instructions stored thereon that, when executed on a processor, perform operations comprising:
extracting, from one or more training images, respective training blocks each having a set of training coordinates that are the same for each of the one or more training blocks
Summarize these training blocks with corresponding block eigen-images.
In some embodiments, each training block is a linear combination of block intrinsic images that are orthogonal to each other, forming a block eigenspace coordinate system that expresses the training blocks, and the weights of the linear combination are the block eigenspace coordinates of the training blocks in block eigenspace.
In certain implementations, an optoelectronic device includes a non-transitory computer-readable medium including instructions stored thereon that, when executed on a processor, perform operations comprising:
collecting test images of one or more objects
Extracting a first test block from the test image, the first test block having a first set of coordinates
Projecting the first test block onto a block eigenspace constructed from a training block having a set of training coordinates identical to the first set of coordinates, and deriving block eigenspace coordinates of the test block from the projection
Comparing the set of block eigenspace coordinates of the first test block with the set of block eigenspace coordinates associated with each training block
Identifying the set of matched block eigenspace coordinates associated with each training block to the set of block eigenspace coordinates of the first test block
Correlating the matched set of block eigenspace coordinates with distance values associated with training images within the depth-ordered sequence of training images.
In certain embodiments, an optoelectronic device includes a non-transitory computer-readable medium comprising instructions stored thereon that, when executed on a processor, perform operations comprising: and carrying out binarization on a plurality of training images. In some examples, binarizing the plurality of training images includes an adaptive thresholding method.
In certain embodiments, an optoelectronic device includes a non-transitory computer-readable medium comprising instructions stored thereon that, when executed on a processor, perform operations comprising: a set of cluster centers is determined by applying a clustering technique to a set of training patches. Each cluster center in the set of cluster centers represents one or more training blocks within the set of training blocks. In some examples, the clustering technique comprises hierarchical k-means clustering.
In certain implementations, an optoelectronic device includes a non-transitory computer-readable medium including instructions stored thereon that, when executed on a processor, perform operations comprising:
collecting test images of one or more objects
Extracting a first test block from the test image, the first test block having a first coordinate set substantially equal to the training block coordinate set
Comparing the first test block with the cluster center
Identifying matching cluster centers within the set of cluster centers
Comparing the first test block with the training blocks in the matching cluster center
Correlate the first test block to a distance value associated with the matching training block, and the matching training block is associated with a training image within the set of training images.
In certain embodiments, an optoelectronic device includes a non-transitory computer-readable medium comprising instructions stored thereon that, when executed on a processor, perform operations comprising: a three-dimensional representation of one or several objects is constructed from the distance values.
While the foregoing apparatus and techniques may be particularly advantageous for collecting three-dimensional data without determining disparity, in certain instances they may be used in conjunction with determining disparity. Thus, the devices and techniques for collecting three-dimensional data described herein may also be used in applications where parallax is determined.
Other aspects, features, and advantages will be apparent from the following detailed description, the accompanying drawings, and the claims.
Drawings
Fig. 1 depicts an exemplary optoelectronic device operable to collect three-dimensional data. The device is characterized by a single transmitter.
FIG. 2 depicts another exemplary optoelectronic device operable to collect three-dimensional data. The apparatus is characterized by a plurality of transmitters.
FIG. 3 illustrates an example of steps encoded in non-transitory media and executed on a processor incorporated into the example optoelectronic device depicted in FIG. 2.
FIG. 4 illustrates another example of steps encoded in non-transitory media and executed on a processor incorporated into the exemplary optoelectronic device depicted in FIG. 2.
Detailed Description
Fig. 1 depicts an example of an
The array of light-emitting
The transmitter
The
The
The imager
The reflected
The
The
In some examples, the emitter
In some examples, the
Fig. 2 depicts another exemplary
The first array of
The first emitter
The
The second array of light-emitting
The second emitter
In some examples, the
In some examples, the
Alternatively or in addition to the foregoing, a single channel may also contain other elements configured such that the
The illumination 215 contains a plurality of high intensity features 216, such as dots, circles, quasi-regular patterns of ellipses or dots, circles, ellipses, regular patterns of grid lines, and so forth. The plurality of high-intensity features 216 is characterized by a feature density (i.e., a number of high-intensity features per region). In certain examples, the
The
The imager optical component 222 may include a cover glass, a refractive lens, a diffractive lens, a microlens array, a diffuser, a spectral filter, an aperture or plurality of refractive lenses, a diffractive lens, a microlens array, a diffuser, a spectral filter, or any combination thereof. The imager optical assembly 222 may be operable to direct the reflected
The reflected
The
The
In some examples, the first emitter
In some examples, the
Fig. 3 illustrates an example of
The one or more calibration objects are positioned at different distances. Calibration images are captured by the
In some examples, a large number of calibration images are collected by
The
At 310, a first test block having a first set of coordinates is extracted from the test image.
At 312, one or more corresponding training blocks having the same first set of coordinates as the first test block are extracted from one or more training images.
At 314, the first test block is compared to the one or more training blocks extracted from the one or more corresponding training images.
At 316, a training block matching the first test block is identified from among the one or more training blocks.
At 318, the first test block is correlated with a first distance value associated with the matching training block, and the matching training block is associated with a corresponding training image within the depth-ordered training image sequence.
At 320, the first distance value is stored, for example, on the non-transitory computer-readable medium.
At 324, for each of the one or more additional test blocks, one or more respective training blocks having the same set of coordinates as the additional test block are extracted from one or more training images.
At 326, for each of the one or more additional test blocks, the additional test block is compared to the one or more training blocks extracted from the one or more respective training images.
At 328, for each of the one or more additional test blocks, a training block matching the additional test block is identified from among the one or more training blocks.
At 330, for each of the one or more additional test blocks, the additional test block is correlated with a distance value associated with the matching training block associated with the corresponding training image within the depth-ordered training image sequence.
At 332, the one or more distance values associated with the one or more additional test blocks are stored, for example, on the non-transitory computer-readable medium.
In some examples, each training image is a linear combination of intrinsic images. The eigen-images are orthogonal to each other, forming an eigen-space coordinate system expressing the training images, and the weight of the linear combination is the eigen-space coordinate of the training images in eigen-space.
Fig. 4 illustrates
An additional set of
The
In some implementations of
In some examples, the
At 414, a first test block is extracted from the test image. The first test block includes a first set of coordinates.
At 416, the first test block is projected onto a block eigenspace constructed from training blocks having the same set of training coordinates as the first set of coordinates.
At 418, block eigenspace coordinates of the test block are derived from the projection.
At 420, the set of block eigenspace coordinates of the first test block is compared to the set of block eigenspace coordinates associated with each training block.
At 422, the set of block eigenspace coordinates of the first test block is identified to which of the training blocks is associated a match.
At 424, the matched set of block eigenspace coordinates is correlated with distance values associated with training images within the depth-ordered sequence of training images.
In some embodiments, the
In certain implementations, these additional operations include determining a set of cluster centers by applying a clustering technique to a set of training blocks (428). Each cluster center in the set of cluster centers represents one or more training blocks within the set of training blocks. In some examples, the clustering technique comprises hierarchical k-means clustering. In some examples, the clustering technique involves principal component analysis.
In some examples, the additional operations further include collecting test images of one or more objects (430).
At 432, a first test block is extracted from the test image, the first test block having a first coordinate set substantially equal to the training block coordinate set.
At 434, the first test block is compared to the set of cluster centers.
At 436, matching cluster centers are identified within the set of cluster centers.
At 438, the first test block is compared to the training blocks in the matching cluster center.
At 440, the first test block is correlated with the distance value associated with the matching training block, and the matching training block is associated with a training image within the set of training images.
In some implementations, the
Various modifications may be made within the spirit of the disclosure. Likewise, in certain instances, features described above in connection with different embodiments can be combined in the same embodiment. Accordingly, other embodiments are within the scope of the following claims.
- 上一篇:一种医用注射器针头装配设备
- 下一篇:具有噪声消除的集成光学陀螺仪