Spatial encoding system, decoding system, imaging system and method thereof

文档序号:144372 发布日期:2021-10-22 浏览:56次 中文

阅读说明:本技术 空间编码系统、解码系统、成像系统及其方法 (Spatial encoding system, decoding system, imaging system and method thereof ) 是由 泽夫·扎列夫斯基 伊察克·奥默·瓦格纳 阿萨夫·莎蒙 于 2020-03-05 设计创作,主要内容包括:一种用于成像的系统可以包括照明系统,该照明系统包括用于生成光束的光源;以及空间编码图案生成器,包括一个或多个光学元件,用于对成像光束进行编码,以便通过多个不同的空间编码图案同时照射对象,其中所述不同编码图案中的每个编码图案的特征在于成像图案的不同波长。该系统还可以包括成像传感器,用于接收透射通过对象或从对象反射的编码成像光束;以及处理器,用于从成像解码图像数据并重建对象的图像。(A system for imaging may include an illumination system including a light source for generating a light beam; and a spatially encoded pattern generator comprising one or more optical elements for encoding the imaging light beam for simultaneous illumination of the object by a plurality of different spatially encoded patterns, wherein each of the different encoded patterns is characterized by a different wavelength of the imaging pattern. The system may further include an imaging sensor for receiving the encoded imaging beam transmitted through or reflected from the object; and a processor for decoding the image data from the imaging and reconstructing an image of the object.)

1. An illumination system, comprising:

a light source for generating a light beam; and

a spatially encoded pattern generator comprising one or more optical elements for encoding the imaging light beam for simultaneously illuminating the object with a plurality of different spatially encoded patterns, wherein each of the different encoded patterns is characterized by a different wavelength of the imaging pattern.

2. The system of claim 1, further comprising one or more optical elements that split the beam into an imaging beam and a reference beam and direct the reference beam to an imaging sensor after the reference beam is combined with the imaging beam.

3. The system of claim 1 or 2, wherein the spatial coding pattern generator is configured to image the plurality of different spatial coding patterns onto the object across a first axis, the first axis being perpendicular to the direction of propagation of the imaging light beam, and to perform a fourier transform of the plurality of different spatial coding patterns on the object across a second axis, the second axis being perpendicular to both the first axis and the direction of propagation of the imaging light beam.

4. The system of any of claims 1 to 3, wherein the one or more optical elements for encoding the imaging beam are aligned along the optical path in the following order: a diffraction grating grid, a first lens, a coding pattern element, and a second lens.

5. The system of claim 4, wherein the first lens is a distance from the diffraction grating grid and the coding pattern element equal to an X-axis focal length of the first lens, and wherein the second lens is a distance from the coding pattern element equal to an X-axis focal length of the second lens.

6. The system of claim 4 or 5, wherein the X-axis focal length of each lens is twice the Y-axis focal length of the lens.

7. The system of any of claims 4 to 6, wherein the optical path defined by the one or more optical elements for encoding the imaging beam includes, in this order, the diffraction grating grid, the first lens, the encoding pattern element, the second lens, a second diffraction grating grid, and a third lens.

8. The system of claim 7, wherein the first lens is a distance from the diffraction grating grid and the coding pattern element equal to an X-axis focal length of the first lens, wherein the second lens is a distance from the coding pattern element equal to an X-axis focal length of the second lens, and wherein the third lens is a distance from the second diffraction grating grid equal to an X-axis focal length of the third lens.

9. The system of claim 8, wherein the X-axis focal length of each lens is twice the Y-axis focal length of the lens.

10. The system of any one of claims 1 to 9, wherein the light source is a laser generator.

11. The system of claim 10, wherein the laser source is a pulsed laser source.

12. The system of any one of claims 1 to 11, incorporated in an endoscope.

13. The system of any one of claims 1 to 12, incorporated in an imaging system, the imaging system further comprising:

an imaging sensor for receiving the encoded imaging beam transmitted through or reflected from the object; and

a processor for decoding image data from the imaging and reconstructing an image of the object.

14. The system of claim 13, wherein to reconstruct an image of the object, the processor is configured to multiply an image of each of the different encoding patterns obtained from the reflected or transmitted encoded imaging beams by a corresponding decoding pattern to obtain products, and to sum all of the products to obtain a reconstructed image of the object.

15. A decoding system, comprising:

an imaging sensor to receive an encoded imaging beam that simultaneously illuminates an object with a plurality of different spatial encoding patterns, wherein each of the different encoding patterns is characterized by a different wavelength of the imaging pattern, and the encoded imaging beam is transmitted through or reflected from the object; and

a processor for decoding image data from the imaging and reconstructing an image of the object.

16. The system of claim 15, wherein to reconstruct an image of the object, the processor is configured to multiply an image of each of the different encoding patterns obtained from the reflected or transmitted encoded imaging beams by a corresponding decoding pattern to obtain products, and to sum all of the products to obtain a reconstructed image of the object.

17. A method, comprising:

generating a light beam; and

the imaging light beam is encoded using a spatial encoding pattern generator to simultaneously illuminate the object with a plurality of different spatial encoding patterns, wherein each of the different encoding patterns is characterized by a different wavelength of the imaging pattern.

18. The method of claim 17, wherein encoding the imaging beam comprises applying temporal gating.

19. The method of claim 18, wherein the temporal gating is applied using any technique of the group of techniques consisting of short optical pulse gating, coherent gating, and interference patterns generated by diffraction grating grids.

20. The method of any of claims 17 to 19, wherein the spatially encoded pattern generator comprises one or more optical elements for encoding the imaging beam, the one or more optical elements aligned along the optical path in the following order: a diffraction grating grid, a first lens, a coding pattern element, and a second lens.

21. The method of claim 20, wherein encoding the imaging beam comprises applying temporal gating, wherein the temporal gating is achieved by splitting the beam into an imaging beam and a reference beam, and directing the reference beam to an imaging sensor after the reference beam is combined with the imaging beam.

22. The method of claim 20 or 21, wherein the first lens is at a distance from the diffraction grating grid and the coding pattern elements equal to an X-axis focal length of the first lens, and wherein the second lens is at a distance from the coding pattern elements equal to an X-axis focal length of the second lens.

23. The method of any of claims 20 to 22, wherein the X-axis focal length of each lens is twice the Y-axis focal length of the lens.

24. The method of any of claims 20 to 23, wherein the one or more optical elements for encoding the imaging beam define an optical path comprising, in this order, the diffraction grating grid, the first lens, the encoding pattern element, the second lens, a second diffraction grating grid, and a third lens.

25. The method of claim 24, wherein the first lens is a distance from the diffraction grating grid and the coding pattern element equal to an X-axis focal length of the first lens, wherein the second lens is a distance from the coding pattern element equal to an X-axis focal length of the second lens, and wherein the third lens is a distance from the second diffraction grating grid equal to an X-axis focal length of the third lens.

26. The method of claim 25, wherein the X-axis focal length of each lens is twice the Y-axis focal length of the lens.

27. The method of any one of claims 17 to 26, wherein the beam is generated by a laser source.

28. The method of claim 27, wherein the laser source is a pulsed laser source.

29. The method of any of claims 17 to 28, further comprising:

receiving the encoded imaging beam transmitted through or reflected from the object using an imaging sensor, an

Using a processor, image data is decoded from the imaging and an image of the object is reconstructed.

30. The method of claim 29, further comprising: to reconstruct an image of the object, the image of each of the different encoding patterns obtained from the reflected or transmitted encoded imaging beams is multiplied by the corresponding decoding pattern to obtain products, and all the products are summed to obtain a reconstructed image of the object.

31. A method, comprising:

receiving, using an imaging sensor, an encoded imaging beam that simultaneously illuminates an object with a plurality of different spatial encoding patterns, wherein each of the different encoding patterns is characterized by a different wavelength of the imaging pattern and the encoded imaging beam is transmitted through or reflected from the object; and

the image data is decoded from the imaging and an image of the object is reconstructed using a processor.

32. The method of claim 31, wherein reconstructing the image of the object comprises: multiplying an image of each of the different encoding patterns obtained from the reflected or transmitted encoded imaging beams by the corresponding decoding pattern to obtain products, and summing all the products to obtain a reconstructed image of the object.

Technical Field

The present invention relates to imaging. More particularly, the present invention relates to spatial encoding systems, decoding systems, imaging systems, and methods thereof.

Background

In vivo biological tissue imaging typically requires careful selection between different biological imaging methods that suit the particular experimental requirements and conditions. Deep penetration non-invasive imaging techniques such as Magnetic Resonance Imaging (MRI), Computed Tomography (CT), high and low frequency Ultrasound (US) are expensive and limited in duration and spatial resolution. Other high resolution methods such as single/multiphoton fluorescence or confocal fluorescence microscopy can be used in vivo, but are generally only useful in shallow interrogation depths.

Miniature endoscopes have been developed that use minimally invasive techniques to insert optical fibers deep into a target area within a patient's body. Such devices enable long-term in vivo monitoring of biological samples. Many commercial miniature endoscope fibers include a number of core bundles, known as multi-core fibers (MCFs), in which each core acts as a single optical fiber.

Many MCFs include multimode fiber (MMF) that allows many spatial electromagnetic modes to pass through each core, thereby increasing the intensity transmission of the image through the endoscope. However, MMFs typically scramble information transmitted through them in space and time. This problem can be addressed by an MCF design that has a large enough space between adjacent cores to minimize core-to-core optical coupling (crosstalk). As a result, image resolution may be compromised and pixelation artifacts may appear in the generated image. Other solutions, such as optimization algorithms, digital phase conjugation or transmission matrices, have been demonstrated, but they are generally sensitive to fiber bending.

A Single Mode Fiber Bundle (SMFB) may be used instead of MMF, which is generally less sensitive to fiber bending and less prone to information corruption. SMFB imaging typically involves the use of a scanning head with lenses, spectral dispersers, speckle correlation, and may also involve other techniques that can produce resolutions up to the diffraction limit. While the SMFB core-to-core length may be reduced, the brightness of the image transmitted through the device may also be reduced. Thus, the signal-to-noise ratio may also decrease. In addition to the necessary geometric design of the optical fiber, the resolution may still be limited and the depth of observation through the scattering medium may be greatly reduced.

In MMF and SMFB beams, various illumination methods by optical fibers are known, for example, confocal microscopy of optical sectioning, speckle correlation techniques capable of optical sectioning without staining, and the like can be performed by sample illumination and collection of reflected light of the same beam.

Current techniques exhibit typical penetration depths of less than 150 μm and are difficult to handle with true biological scattering media (e.g., blood) between the distal end of the fiber and the sample. Furthermore, typical imaging acquisition rates are quite poor (typically roughly between 5Hz for a 36 x 36 pixel image, up to several minutes).

Disclosure of Invention

Thus, according to some embodiments of the present invention, there is provided a lighting system comprising a light source for generating a light beam and a spatial coding pattern generator comprising one or more optical elements for coding an imaging light beam for simultaneously illuminating an object with a plurality of different spatial coding patterns, wherein each of the different coding patterns is characterized by a different wavelength of the imaging pattern.

In some embodiments of the invention, the system further comprises one or more optical elements for splitting the beam into an imaging beam and a reference beam and directing the reference beam to the imaging sensor after the reference beam is combined with the imaging beam.

In some embodiments of the invention, the spatial coding pattern generator is configured to image the plurality of different spatial coding patterns onto the object across a first axis, the first axis being perpendicular to the direction of propagation of the imaging light beam, and to perform a fourier transform of the plurality of different spatial coding patterns on the object across a second axis, the second axis being perpendicular to both the first axis and the direction of propagation of the imaging light beam.

In some embodiments of the invention, one or more optical elements for encoding the imaging beam are aligned along the optical path in the following order: a diffraction grating grid, a first lens, a coding pattern element, and a second lens.

In some embodiments of the invention, the first lens is at a distance from the diffraction grating grid and the coding pattern element equal to an X-axis focal length of the first lens, and wherein the second lens is at a distance from the coding pattern element equal to an X-axis focal length of the second lens.

In some embodiments of the invention, the X-axis focal length of each lens is twice the Y-axis focal length of that lens.

In some embodiments of the invention, the optical path defined by the one or more optical elements for encoding the imaging beam comprises, in this order, a diffraction grating grid, a first lens, an encoding pattern element, a second lens, a second diffraction grating grid, and a third lens.

In some embodiments of the invention, the first lens is at a distance from the diffraction grating grid and the coding pattern element equal to an X-axis focal length of the first lens, wherein the second lens is at a distance from the coding pattern element equal to an X-axis focal length of the second lens, and wherein the third lens is at a distance from the second diffraction grating grid equal to an X-axis focal length of the third lens.

In some embodiments of the invention, the X-axis focal length of each lens is twice the Y-axis focal length of that lens.

In some embodiments of the invention, the light source is a laser generator.

In some embodiments of the invention, the laser source is a pulsed laser source.

In some embodiments of the invention, the system is incorporated into an endoscope.

In some embodiments of the invention, the system is incorporated in an imaging system, the imaging system further comprising an imaging sensor for receiving the encoded imaging light beam transmitted through or reflected from the object; and a processor for decoding the image data from the imaging and reconstructing an image of the object.

In some embodiments of the invention, to reconstruct an image of the object, the processor is configured to multiply the image of each of the different encoding patterns obtained from the reflected or transmitted encoded imaging beams by the corresponding decoding pattern to obtain products, and to sum all of the products to obtain a reconstructed image of the object.

In some embodiments of the present invention, a decoding system is provided that includes an imaging sensor for receiving an encoded imaging beam that simultaneously illuminates an object with a plurality of different spatial encoding patterns, wherein each of the different encoding patterns is characterized by a different wavelength of the imaging pattern, and the encoding patterns are transmitted through or reflected from the object. The decoding system also includes a processor for decoding the image data from the imaging and reconstructing an image of the object.

In some embodiments of the invention, to reconstruct an image of the object, the processor is configured to multiply the image of each of the different encoding patterns obtained from the reflected or transmitted encoded imaging beams by the corresponding decoding pattern to obtain products, and to sum all of the products to obtain a reconstructed image of the object.

In some embodiments of the invention, there is provided a method comprising: generating a light beam; and encoding the imaging beam using a spatial encoding pattern generator to simultaneously illuminate the object with a plurality of different spatial encoding patterns, wherein each of the different encoding patterns is characterized by a different wavelength of the imaging pattern.

In some embodiments of the invention, encoding the imaging beam comprises applying temporal gating.

In some embodiments of the invention, temporal gating is applied using any of a group of techniques consisting of short optical pulse gating, coherent gating, and interference patterns produced by diffraction grating grids.

In some embodiments of the invention, the spatially encoded pattern generator comprises one or more optical elements to encode the imaging beam, the optical elements being aligned along the optical path in the following order: a diffraction grating grid, a first lens, a coding pattern element, and a second lens.

In some embodiments of the invention, the encoding of the imaging beam comprises applying temporal gating, wherein temporal gating is achieved by splitting the beam into the imaging beam and the reference beam and directing the reference beam to the imaging sensor after the reference beam is combined with the imaging beam.

In some embodiments of the invention, there is provided a method comprising: receiving, using an imaging sensor, a coded imaging beam that simultaneously illuminates a subject with a plurality of different spatial coding patterns, wherein each of the different coding patterns is characterized by a different wavelength of the imaging pattern and the coded imaging beam is transmitted through or reflected from the subject; and decoding the image data from the imaging and reconstructing an image of the object using the processor.

Drawings

For a better understanding of the present invention and its practical application, the following drawings are provided and referenced hereinafter. It should be noted that the figures are given by way of example only and in no way limit the scope of the invention. Like parts are denoted by like reference numerals.

FIG. 1A is a graph showing photon count versus time for light to interact with a scattering medium.

FIG. 1B shows a pair of Barker-based arrays that may be used in a system for imaging through a scattering medium.

FIG. 2A illustrates a system for imaging through a scattering medium using a one-dimensional illumination pattern according to some embodiments of the invention.

FIG. 2B illustrates images of light intensity at particular wavelengths on different planes according to some embodiments of the invention.

FIG. 3 illustrates a system for imaging through a scattering medium using a two-dimensional illumination pattern according to some embodiments of the invention.

FIG. 4 illustrates discrete light illumination implemented by a system for imaging through a scattering medium, according to some embodiments of the invention.

FIG. 5 illustrates convolution components of a single wavelength in light illumination achieved by a system for imaging through a scattering medium according to some embodiments of the present invention.

FIG. 6 illustrates the final convolution of a single wavelength in illumination achieved by a system for imaging through a scattering medium according to some embodiments of the present invention.

Figure 7 illustrates continuous wavelength encoded spectral regions and pattern pixels using a single frequency grating according to some embodiments of the present invention.

Figure 8 illustrates deflection of grating grid frequencies per second in a spatial axis plane according to some embodiments of the invention.

FIG. 9 illustrates a multicore fiber endoscope including a system for imaging an object through a scattering medium, according to some embodiments of the present invention.

FIG. 10 is a diagram of a method for imaging an object through a scattering medium according to some embodiments of the invention.

Detailed Description

In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the methods and systems. However, it will be understood by those skilled in the art that the present method and system may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the present methods and systems.

Although the examples disclosed and discussed herein are not limited in this regard, the terms "plurality" and "a plurality" as used herein may include, for example, "multiple" or "two or more. The terms "plurality" or "a plurality" may be used throughout the specification to describe two or more components, devices, elements, units, parameters and the like. Unless explicitly stated, the method examples described herein are not limited to a particular order or sequence. Additionally, some of the described method examples, or elements thereof, may occur or be performed at the same point in time.

Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "adding," "associating," "selecting," "evaluating," "processing," "computing," "calculating," "determining," "specifying," "allocating," or the like, refer to the action and/or processes of a computer, computer processor, or computing system, or similar electronic computing device, that manipulate, execute, and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.

According to some embodiments of the present invention, a new optical arrangement is provided, which aims to achieve deeper imaging through scattering media by employing spatial illumination.

In general, four main parameters affect how light propagates in a scattering medium — in m-1]Absorption coefficient mu measured as a unit0Determining the energy loss of the signal; with [ m-1 ]]Scattering coefficient mu measured as unit2Which is a measure of the typical length of light passing between scatterings, the scattering anisotropy g measures the average<cos(θ)>Where θ is the angle of deflection of the scatter (which allows calculation of the degree to which the typical scatter is "forward" in direction), and the index of refraction n of the medium.

From musAnd g can derive a simplified scattering coefficient μ s', where;

μs′=μs *(1-g) (1)

which represents the actual scattering length taking into account typical scattering directions. Typical scattering times can be calculated by simplified scattering coefficients and refractive indices.

In the actual imaging situation, light pulses are projected through the scattering medium to the sample. Due to scattering in the medium, the pulse spreading can be described by ballistic, serpentine and diffuse signal components. The ballistic component takes the shortest path through the medium and retains the image information. In contrast, diffuse light undergoes multiple scattering, travels long distances within the scattering medium, and does not contribute to forming a direct image. The serpentine photons experience some scattering in the forward direction, thus retaining some image information. The light then strikes the sample, is scattered back or is transmitted there to the sensor via a scattering medium.

As previously mentioned, the signal stretches and can be described by ballistic, serpentine, and diffuse photons.

FIG. 1 is a graph of photon count versus time for light interacting with a scattering medium. Three portions (12, 14 and 16) of photons arriving through the scattering medium from the interaction between the illumination light and the sample are shown, divided into times of arrival at the sensor. First portion 12 includes ballistic photons that reach the sample directly from the illumination source (B1) and ballistic photons that reach the sensor after interacting with the sample (B2). The next portion 14, after a few picoseconds, comprises two sets of photons: photons scattered on their way from the sample to the sensor due to the scattering medium (B1 and P2), and photons scattered into the sample direction and ballistic from the sample to the sensor due to the medium (P1 and B2). The third portion 16 comprises photons scattered by the sample and by the scattering medium before reaching the sensor, so that they arrive last.

Many methods are known to screen photons that contribute to image data from photons that do not contribute to image data. An ideal imaging method would be to gate the third fraction of photons, use the first fraction of photons and collect the maximum amount of information from the second fraction of snake photons.

It is known that less than t has been useds' and a time-gating of less than 100ps from the earliest arrival of light. This requires expensive laser sources less than a few picoseconds and dedicated time-gated sensors.

According to some embodiments of the present invention, instead of short pulses and time gating, a system for imaging through a scattering medium may employ narrow angle light collection, thereby omitting scattered photons while preserving ballistic photons.

Optical systems comprising long optical channels that absorb light propagating at angles higher than a predetermined angle (e.g., 0.29 °) are known, but may not be suitable for imaging through scattering media under real in vivo conditions. Furthermore, the signals acquired in such systems are typically very weak and susceptible to stray light from photons that experience multiple scattering traveling uniformly in all directions.

The use of a holography based approach with a short coherence length illumination source may allow for longer pulse times. In this method, the coherence length can be related to μsIn comparison, only photons that experience scattering resulting in an optical path less than the coherence length will contribute to the interference pattern, while light traveling longer distances will be averaged, contributing only to random noise. Increasing the width of the scattering medium (the length along the direction of propagation of the imaging beam (ballistic path)) can reduce the number of interfering photons while increasing the average noise. Thus, the signal-to-noise ratio may reduce the ambiguity and limitations of the sample spatial frequency reconstruction. It is known that illumination has previously been encoded using modulated phase to improve signal-to-noise ratio. However, it is assumed that the illumination system directly illuminates the sample, and not scattered by the scattering medium as in a real in vivo scenario. Furthermore, this approach relies on time multiplexing, which increases the acquisition duration.

According to some embodiments of the invention, scattering limitations and high acquisition times may be addressed by using spatially structured illumination. This may require an autocorrelation of the coded illumination pattern.

According to some embodiments of the invention, a system for imaging an object through a scattering medium may comprise an illumination system and an imaging sensor and a processing unit for processing image data sensed by the imaging sensor.

According to some embodiments of the invention, the illumination system may comprise a light source generating a light beam. In some embodiments, the light source may be, for example, a white light source, a Light Emitting Diode (LED), a continuous laser source, a pulsed laser source (e.g., femtosecond, picosecond, nanosecond, millisecond pulsed laser source, etc.) for generating the imaging beam.

The spatial encoding pattern generator may be operative to encode the imaging light beam to simultaneously illuminate the object with a plurality of different spatial encoding patterns, wherein each of the different encoding patterns is characterized by a different wavelength of the imaging pattern.

Some different coding patterns may overlap in whole or in part, but not correlated, or have a correlation function (between them) with a sharp maximum at a particular point.

The imaging sensor may be operable to receive an image beam transmitted through or reflected from the object, and the processing unit may be operable to reconstruct an image of the image data sensed by the sensor.

In the reconstruction of the object image, the processing unit may be designed to execute an image reconstruction algorithm which decodes the encoded spatial pattern illuminating the object and ignores photons scattered by the scattering medium by ignoring any image data representing deviations from the spatial encoding pattern. For example, if a green photon reaches a region that should be illuminated by another color (or colors), it will be ignored in the reconstruction of the object image assuming it is not a ballistic photon (e.g., it does not travel directly from the light source to the location where it is detected, and is most likely to scatter in its way).

In some embodiments of the invention, temporal gating is used in order to separate ballistic photons from scattered photons. Temporal gating can be achieved, for example, by applying very short laser pulses, by applying short coherent gating (e.g., via interference). For example, coherent gating can be achieved by performing coherent shaping of the illumination to obtain the desired temporal gating, employing the earliest arriving light (FAL) method.

For example, to apply the FAL method, a reference beam may be separated from the beam generated by the light source and directed along another optical path into the sensor, allowing interferometric measurements.

The spatial coding pattern may be obtained, for example, by employing a barker-based array. A set of laterally shifted barker code patterns (shown on fig. 1B) may be projected on the sample. Such a shift may cause a pattern scan of the sample.

One-dimensional (1D) scanning can enhance two-dimensional (2D) images in all directions, regardless of the original scanning direction. Another feature associated with the illumination produced by the system according to some embodiments of the present invention is that it can project multiple patterns of different wavelengths simultaneously. The sample image of the shifted pattern illumination may then be separated and analyzed (e.g., using wavelength multiplexing) to increase acquisition time.

FIG. 1B shows a pair of Barker-based arrays that may be used in a system for imaging through a scattering medium. In this example, 13 × 13 is based on an array of barkers (a), where each row is a 5-pixel shift of the basic barker-encoded vector. Array (b) is the autocorrelation of the barker array of (a). Other arrangements (other number of pixels, other coded vectors) may also be used in some embodiments of the invention.

The coherence length can be calculated in relation to fig. 1A. A straightforward approach may involve determining the coherence length such that only photons from the first part will interfere. Increasing the coherence length may allow more photons to be collected from the intermediate portion, thereby increasing signal and noise. Spatial encoding then removes noise from B1+ P2 photons that did not contribute to the data, while preserving snake photons that contributed to the data.

FIG. 2A illustrates a system for imaging through a scattering medium using a one-dimensional illumination pattern according to some embodiments of the invention. The system may be designed to perform different spatial encoding of different wavelengths of the illumination beam to perform resolution enhancement and to see outside the scattering tissue.

The system 100 includes an illumination source 102, for example, a laser beam generator such as a continuous laser, a pulsed laser (e.g., femtosecond, picosecond pulsed laser in some embodiments, nanosecond or millisecond pulsed laser in other embodiments — faster pulses may better facilitate high resolution imaging results). The light beam generated by the light source 102 may be split into two light beams by the beam splitter 104. One beam serves as a reference beam and is directed by mirrors (106 and 118) through second beam splitter 126 into optical imaging sensor 130. Another beam (hereinafter referred to as the imaging beam) is directed through a spatially encoding pattern generator 105, such as a series of optical elements. According to some embodiments of the invention, the spatial coding pattern generator is configured to image the plurality of different spatial coding patterns onto the object to be imaged across a first axis, the first axis being perpendicular to the direction of propagation of the imaging light beam, and to perform a fourier transform of the plurality of different spatial coding patterns onto the object across a second axis, the second axis being perpendicular to both the first axis and the direction of propagation of the imaging light beam.

First, the imaging beam passes through the diffraction grating grid G1108, e.g., 300 lines per millimeter, and the other gratings may be in the range of 200 to 2/λ (center illumination wavelength) grating lines per millimeter and diffracted into a plurality of parallel beams, which are then Fourier transformed in the X-axis direction when passing through the cylindrical lens L1. L1 is characterized as having two different focal length values for each of two orthogonal axes (e.g., f in the Y axis and 2f in the X axis, e.g., 25.4mm and 50.8mm, respectively), the diffraction grating grid G1108 is located 2f from L1 (X-axis focal length of L1) such that the fourier conjugate plane in the X axis is located at the X-axis focus of L1 and the imaging plane of the beam is located at the Y-axis focus. This causes the imaging beam to split into multiple beams of different wavelengths at different deflection locations, corresponding to their wavelengths in the X-plane, with the original height of the beams in the Y-plane remaining unchanged. The encoding pattern element 112 (e.g., two barker-based arrays 114, such as the array depicted in fig. 1B, for example) is further disposed downward along the direction of propagation of the imaging beam, at a distance from L12 f (X-axis focal length of L1), to correspondingly encode each of the plurality of beams of different wavelengths. Next, the imaging beam passes through lens L2116, the X-axis focal length of lens L2116 again being twice the focal length of the lens in the Y-axis (e.g., 25.4mm and 50.8mm, respectively). The encoding pattern element 112 is 2f (X-axis focal length of L1) from L1110 (X-axis focal length of L1) at which point an image of the encoding pattern in the Y-axis is formed, and the lens L2116 is located at a distance of 2f (X-axis focal length of L1) and serves to widen the imaging beam back to its original width in the X-axis.

The light exiting L2116 is directed onto a sample (e.g., tissue within a patient), which may be located at a distance from L22 f (X-axis focal length of L2). The light is transmitted through the sample and collected by the optical imaging sensor 130. A beam splitter 126 may be placed in the path to combine the reference beam with the imaging beam before impinging on the optical imaging sensor 130.

The respective X-axis focal lengths and Y-axis focal lengths of L1 and L2 may be the same or different.

FIG. 2B illustrates images of light intensity at particular wavelengths on different planes according to some embodiments of the invention. Image (a) shows the intensity image of the imaging beam as it reaches L1, at the X-axis focal plane, just before the encoded pattern. Image (b) shows the intensity image of the imaging beam after traversing the encoding pattern. Only one line passes through explicitly, encoded on the Y-axis. Image (c) is the projected intensity of the imaging beam on the object.

Finally, in the example of fig. 2, the result of each wavelength (of the plurality of wavelengths emerging from L1) is a spot that follows the encoding pattern on the Y-axis and follows the original beam profile on the X-axis. Each wavelength will produce a different pattern on the object depending on the encoding pattern.

Using the encoding pattern to introduce a set of laterally shifted patterns (e.g. 2D images, a pattern encoding a single line in the pattern shown in fig. 2, shifting the encoded line for each wavelength), we can support image enhancement and coherent gating signals, as explained in the introduction.

FIG. 3 illustrates a system for imaging through a scattering medium using a two-dimensional illumination pattern according to some embodiments of the invention.

The design of system 200 is similar to system 100 of FIG. 2A, but with some additional optical elements in the spatially encoded pattern generator, along the optical path of the spatially encoded pattern generator in the order: a second diffraction grating grid 120 and a third lens L3122. The X-axis focal length of the third lens L3122 is twice the focal length of the lens in the Y-axis (e.g., 25.4mm and 50.8mm, respectively).

Diffraction grating grid 120 (e.g., 300 lines per millimeter, other gratings may be in the range of 200 to 2/λ (center illumination wavelength) grating lines per millimeter) is located at the X-axis focus of lens 116 and the X-axis focus of lens L3122.

The focal lengths (X, Y) of the lenses need not be the same (in either system shown in fig. 2B and 3).

The spatially encoded pattern projection generated in this setup is two-dimensional due to the addition of optical elements.

Some embodiments of the invention may utilize discrete wavelength encoding patterns. Some embodiments of the present invention may utilize continuous wavelength (band) encoding patterns.

The second diffraction grating grid G2120 may be designed to meet the required function.

For example, for discrete wavelengths, G2 is designed to have a frequencyWherein v is0Is the frequency of G1, fL1XAnd fL2XIs the X-axis focal length of L1 and L2.

For a contiguous wavelength band, a grid having a single frequency G2 may have a frequency Δ Ω, such that:wherein N isnumIs the number of different patterns, λ, to be projected on the target0Is the minimum projection wavelength and Δ λ.

Using the coding pattern to introduce a set of laterally shifted patterns (e.g., 2D images of the pattern shown in fig. 1B, with cyclic shifts of the coding by one pixel in the horizontal direction for each wavelength), image reconstruction can be enhanced and a coherent gating signal obtained as described above.

The mathematical description of the optical setup of the spatially encoded pattern generator is provided as follows:

for plane U (X)0,Y0) Assuming that the wavefront is constant and is tilted to the grid by θ:

having a frequency v0Plane U (X) behind grid G11,Y1)

Plane U (X)2,Y2) Fourier transform scaled with f1 lambda we assume we take only the first diffraction order and get:

u2(x2,y2)=δ(x2-[f1·sinθ+f1v0λ])

for discrete wavelengths:

the lenses have different focal points such that the length between U (x1, y1) to plane U (x2, y2) is fx-2 × fy, so imaging in the y plane and fourier transform in the x plane can be produced.

In the X-plane, the lens has an aperture D of finite diameter, so that there is sufficient position in plane U (X2, y2) to place the pattern, with minimal variation in illumination intensity:

u2(x2,y2)=δ(x2-[f1·sinθ+f1v0λ])sinc(D·x2)

the spatially encoded pattern elements (encoding masks) may be placed before the focal plane of L1 to again get a sinusoidal function of the same diameter.

Under these conditions, different color patches in discrete locations can be obtained before encoding the mask, at [ f ]1·sinθ+f1v0λ]As the center. For each position, the coding pattern can be matched, as shown in fig. 4. The laser source 402 in the system 400 generates an imaging beam that passes through a grating grid 404, a lens 406, and illuminates discrete, separate color dots (blue 410, green 412, and red 414) onto the spatially encoded pattern element 408.

Plane X3Multiplication by the coding pattern:

plane X scaled by f2 lambda4Fourier transform:

introducing grid 2:

to join blobs together, grid2 should be associated withThe same is that:

plane X6 fourier transform:

this means that the grid deflects each wavelength to the optical axis position regardless of the wavelength.

Another solution involves generating a continuous wavelength band, one frequency grating.

The lenses have different focal points such that the length fx-2 × fy from U (x1, y1) to plane U (x2, y2) so that imaging in the y plane and fourier transform in the x plane can be obtained.

In the X-plane, the aperture D may be opened such that in plane U (X2, y2) there is a delta function for each wavelength:

plane X3Multiplication by the coding pattern:

u3(x3,y3)=δ(x3-[f1·sinθ+f1v0λ])B(x3-f1v0λmin) (11)

note that in the example discussed, the encoding pattern is from a size of Δ XptN of (A)pBuilt up from discrete pixels, which means that each pattern may need to be of length Lpt=NpΔXpt. If N is desirednumA different pattern may then need to have a size Lnum=NnumLpt=NnumNpΔXptOf the light spot.

This means that the laser spectral band should be:

f1v0Δλ=NnumNpΔXpt (12)

plane X scaled by f2 lambda4Fourier transform:

introducing a grid G2 with only one frequency:

assume t f1=f2And θ is 0:

where Δ Ω will currently be undetermined

Plane X5Multiplication by grid 2:

plane X6Fourier transformation:

for a single wavelength, FIG. 5 shows the convolution components of equation 10 for a single wavelength. The top is the left component of the equation, the bottom is the right component of the equation, and the final convolution for each wavelength is depicted in fig. 6.

Fig. 6 shows equation 10 for the final convolution involving a single wavelength.

Thus, for an unlimited number of orders, the entire space may be covered with the coding pattern, but a scaled barker code may be required.

The minimum wavelength positions are:

x0=λ0f3v0 (11)

the next wavelength that overlaps the first wavelength is:

λ0f3v0=λ1f3(v0-ΔΩ)

typically, the nth overlap is:

to use the entire bandwidth, a grid frequency is employed such that N is accurately obtainednumCopy number:

and finally:

if it is not 1 2f≠fAnd everywhere isθ=0:v0Then replace it with

Typically, the nth overlap is:

note that, from equation (14), each pattern region λnn-1Have different spectral sizes and therefore the pattern should be scaled in each pattern area.

Fig. 7 shows a diagram of the spectral axis, in which different spectral regions are marked. The wavelengths starting in each region are marked with black dashed lines. Each region will eventually shift to the underlying spectral region.

The pattern pixels in each region are scaled to fit N equally spaced pixels in the fundamental spectral region. At the bottom of the figure, each pattern pixel is shown by a blue line, and the different patterns are shown by filling the space within the specified pattern pixel. The code pattern placed in the X-axis focus of the L1 lens should consist of the entire code pattern in these wavelength-corresponding positions, as shown in the bottom overall pattern.

Fig. 8 shows the spectral region and pattern pixels of solution 2. The nth spectral region has a value ofnn-1Each region having a different spectral size. The pixels in the region are marked by blue lines, each pattern filling a different spectral pixel differently. Finally, the coded mask includes the same pattern as shown herein in the corresponding position in the spatial axis of the plane in the X-axis focal plane of lens L1.

To find the scale at which Np pixels are equally spaced in the first region, from equations (14) and (15), the first pattern overlap by grid multiplication is:

and the first state is λ10Is divided into Np equally spaced pixels, each pixel of length:

and the starting spectral wavelength for each pixel is:

thus, in each nth replica, the starting spectral wavelength of the mth pixel is:

solution 3: continuous wavelength, multi-frequency gratings are suggested.

In the previous section, a grid G2 is shown that includes a frequency that folds the projected illumination at the minimum wavelength G1 deflection location.

Instead, this may be done using a different G2 grating having multiple frequencies, each of which will deflect a different wavelength toward the desired location. The advantage of this method is that it deflects the imaging beam closer to the optical axis than the single frequency grid method.

To calculate the required frequency and wavelength at which each new pattern will start, iterations may be considered as follows: lambda [ alpha ]1,v1

In this case, the replica in the optical axis can be shown in fig. 8. The deflection of each new G2 grid frequency on the spatial axis plane is shown in the figure. Three example grid frequencies are shown. The thicker lines (on the lambda (λ) axis) mark the separation wavelengths between each different encoding pattern frequency.

Applying the above iterative relationship forces each new frequency in the grid to deflect a wavelength to a known location such that a portion of the laser source bandwidth remains between particular calculated wavelengths in a particular portion. Note, though in (G1-v)00To (G1-v)01In the first spatial part in between, the set of spatial coding patterns is complete, but may be obtained by employing additional wavelengths, shifting or projecting to wavelengths arriving at a spatial locationTo enlarge the spatial region.

In reconstruction of an image of an object, the image of each of the different encoding patterns retrieved from the encoded imaging beams reflected from or transmitted through the object may be multiplied by a corresponding decoding pattern to obtain products, and all the products may be added to obtain a reconstructed image of the object.

According to some embodiments of the invention, decoding in the above manner is suitable for imaging through scattering media and for increasing the imaging resolution to super-resolution.

FIG. 9 illustrates a multi-core fiber endoscope 800 including a system for imaging an object through a scattering medium, according to some embodiments of the present invention. The endoscope 800 may include an elongated multi-core fiber body 802 having one or more illumination fibers 804 and one or more imaging fibers 812. A spatially encoded pattern generator 806 may be provided that is optically linked to one or more illumination fibers 804, the illumination fibers 804 being designed to guide a plurality of different spatially encoded patterns generated by the spatially encoded pattern generator 806 out of the distal end thereof through the endoscope body 802 in order to illuminate an object 814 (e.g., tissue within a patient). One or more imaging fibers 812 of the endoscope receive reflected illumination light from the object 814 and transmit it (e.g., via the beam splitter 808) into an imaging device 810, the imaging device 810 including an imaging sensor 816 and a processing unit 818.

FIG. 10 is a diagram of a method for imaging an object through a scattering medium according to some embodiments of the invention. The method 900 may include generating 902 a light beam. The method 900 may also include encoding 904 the imaging light beam using a spatial encoding pattern generator to simultaneously illuminate the object with a plurality of different spatial encoding patterns, wherein each of the different encoding patterns is characterized by a different wavelength of the imaging pattern.

The method 900 may also include receiving 906, using an imaging sensor, the encoded imaging light beam transmitted through or reflected from the object, and decoding 908, using a processor, the image data from the imaging and reconstructing an image of the object.

Some embodiments of the invention may be implemented as a system, method or computer program product. Similarly, some embodiments may be embodied as hardware, software, or a combination of both. Some embodiments may be embodied as a computer program product stored on one or more non-transitory computer-readable media (or multiple media) in the form of computer-readable program code embodied thereon. Such non-transitory computer readable media may include instructions that, when executed, cause a processor to perform method steps according to an example. In some examples, the instructions stored on the computer-readable medium may be in the form of an installed application and in the form of an installation package.

Such instructions may be loaded and executed, for example, by one or more processors.

For example, the computer readable medium may be a non-transitory computer readable storage medium. The non-transitory computer readable storage medium may be, for example, an electronic, optical, magnetic, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof.

The computer program code may be written in any suitable programming language. The program code may execute on a single computer system or on multiple computer systems.

Some embodiments are described above with reference to flowchart illustrations and/or block diagrams, which depict methods, systems, and computer program products according to various embodiments.

Features of various embodiments discussed herein may be used with other embodiments discussed herein. The foregoing description of the embodiments has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the precise form disclosed. It will be appreciated by those skilled in the art that many modifications, variations, substitutions, changes, and equivalents are possible in light of the above teaching. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:AR、HMD和HUD应用的光波导中的非均匀亚瞳孔反射器和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!