Image registration method and system thereof

文档序号:1957998 发布日期:2021-12-10 浏览:14次 中文

阅读说明:本技术 图像配准方法及其系统 (Image registration method and system thereof ) 是由 吕志伟 卢松暐 林嘉伟 陈怡伶 何端书 于 2020-04-30 设计创作,主要内容包括:本文提供了一种图像配准方法及其系统,所述方法包括由第一成像器提供目标区域的宽视野图像;由第二成像器提供目标区域的窄视野图像;将目标区域的窄视野图像在宽视野图像上对齐;由光学成像器捕获光学图像,其中光学成像器被配置为在窄视野图像中定位光学图像;并显示光学图像在目标区域的窄视野图像和宽视野图像上的位置。(An image registration method and system thereof are provided herein, the method including providing, by a first imager, a wide-field image of a target region; providing, by a second imager, a narrow field of view image of the target region; aligning a narrow-field image of the target region on a wide-field image; capturing an optical image by an optical imager, wherein the optical imager is configured to position the optical image in a narrow-field-of-view image; and displays the position of the optical image on the narrow-field image and the wide-field image of the target region.)

1. An image registration method, comprising:

providing, by a first imager, a wide-field image of a target region;

providing, by a second imager, a narrow field of view image of the target region;

aligning the narrow-field image of the target region over the wide-field image; and

displaying a position of the narrow-field image of the target region on the wide-field image.

2. The image registration method of claim 1, wherein the step of aligning the narrow-field-of-view image over the wide-field-of-view image comprises:

increasing a dominant image feature point weight of the wide-field image and the narrow-field image;

extracting the feature points of the wide-field image and the narrow-field image having at least one of a scale invariance, a rotation invariance, and an affine invariance; and

matching the feature points of the wide-field image and the narrow-field image.

3. The image registration method of claim 2, wherein the step of increasing a primary image point weight comprises down-scaling and/or image blurring the wide-field and narrow-field images.

4. The image registration method of claim 3, wherein the reduction ratio is in a range of 30 to 90%, 50 to 80%, or 60 to 70%.

5. The image registration method of claim 1, wherein a resolution of the wide-field image is substantially close to a resolution of the narrow-field image.

6. The image registration method according to claim 2, wherein the step of extracting feature points includes at least one of substantially scale invariant, substantially rotation invariant, and substantially affine invariant.

7. The image registration method according to claim 5, wherein the resolution of the wide-field image differs from the resolution of the narrow-field image by approximately 0 to 25 μm, 0 to 20 μm, 0 to 15 μm, 0 to 10 μm, 0 to 5 μm, or 0 to 3 μm.

8. The image registration method of claim 1, wherein the first imager comprises at least one dermatoscope, an epitomic light microscope, and an image mosaicing module.

9. The image registration method of claim 8, wherein the light source of the dermoscope and/or the epitomic microscope comprises at least one LED and/or wood's lamp.

10. The image registration method of claim 1, wherein the field of view of the first imager is at 5 x 5mm2To 20 x 20mm2Within the range of (1).

11. The image registration method of claim 1, wherein the field of view of the second imager is at 1 x 1mm2To 5 x 5mm2Within the range of (1).

12. The image registration method according to claim 1, further comprising the steps of:

capturing an optical image by an optical imager, wherein the optical imager is configured to position the optical image in the narrow-field-of-view image; and

displaying a position of the optical image on the narrow-field-of-view image and the wide-field-of-view image of the target region.

13. The image registration method of claim 1, wherein the field of view of the optical imager is at 50 x 50 μ ι η2To 1000 x 1000 μm2Within the range of (1).

14. The image registration method of claim 1, wherein the optical imager is an Optical Coherence Tomography (OCT) device, a Reflective Confocal Microscope (RCM) device, a two-photon emission microscope (TPL) device, a second harmonic generation microscope (SHG) device, a third harmonic generation microscope (THG) device, a Fluorescent Confocal Microscope (FCM) device, or a combination thereof.

15. An image registration system, comprising:

a first imager configured to capture a wide-field image of a target region; and

an optical module comprising a second imager and an optical imager, the second imager and the optical imager sharing the same objective lens, wherein the optical imager is configured to capture an optical image and the second imager is configured to capture a narrow-field image of the target region to align the narrow-field image of the target region over the wide-field image and to display a position of the optical image over the narrow-field image and the wide-field image of the target region.

16. The image registration system of claim 15, wherein the first imager comprises at least one dermatoscope, an epitomic light microscope, and an image mosaicing module.

17. The image registration system of claim 16, wherein the light source of the dermoscope and/or the epitomic microscope comprises at least one LED and/or wood's lamp.

18. The image registration system of claim 15, wherein the field of view of the first imager is in a range of 5 x 5mm to 20 x 20 mm.

19. The image registration system of claim 15, wherein the field of view of the second imager is in a range of 1 x 1mm to 5 x 5 mm.

20. The image registration system of claim 15, wherein the field of view of the optical imager is in a range of 50 x 50 μ ι η to 1000 x 1000 μ ι η.

21. The image registration system of claim 15, wherein a resolution of the wide-field image is substantially equal to a resolution of the narrow-field image.

22. The image registration system of claim 21, wherein the resolution of the wide-field image differs from the resolution of the narrow-field image by approximately 0 to 25 μ ι η, 0 to 20 μ ι η, 0 to 15 μ ι η, 0 to 10 μ ι η, 0 to 5 μ ι η, or 0 to 3 μ ι η.

23. The image registration system of claim 15, wherein the optical imager is an Optical Coherence Tomography (OCT) device, a Reflective Confocal Microscope (RCM) device, a two-photon emission microscope (TPL) device, a second harmonic generation microscope (SHG) device, a third harmonic generation microscope (THG) device, a Fluorescent Confocal Microscope (FCM) device, or a combination thereof.

Background

According to the statistics of the world health organization, the worldwide skin cancer has a tendency of increasing year by year in the last decade, which is closely related to the life style, the aging of society and the global ozone layer destruction.

Clinically, the diagnosis of any particular skin disorder, including skin cancer, is made by gathering relevant information about the skin lesion represented, including location (such as arm, head, leg), symptoms (itch, pain), duration (acute or chronic), arrangement (isolated, systemic, cyclic, linear), morphology (small spots, papules, vesicles), and color (red, blue, brown, black, white, yellow). In addition to conventional skin biopsies, an optical diagnostic system may be used to assess skin disorders.

Disclosure of Invention

The present invention provides an image registration method that accurately locates and tracks a target region in a medical diagnostic procedure. The present invention also provides an image registration system having two imagers that share the same optical elements to achieve accurate optical image registration.

The invention relates to an image registration method, which comprises the following steps: providing, by a first imager, a wide-field image of a target region; providing, by a second imager, a narrow field of view image of the target region; aligning a narrow field of view image of the target region on a wide field of view image; capturing an optical image by an optical imager, wherein the optical imager is configured to position the optical image in the narrow-field-of-view image; and displaying the position of the optical image on the narrow-field image and the wide-field image of the target region.

The invention also relates to an image registration system comprising: a first imager configured to capture a wide-field image of a target region; and an optical module comprising a second imager and an optical imager, the second imager and the optical imager sharing the same objective lens, wherein the optical imager is configured to capture an optical image and the second imager is configured to capture a narrow-field image of the target region to align the narrow-field image of the target region over a wide-field image and to display a position of the optical image over the narrow-field image and the wide-field image of the target region.

Introduction by introduction

All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.

Drawings

A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:

fig. 1 illustrates an exemplary flow chart of the general method of image registration of the present invention.

Fig. 2 illustrates an example of the image alignment process.

Fig. 3A and 3B show an exemplary flowchart of the image registration process.

Fig. 4A-4D illustrate an exemplary image registration method of the present invention in various embodiments.

Fig. 5 illustrates an example of an image registration system.

Fig. 6 illustrates yet another embodiment of an image registration system.

Detailed Description

Image registration is the process of transforming different sets of data into one coordinate system. The data may be multiple photographs, data from different sensors, time, depth, or perspective. In order to be able to compare or integrate the data obtained from these different measurements, registration is necessary. For example, one of the primary purposes of image registration is to accurately locate the target region during diagnosis and treatment to facilitate the use of high resolution non-invasive optical scanning. This can be used to repeatedly find a target region of interest in subsequent examinations for continuous tracking and subsequent medical services.

Generally, optical images, such as Optical Coherence Tomography (OCT) images and Reflection Confocal Microscope (RCM) images, each having a high resolution and a small field of view (FOV), are not easy to accurately find a target site/area of interest, and usually the same site/area cannot be found later in a large area, so that the target site/area of interest cannot be tracked; this difficulty leads to increased diagnostic time and treatment costs. For example, for a resolution of 1 μm, the field of view is on the order of hundreds of microns, which can lead to difficulties in locating the scan area in the target area (e.g., at a lesion on the skin).

Based on the above-mentioned image alignment problem, there is a need to develop an accurate image alignment/registration system, and more particularly a skin image alignment/registration system, that makes it easier to locate and track a target area of the skin for diagnosis and treatment. The present invention may help to accurately target the scanned area at the lesion and record the scanned spot to confirm that the entire lesion has been examined. Thus, the efficiency of the entire examination process can be greatly improved, and the doctor can scan the same focal point again at the time of follow-up visit to the patient after the last examination.

In some embodiments, an image registration method, particularly suitable for skin diagnosis, is provided to accurately locate the position of a target region during skin navigation. The present invention also provides an image registration system having at least two imagers that share the same optical elements to accurately achieve optical image registration.

Fig. 1 shows an exemplary flow chart of a general method of image registration of the present invention, which includes the steps of: providing a wide-field image of the target area by the first imager (step 11); providing a narrow field of view image of the target area by the second imager (step 12); aligning the narrow-field image of the target region on the wide-field image (step 13); and capturing an optical image by an optical imager (step 14), wherein the optical imager is configured to position the optical image in the narrow-field-of-view image; and displaying the position of the optical image on the wide-field image (step 15), thereby achieving registration of the optical image on the narrow-field image and the wide-field image of the target region.

In some embodiments, the optical image is an Optical Coherence Tomography (OCT) image, a Reflectance Confocal Microscope (RCM) image, a two-photon emission microscope (TPL) image, a second harmonic generation microscope (SHG) image, a third harmonic generation microscope (THG) image, a Fluorescence Confocal Microscope (FCM) image, or the like. In some embodiments, the optical imager is a corresponding device/system that can produce Optical Coherence Tomography (OCT) images, Reflection Confocal Microscope (RCM) images, two-photon emission microscope (TPL) images, second harmonic generation microscope (SHG) images, third harmonic generation microscope (THG) images, Fluorescence Confocal Microscope (FCM) images, and the like. In certain embodiments, the optical image is an OCT image or an RCM image.

In order to implement the feature extraction process and the feature matching process in image registration, there are two ways to implement the two processes, one is a region-based matching technique, and the other is a feature-based matching technique. For skin image registration, feature-based matching techniques are preferred for the image extraction process and the image matching process, since features of adjacent skin tones are similar and difficult to distinguish.

Additionally, there are problems with skin deformation, image rotation, and dimensional differences between image frames during skin scanning. For these considerations, a feature-based matching technique would be a suitable way of registering skin images. Among feature-based techniques, there are also two ways, blob-based techniques and angle-based techniques. For skin images with high magnification, small sharp corners or sharp edge attributes, speckle-based techniques would be the preferred way of skin image feature extraction and matching. In some embodiments, the blob-based technique is at least one algorithm selected from the group consisting of SURF algorithm, SIFT algorithm, and KAZE algorithm, preferably, but not limited to, SURF algorithm and SIFT algorithm. In some embodiments, the SURF algorithm is the preferred algorithm due to its insensitivity to skin deformation, skin image rotation, and inter-frame scale differences. Additionally, it also performs better in image registration processing speed to achieve real-time skin image navigation.

In order to accurately register the optical image on the wide-field image, in some embodiments, as shown in fig. 2, the process of aligning the narrow-field image on the wide-field image includes the steps of: increasing the weight of the main image feature points of the wide-field image and the narrow-field image (step 131); a step 132 of extracting feature points of the wide-field image and the narrow-field image; and a step 133 of matching feature points of the wide-field image and the narrow-field image.

In some embodiments, the step of increasing the dominant image feature point weight comprises down-scaling and/or blurring the wide-field image and the narrow-field image. In terms of the reduction process, the ratio is preferably 30 to 90%, more preferably 50 to 80%, more preferably 60 to 70%, but is not limited thereto. This downscaling step may effectively increase the speed of the image registration process and make the resolution of the wide-field image substantially equal/close to the resolution of the narrow-field image. In some embodiments, it also has the function of enhancing the feature weight of the primary image points and reducing the feature weight of the secondary image points. In other embodiments, the image blurring process exhibits primarily the effect of increasing the primary image feature point weight and decreasing the secondary image feature point weight. Therefore, if both the narrow-field image and the wide-field image have higher resolution, the step of reducing and/or blurring the images will help to improve the accuracy and timeliness of image registration during image scanning (e.g., skin image diagnosis).

In some embodiments, the step of extracting feature points comprises at least one of substantially scale invariant, substantially rotation invariant and substantially affine invariant. In a certain embodiment, the property satisfies scale invariant, rotation invariant, and affine invariant. In the step of extracting the feature points, the substantially invariant refers to any one of scale, rotation, and imitation, and is not necessarily completely invariant, and if it is a recognizable feature, a slight variation in the attribute (at least one of scale, rotation, imitation) is allowed.

In some embodiments, the resolution of the wide-field image is substantially equal to or close to the resolution of the narrow-field image. The definition of "substantially equal to or close to" has a difference of about 0 to 25 μm, preferably 0 to 20 μm, preferably 0 to 15 μm, preferably 0 to 10 μm and preferably 0 to 5 μm, and most preferably 0 to 3 μm. The closer the resolutions of the narrow-field image and the wide-field image are, the closer the detailed image features of the two images are, thereby improving the success rate of image registration.

Fig. 3A and 3B further provide a flow chart illustrating an exemplary image registration process applied to skin optical image registration (e.g., Optical Coherence Tomography (OCT) images, Reflectance Confocal Microscope (RCM) images, two-photon emission microscope (TPL) images, second harmonic generation microscope (SHG) images, third harmonic generation microscope (THG) images, or Fluorescence Confocal Microscope (FCM) images, etc.).

In some embodiments, an exemplary skin image registration method, as shown in fig. 3A, includes the steps of: step 20: acquiring a wide-field image (i.e., a wide-field image) of the dermatoscope image through the dermatoscope (i.e., the first imager); step 21: reducing and blurring a skin mirror image (namely a wide-field image) to improve the weight of main image feature points; step 22: extracting image characteristic points of a skin mirror image; step 23: gating the narrow field of view image of the new guide image by the image guide imager (i.e., the second imager); step 24: reducing and blurring the guide image (namely, the narrow-field image) to improve the weight of the main image feature point; step 25: extracting feature points of a guide image (namely, a narrow-field image); step 26: matching feature points of the two images (the dermatoscope image and the new guide image); step 27: updating the matching positions of the dermatoscope image and the guide image; step 28: determining whether to scan the optical image (i.e., an OCT, RCM, TPL, SHG, THG, or FCM image); if so, go to step 281: obtaining an optical image after scanning the optical image; step 282: the position of the optical image on the dermatoscope image is displayed. However, if the image match is not correct in step 282, new guide images should be continuously acquired in steps 29 and 23. When the optical image registration is completed, the user stops continuously acquiring new guide images in step S.

In some embodiments, as shown in FIG. 3A, the optical image need not be scanned. If so, the process of image registration does not involve an optical image and the image registration process is complete. In some embodiments, there is provided an image registration method, comprising: providing, by a first imager, a wide-field image of a target region; providing, by a second imager, a narrow field of view image of the target region; aligning a narrow field of view image of the target region on a wide field of view image; displaying a position of the narrow-field image of the target region on the wide-field image. The process of aligning the narrow-field image over the wide-field image is the same whether or not the optical image is included.

In other embodiments, the image registration of the present invention is provided, which comprises a mosaicing step (guided image mosaicing). The method involves comparing the overlap regions from frame to frame and stitching the images in real time as shown in fig. 3B. In step 200: acquiring a new guide image; step 201: obtaining a final spliced area image; step 202: two image processes are involved, including reduction, image blurring, and histogram equalization; step 203: extracting two image features; step 204: matching the characteristics of the two images; step 205: guiding image fuzzy screening; step 206: guiding image transformation; step 207: splicing the two images; step 208: updating the spliced image and the position thereof; step 209: it is determined whether an interrupt is required and if so, it is passed to step S (stop) and if not, it is passed to step 200 to get another new boot image and then the process is started again. In this method, any image may be used to obtain a stitched image with the identified location. The step involving histogram equalization is to enhance the image characteristics after removing the secondary image features, thereby improving matching reliability. In some embodiments, there is provided an image mosaic method comprising the steps of: acquiring a new guide image; obtaining a final spliced area image; processing the guide image and the finally spliced region image, wherein the processing comprises reduction, image blurring and histogram equalization; extracting image features; matching the characteristics of the guide image and the finally spliced area image; fuzzy screening the guide image; transforming the guide image; and splicing the guide image and the finally spliced area image, and updating the generated spliced image and position.

In some embodiments, the methods of the present invention are applied to image registration of OCT images. Since the field of view of the dermatome image (wide field image) is much wider than that of the guide image (narrow field image), this region can cover most commonly sized lesions. As shown in fig. 4A, in step 30: the lesion image is retrieved with a dermatoscope (wide field image) by attaching a probe (which may include a narrow field imager and an optical field imager) to the lesion. Next, in step 31, the OCT/guidance system is attached to the lesion, and then the guidance image is continuously captured. In step 32, a software utilizing the inventive method disclosed herein will identify the current location on the dermatoscope image displayed by the image registration process, and therefore, in step 33, the system will construct the image in real time. Since the spatial relationship between the OCT imaging system and the guidance imaging system (in which the narrow-field image is taken) is fixed, the OCT imaging position in this example can also be identified.

In situations where the OCT image (or other imaging modality that provides sub-surface information) and the guide image cannot be acquired simultaneously (e.g., in situations where the OCT and guide systems cannot be optically separated and may interfere with each other), the present invention provides yet another embodiment of the image registration method as shown in fig. 4B. The problem with this situation is that the OCT imaging location is marked based on the last guide image acquired before the OCT scan begins, and this location may be inaccurate if the user is moving the probe during the switching process and/or OCT scan. In step 301, an image of the lesion is taken by a wide-field imager (e.g., a dermatoscope). Step 302: the system is turned on in the image guidance mode. Step 303: the mobile probe is attached to the lesion. Step 304: the location of the narrow field of view image (e.g., the guide image) is identified by the software. Step 305: after switching to the image-guided mode in the system, the probe is moved to construct a target image (e.g., a lesion image). Here, if the probe is moved to a location of interest, step 309 provides for switching to OCT scan mode for scanning (i.e., acquiring an optical image by the optical imager). The system then marks the scan location on the constructed image. Step 306: and finishing the construction of the focus image. Step 307: if there are more regions of interest that have not yet been scanned, the probe is moved to the location of interest. Step 310 provides for: if the probe is moved to the location of interest, the OCT scan mode is switched to scan (i.e., optical images are acquired by the optical imager). The system then marks the scan position on the constructed image (i.e., the optical image).

In some embodiments, such as in OCT B-scan mode, the guide image (i.e., the narrow field of view image) cannot be acquired synchronously, but the OCT scan position can be registered with the OCT image (i.e., the optical image) itself by mosaicing, the present invention provides yet another embodiment of the image registration method as shown in figure 4C. In step 51, an image of the lesion is taken by a wide field-of-view imager (e.g., a dermatoscope). Step 52: the system is turned on in the image guidance mode. Step 53: the mobile probe is attached to the lesion. Step 54: the location of the narrow field of view image (e.g., the guide image) is identified by a piece of software that utilizes the inventive methods disclosed herein. Step 55: after switching to the image-guided mode in the system, the probe is moved to construct a target image (e.g., a lesion image). If the probe is moved to the location of interest, then in step 59, the OCT scan mode is switched to scan. Step 60 may then be taken to acquire guide images (i.e., narrow field of view images) at specific time intervals during the scan and identify the location of these images on the dermatoscope image. If no location can be identified, step 62 is taken to use the last successful image for OCT image location tagging. If successful, step 61 is taken to use the image for OCT image location marking. After step 63, the next OCT scan is started or returned to the guidance mode at step 55 or step 57. Step 56: and finishing the construction of the focus image. If there are more regions of interest that have not yet been scanned, then in step 57, the probe is moved to the location of interest. Step 58: the entire scan is completed.

In some embodiments, such as in OCT E-scan mode, the OCT image and the guide image cannot be acquired simultaneously, but the OCT scan position can be registered with the OCT image itself by mosaicing, the present invention provides yet another embodiment of the image registration method as shown in figure 4D. In step 61, an image of the lesion is taken by a wide field-of-view imager (e.g., a dermatoscope). Step 62: the system is turned on in the image guidance mode. And step 63: the mobile probe is attached to the lesion. Step 64: the location of the narrow field of view image (e.g., the guide image) is identified by a piece of software that utilizes the inventive methods disclosed herein. Step 65: after switching to the image-guided mode in the system, the probe is moved to construct a target image (e.g., a lesion image). If the probe is moved to the location of interest, then in step 69, the OCT scan mode is switched to scan and the first acquired OCT image (i.e., optical image) is recorded. Step 70 may then be taken to attempt to identify the relevant displacement for the next OCT image. If not, step 72 is taken to use the last successful image for OCT image location tagging. If successful, step 71 is taken to use the displacement and the last recorded guide image to mark the OCT scan on the dermoscopic image. Thereafter, in step 73, the next OCT scan is started or a return to the guidance mode is made at step 65 or step 70. And step 66: and finishing the construction of the focus image. If there are more regions of interest that have not yet been scanned, the probe is moved to the location of interest in step 67. Step 68: the entire scan is completed.

Fig. 5 provides an exemplary image registration system of the present invention. The first imager a (e.g., a dermatoscope) is configured to capture a wide-field image of the target area. The optical module B includes a second imager and an optical imager, wherein the computer C is configured to connect and control the first imager a and the optical module B. The second imager and the optical imager share the same objective lens 46 so that the FOV of the optical image overlaps with the FOV of the narrow-field-of-view image provided by the second imager. The optical imager is configured to capture an optical image and the second imager is used to capture a narrow field of view image of the target area, wherein the narrow field of view image will be aligned with the wide field of view image of the target area and the optical image will be displayed on the narrow field of view image and the wide field of view image of the target area.

In detail, as shown in fig. 5, for example, the first imager a capturing a wide-field image is at least one of a dermoscope, an epitomic microscope, and an image mosaic module. Those skilled in the art will readily recognize and adapt the use of other suitable first imagers. In some embodiments, the first imager a includes a first imager optical lens 31 and a first camera 32 that can be controlled by a computer C. Additionally, in some embodiments, the light source of the dermoscope and/or epitopic luminescence microscope comprises at least one LED and/or wood's lamp, but is not limited thereto. Those skilled in the art will readily recognize other suitable light sources for practice in accordance with the present invention. The optical module B includes two imagers, including a second imager that provides a narrow field of view in an image-guided mode and an optical imager that provides an optical image. The optical image is preferably an Optical Coherence Tomography (OCT) image, a Reflectance Confocal Microscope (RCM) image, a two-photon emission microscope (TPL) image, a second harmonic generation microscope (SHG) image, a third harmonic generation microscope (THG) image, a Fluorescence Confocal Microscope (FCM) image, or a combination thereof. More preferably, the optical image is an OCT image or an RCM image. With respect to the second imager, it includes a light source 463 surrounded by an objective module 46 to provide light on a target area of the sample 5; and a beam splitter 44 for guiding the light signal to the second camera 481 through the projection lens 471. Additionally, with respect to the optical imager, it includes a light source 40 that provides light through an optical fiber 41 to an optical lens 42; the light passes through the polarization beam splitter 43, the beam splitter 44, and the quarter wave plate 45, which converts circularly polarized light; an objective module 46 having an objective lens 461 and an interference device 462 for passing light through the sample 5. When light is backscattered from the sample 5, the polarizing beam splitter 43 directs the light through a projection lens 472 to a third camera 482. The computer C is configured to control the light source 40 and process images from the second and third cameras 481, 482.

The alignment system of the present invention can continuously align an optical image on a dermatoscope image and present a plurality of scanning positions of the optical image on the dermatoscope image so as to clearly indicate/identify the scanning positions of the optical image.

FIG. 5 shows an exemplary optical system comprising a Mirau-type interferometer; on the other hand, fig. 6 illustrates a Linnik type interferometer. Other suitable optical imagers disclosed herein may be readily adapted by those skilled in the art. For example, one skilled in the art may select a michelson-type interferometer or a mach-zehnder interferometer as necessary. In addition, other optical systems, such as RCM, TPL, SHG, THG, or FCM, may also be replaced by those skilled in the art to meet the requirements as necessary.

The only difference between fig. 6 and fig. 5 is the objective lenses 46a and 46b, where objective lens 46a provides the sample arm from the sample 5 covered by glass plate 463, and objective lens 46b provides the reference arm from mirror 462.

In the present invention, the optical image is well registered with the dermatoscope image, since the optical imager and the image-guided imager share the same optical objective, and the FOV of the optical imager and the FOV of the image-guided imager substantially overlap or are the same.

In some embodiments, there is provided an image registration system comprising: a first imager configured to capture a wide-field image of a target region; and an optical module comprising a second imager and an optical imager, the second imager and the optical imager sharing the same objective lens, wherein the optical imager is configured to capture an optical image and the second imager is configured to capture a narrow-field image of the target region to align the narrow-field image of the target region over the wide-field image and to display a position of the optical image over the narrow-field image and the wide-field image of the target region. In certain embodiments, the first imager comprises at least one dermatoscope, an epifluorescence microscope, and an image mosaicing module. In certain embodiments, the light source of the dermoscope and/or epitopic luminescence microscope comprises at least one LED and/or wood's lamp. In certain embodiments, the field of view of the first imager is in the range of 5 x 5mm to 20 x 20 mm. In certain embodiments, the field of view of the second imager is in the range of 1 x 1mm to 5 x 5 mm. In certain embodiments, the field of view of the optical imager is in the range of 50 x 50 μm to 1000 x 1000 μm. In some embodiments, the resolution of the wide-field image is substantially equal to the resolution of the narrow-field image. In some embodiments, the resolution of the wide-field image differs from the resolution of the narrow-field image by about 0 to 25 μm, 0 to 20 μm, 0 to 15 μm, 0 to 10 μm, 0 to 5 μm, or 0 to 3 μm. In certain embodiments, the optical imager is an Optical Coherence Tomography (OCT) device, a Reflective Confocal Microscope (RCM) device, a two-photon emission microscope (TPL) device, a second harmonic generation microscope (SHG) device, a third harmonic generation microscope (THG) device, a Fluorescent Confocal Microscope (FCM) device, or a combination thereof.

In some embodiments, the first imager has a field of view (FOV) in the range of 5 x 5mm to 20 x 20mm, preferably 6 x 6mm to 17 x 17mm, preferably 10 x 10mm to 15 x 15mm, but not limited thereto. The FOV of the second imager is in the range of 1 x 1mm to 5 x 5mm, preferably 2 x 2mm to 4.5 x 4.5mm, preferably 3 x 3mm to 4 x 4mm, but is not limited thereto. In addition, the FOV of the optical imager is at 50 x 50 μm2To 1000 x 1000 μm2In the range, preferably 100 x 100 μm2To 800 x 800 μm2Preferably 300 x 300 μm2To 600 x 600 μm2And preferably 400 x 400 μm2To 500 x 500 μm2And is not limited thereto. Since the FOV of the narrow-field image overlaps the FOV of the optical imager, the position of the optical image in the narrow-field image is always easily tracked, otherwise a marker will be provided to accurately identify the position of the optical imager on the narrow-field imager.

While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, modifications, and alternatives will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

17页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:配准方法和装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!