Multispectral imaging system and method based on image exposure

文档序号:1432876 发布日期:2020-03-20 浏览:12次 中文

阅读说明:本技术 一种基于图像曝光的多光谱成像系统和方法 (Multispectral imaging system and method based on image exposure ) 是由 胡文忠 张宇 戴玉蓉 陈继东 聂红林 于 2018-09-12 设计创作,主要内容包括:本申请公开了基于图像曝光的多光谱成像系统和方法。多光谱成像系统包括:光源模块,提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;相机模块,包括一个或多个图像传感器,所述图像传感器接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。本申请能够使可见光背景图像成像清晰明亮,同时荧光成像对比度高,噪点低。(The application discloses a multispectral imaging system and method based on image exposure. The multispectral imaging system comprises: the light source module provides visible light and generates exciting light, the irradiated area is irradiated by the combination of the visible light and the exciting light, and the irradiated area is excited by the exciting light to release fluorescence; a camera module comprising one or more image sensors that receive visible and fluorescent light in reflected light from the illuminated region to form a multi-spectral image comprising visible and fluorescent spectral band image information; the exposure time value of the visible spectrum section image is changed periodically, so that the visible spectrum section image information with different light intensities of the visible spectrum section is alternately and circularly output in the video continuous frame images. The method and the device can ensure that the visible light background image is clear and bright, and meanwhile, the fluorescence imaging contrast is high and the noise point is low.)

1. A multispectral imaging system, comprising:

the light source module provides visible light and generates exciting light, the irradiated area is irradiated by the combination of the visible light and the exciting light, and the irradiated area is excited by the exciting light to release fluorescence;

a camera module comprising one or more image sensors that receive visible and fluorescent light in reflected light from the illuminated region to form a multi-spectral image comprising visible and fluorescent spectral band image information;

the exposure time value of the visible spectrum section image is changed periodically, so that the visible spectrum section image information with different light intensities of the visible spectrum section is alternately and circularly output in the video continuous frame images.

2. The multispectral imaging system of claim 1, wherein:

the exposure time values of the visible spectrum range image comprise a first exposure time value of the visible spectrum range image and a second exposure time value of the visible spectrum range image, and the first exposure time value of the visible spectrum range image and the second exposure time value of the visible spectrum range image respectively appear in the video continuous frame images in an alternating periodic cycle.

3. The multispectral imaging system of claim 2, wherein:

the plurality of image sensors includes a first image sensor corresponding to a fluorescence spectrum band and a second image sensor corresponding to a visible spectrum band; each of the period frames is a two-frame image, an exposure time value of the first image sensor is maintained constant in successive frames of the image, an exposure time value of the second image sensor is set to a first exposure time value of the visible spectrum band image in an upper half period frame, an exposure time value of the second image sensor is set to a second exposure time value of the visible spectrum band image in a lower half period frame, and the first exposure time value of the visible spectrum band image is different from the second exposure time value of the visible spectrum band image.

4. The multispectral imaging system of claim 2, wherein:

the first exposure time value of the visible spectrum band image and the second exposure time value of the visible spectrum band image are set according to a preset reference exposure time value of the visible spectrum band.

5. The multispectral imaging system of claim 2, wherein:

the first exposure time value of the visible spectrum band image and the second exposure time value of the visible spectrum band image are set based on the exposure time value of the fluorescence spectrum band image.

6. The multispectral imaging system of claim 3, wherein:

the plurality of image sensors adopt a progressive scanning exposure mode.

7. The multispectral imaging system of claim 3, wherein the difference in light intensity between adjacent frames of an image is mediated by one or more of:

adjusting the exposure time value of the visible spectrum section image in the adjacent frame of the image; carrying out image enhancement processing on the later-stage images of the visible light spectrum bands of the adjacent frame images, or carrying out weighting algorithm processing to mediate the hardware gain amplification values of the output later-stage images in the plurality of image sensors; or to increase the image frame rate.

8. The multispectral imaging system of claim 2, wherein:

the image sensor adopts an interlaced exposure mode, each frame of the image is divided into two field images, the odd field image corresponds to an odd line of a color filter array of the image sensor, the even field image corresponds to an even line of the color filter array of the image sensor, each period frame comprises four continuous field images, wherein a first field is set to be exposed with a first exposure time value of a visible spectrum section image, a second field is set to be exposed with a second exposure time value of the visible spectrum section image, a third field is set to be exposed with the second exposure time value of the visible spectrum section image, and a fourth field is set to be exposed with the first exposure time value of the visible spectrum section image.

9. The multispectral imaging system of claim 2, wherein:

the first exposure time value of the visible spectrum segment image is greater than the second exposure time value of the visible spectrum segment image.

10. The multispectral imaging system of claim 8, wherein:

and carrying out later image enhancement processing on the image information of the adjacent odd field and the even field, and respectively carrying out interlaced scanning display output.

11. A method of multispectral imaging, comprising:

irradiating the irradiated area by the visible light and the exciting light in a combined way, and exciting the irradiated area by the exciting light to release fluorescence;

receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information;

the exposure time value of the visible spectrum section image is changed periodically, so that the visible spectrum section image information with different light intensities of the visible spectrum section is alternately and circularly output in the video continuous frame images.

12. The multispectral imaging method of claim 11, wherein:

the exposure time values of the visible spectrum range image comprise a first exposure time value of the visible spectrum range image and a second exposure time value of the visible spectrum range image, and the first exposure time value of the visible spectrum range image and the second exposure time value of the visible spectrum range image respectively appear in the video continuous frame images in an alternating periodic cycle.

13. The multispectral imaging method of claim 12, wherein:

each period frame is two frames of images, the exposure time value of the fluorescence spectrum section image is kept constant in the continuous frames of the images, the exposure time value of the visible spectrum section image is set to be the first exposure time value of the visible spectrum section image in the upper half period frame, the exposure time value of the visible spectrum section image is set to be the second exposure time value of the visible spectrum section image in the lower half period frame, and the first exposure time value of the visible spectrum section image is different from the second exposure time value of the visible spectrum section image.

14. The multispectral imaging method of claim 12, wherein:

the visible spectrum band image is exposed in an interlaced exposure manner, each frame of the image is divided into two field images, the odd field image corresponds to odd lines of the image, the even field image corresponds to even lines of the image, each periodic frame includes four consecutive field images, wherein a first field is set to be exposed at a first exposure time value of the visible spectrum band image, a second field is set to be exposed at a second exposure time value of the visible spectrum band image, a third field is set to be exposed at the second exposure time value of the visible spectrum band image, and a fourth field is set to be exposed at the first exposure time value of the visible spectrum band image.

15. A computer-readable storage medium storing computer-executable instructions for:

irradiating the irradiated area by the visible light and the exciting light in a combined way, and exciting the irradiated area by the exciting light to release fluorescence;

receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information;

the exposure time value of the visible spectrum section image is changed periodically, so that the visible spectrum section image information with different light intensities of the visible spectrum section is alternately and circularly output in the video continuous frame images.

Technical Field

The present application relates to the optical imaging arts, and more particularly to the field of endoscopic multispectral imaging.

Background

The optical imaging is widely applied in clinical medicine, and has the advantages of no harm to human body, non-invasiveness, high sensitivity and capability of carrying out in-vivo multi-target imaging. Among them, with the continuous development and innovation of fluorescence imaging agents, fluorescence imaging becomes an important tool for intraoperative navigation. Currently, application technology based on near-infrared fluorescence developer Indocyanine Green (ICG) is also becoming mature. The fluorescent developer indocyanine green is combined with plasma protein after intravenous injection, and emits fluorescence of about 835nm wave band which is longer than the wavelength of excitation light through the excitation of near infrared light of about 805nm wave band. The imaging system processes the captured fluorescent signals through an algorithm to display white light, fluorescent light, or superimposed images on a screen in real time. Compared with the traditional organic dye, the real-time fluorescence developing method has higher contrast ratio, better identification effect on target tissues and avoids dye dispersion in the later period of operation.

For the simultaneous imaging of visible light and fluorescence, each image frame of the system output image is a multispectral fusion image, and one image frame can be divided into visible spectrum section image information and fluorescence spectrum section image information according to different spectral wavelength sections. In the case of tumor surgical resection using fluoroscopic imaging, it is often desirable to precisely ablate the fluorescently labeled tissue regions, which requires clear tissue regions and well-defined boundaries of the fluorescence regions. At the same time, the operator also wants the background tissue area in the visible spectrum to be clear and definite, so as to better identify the tissue. However, the labeled fluorescence region contains information in both the fluorescence and visible regions of the spectrum, and the intensity of light in the visible region is typically much greater than the intensity of light in the fluorescence region. This results in the fluorescence being generally overwhelmed by visible light, resulting in darker, unclear, lower contrast, etc. fluorescence in multispectral images, whereas the tissue at the borders of the fluorescence area is generally less diseased, the excited fluorescence is generally weaker, and the borders of the fluorescence area are more obscured and less clear at high visible spectral band light intensities.

Accordingly, there is a need in the art for improved multispectral imaging techniques to improve the image quality of simultaneous imaging of visible light and excited fluorescence.

Disclosure of Invention

To achieve the above object, the present application provides a multispectral imaging system comprising: the light source module provides visible light and generates exciting light, the irradiated area is irradiated by the combination of the visible light and the exciting light, and the irradiated area is excited by the exciting light to release fluorescence; a camera module comprising one or more image sensors that receive visible and fluorescent light in reflected light from the illuminated region to form a multi-spectral image comprising visible and fluorescent spectral band image information; the exposure time value of the visible spectrum range image is periodically changed in the continuous frame images of the video, so that the visible spectrum range image information with different light intensities is alternately and circularly output in different frames of the video image.

According to one aspect of the application, the exposure time values of the visible spectrum band image include a first exposure time value of the visible spectrum band image and a second exposure time value of the visible spectrum band image, which respectively occur in alternating periodic cycles in each periodic frame of the video image.

According to one embodiment of the present application, the plurality of image sensors includes a first image sensor corresponding to a fluorescence spectrum band and a second image sensor corresponding to a visible spectrum band; each of the period frames is a two-frame image, an exposure time value of the first image sensor is maintained constant in successive frames of the image, an exposure time value of the second image sensor is set to a first exposure time value of the visible spectrum band image in an upper half period frame, an exposure time value of the second image sensor is set to a second exposure time value of the visible spectrum band image in a lower half period frame, and the first exposure time value of the visible spectrum band image is different from the second exposure time value of the visible spectrum band image.

According to an aspect of the present application, the first exposure time value of the visible spectrum band image and the second exposure time value of the visible spectrum band image are set according to a reference exposure time value of the visible spectrum band image preset by the system.

According to an aspect of the present application, the first exposure time value of the image in the visible spectrum and the second exposure time value of the image in the visible spectrum may also be set according to the exposure time value of the fluorescence spectrum.

According to one aspect of the application, the plurality of image sensors employ a progressive scan exposure scheme.

According to one aspect of the application, the light intensity difference between adjacent frame images of an image is mediated by one or more of: adjusting the exposure time value of the visible spectrum section image in the adjacent frame of the image; carrying out image enhancement processing on the later-stage images of the visible light spectrum bands of the adjacent frame images, or carrying out weighting algorithm processing on the later-stage images to mediate the hardware gain amplification values of the output later-stage images in the plurality of image sensors; or to increase the image frame rate.

According to another embodiment of the application, the image sensor is arranged to use an interlaced exposure mode, each frame of the image is divided into two field images, an odd field image corresponding to an odd number of lines of the color filter array of the image sensor, an even field image corresponding to an even number of lines of the color filter array of the image sensor, each period frame image comprises four consecutive field images, wherein a first field is arranged to be exposed at a first exposure time value of the visible spectrum band image, a second field is arranged to be exposed at a second exposure time value of the visible spectrum band image, a third field is arranged to be exposed at a second exposure time value of the visible spectrum band image, and a fourth field is arranged to be exposed at the first exposure time value of the visible spectrum band image.

According to one aspect of the application, the first exposure time value of the visible spectrum band image is greater than the second exposure time value of the visible spectrum band image.

According to one aspect of the application, the image information of adjacent odd and even fields is output separately after post image enhancement processing.

The application also provides a multispectral imaging method, comprising: irradiating the irradiated area by the visible light and the exciting light in a combined way, and exciting the irradiated area by the exciting light to release fluorescence; receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information; the exposure time value of the visible spectrum section image is changed periodically, so that the visible spectrum section image information with different light intensities of the visible spectrum section is alternately and circularly output in the video continuous frame images.

According to one embodiment of the present application, the exposure time values of the visible spectrum band image include a first exposure time value of the visible spectrum band image and a second exposure time value of the visible spectrum band image, and the first exposure time value of the visible spectrum band image and the second exposure time value of the visible spectrum band image respectively alternate periodically and cyclically appear in the video sequence frame images.

According to one aspect of the application, each of the period frames is a two-frame image, the exposure time value of the fluorescence spectrum band image remains constant in successive frames of the image, the exposure time value of the visible spectrum band image is set to a first exposure time value of the visible spectrum band image in an upper half period frame, the exposure time value of the visible spectrum band image is set to a second exposure time value of the visible spectrum band image in a lower half period frame, and the first exposure time value of the visible spectrum band image is different from the second exposure time value of the visible spectrum band image.

According to another embodiment of the application, the visible spectrum segment is exposed in an interlaced exposure, each frame of the image is divided into two field images, the odd field images corresponding to odd lines of the image, the even field images corresponding to even lines of the image, each periodic frame comprising four consecutive field images, wherein a first field is set to be exposed at a first exposure time value of the visible spectrum segment image, a second field is set to be exposed at a second exposure time value of the visible spectrum segment image, a third field is set to be exposed at the second exposure time value of the visible spectrum segment image, and a fourth field is set to be exposed at the first exposure time value of the visible spectrum segment image.

The application also provides a computer-readable storage medium storing computer-executable instructions for: irradiating the irradiated area by the visible light and the exciting light in a combined way, and exciting the irradiated area by the exciting light to release fluorescence; receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information; the exposure time value of the visible spectrum section image is changed periodically, so that the visible spectrum section image information with different light intensities of the visible spectrum section is alternately and circularly output in the video continuous frame images.

Compared with the prior art, the multispectral imaging system and the multispectral imaging method based on image exposure can enable visible light background images to be clear and bright in imaging, and meanwhile, fluorescence imaging contrast is high and noise is low.

The method can be widely applied to the technical field of medical imaging, including various fields such as medical endoscopes and fluorescence microscopes.

Drawings

In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.

FIG. 1 is a schematic structural diagram of a multispectral imaging system according to an embodiment of the present application;

FIG. 2 is a schematic diagram of a multi-spectral imaging system employing a single image sensor according to one embodiment of the present application;

FIG. 3 is a schematic diagram of a light source module according to one embodiment of the present application employing a combination of multiple monochromatic light sources;

FIG. 4 is a schematic diagram of a camera module employing two image sensors according to one embodiment of the present application;

FIG. 5 is a schematic diagram of a camera module according to one embodiment of the present application employing four image sensors;

FIG. 6 is a schematic diagram of an imaging method in a multi-image sensor mode for line-by-line output according to an embodiment of the present application;

FIG. 7 is a schematic view of a color filter array arrangement in a single image sensor mode according to one embodiment of the present application;

FIG. 8 is a schematic diagram of an imaging method in interlaced output mode according to an embodiment of the present application; and

FIG. 9 is a block diagram of an exemplary computer system according to an embodiment of the present application.

Detailed Description

Various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to specific examples and implementations are for illustrative purposes, and are not intended to limit the scope of the application or claims.

The word "exemplary" is used herein to mean "serving as an example, instance, or illustration. Any implementation described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other implementations.

FIG. 1 illustrates an exemplary structural schematic of a multispectral imaging system according to one embodiment of the present application. In this example, the multispectral imaging system may be used for fluorescence endoscopy. As shown in fig. 1, the fluorescence endoscope camera system may include a light source module 1, a camera module 2 for imaging, a data acquisition and preprocessing module 3 for image acquisition and preprocessing, a digital image processing module 4, a system controller 5, and a light source controller 6. The camera module 2 may include, among other things, an optical lens assembly and one or more image sensors (which will be described in more detail below). The system controller 5 may generate a corresponding control signal according to a feedback processing result of the digital image processing module 4. The light source controller 6 may perform lighting control on the light source module 1 according to a control signal of the system controller 5.

FIG. 2 illustrates an exemplary structural schematic of a multi-spectral imaging system employing a single image sensor according to one embodiment of the present application. As shown in fig. 2, the light source module 1 may include an incident visible light LED 11, a Near Infrared (NIR) exciter 12, and a combined beam splitter 13. In this example, the camera module 2 may include an optical lens assembly 9 and a single image sensor 10 (e.g., a CCD or CMOS image sensor). Optionally, the optical lens assembly 9 may further include an optical filter 8. Incident visible light LEDs 11 can provide illumination in the visible wavelength band (400nm to 760 nm). The Near Infrared (NIR) exciter 12 can generate NIR excitation light in the near infrared band (790 to 820nm), particularly in the wavelength band around 800 nm. Visible and NIR excitation light may be combined for illumination of the illuminated area (e.g., tissue) by a combined beam splitter 13. When the tissue injected with the fluorescence developer is irradiated with NIR excitation light in the light source module 1, fluorophores in the tissue are excited to release near infrared fluorescence of a wavelength band longer than the wavelength of the excitation light, for example, fluorescence of 830nm to 900 nm. The reflected light from the illuminated region (e.g., tissue) can include, for example, three spectral bands of visible light, excitation light (e.g., NIR excitation light), and fluorescence (e.g., near infrared fluorescence). Where NIR excitation light is completely blocked from entering the image sensor by the optional optical filter 8, while visible and fluorescence light passes through the optical lens assembly 9 into a single image sensor 10 having NIR pixels. Data from the image sensor 10 may be passed through the data acquisition and pre-processing module 3 to the digital image processing module 4 to form images in the visible and fluorescence bands. The system controller 5 may output a signal to the light source controller 6 according to the feedback of the digital image processing module 4. The light source controller 6 may control the light source module 1 by controlling a driving current or a voltage.

FIG. 3 is an exemplary diagram of a light source module according to one embodiment of the present application employing a combination of multiple monochromatic light sources. In this example, the light source module 1 may include a combination of three monochromatic light sources as shown in fig. 3. The visible light source is composed of three monochromatic light sources of a red light LED 18(620 nm-760 nm), a green light LED 19(492 nm-577 nm) and a blue light LED 20(400 nm-450 nm) through a corresponding combined spectroscope 15, a combined spectroscope 16 and a combined spectroscope 17. In an example of a combination of multiple monochromatic light sources, the camera module 2 may include one or more image sensors. For example, the camera module 2 may be a single image sensor as shown in fig. 2, or may be a dual image sensor combination as shown in fig. 4.

FIG. 4 is an exemplary diagram of a camera module employing two image sensors according to one embodiment of the present application. The camera module 2 may include a plurality of image sensors, and corresponding beam splitting prisms. In this example, the camera module 2 may employ an image sensor 32 (e.g., a CCD/CMOS image sensor) of visible light and an image sensor 33 (e.g., a CCD/CMOS image sensor) of fluorescence. Visible light and fluorescent light from the tissue enter the corresponding image sensors through the dichroic prism 31, respectively.

FIG. 5 is an exemplary diagram of a camera module employing four image sensors according to one embodiment of the present application. The camera module 2 may include four image sensors and corresponding three beam splitting prisms. When a 4-CCD/CMOS image sensor combination is employed, as shown in fig. 5, prism spectroscopy (e.g., a combination of a beam splitter prism 34, a beam splitter prism 35, and a beam splitter prism 36) is employed to direct light into three RGB monochrome image sensors (e.g., a red image sensor 38, a green image sensor 39, and a blue image sensor 40) and one NIR fluorescence image sensor 37, respectively, each of which receives reflected light of a corresponding wavelength band to form a multispectral image.

For the simultaneous imaging of visible light and fluorescence, each image frame of the system output image is a multispectral fusion image of the visible light and the fluorescence, and one image frame can be divided into visible spectrum section image information and fluorescence spectrum section image information according to different spectral wavelength sections. For the post-fusion mode (e.g., fig. 4 and 5) in which multiple image sensors are used to capture image signals of different spectral wavelength bands, the light intensity in each image frame is determined by the tissue reflected light intensity and the illumination time integral of the reflected light on the image sensor.

The contrast of the fluorescence area can be improved by reducing the intensity of the visible light, but the brightness and the definition of the visible background tissue are also reduced. In the multi-spectral fusion image of the visible light spectrum segment and the fluorescence, the high visible light intensity can highlight the image space details of the visible light image, and the details of the fluorescence image are highlighted under the low light intensity. The improvement of the fluorescence contrast by reducing the illumination intensity of the visible light background image can sacrifice the imaging quality of the visible light image, and simultaneously, under the condition that the signal-to-noise ratio of the system is certain, the image noise under low illumination is more prominent and obvious, so that the image boundary is fuzzy and the noise is prominent.

Under the condition that the illumination intensity of the light source is kept unchanged, the brightness of the frame image is increased by increasing the exposure time, and the brightness of the image is correspondingly reduced by reducing the exposure time in the same way. On the basis of ensuring the image definition, if the exposure time is long, the image brightness is strong; if the exposure time is short, the image brightness decreases. According to the foregoing description, within the exposure time of one frame of the image, the exposure time of the image sensor corresponding to the visible spectrum segment is reduced, so that the light intensity of the image information of the visible spectrum segment in the multispectral image can be reduced, and the contrast and the definition of the fluorescence spectrum segment in the multispectral fusion image can be improved.

FIG. 6 is a schematic diagram of an imaging method in a multi-image sensor mode with progressive output according to an embodiment of the present application. As shown in fig. 6, in the multi-image sensor mode and progressive image output mode, the exposure adjustment mode may be adopted as follows. In all the continuous frame images of the video, the exposure time of the fluorescent image sensor is kept constant according to the frame rate of the video image. In the first frame of the system image output, the image sensor corresponding to the visible light spectrum section is subjected to long exposure time, and the image sensor corresponding to the visible light spectrum section in the second frame is subjected to short exposure time; in the third frame, the image sensor corresponding to the visible spectrum segment is exposed for a long time, while the image sensor corresponding to the visible spectrum segment in the fourth frame is exposed for a short time. Similarly, the subsequent image frames are also changed periodically. The different exposure time values, long and short, of the visible spectrum respectively, occur in each frame of the image alternately periodically cyclically. In other words, one period frame is two frames of images, the first half period frame is a first exposure time (e.g., a long exposure time) of the visible spectrum band image, and the second half period frame is a second exposure time (e.g., a short exposure time) of the visible spectrum band image. It should be understood that the exposure time of the visible spectrum segment in each periodic frame may be made longer and shorter (as in the illustrated embodiment), but the application is not limited thereto. Alternatively, in each of the period frames, the upper half of the period frame may be set to the second exposure time (e.g., short exposure time) of the visible spectrum band image, and the lower half of the period frame may be set to the first exposure time (e.g., long exposure time) of the visible spectrum band image. It should also be understood that either the long exposure time or the short exposure time for the visible portion of the image is relative to a preset reference exposure time for the visible portion or relative to the fluorescent portion. Optionally, the long exposure time may be best matched according to the frame rate size. Although the exposure time of the visible light spectrum section of the adjacent frames of the image has difference, which causes the light intensity of different image frames to have brightness difference, due to the characteristics of human persistence of vision and the characteristic that human eyes have low sensitivity to brightness change above a certain light intensity, the brightness and dark parts in the continuous image frames are interwoven together, and the human eyes cannot easily detect the rapid brightness change of the visible light spectrum section. The exposure time difference can be adjusted according to specific visual effects, and the effect that human eyes do not obviously generate a screen flashing effect is achieved. According to the color theory, human eyes are sensitive to brightness at very low brightness and are insensitive at high brightness.

In addition, in order to avoid the obvious screen flash phenomenon caused by the overlarge light intensity difference of the adjacent frame images, especially under the condition of low frame rate, the exposure time of the visible spectrum section in the adjacent frame images can be adjusted and optimized, meanwhile, the later-stage images of the visible spectrum section of the adjacent frame images are subjected to image enhancement processing, or the later-stage image weighting algorithm processing, or the hardware gain amplification value of the output later-stage images in different image sensors is adjusted, the brightness of the visible spectrum section as dark frame data is enhanced and output, so that the human eye sensitivity to the light intensity change is adapted, and a better visual effect is achieved. In addition, in the specific implementation, the image frame rate can be properly increased to eliminate the discomfort of human eyes caused by the overlarge light intensity change due to the difference of the exposure time.

In addition, in the information contained in each image frame, the visible light spectrum region is the high-exposure frame image and mainly highlights the image detail and definition of background tissues. And the visible spectrum low exposure frame image highlights the image details of the fluorescence area and highlights the contrast with the background tissue. Therefore, in the process of continuous image frames, the structural image details of the background tissue under high exposure can be reflected and the specific image details of the fluorescence area can be reflected by using the persistence of vision of human eyes.

Fig. 7 is a schematic view of a color filter array arrangement in a single image sensor mode according to an embodiment of the present application.

Due to the increasing number of CCD pixels of the same size, the signal transmission path in the CCD may not be adapted to the one-time large amount of data obtained by the progressive scanning, which may cause a decrease in the image processing speed. In this case, an interlaced scanning technique can be employed, particularly in a high-pixel digital camera. Thus, for a single image sensor with integrated NIR pixels in a sensor filter array, the light intensity information of the visible and fluorescent images can be adjusted separately by interlaced output of interlaced exposure. Some CCD or CMOS image sensors can only perform interlaced exposure and read out in interlaced output mode. In this mode, the electronic shutter resets all the photodiodes prior to exposure, as in the progressive scan mode. At the end of the exposure, however, not all of the charge can be transferred out of the photodiode at the same time. The odd and even lines of the image must be charge transferred at different times so that the odd and even lines of the image will have different exposure times, i.e., the image sensor exposes the odd lines of the image first and then exposes the even lines of the image.

In the case of a single image sensor for a camera system, the mosaic filter array is as shown in fig. 7. In the figure, NIR (near infrared fluorescence) pixels and blue B pixels are in a row, and red R and green G pixels are in a row. The color filter array may be arranged with odd rows arranged as an array of red R and green G, and even rows arranged as an array of blue and NIR. According to the interlaced exposure principle, the exposure time of the odd lines and the even lines of the image can be respectively set to adjust the brightness of each monochromatic light. The color visible light image is color light synthesized by the RGB array in the color filter array, according to the condition that the brightness of the visible light is equal to the sum of the brightness of the primary colors participating in color mixing, under the condition that the exposure time of even lines (blue light B and NIR) is unchanged, the spectral components of red light R and green light G components can be reduced by independently reducing the exposure time of odd lines of the image, and the total light intensity of the image in the visible spectrum section is reduced. However, this results in a high intensity of blue light in the visible light mixed primary color, which causes color cast. For this, alternatively, gamma correction may be performed in the image post-processing, and the weight ratio of the blue component in the gamma correction is re-determined according to the difference in the exposure time between the odd and even lines and the magnitude of the illumination intensity, thereby restoring the true color of the image in the visible light range.

Alternatively, red R or green G (not shown) may be in the same row as the NIR pixels. Similarly, the later gamma correction requires adjusting the weight ratio of the red or green component R or G.

FIG. 8 is a schematic diagram of an imaging method in an interlaced output mode according to an embodiment of the present application. In this example, a single image sensor is employed. In the interlaced exposure interlaced output mode, the following frame image exposure adjustment mode may be adopted. As shown in fig. 8, in the interlaced exposure mode, an image is divided into two field images: an odd field image and an even field image. The odd field image corresponds to odd lines of the color filter array of the image sensor and the even field image corresponds to even lines of the image sensor. The first field is a long exposure, the second field is a short exposure, the third field is also a short exposure, and the fourth field is a long exposure. In other words, there is one exposure variation period in 4 consecutive field images. The corresponding image frame illumination intensity variation is also shown in fig. 8, the light intensity of the visible light spectrum section in the first frame of the image is high, and the light intensity of the visible light spectrum section in the second frame is low; the intensity of light in the visible spectrum band is high in the third frame and low in the fourth frame.

In image transmission, the converted digital image signals enter the digital image processing module 4, and the digital image processing module 4 performs post image enhancement processing on the field images of the odd and even lines of the image sensor with different exposure times, respectively, to synthesize images, and images the images on a display (not shown). Alternatively, the image may be directly displayed on the display in an interlaced manner after the post image enhancement processing.

It will be appreciated that adjustment of the odd and even lines in the image filter array for different exposure times can effect a change in light intensity in the visible range of the image frame, with the odd and even fields of the final image being output for display at different exposure times. Because the bright and dark parts of the visible spectrum are interwoven together, the visual characteristics of people are utilized, and the parts are not easy to be perceived. With interlaced output, the image information is divided into two field images for output respectively, which is more continuous for the visual effect of the invention. And the simultaneous interlaced output can support image output with higher resolution.

According to the method and the device, the visual persistence characteristics of human eyes are combined with different video output modes, and the visible light intensity of different frame images in the video images is adjusted by setting different exposure time in continuous frame image information, so that the high and low visible light spectrum section light intensity is alternately and circularly output in different frame images. Therefore, in a continuous frame image information, not only can a visible light spectrum section image with high brightness be output to highlight the visible light image details of background tissues, but also fluorescent image information with high contrast under the background with low visible light intensity can be output. Therefore, in a series of continuous frame image information, the clear details of the visible spectrum range image and the image details of the fluorescent near-infrared spectrum range image under high contrast are reflected. By adopting the image period frame technology of outputting different light intensities and contrasts at intervals of frames, the high contrast of the fluorescence image information is ensured, and simultaneously, the high-definition visible light image information is also output.

According to the scheme of the application, the defect that the contrast of a fluorescence area is low under a background image with high visible light intensity when the existing fluorescence and visible light spectrum sections are imaged simultaneously is overcome. In addition, the defect of poor imaging quality of visible light images under low-visible-light-intensity background images when fluorescence and visible light spectrum sections are imaged simultaneously in the market is solved. In addition, the difference adjustment of the exposure time is utilized, so that the image brightness information is controlled more conveniently, flexibly and accurately.

It should be understood that the above embodiments are only examples and not limitations, and that besides the above progressive and interlaced scanning schemes, those skilled in the art can also conceive other similar methods, which can embody both the sharp details of the visible spectrum band image and the image details of the fluorescent near infrared band image at high contrast. Such implementations should not be read as resulting in a departure from the scope of the present application.

Referring to FIG. 9, an exemplary computer system 900 is shown. Computer system 900 may include a logical processor 902, such as an execution core. Although one logical processor 902 is illustrated, in other embodiments, the computer system 900 may have multiple logical processors, e.g., multiple execution cores per processor substrate, and/or multiple processor substrates, where each processor substrate may have multiple execution cores. As shown, the various computer-readable storage media 910 may be interconnected by one or more system buses that couple the various system components to the logical processor 902. The system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. In an exemplary embodiment, the computer-readable storage medium 910 may include, for example, Random Access Memory (RAM)904, storage 906 (e.g., an electromechanical hard drive, a solid state hard drive, etc.), firmware 908 (e.g., flash RAM or ROM), and a removable storage device 918 (such as, for example, a CD-ROM, a floppy disk, a DVD, a flash drive, an external storage device, etc.). Those skilled in the art will appreciate that other types of computer-readable storage media can be used, such as magnetic cassettes, flash memory cards, and/or digital video disks. Computer-readable storage media 910 may provide nonvolatile and volatile storage of computer-executable instructions 922, data structures, program modules, and other data for computer system 900. A basic input/output system (BIOS)920, containing the basic routines that help to transfer information between elements within computer system 900, such as during start-up, may be stored in firmware 908. A number of programs may be stored on firmware 908, storage device 906, RAM 904, and/or removable storage device 918, and executed by logical processor 902, with logical processor 902 including an operating system and/or application programs. Commands and information may be received by computer system 900 through input device 916, and input device 916 may include, but is not limited to, a keyboard and a pointing device. Other input devices may include a microphone, joystick, game pad, scanner, or the like. These and other input devices are often connected to the logical processor 902 through a serial port interface that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port, or Universal Serial Bus (USB). A display or other type of display device is also connected to the system bus via an interface, such as a video adapter, which may be part of graphics processing unit 912 or connected to graphics processing unit 912. In addition to the display, computers typically include other peripheral output devices, such as speakers and printers (not shown). The exemplary system of FIG. 9 can also include a host adapter, Small Computer System Interface (SCSI) bus, and an external storage device connected to the SCSI bus. The computer system 900 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer. The remote computer may be another computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer system 900. When used in a LAN or WAN networking environment, computer system 900 can be connected to the LAN or WAN through network interface card 914. A network card (NIC)914 (which may be internal or external) may be connected to the system bus. In a networked environment, program modules depicted relative to the computer system 900, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections described herein are exemplary and other means of establishing a communications link between the computers may be used.

In one or more exemplary embodiments, the functions and processes described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, the software codes may be stored in a memory, such as a memory of a mobile station, and executed by a processor, such as a desktop computer, a laptop computer, a server computer, a microprocessor of a mobile device, or the like. The memory may be implemented within the processor or external to the processor. As used herein, the term "memory" refers to any type of long term, short term, volatile, nonvolatile, or other memory and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.

The foregoing description and drawings are provided by way of illustrative example only. Any reference to claim elements in the singular, for example, using the articles "a," "an," or "the" is not to be construed as limiting the element to the singular. Any reference to source depth, such as "first," "second," etc., as used herein, does not generally limit the number or order of those elements. Reference to first and second elements does not imply that only two elements are used herein, nor that the first element must somehow precede the second element. A set of elements may include one or more elements unless otherwise specified. Skilled artisans may implement the described structure in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.

The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.

16页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种儿童睡眠管理设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!