Spectral imaging apparatus and fluorescence observation apparatus

文档序号:835229 发布日期:2021-03-30 浏览:2次 中文

阅读说明:本技术 光谱成像装置和荧光观察装置 (Spectral imaging apparatus and fluorescence observation apparatus ) 是由 辰田宽和 于 2019-07-22 设计创作,主要内容包括:根据本发明的实施例的光谱成像装置设置有光谱单元、图像传感器和控制单元。光谱单元根据波长分散入射光。图像传感器被配置为使得可以对每个像素设置曝光时间或增益,并且检测分散在光谱单元中的每个波长的光。控制单元被配置为使得可以对每个预定像素区域设置图像传感器的曝光时间或增益。(A spectral imaging apparatus according to an embodiment of the present invention is provided with a spectral unit, an image sensor, and a control unit. The spectral cell disperses incident light according to wavelength. The image sensor is configured such that an exposure time or gain can be set for each pixel, and light of each wavelength dispersed in a spectral unit is detected. The control unit is configured such that an exposure time or a gain of the image sensor can be set for each predetermined pixel region.)

1. A spectral imaging apparatus, comprising:

a spectrum section that disperses incident light for each wavelength;

an image sensor configured to be able to set an exposure time or a gain in units of pixels, the image sensor detecting light of each wavelength dispersed in the spectral portion; and

a control unit configured to be able to set the exposure time or the gain of the image sensor in units of predetermined pixel regions.

2. Spectral imaging apparatus according to claim 1, wherein

The spectral portion is configured to disperse the incident light in one axial direction for each wavelength, and

the control unit is configured to set the exposure time or the gain of the image sensor in units of a line perpendicular to the one axial direction.

3. Spectral imaging apparatus according to claim 1, wherein

The image sensor includes a pixel section and a calculation section that calculates a pixel value from image data output from the pixel section, an

The control unit is configured to set the gain for calculating the pixel value in units of the predetermined pixel region.

4. Spectral imaging apparatus according to claim 1, wherein

The control unit includes an evaluation section that obtains an emission spectrum of the incident light based on an output of the image sensor, and a storage section that stores a plurality of reference component spectra and an autofluorescence spectrum, and

the evaluation section is configured to calculate a component ratio of the emission spectrum so that a linear sum of a plurality of the reference component spectra and self-luminescence spectra is obtained.

5. Spectral imaging apparatus according to claim 4, wherein

The evaluation section is configured to calibrate at least one of the emission spectrum or the component spectrum based on the exposure time or the gain set for each predetermined pixel region.

6. Spectral imaging apparatus according to claim 5, wherein

The evaluation section is configured to determine from the captured spectrum whether there is a pixel whose pixel value reaches saturation, and to exclude the pixel reaching saturation from calculation of a component ratio of the captured spectrum.

7. A fluorescence observation device comprising:

an objective table capable of supporting a fluorescent-stained pathological specimen;

an excitation section for irradiating the pathological specimen on the stage with line illumination;

a spectrum section that disperses, for each wavelength, fluorescence excited by the line illumination;

an image sensor configured to be able to set an exposure time or a gain in units of pixels, the image sensor detecting light of each wavelength dispersed in the spectral portion; and

a control unit configured to set the exposure time or the gain of the image sensor in units of predetermined pixel regions.

8. The fluorescence observation device of claim 7, further comprising:

a display section for displaying a fluorescence spectrum based on an output of the image sensor.

9. The fluorescence observation device of claim 8, wherein

The display section has an operation area for receiving an input of an exposure time or a gain in units of the predetermined pixel area.

10. The fluorescence observation device of claim 8, wherein

The display unit has a display area for displaying a spectrum and a histogram based on the set exposure time or the gain setting.

Technical Field

The present technology relates to a spectral imaging apparatus and a fluorescence observation apparatus used for diagnosis of pathological images, for example.

Background

Pathological image diagnosis using fluorescent staining has been proposed as a highly quantitative and multicolor method (for example, see patent document 1). The advantage of fluorescence methods over color staining is that they are easily multiplexed and obtain detailed diagnostic information. In fluorescence imaging other than pathological diagnosis, an increase in the number of colors makes it possible to examine various antigens expressed in a specimen at once.

Reference list

Patent document

Patent document 1: japanese patent No.4452850

Disclosure of Invention

Technical problem

A spectrum observation apparatus that extends the horizontal axis of the area sensor to space and the vertical axis thereof (area sensor) to wavelength can easily obtain the spectrum of one line on the sample. However, in the case where a bright wavelength band, a very dark wavelength band, or the like is mixed in the spectrum, the dynamic range of the sensor itself is insufficient, a dark portion collapses, or a bright portion saturates, resulting in failure to obtain sufficient data. On the other hand, in order to solve this problem, if a sensor having a large recording capacity is used, the storage capacity is increased in an object such as a pathological image in which the number of entire pixels becomes huge, and new problems such as a decrease in accessibility of data and a slow operation of the entire system arise.

In view of the above circumstances, an object of the present technology is to provide a spectral imaging apparatus and a fluorescence observation apparatus capable of recording in a high dynamic range while suppressing the recording capacity of a sensor.

Solution of the problem

A spectral imaging apparatus according to an embodiment of the present technology includes a spectral portion, an image sensor, and a control unit.

The spectral portion disperses incident light for each wavelength.

The image sensor is configured to be able to set an exposure time or a gain in units of pixels and detect light of each wavelength dispersed in a spectral portion.

The control unit is configured to be able to set an exposure time or a gain of the image sensor in units of predetermined pixel regions.

According to the spectral imaging apparatus described above, an optimum exposure condition can be obtained and the dynamic range of the spectrum to be recorded can be expanded.

The spectral portion may be configured to disperse incident light in one axial direction for each wavelength, and the control unit may be configured to set the exposure time of the image sensor in units of a line perpendicular to the one axial direction.

The image sensor may further include a pixel section and a calculation section that calculates a pixel value of the image data output from the pixel section. In this case, the control unit is configured to set a gain for calculating the pixel value in units of a predetermined pixel region.

The control unit may include an evaluation section that obtains an emission spectrum of the incident light based on an output of the image sensor, and a storage section that stores a plurality of reference component spectra and autofluorescence spectra. The evaluation section is configured to calculate a component ratio of the emission spectrum such that a linear sum of the plurality of reference component spectra and the self-luminescence spectrum is obtained.

The evaluation section is configured to calibrate at least one of the emission spectrum or the component spectrum based on the exposure time or the gain set for each predetermined pixel region.

The evaluation section may be configured to determine whether or not there is a pixel whose pixel value reaches saturation from the captured spectrum, and to exclude the pixel reaching saturation from the calculation of the component ratio of the captured spectrum.

A fluorescence observation device according to an embodiment of the present technology includes a stage, an excitation section, a spectrum section, an image sensor, and a control unit.

The stage is configured to support a fluorescently stained pathology specimen.

The excitation part irradiates the pathological specimen on the objective table with line illumination.

The spectral portion disperses the excited fluorescence by line illumination for each wavelength.

The image sensor is configured to be able to set an exposure time or a gain in units of pixels and detect light of each wavelength dispersed in a spectral portion.

The control unit is configured to set an exposure time or a gain of the image sensor in units of a predetermined pixel region.

The fluorescence observation device may further include a display section for displaying a fluorescence spectrum based on an output of the image sensor.

The display section may have an operation region for receiving an input of an exposure time or a gain in units of a predetermined pixel region.

The display section may have a display area for displaying the spectrum and the histogram after the setting based on the exposure time or the gain setting.

Advantageous effects of the invention

As described above, according to the present technology, recording can be performed in a high dynamic range while suppressing the recording capacity of the sensor.

Note that the effects described herein are not necessarily limiting, and any effect described in the present disclosure may be provided.

Drawings

Fig. 1 is a schematic diagram showing a basic configuration of a spectral imaging apparatus according to an embodiment of the present technology.

Fig. 2 is a schematic view showing an optical system of a fluorescence observation device provided with a spectral imaging device.

FIG. 3 is a schematic view of a pathological specimen of an observation target.

Fig. 4 is a block diagram showing the configuration of the fluorescence observation device.

Fig. 5 is a block diagram showing the arrangement of a detection unit and its periphery in a fluorescence observation device.

Fig. 6 is a schematic diagram for explaining a relationship between a pixel portion and an emission spectrum.

Fig. 7 is an explanatory diagram showing a relationship between an emission spectrum and a dynamic range in a detection region.

Fig. 8 is a flowchart showing a processing procedure up to the component separation calculation of the emission spectrum performed in the control unit.

Fig. 9 is a flowchart showing an example of a saturation processing procedure in the embodiment.

Fig. 10 is a schematic diagram illustrating an example of saturation processing.

FIG. 11 is a schematic view of a display section in the fluorescence observation apparatus.

Fig. 12 is a diagram showing an example of a screen configuration of a setting region of an excitation portion in a display portion.

Fig. 13 is a diagram showing an example of a screen configuration of a detection setting region of a fluorescence spectrum from one line illumination in a display section.

Fig. 14 is a diagram showing an example of a screen configuration of a detection setting region of fluorescence spectra from other line illuminations in a display portion.

Fig. 15 is a diagram for explaining a histogram window in the display unit.

Fig. 16 is a block diagram of the fluorescence observation device for explaining the processing executed in the control unit.

Fig. 17 is a schematic block diagram showing a modification of the fluorescence observation device.

Fig. 18 is a schematic block diagram showing other modifications of the fluorescence observation device.

Detailed Description

Embodiments according to the present technology will be described below with reference to the drawings.

[ outline of the apparatus ]

Fig. 1 is a schematic diagram showing a basic configuration of a spectral imaging apparatus 10 according to an embodiment of the present technology.

As shown in the same figure, the spectral imaging apparatus 10 is a line scanning type imaging spectrometer, and includes a spectral portion 11 and a detection portion 12. The spectral portion 11 has a slit 111 and a wavelength dispersion element 112 parallel to the X-axis direction. The detection section 12 includes an image sensor (area sensor) 121, and the image sensor 121 includes a solid-state imaging element such as a CMOS (complementary metal oxide semiconductor) and a CCD (charge coupled device).

The slit 111 extracts a spatial component in the X-axis direction of incident light (fluorescence) from a sample (not shown) on the xy plane. The wavelength dispersion element 112 disperses the incident light Lx passing through the slit 111 for each wavelength to image the image sensor 121. As the wavelength dispersion element 112, a prism or a diffraction grating is generally used to separate each wavelength band of the incident light Lx in the Y-axis direction. The image sensor 121 obtains a spectral image of (X, λ) of the incident light L1 wavelength-separated in the wavelength dispersion element 112. By combining a mechanism for scanning the sample in the Y-axis direction, a spectral image of (X, Y, λ) can be obtained.

The image sensor 121 is configured to be able to set an exposure time or a gain to a pixel unit, as will be described later. By adjusting the exposure time or gain depending on the light-receiving region of light in each wavelength band, saturation of light in a bright wavelength band can be suppressed, and a spectral image having sufficient sensitivity to light in a dark wavelength band can be obtained.

Further, the image sensor 121 is configured to read only a part of the area from the readout area in the full frame. As a result, the frame rate can be increased by an amount corresponding to the reduction of the readout area. Further, any region of the readout region may be divided into a plurality, and different gains and exposure times may be set in each region.

[ fluorescence observation apparatus ]

Fig. 2 is a schematic diagram showing an optical system of the fluorescence observation device 100 including the spectral imaging device 10 of the present embodiment.

As shown in the figure, the fluorescence observation apparatus 100 includes a spectrum section 11, a detection section 12, and a fluorescence excitation section 13. The fluorescence excitation section 13 includes an excitation light optical system 131, a filter block 132, and an objective lens 133.

The excitation light optical system 131 includes one or more light sources capable of emitting excitation light. As the light source, a Light Emitting Diode (LED), a Laser Diode (LD), a mercury lamp, or the like is used. The excitation light is line-illuminated and illuminates the sample S on the stage 20 parallel to the xy-plane.

The specimen S is usually formed of a slide (slide) including the observation target Sa, such as a tissue section shown in fig. 3. However, it should be understood that the specimen S may be formed of something other than such a slide. The specimen S (observation target Sa) is stained with a plurality of fluorescent pigments excited by irradiation of excitation light.

The filter block 132 includes a dichroic mirror, a band pass filter, and the like. The dichroic mirror reflects the excitation light from the excitation light optical system 131 toward the objective lens 133, and transmits the fluorescence from the sample S transmitted through the objective lens 133 toward the spectral portion 11. The band pass filter has band pass characteristics of cutting off a wavelength band of excitation light of light directed from the sample S to the spectrum portion 11.

Fig. 4 is a block diagram showing the configuration of the fluorescence observation device 100. The fluorescence observation device 100 includes a device main body 1, a control unit 2, and a display unit 3.

The apparatus main body 1 includes a stage 20, an excitation light source (excitation portion) 101, a spectral imaging portion 102, an observation optical system 103, a scanning mechanism 104, a focusing mechanism 105, a non-fluorescence observation portion 106, and the like.

The excitation light source 101 corresponds to the excitation optical system 131, and the spectral imaging section 102 corresponds to the spectroscopic section 11 and the detection section 12. The observation optical system 103 corresponds to the filter block 132 and the objective lens 133.

The scanning mechanism 104 is generally formed of an XY moving mechanism that moves in parallel with the stage 20 in at least two directions of the X axis and the Y axis. In this case, for example, as shown in fig. 3, the image capturing region Rs is divided into a plurality of regions in the X axis direction, and the operation is repeatedly performed, that is, the sample S is scanned in the Y axis direction, followed by moving the sample S in the X axis direction, and further scanning is performed in the Y axis direction. As a result, a spectral image of a large area can be obtained, and for example, in the case of a pathological slide (pathological slide), a WSI (whole slide imaging) can be obtained.

The focusing mechanism 105 moves the stage 20 or the objective lens 133 to the best focus position in a direction perpendicular to the X-axis and the Y-axis. The non-fluorescence observation unit 106 is used for dark field observation, bright field observation, and the like of the sample S, but may be omitted as necessary.

The fluorescence observation apparatus 100 may be connected to the control section 80 for controlling a fluorescence excitation section (control of a laser diode or a shutter), an XY stage as a scanning mechanism, a spectral imaging section (camera), a focusing mechanism (a detection section and a Z stage), a non-fluorescence observation section (camera), and the like.

[ image sensor ]

Fig. 5 is a block diagram showing the arrangement of the detection unit 12 and its periphery.

As shown in the same figure, the detection section 12 includes an image sensor 121 and a signal processing circuit 122. The image sensor 121 includes a pixel section 30 and a calculation section 31.

The pixel section 30 outputs charge information corresponding to an exposure time by photoelectric conversion in each pixel of a pixel array of a Bayer (Bayer) array composed of, for example, RGB pixels. The pixel sections 30 are set to different exposure times in units of pixel areas (for example, in units of rows (lines)) by control (shutter control) of the control unit 2. From the row to be subjected to the long-time exposure, high-sensitivity pixel information 311 corresponding to the accumulated charge based on the long-time exposure is output. From the row to be subjected to the short-time exposure, low-sensitivity pixel information 312 corresponding to the accumulated charge based on the short-time exposure is output.

The calculation section 31 calculates a pixel value from the image data output from the pixel section 30. In the present embodiment, the calculation section 31 inputs the high-sensitivity pixel information 311 and the low-sensitivity pixel information 312 output from the pixel section 30, and has an image information synthesis section 313 for generating one piece of image information based on the input information. The output of the image information combining unit 313 is input to the signal processing circuit 122. The signal processing circuit 122 performs signal processing, such as White Balance (WB) adjustment and γ correction, for example, to generate an output image. The output image is supplied to the control unit 2, and is stored in a storage unit 21 described later, or is output to the display unit 3.

The image sensor 121 obtains fluorescence spectrum data (x, λ) using a pixel array in the Y-axis direction (vertical direction) of the pixel section 30 as a channel of wavelength. The obtained spectral data (x, λ) is recorded in the control unit 2 (storage section 21) in a state of whether or not the spectral data excited by the excitation wavelength is bound.

The exposure time of the pixel section 30 is set for each predetermined pixel region by the control unit 2. In the present embodiment, since the wavelength dispersion element 112 in the spectral portion 11 is wavelength-separated in the Y-axis direction for the incident light Lx (see fig. 1), light (emission spectrum) having a different wavelength in the Y-axis direction reaches the pixel portion 30 of the image sensor 121. Therefore, in the present embodiment, as described above, by the control (shutter control) of the control unit 2, the exposure time of the pixel section 30 is set in units of lines parallel to the X-axis direction perpendicular to the Y-axis direction.

The control unit 2 is also configured to be able to individually set the gain for sensitivity compensation by which each of the high-sensitivity pixel information 311 and the low-sensitivity pixel information 312 in the image information synthesizing section 313 of the calculating section 31 is multiplied, in units of pixel regions. Therefore, the sensitivity of the low-sensitivity pixel information can be improved while suppressing saturation of the high-sensitivity pixel information 311.

The set values of the exposure time and the gain are not particularly limited, and may be arbitrary values or values based on the emission spectrum intensity of the pigment measured in advance. For example, when the exposure time and the gain of the pixel value of the low-sensitivity pixel information area are set to 1, the exposure time and the gain of the pixel value of the high-sensitivity pixel information area are set to a range of, for example, about 1.5 to 5.0.

Further, it is not limited to the case where both the exposure time and the gain are set, and may be set so that only the exposure time is adjustable, or may be set so that only the gain is adjustable. Alternatively, one of the exposure time and the gain is a main setting value, and the other may be a supplementary setting value. For example, by setting the exposure time to the main setting value, image data having good S/N can be obtained.

Fig. 6 is a schematic diagram for explaining a relationship between the pixel section 30 and the emission spectrum.

As shown in the same figure, the control unit 2 determines the detection area from the wavelength range of the emission spectrum, the transmission wavelength range of the filter block 132 (see fig. 2), and the entire readout area of the image sensor 121 (pixel section 30). In the case of fluorescence imaging, the filter block 132 generally has a band-pass characteristic for cutting off excitation light. Therefore, if there are a plurality of excitation wavelengths, a band (opaque band DZ) which is not transmitted by the wavelengths as shown in the same figure is generated. The control unit 2 excludes an area not including such a signal to be detected from the detection area.

As shown in fig. 6, when regions located above and below the light non-penetration region DZ are taken as ROI1 and ROI2, respectively, emission spectra (hereinafter, also referred to as fluorescence spectra) of pigments having corresponding peaks are detected. Fig. 7 is an explanatory diagram showing a relationship between an emission spectrum and a dynamic range in detection regions, and (a) of the same diagram shows data obtained before setting an exposure time and a gain (the exposure time or the gain is the same in each detection region), and (b) of the same diagram shows data obtained after setting the exposure time and the gain, respectively.

As shown in fig. 7 (a), the pigment of ROI1 has strong spectral intensity and saturates beyond the dynamic range of detection, whereas the pigment of ROI2 has weak intensity. In the present embodiment, as shown in (b) of fig. 7, the exposure time of the (X, λ) region corresponding to ROI1 is set to be relatively short (or the gain is set to be relatively small), and conversely, the exposure time of the (X, λ) region corresponding to ROI2 is set to be relatively long (or the gain is set to be relatively large). As a result, dark and light colored pigments can be captured with proper exposure. Coordinate information of the detection region, such as ROI1 and ROI2, gain, and information on the exposure time period, is stored in the storage section 21 of the control unit 2.

[ control Unit ]

The fluorescence spectrum obtained by the detection section 12 (spectral imaging section 102) including the image sensor 121 is output to the control unit 2. The captured data of the plurality of fluorescence spectra can be quantitatively evaluated by component analysis (color separation) based on the spectrum of the individual pigment or the like. The control unit 2, as shown in fig. 4, includes a storage section 21 and an evaluation section 22.

The control unit 2 may be realized by hardware elements used in a computer, such as a CPU (central processing unit), a RAM (random access memory), and a ROM (read only memory), and necessary software. A PLD (programmable logic circuit) such as an FPGA (field programmable gate array), or a DSP (digital signal processor), other ASIC (application specific integrated circuit (ASIC)), or the like may be used instead of or in addition to the CPU.

The storage section 21 stores in advance a plurality of reference component spectra as pigments for individually staining the sample S and a self-luminescence spectrum (hereinafter, also collectively referred to as a standard spectrum) of the sample S. The evaluation section 22 separates the emission spectrum of the sample S obtained by the image sensor 121 into spectra derived from the pigment and self-luminescence spectra based on the standard spectrum stored in the storage section 21, and calculates each component ratio. In the present embodiment, the component ratio of the emission spectrum of the captured sample S is calculated so as to be the linear sum of the standard spectra.

On the other hand, the emission spectrum of the sample S obtained by the image sensor 121 is modulated from the original spectrum because the exposure time and gain are set individually for each detection region. Therefore, if the data obtained by the image sensor 121 is used as it is, the color separation calculation of the component spectrum may not be performed accurately.

Therefore, the evaluation section 22 is configured to calibrate at least one of the emission spectrum or the reference component spectrum based on the exposure time and the gain set for each predetermined pixel region (detection region) of the image sensor 121.

Fig. 8 is a flowchart showing a processing procedure up to the component separation calculation of the emission spectrum performed in the control unit 2. Hereinafter, the emission spectrum of the sample S obtained by the image sensor 121 is also referred to as a captured spectrum.

As shown in the same figure, the control unit 2 sets the exposure time and the gain of the detection region of the pixel section 30 of the image sensor 121 (step 101). These setting values are input by the user via the display unit 3 described later. After recording the exposure time and the gain setting to the storage section 21, the control unit 2 obtains the captured spectrum of the sample S via the image sensor 121 (steps 102 and 103).

The control unit 2 demodulates the captured spectrum based on the set gain and the set exposure time of each detection region, or calibrates the captured spectrum by modulating the standard spectrum stored in the storage section 21 (step 104). In other words, the captured spectrum and the standard spectrum are converted to a common intensity axis based on the set exposure time and the set gain. The intensity axis includes, for example, the number of charges per unit time [ e ]-]Spectral radiance [ W/(sr m)2·nm)]And the like. In the case of changing the standard spectrum, the standard spectrum is multiplied by the relative intensity ratio of the detection regions at the time of capture. Thereafter, if necessary, a saturation process (step 105) which will be described later is performed, and then a component separation calculation of the captured spectrum is performed (step 106).

On the other hand, in the case where spectrum capturing of a plurality of fluorescent samples is performed by spectroscopy, it is important to set parameters such as exposure time and gain of each pixel so that capturing can be performed in advance without saturation. However, in WSI (whole slide imaging) and the like, it is difficult to obtain optimal exposure in all regions of the specimen, and the time loss is also large. When saturation occurs during capture, the peak of the spectrum reaches a limit at the AD (analog to digital conversion) maximum of the sensor, making it impossible to capture the correct spectrum. Therefore, there arises a problem that a deviation from a component spectrum (standard spectrum) prepared in advance for color separation calculation becomes large, and correct calculation cannot be performed.

Therefore, in the present embodiment, in addition to extending the dynamic range by ROI (region of interest) setting, the saturation processing described later is performed. This enables the color separation calculation to be performed correctly even when there is some saturation in the captured spectrum, thereby reducing the number of retries of capturing.

The saturation processing in the present embodiment performs processing of specifying a pixel at which saturation occurs and excluding the pixel from calculation. An example of the processing procedure is shown in fig. 9.

Fig. 9 is a flowchart showing a saturation processing procedure.

As shown in the same figure, the control unit 2 performs a process of generating a saturated detection array from the acquired captured spectrum (step 201). As shown in fig. 10, the presence or absence of saturation of the captured spectrum is determined for each wavelength (channel), and a saturation detection array is generated, in which the unsaturated channel is set to "1" and the saturated channel is set to "0".

The presence or absence of saturation is determined by referring to the pixel value of each detection region and whether it reaches the maximum luminance value. Since the pixel region estimated to reach the maximum luminance value is saturated compared to the original correct spectrum, the channel of the reference spectrum corresponding to the pixel region (channel) is removed from the component separation calculation.

In general, the number of channels (CH number) of wavelengths recorded by spectral capture is typically greater than the number of components of the final output. Therefore, if the number of effective channels in which saturation does not occur is larger than the number of components, the component separation calculation can be performed even if the data of the channels in which saturation occurs is removed from the calculation.

When the effective number of channels in the generation array (the number of channels is determined to be "1") is larger than the final output component (the number of channels), a process of multiplying the saturation detection array by the captured spectrum and the reference spectrum is performed (steps 203 and 204). Otherwise, the calculation is impossible, and therefore, the processing is ended without performing the component separation calculation. As a result, since the channel in which the saturation occurs is excluded from the calculation of the least square method, the composition ratio calculation can be performed using only the correctly measured wavelength.

According to the present embodiment as described above, with respect to the image sensor 121 capable of changing the gain setting and exposure time of any detection region, a spectral imaging optical system is provided to spread the lateral axis of the image sensor 121 to space and the longitudinal axis thereof to wavelength. From each region of the image sensor 121, by setting to read only the detection region, further dividing the detection region into two or more two-dimensional spaces ROI of wavelength × space, and by setting a combination of different gain and exposure time to each detection region, an optimum exposure condition can be obtained, and also the dynamic range of the spectrum to be recorded can be expanded.

For example, when taking a multi-fluorescently stained sample, the blue fluorescent pigment can have a very high intensity compared to the red fluorescent pigment. Under such conditions, the exposure time of the blue wavelength band is shortened, the gain is set lower, the exposure time of the red wavelength band is lengthened, and the gain is set higher. As a result, recording with a shallow bit range can be performed, so that recording with a high dynamic range can be performed while suppressing the recording capacity of the sensor.

The detection area of the image sensor 121 is set according to the spectrum of the object to be measured within the sensitivity area of the sensor. If there are non-transparent bands such as notch filters and areas where no light is present in the observation optical path, the recording frame rate can be increased by excluding them from the readout area.

Further, according to the present embodiment, when the color mixture ratio of each pigment is calculated individually from the obtained spectrum, even if there is some saturation in the captured spectrum, color separation by spectrum fitting can be performed by generating a saturation detection array (see fig. 10) for distinguishing a saturation wavelength from other wavelengths.

[ display part ]

The problem with setting the capture parameters by ROI is that it is difficult for the user to understand the capture conditions. Because the data is three-dimensional in space and wavelength, it is difficult to see where saturation occurs and which wavelength has insufficient signal. The section that performs ROI setting and display needs to be able to comprehensively display and set the relationship between the setting parameters and the capture range, the relationship between the setting parameters and the sensor output, and the like.

Therefore, in the present embodiment, the display section 3 is configured as follows, and details of the display section 3 will be described below. Here, as an example, a configuration of the display section 3 assuming multiple fluorescence imaging will be described.

Fig. 11 is a schematic diagram illustrating the display unit 3. The display section 3 is configured to be able to display the fluorescence spectrum of the sample S based on the output of the image sensor 121. The display section 3 may be constituted by a monitor integrally mounted to the control unit 2, or may be a display device connected to the control unit 2. The display section 3 includes a display element such as a liquid crystal device or an organic EL device and a touch sensor, and is configured as a UI (user interface) that displays settings for inputting shooting conditions, shooting images, and the like.

As shown in fig. 11, the display section 3 includes a main screen 301, a thumbnail image display screen 302, a slide information display screen 303, and a captured slide list display screen 304. The main screen 301 includes a display area 305 of control buttons (keys) for capturing, a setting area 306 of excitation laser (line illumination), detection setting areas 307 and 308 of spectrum, a spectrum automatic setting control area 309, and the like. At least one of these areas 305 to 309 may exist, and other display areas may be included in one display area.

The fluorescence observation device 100 sequentially performs taking out a slide (sample S) from a slide rack (not shown), reading slide information, capturing a thumbnail of the slide, setting an exposure time, and the like. The slide information includes patient information, tissue site, disease, staining information, and the like, and is read from a barcode, a QR code (registered trademark), and the like attached to the slide. The thumbnail image of the specimen S and the slide information are displayed on the display screens 302 and 303, respectively. The captured slide information is displayed as a list on the screen 304.

In addition to the fluorescence image of the specimen S, the capture state of the slide currently captured is displayed on the main screen 301. The excitation laser light is displayed or set in the setting region 306, and the fluorescence spectrum derived from the excitation laser light is displayed or set in the detection setting regions 307 and 308.

Fig. 12 is a diagram showing an example of a screen configuration of the setting region 306 of the excitation laser. Here, ON/OFF (ON/OFF) of the output of each excitation line L1-L4 is selected or switched by a touch operation ON each check box 81. Further, the magnitude of the output of each light source is set by the operation section 82.

Fig. 13 shows an example of the screen configuration of the spectral detection arrangement region 307 in the excitation line 1, and fig. 14 shows an example of the screen configuration of the spectral detection arrangement region 308 in the excitation line 2. In each figure, the vertical axis represents brightness, and the horizontal axis represents wavelength. These detection setting regions 307 and 308 are each configured as an operation region for accepting input of exposure time and gain in predetermined units of pixels of the image sensor 121.

In fig. 13 and 14, the indicator 83 indicates that the excitation light sources (L1, L2, and L4) are bright, and the longer the length of the indicator 83 is, the more power of the light sources is indicated. The detection wavelength range of the fluorescence spectrum 85 is set by the operation bar 84. The display method of the fluorescence spectrum 85 is not particularly limited, and is displayed as a total pixel average spectrum (wavelength × intensity) at the excitation lines 1 and 2, for example.

As shown in fig. 13 and 14, the fluorescence spectrum 85 can be displayed by a thermographic method in which frequency information of values is shaded. In this case, too, an unclear signal distribution can be visualized by means of the mean values.

Note that the vertical axis of the graph for displaying the fluorescence spectrum 85 is not limited to a linear axis, and may be a logarithmic axis or a mixed axis (double-exponential axis).

The fluorescence spectrum 85 can be set according to the wavelength and power of the excitation light source. The wavelength range of the fluorescence spectrum 85 can be arbitrarily changed by a cursor movement operation on the operation bar 84 using an input device such as a mouse. The fluorescence spectrum 85 is represented by a current average value or a waveform calculated from the last captured waveform in consideration of the setting change.

The control unit 2 sets the reading area of the image sensor 121 based on the wavelength bands (set values) input to the detection setting areas 307 and 308. Based on the wavelength bands set by the detection setting regions 307 and 308 and a predetermined conversion formula (conversion formula to pixels corresponding to the wavelengths) obtained in advance, sensor coordinates are specified, and the exposure time and gain are set. Display areas in which the exposure time and the gain can be set individually can be provided separately. After inputting settings based on the exposure time and gain and setting via the operation bar 84, the detection setting regions 307 and 308 display the fluorescence spectrum 85.

Fig. 15 shows an example of a screen configuration of the spectrum automatic setting control region 309. In the spectrum automatic setting control region 309, an automatic setting key 86, a histogram window 87, and the like are arranged. The automatic setting start key 86 automatically executes the pre-sampling imaging and the above-described spectral detection setting. The histogram window 87 calculates and displays histograms corresponding to the wavelength ranges of the spectra set in the detection setting areas 307 and 308. The vertical axis of the histogram is frequency and the horizontal axis is wavelength.

Referring to the histogram window 87, when captured under the detection conditions of the spectra set in the detection setting regions 307 and 308, the occurrence of saturation and the presence or absence of signal insufficiency (insufficient intensity) can be definitely confirmed. Further, the exposure time and gain may be changed while the histogram is checked.

Fig. 16 is a block diagram of the fluorescence observation device 100 for explaining the processing executed in the control unit 2.

The control unit 2 stores the parameters set in the respective setting regions 306 to 308 of the display section 3 in the storage section 21 (see fig. 4), and sets the reading region (wavelength band), exposure time, and gain for the image sensor 121 based on the parameters (S401).

The control unit 2 outputs the emission spectrum of the sample S obtained by the image sensor 121 to the display section 3(S402), and the waveforms of the spectrum are displayed in the detection setting regions 307 and 308 (see fig. 13 and 14).

In the automatic setting control mode, the control unit 2 performs optimization processing of the exposure time and the gain based on the captured data of the image sensor 121 (step 403), and repeats the processing of acquiring the captured data of the changed parameters.

On the other hand, when the component separation calculation of the captured spectrum is performed, the above-described component separation calculation is performed based on the captured data of the image sensor 121, and the result is displayed on the display section 3 (e.g., the main screen 301) (S404).

As described above, according to the present embodiment, the set spectrum and histogram are captured and displayed in real time based on the set wavelength band, exposure time, and gain, and the spectrum and histogram at the new setting value are displayed from the obtained spectrum. Therefore, the relationship between the setting parameter and the capture range, the relationship between the setting parameter and the sensor output, and the like can be comprehensively displayed and set.

< modification >

Next, a modification of the configuration of the fluorescence observation device 100 described above will be described.

Fig. 17 is a schematic block diagram of a fluorescence observation device 200 according to modification 1, and fig. 18 is a schematic block diagram of a fluorescence observation device 300 according to modification 2. The fluorescence observation devices 200 and 300 respectively include a device main body 1, a control unit 2, a display section 3, and a control program 81.

The control program 81 is a program for causing the fluorescence observation devices 200 and 300 to perform the same control functions as those performed by the control section 80 of the fluorescence observation device 100 described above. In the fluorescence observation device 200 shown in fig. 17, the control program 81 is set in a state stored in a recording medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a flash memory, and is downloaded to and used by an electronic computer C or the like connected to the fluorescence observation device 200.

On the other hand, in the fluorescence observation device 300 shown in fig. 18, a control program 81 issued from the outside via a network such as the internet is downloaded to the electronic computer C and used. In this case, the fluoroscopic observation apparatus 300 and the code for obtaining the control program 81 are packaged to be provided.

The electronic computer C that has downloaded the control program 81 obtains various data for controlling the excitation light source 101, the spectral imaging section 102, the scanning mechanism 104, the focusing mechanism 105, the non-fluorescence observation section 106, and the like, and the control algorithm of the downloaded control program 81 is executed, and the control conditions of the fluorescence observation devices 200 and 300 are calculated. The electronic computer C issues a command to the fluoroscopy apparatuses 200 and 300 based on the calculated conditions, so that the conditions of the fluoroscopy apparatuses 200 and 300 are automatically controlled.

Although the embodiments of the present technology are described above, it is needless to say that the present technology is not limited to the above-described embodiments, and various modifications may be made.

The present technology may also have the following structure.

(1) A spectral imaging apparatus, comprising:

a spectrum section that disperses incident light for each wavelength;

an image sensor configured to be able to set an exposure time or a gain in units of pixels, the image sensor detecting light of each wavelength dispersed in a spectral portion; and

a control unit configured to be able to set an exposure time or a gain of the image sensor in units of a predetermined pixel region.

(2) The spectral imaging apparatus according to (1), wherein

The spectral portion is configured to disperse incident light in one axial direction for each wavelength, and

the control unit is configured to set an exposure time or a gain of the image sensor in units of a line perpendicular to the one axial direction.

(3) The spectral imaging apparatus according to (1) or (2), wherein

The image sensor includes a pixel section and a calculation section that calculates a pixel value based on image data output from the pixel section, an

The control unit is configured to set a gain for calculating a pixel value in units of a predetermined pixel region.

(4) The spectral imaging apparatus according to any one of (1) to (3), wherein

The control unit includes an evaluation section that obtains an emission spectrum of incident light based on an output of the image sensor; a storage unit for storing the plurality of reference component spectra and autofluorescence spectra, and

the evaluation section is configured to calculate a component ratio of the emission spectrum so that a linear sum of a plurality of reference component spectra and a self-luminescence spectrum is obtained.

(5) The spectral imaging apparatus according to (4), wherein

The evaluation section is configured to calibrate at least one of the emission spectrum or the component spectrum based on the exposure time or the gain set for each predetermined pixel region.

(6) The spectral imaging apparatus according to (5), wherein

The evaluation section is configured to determine from the captured spectrum whether or not there is a pixel whose pixel value reaches saturation, and to exclude the pixel reaching saturation from the calculation of the component ratio of the captured spectrum.

(7) A fluorescence observation device comprising:

an objective table capable of supporting a fluorescent-stained pathological specimen;

an excitation part for irradiating the pathological specimen on the objective table by line illumination;

a spectrum section that disperses, for each wavelength, fluorescence excited by line illumination;

an image sensor configured to be able to set an exposure time or a gain in units of pixels, the image sensor detecting light of each wavelength dispersed in a spectral portion; and

a control unit configured to set an exposure time or a gain of the image sensor in units of a predetermined pixel region.

(8) The fluorescence observation device according to (7), further comprising:

a display section for displaying the fluorescence spectrum based on an output of the image sensor.

(9) The fluorescence observation device according to (8), wherein

The display section has an operation area for receiving an input of an exposure time or a gain in units of a predetermined pixel area.

(10) The fluorescence observation device according to (8) or (9), wherein

The display section has a display area for displaying the spectrum and the histogram after the exposure time or the gain is set.

List of reference marks

2 control unit

3 display part

10 spectral imaging device

11 spectral part

12 detection part

13 fluorescence excitation part

20 object stage

21 storage part

22 evaluation unit

30 pixel part

31 calculation part

100. 200, 300 fluorescence observation device

121 image sensor

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:光照射装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!