Method and apparatus for collecting and displaying immunofluorescence images of biological samples

文档序号:1435023 发布日期:2020-03-20 浏览:9次 中文

阅读说明:本技术 用于采集和显示生物试样的免疫荧光图像的方法和设备 (Method and apparatus for collecting and displaying immunofluorescence images of biological samples ) 是由 T·J·祖姆普夫 M·帕佩 M·哈根-埃格特 C·皮普克 T·劳丹 M·法尔克特 M·莫 于 2019-09-12 设计创作,主要内容包括:本发明提出一种用于采集和显示生物试样的免疫荧光图像的方法。在第一运行状态期间,持续地用激发辐射照射试样。根据用户的运动姿势来改变在试样和光学系统之间的相对位置,所述光学系统将荧光辐射朝向图像传感器引导,并且显示相应的数字图像。在探测到运动姿势结束时从第一运行状态切换到第二运行状态中。在第二运行状态中,首先用激发辐射照射试样,以激发荧光染料的荧光辐射。然后采集由试样发射出的荧光辐射并且产生相应的、另外的数字图像。在第二运行状态中,在第二采集持续时间到期之后结束对所发射出的荧光辐射的采集以及结束对试样的照射。此外,在结束对所发射出的荧光辐射的采集以及结束对试样的照射之后持续地显示所述另外的数字图像。(The invention provides a method for collecting and displaying immunofluorescence images of biological samples. During the first operating state, the sample is continuously irradiated with the excitation radiation. The relative position between the sample and an optical system, which directs the fluorescent radiation towards the image sensor and displays the corresponding digital image, is changed according to the user's movement posture. Switching from the first operating state into the second operating state upon detection of the end of the movement gesture. In the second operating state, the sample is first irradiated with excitation radiation to excite the fluorescence radiation of the fluorochrome. The fluorescent radiation emitted by the sample is then collected and a corresponding, further digital image is generated. In a second operating state, the acquisition of the emitted fluorescence radiation and the irradiation of the sample are terminated after expiration of a second acquisition duration. Furthermore, the further digital image is displayed continuously after the end of the acquisition of the emitted fluorescence radiation and the end of the irradiation of the sample.)

1. A method for collecting and displaying immunofluorescence images (BI1, BI2, BI3) of a biological sample (G), the method comprising in a first operating state (BZ 1):

-continuously irradiating the sample (G) with excitation radiation (AS),

-changing the relative position between the sample (G) and an Optical System (OS) which directs fluorescent radiation (FS) emitted by the sample (G) towards at least one image sensor (BS) in dependence on the user's movement posture (BG1),

-acquiring fluorescence radiation (FS) with a first acquisition duration and determining a digital image (BI1, BI2) by means of a plurality of sensor pixels (P1, P2, P3, P4, P5, P6, P7, P8, P9, P10, P11, P12, P13, P14) of the at least one image sensor (BS), and

-displaying the digital image (BI1, BI2),

wherein in a first operating state (BZ1), the acquisition of the fluorescence radiation (FS) and the determination and display of the digital images (BI1, BI2) are repeated at time points which follow one another with a specific repetition frequency,

wherein a switch from the first operating state (BZ1) to the second operating state (BZ2) is also made upon detection of the end of the movement posture (BG1),

furthermore, the method comprises, in a second operating state (BZ 2):

-irradiating the sample (G) with excitation radiation (AS),

-acquiring fluorescence radiation (FS) emitted by the specimen (G) with a second acquisition duration by means of the plurality of sensor pixels (P1, P2, P3, P4, P5, P6, P7, P8, P9, P10, P11, P12, P13, P14) and determining a further digital image (BI3),

-displaying the further digital image (BI3),

-ending the acquisition of the emitted fluorescence radiation (FS) and ending the illumination of the specimen (G) after expiration of the second acquisition duration,

-displaying said further digital image (BI3) continuously after ending the acquisition of the fluorescence radiation (FS) and ending the irradiation of the specimen (G).

2. The method of claim 1, wherein the second acquisition duration is greater than the first acquisition duration.

3. The method of claim 1, further comprising:

-ending the second operating state (BZ2) and moving to the first operating state (BZ1) upon detecting the start of a new movement gesture (BG2) of the user.

4. Method according to claim 1, wherein the digital images (BI1, BI2) in the first operating state (BZ1) and the further digital image (BI3) in the second operating state are determined such that the digital images (BI1, BI2) of the first operating state (BZ1) and the further digital image (BI3) of the second operating state (BZ2) have the same intensity at the same light intensity of the fluorescent radiation (FS).

5. Method according to claim 4, wherein the digital images (BI1, BI2) in the first operating state (BZ1) and the further digital image (BI3) in the second operating state (BZ2) are determined in such a way that the digital images (BI1, BI2) of the first operating state (BZ1) and the further digital image (BI3) of the second operating state (BZ2) have the same intensity at the same light intensity of the fluorescence radiation (FS) in such a way that: one or more of the following parameters are selected differently for the first operating state (BZ1) and for the second operating state (BZ 2):

-an amplification factor for the sensor pixel values (SW),

-the number of sensor pixel values (SW) integrated into an image pixel value by means of pixel combinations,

-the number of sensor pixel values (SW) integrated into image pixel values by means of color filter array interpolation.

6. The method according to claim 1, wherein the digital image (BI1, BI2) in the first operating state (BZ1) is determined such that it has a first image resolution,

wherein the further digital image (BI3) in the second operating state (BZ2) is determined such that the further digital image has a second image resolution,

and selecting the first image resolution to be less than the second image resolution.

7. Method according to claim 1, wherein in the second operating state (BZ2) the emitted fluorescence radiation (FS) is acquired with a plurality of successive sub-acquisition durations within the second acquisition duration,

wherein for a sub-acquisition duration a corresponding, temporal digital image is determined,

and determining the further digital image based on the temporary digital image (BI 3).

8. Method according to claim 7, wherein in the second operating state (BZ2) the acquisition of the emitted fluorescence radiation (FS) and the determination of the temporal digital image are interrupted at the beginning of the detection of a new movement posture (BG2) of the user and a transition is made to the first operating state (BZ 1).

9. The method according to claim 1, wherein a color image sensor (BS1) is used as the image sensor (BS).

10. Method according to claim 1, wherein a grey-scale image sensor (BS2) is used as image sensor (BS), which detects fluorescent radiation in the green channel (FS 2).

11. The method according to claim 10, wherein a further grey value image sensor (BS3) is also used, which detects fluorescent radiation in the red channel (FS 3).

12. Method according to claim 1, wherein the focusing of the sample (G) by the Optical System (OS) is performed before the second operating state (BZ2) is occupied when switching from the first operating state (BZ1) into the second operating state (BZ 2).

13. An apparatus (V) for collecting and displaying immunofluorescence images (BI1, BI2, BI3) of a biological sample (G), the apparatus having:

an excitation light source (AL) for irradiating the sample (G) with excitation radiation (AS),

-a holding device (H) for holding a sample (G),

at least one image sensor (BS) having a plurality of sensor pixels (P1, P2, P3, P4, P5, P6, P7, P8, P9, P10, P11, P12, P13, P14) for acquiring fluorescence radiation (FS) emitted by the specimen (G),

an Optical System (OS) for directing the fluorescent radiation (FS) from the sample (G) towards the image sensor (BS),

a positioning unit (P) configured for changing the relative position between the sample (G) and the Optical System (OS),

-at least one control unit (S) having a first interface (SC1) to the user input means (N), a second interface to the image sensor (BS), a third interface (SC3) to the excitation light source (AL), a fourth interface (SC4) to the positioning unit (P) and a fifth interface (SC5) to the display unit (AE),

wherein the control unit (S) is designed to be used in a first operating state (BZ1)

-manipulating the excitation light source (AL) such that the sample (G) is continuously illuminated with excitation radiation (AS),

-deriving a movement posture (BG1) of the user also from input signals (ES) of a user input device (N), and manipulating the positioning unit (P) such that a relative position between the specimen (G) and the Optical System (OS) is changed in accordance with the movement posture (BG1),

-manipulating the image sensor (BS) such that the fluorescence radiation (FS) is acquired with a first acquisition duration by means of the plurality of sensor pixels (P1, P2, P3, P4, P5, P6, P7, P8, P9, P10, P11, P12, P13, P14), and a digital image (BI1, BI2) is also determined from the generated sensor pixel values (SW), and furthermore

-manipulating the display unit (AE) for displaying the digital image (BI1, BI2),

wherein the control unit (S) is also designed to be in a first operating state (BZ1)

-repeating the acquisition of fluorescence radiation (FS) and the determination and display of digital images (BI1, BI2) at a specific repetition frequency at successive time points,

and switching into a second operating state (BZ2) upon detection of the end of the movement posture (BG1),

wherein the control unit (S) is also designed to be in a second operating state (BZ2)

-manipulating the image sensor (BS) such that the fluorescence radiation (FS) is acquired with a second acquisition duration by means of the plurality of sensor pixels (P1, P2, P3, P4, P5, P6, P7, P8, P9, P10, P11, P12, P13, P14) and further digital images (BI3) are determined from the generated sensor pixel values (SW), manipulating the display unit (AE) for displaying the further digital images (BI3),

-the image sensor (BS) is further operated such that the acquisition of the fluorescence radiation (FS) is stopped after the expiration of a second acquisition duration, and the excitation light source (AL) is further operated such that the irradiation of the specimen (G) is ended after the expiration of the second acquisition duration,

-and finally also after the end of the irradiation of the sample (G) and after the end of the acquisition of the emitted fluorescence radiation (FS), the display unit (AE) is operated such that the further digital image (BI3) is displayed continuously.

14. The apparatus of claim 13, wherein the second acquisition duration is greater than the first acquisition duration.

15. The device according to claim 13, wherein the control unit (S) is further designed to end the second operating state (BZ2) and to switch into the first operating state (BZ1) when a new movement posture (BG2) of the user is detected to begin.

Technical Field

Methods and devices for acquiring so-called immunofluorescence images of biological samples by means of image sensors and displaying the immunofluorescence images on a display unit are known from the prior art.

Background

Such biological samples are for example tissue samples taken from humans or animals. In the process of so-called direct immunofluorescence, for example, the cellular components within the tissue can be made visible in the following manner: the tissue is incubated with a specific type of antibody, wherein the antibody reacts specifically with a specific antigen of the tissue. Here, a so-called fluorescent dye is bound to the antibody, so that the structures to be detected can then be visualized in the immunofluorescence image. For this purpose, the fluorescent dye is irradiated with excitation light having a specific excitation wavelength, so that fluorescent radiation having a wavelength different from the excitation wavelength is emitted from the tissue and can be guided by means of the objective lens toward the image sensor in order then to be captured by the image sensor.

In the course of so-called indirect immunofluorescence, for example, tissue, for example monkey liver, is used as the biological sample and is incubated with the serum of the patient in a first incubation step in order to detect possible binding of the antibodies in the patient serum to the specific antigens of the tissue. A second incubation with a second type of antibody is also carried out, which then binds to the first antibody in the serum, wherein the second antibody is likewise labeled with a fluorescent dye. Even in indirect immunofluorescence, the incubated tissue is then irradiated with excitation light of an excitation wavelength, so that the patient's antibodies bind to the specific antigens in the tissue by fluorescent radiation of a fluorescent dye having a fluorescence wavelength which is different from the excitation light wavelength.

The biological sample may also be given by a so-called antigen spot having the patient's antibodies bound to the antigen spot after incubation of the antigen spot with the patient's serum. After performing a further incubation step, a so-called secondary antibody, which is labeled with a fluorescent dye for its part, can be in turn bound to the patient's antibody.

The immunofluorescence image thus obtained may then be a solution for clinical reporting.

Known are, in particular, apparatuses, such as microscopes, in which a user can manually perform an orientation of a sample with respect to the microscope or an objective of the microscope in order to make a specific region or a specific local portion of the sample visible in an obtained immunofluorescence image.

Disclosure of Invention

It is an object of the present invention to provide a device and a method for a user, in which device or in which the acquisition of immunofluorescence images of different regions or regions of a sample is as simple and well controllable as possible for the user or can be handled well and designed to be safe to carry out.

The object of the invention is achieved by a method according to claim 1 and an apparatus according to claim 13.

A method for acquiring (erfassen) and displaying immunofluorescent images of a biological sample is presented. In a first operating state, the method executes specific method steps. During the first operating state, the sample is continuously irradiated with excitation radiation. Thus, in particular, a continuous emission of fluorescent radiation is generated by the sample. The relative position between the sample and an optical system which directs the fluorescent radiation emitted by the sample towards the at least one image sensor is changed depending on the movement posture of the user. This is achieved in particular by: the user can determine by motion gesture: which part of the sample is visible in the fluoroimmunoassay image can be made by means of an optical system and an image sensor. During the first operating state, fluorescent radiation is acquired with a first acquisition duration by means of a plurality of sensor pixels of the at least one image sensor. Then, a digital image is determined based on the generated sensor pixel values of the sensor pixels and displayed.

In a first operating state, the acquisition of the fluorescence radiation and the determination and display of the digital image are repeated at successive points in time with a specific repetition rate. In particular when the movement posture is carried out continuously, different individual regions or areas of the sample are therefore acquired in the immunofluorescence image and displayed in the digital image at corresponding individual points in time one after the other. Thus, the user is able to determine, inter alia, by performing a movement gesture: which part of the specimen is shown or displayed in the digital image. In this case, in particular in the first operating state, the fluorescence radiation of the digital image for the time point is acquired for a first acquisition duration. The maximum repetition frequency for displaying successive digital images in the first operating state is therefore determined in particular by the first acquisition duration and is the inverse of the first acquisition duration.

Upon detection of the end of the movement gesture, switching from the first operating state into a further second operating state. In the second operating state, the sample is first irradiated with excitation radiation to excite the fluorescence radiation of the fluorochrome. The fluorescent radiation emitted by the sample is then acquired by means of the plurality of sensor pixels with a second acquisition duration. The second acquisition duration is preferably greater than the first acquisition duration. Based on the resulting sensor pixel values of the sensor pixels, a further digital image is then generated. In a second operating state, the acquisition of the emitted fluorescence radiation and the irradiation of the sample are terminated after expiration of a second acquisition duration. Furthermore, the further digital image is continuously displayed after the end of the acquisition of the fluorescent radiation and the end of the irradiation of the sample.

To set forth one or more of the advantages that may be realized by the present invention in detail below, the reader is now provided with a detailed description. For the user, for example, for the actual examination of a patient, it is necessary to observe not only image sections of the incubated sample in the immunofluorescence image, but also different regions of the tissue at correspondingly high magnification levels. For this purpose, the user must be able to change the relative position of an optical system with respect to the sample, which optical system directs the emitted fluorescent radiation from the tissue towards the image sensor. If the relative position between the sample and the optical system remains unchanged, in particular at high optical magnifications, the user wishes to be able to observe such different regions, if necessary, with high image quality, so that the same image of the sample is constantly shown in the form of a further digital image in certain regions during the second operating state. The user wishes to observe the further digital image shown for a longer time, if necessary, in order to view the structure or the coloration in detail, and decides himself: whether he views another image part by entering a further movement gesture if necessary. By continuously displaying the further digital images after the expiration of the second acquisition time duration in the second operating state and, furthermore, terminating the irradiation of the sample by the excitation radiation, the so-called burning of the biological sample due to the irradiation with the excitation radiation is minimized. If the sample is continuously irradiated with the excitation radiation during the second operating state, the sample burns significantly earlier and is no longer available for microscopic examination, which is regarded as a technical problem. For the user, the following impression is obtained by the method proposed here: he not only acquires the continuously acquired images in the form of a representation which is displayed as a real microscope image during the execution of his movement posture, but also displays the continuously acquired microscope images for him during the second operating state, since he acquires the further digital images in the continuously displayed form. The solution proposed here thus achieves: the user obtains the impression of a continuous microscope or a so-called "real-time microscope" on the display unit, although during the second operating state no further fluorescence radiation is acquired while irradiating the tissue with excitation radiation after the expiration of the second acquisition duration and after the determination of the further digital image. Thus, the user has no notice at all: during at least one temporal subsection of the second operating state, no image acquisition is performed anymore and at the same time the biological sample is protected with respect to cauterization.

Another technical problem is that, on the one hand, during the execution of a movement gesture, for example by a finger sweeping over a touch-sensitive display screen, such as a touch screen, the user expects an image that is as smooth or smooth as possible in the display on the display unit in order to have the following feel: the image localization or localization unit follows its movement posture without delay, and he does not have the following impression due to the delayed display of digital images obtained at successive points in time: the image acquisition device selects a partial image with a delay and displays the digital image. At the same time, however, it is necessary for the user, after having performed his movement posture and thus concomitantly selected a specific image section of the specimen, to then display the further digital image in the second operating state with sufficiently high or relatively high quality, since the smallest image details for the examination in the immunofluorescence image may be of interest to the user. Since the emitted fluorescence radiation has only a certain optical power or a certain optical intensity per area, but at the same time the image sensor is only able to detect a certain amount of light at a given pixel resolution or in a given area per sensor pixel for a certain detection time or exposure time, the so-called sensor noise has an influence on the image signal to a certain extent depending on the selected detection duration and the given pixel resolution. If, for example, an acquisition duration with a relatively high value of, for example, 500ms is selected, then, nevertheless, at each individual sensor pixel, a certain amount of fluorescence radiation can be collected during this acquisition duration, and therefore the effective signal of the image obtained here covers, to a sufficient extent, the superimposed noise signals of the sensor and/or of the electronics. However, it is also only possible in this case to obtain digital images at time intervals of 500ms one after the other at time points which are locally insufficient for displaying or selecting images of the specimen in accordance with the movement posture with a smooth action, since the image frequency is only 2 images per second.

By preferably selecting the acquisition duration in the first operating state to be shorter than the acquisition duration in the second operating state, the repetition frequency or the display frequency of the digital images in the first operating state can be selected to be sufficiently high in order to provide the user with the impression of a smooth and fluent image with regard to his movement posture. Furthermore, it is possible to obtain an image signal for the further digital image with sufficient intensity and to show the further digital image with sufficient image quality by selecting the acquisition duration to be higher in the second operating state than in the first operating state, so that even small details are clearly recognizable to the user. When the user changes the optical detail of the sample, which is displayed in the digital image of the first operating state, by means of a movement gesture, the user orients only on a relatively coarse structure in order to decide himself whether the selected detail of the tissue is a section that the user wishes to observe with a higher image quality than what is known as a stop-motion image. In other words: the user is not interested in obtaining digital images describing all the details at mutually successive points in time in the first operating state during the execution of his movement posture, but it is sufficient for the user that the digital images are of only a certain minimum quality. However, after finishing his movement posture, he needs a higher quality image in quality in order to be able to accurately recognize all the details. Thus, in a second operating state for the further digital image, the fluorescence radiation is then acquired with a second acquisition duration which is greater than the first acquisition duration of the first operating state.

If the user performs a movement gesture, it is possible for the user to display digital images at successive points in time during the first operating state, albeit with a certain image quality due to the selection of the first acquisition duration. However, such quality differences cannot be truly fully perceived for the human eye and the human brain during motion showing an image. During the first operating state, the user therefore has the following optical impression: the image quality changes only slightly between the first operating state and the second operating state, but at the same time, due to the solution according to the invention, the digital images are displayed smoothly or smoothly according to the movement posture of the user at successive points in time during the first operating state, so that the user does not perceive a significant time delay between his movement posture and the change of the relative position between the optical unit and the sample.

Further embodiments of the invention have the advantage that they are the subject matter of the dependent claims and are explained in detail in the following description with partial reference to the drawings.

Preferably, the second operating state is ended and the first operating state is entered when a new movement gesture of the user is detected to start.

Preferably, the digital image in the first operating state and the further digital image in the second operating state are determined such that the digital image of the first operating state and the further digital image of the second operating state have the same intensity at the same light intensity of the fluorescent radiation.

Preferably, the digital image in the first operating state and the further digital image in the second operating state are determined such that, at the same light intensity of the fluorescent radiation, the digital image in the first operating state and the further digital image in the second operating state have the same intensity, in that: one or more of the following parameters are selected differently for the first operating state and for the second operating state:

-an amplification factor for the sensor pixel values,

-a number of sensor pixel values integrated into an image pixel value by means of pixel combinations,

-the number of sensor pixel values integrated into an image pixel value by means of color filter array interpolation.

Preferably, the digital image in the first operating state is determined such that it has a first image resolution, wherein the further digital image in the second operating state is determined such that it has a second image resolution, and the first image resolution is selected to be smaller than the second image resolution.

Preferably, in the second operating state, the emitted fluorescence radiation is acquired with a plurality of successive sub-acquisition durations within the second acquisition duration, wherein a corresponding, temporal digital image is determined for the sub-acquisition durations and the further digital image is determined on the basis of the temporal digital image.

In the second operating state, the acquisition of the emitted fluorescence radiation and the determination of the temporary digital image are preferably interrupted when a new movement posture of the user is detected, and the first operating state is entered.

Preferably, a color image sensor is used as the image sensor.

Preferably, a grayscale image sensor is used as the image sensor, which detects the fluorescent radiation in the green channel.

Preferably, a further gray-value image sensor is also used, which detects the fluorescent radiation in the red channel.

Preferably, the focusing of the sample by the optical system is performed before the second operating state is occupied when switching from the first operating state into the second operating state.

Drawings

The invention is explained in more detail below on the basis of specific embodiments without limiting the general inventive concept, on the basis of the attached drawings. Here:

fig. 1 shows a preferred embodiment of the apparatus according to the invention;

FIG. 2a shows a user input device;

FIG. 2b shows a display unit;

FIG. 3 illustrates a preferred embodiment for using two image sensors;

FIG. 4 illustrates a preferred embodiment of an image sensor having a plurality of sensor pixels;

FIG. 5 illustrates another preferred embodiment of an image sensor that is a color image sensor having a plurality of sensor pixels;

FIG. 6 shows preferred steps in the scope of carrying out a preferred embodiment of the method according to the invention;

FIG. 7 shows preferred steps for acquiring fluorescent radiation in a second operating state;

fig. 8 shows different image sections of a cell image during a first operating state;

fig. 9 shows a further detail of the cell image in the second operating state;

fig. 10 shows different image portions of an immunofluorescence image of tissue at different points in time during a first operating state;

fig. 11 shows a further digital image with a further section of the tissue during the second operating state.

Detailed Description

Fig. 8a shows a first image detail BI1, in which so-called HEP cells (human epithelial tumor cells) are shown. This image is acquired during the first operating state by means of the device according to the invention. For example, the user may wish to view the image local BI1 at a higher image quality. Previously, he may have observed another image part, BI2 in fig. 8B, and selected the corresponding image part by means of a motion gesture input via a user input tissue, as shown in fig. 8 a.

In accordance with the method according to the invention, the fluorescence radiation is then acquired in the second operating state to obtain an image, for example image BI3 in fig. 9, which has a higher image quality than image BI1 in fig. 8a and image BI2 in fig. 8 b.

In fig. 10a and 10b, a further example is shown, in which the user during the first operating state initiates a change in the relative position between the sample and the optical system from the image part BI12 toward the image part BI11 by means of a specification of its movement position, wherein the image BI13 is then acquired and displayed in the second operating state after the end of its movement position, as is shown in fig. 11.

The user input means may be, for example, a so-called touch screen on which the user performs a movement gesture by moving his fingertip over the surface of the touch screen. The motion gesture is then preferably ended or detected as ended when the user moves the finger away from the touch screen surface. Alternatively, the motion gesture may be detected as having ended when the user observes in time that the change in position of his fingertip on the touch screen surface is no longer or no longer clearly performed, such that a certain threshold value for the change in orientation of the fingertip is fallen below within a preset time window. The spatial threshold relates to a change in the detected position of the finger on the touch screen.

Alternatively, the user input device may be an input means, such as a computer mouse, via which the user may input such a movement gesture.

The control unit can then derive or detect the movement gesture from the input signals collected via the user input means.

Fig. 1 shows a preferred embodiment of a device V according to the invention, which is designed to carry out the method according to the invention. The sample G, preferably a tissue sample, is provided, for example, in a slide OT. The holding device H is preferably used to hold a slide OT or a specimen G. The holding device H is preferably coupled to a positioning unit P, which is designed to change the relative position between the sample G and the optical system OS. The change in the relative position between the sample G or the holding device H and the optical system OS can be brought about instead of the variant shown in fig. 1 by: so that the optical system OS is changed in its relative position with respect to the tissue G or holder H by the positioning unit.

The optical system OS directs the fluorescent radiation FS from the sample G towards the image sensor BS. The optical system OS is preferably formed by an objective OB and a further optical unit OE, which will be discussed in more detail below.

The image sensor BS has a plurality of sensor pixels for detecting the fluorescence radiation FS emitted by the sample G. Excitation light or excitation radiation AS is provided by means of an excitation light source AL and the sample G is thereby illuminated. The excitation radiation AS first passes through an optical filter FI1, which optical filter FI1 filters out the desired excitation radiation in a specific wavelength range. The radiation AS is then guided via the dichroic mirror SP1 toward the objective OB or the sample G.

The fluorescent radiation FS emitted by the sample G is conducted back through the objective OB to a dichroic mirror which directs the fluorescent radiation, whose wavelength is different from that of the excitation radiation AS, to an image sensor. Filter FI2 preferably filters out fluorescent radiation. The fluorescence radiation FS here preferably passes through a camera objective KO, which then conducts the fluorescence radiation FS to the image sensor BS of the camera K.

The device V according to the invention has at least one control unit S. The control unit S in turn has a first interface SC1 to the user input means N. The control unit S receives the input signal ES of the user input means N via the interface SC1 and is able to derive or determine therefrom a movement posture of the user and to detect the end of the movement posture and the start of a new movement posture.

The control unit S is capable of outputting the image signal BS or the control signal ST5 via the interface SC5 towards the display unit AE, and is thus capable of displaying a digital image. The user input means N and the display unit AE can also be provided in a single combined unit, for example a touch screen. Alternatively, the display unit may be a so-called computer monitor, wherein the user input means N may be a so-called computer mouse. If the user input means N and the display unit AE are combined, the interfaces SC1, SC5 may also be combined as a preferably bi-directional interface.

The control unit S receives the sensor pixel values SW of the image sensor BS via the interface SC 2. Furthermore, the control unit S may transmit one or more control signals ST2 to the camera K via the interface SC 2. Such a control signal ST2 may for example indicate the acquisition time to be used for acquiring the fluorescence radiation by the image sensor BS or the camera K. Furthermore, such a control signal ST2 may also be a so-called trigger signal in order to request the acquisition of a fluorescence beam by the image sensor BS at a specific point in time.

Preferably, the control unit S has an interface SC6 via which the control unit S manipulates the objective OB or the optical system OS by means of a control signal ST6 in order to change an optical setting, for example the focus of the optical system OS.

The control unit S can manipulate the excitation light source AL via the interface SC3 by means of the control signal ST3 in order to switch the excitation light on or off.

Furthermore, the control unit S can switch the laser light source LL on or off via the interface SC7 by means of the control signal ST7 and preferably also vary the brightness of the laser light source.

The positioning unit P is configured to change at least one lateral position of the specimen in the XY directions with respect to the optical system OS. The control unit S can actuate the positioning unit P via the interface SC4 by means of control signals ST4 such that a desired change in the relative position between the sample G or the holder H with respect to the optical system OS is brought about.

Preferably, the positioning unit P is also configured for changing the Z position to change the spacing between the specimen G and the optical system OS for focusing. For this purpose, in particular laser radiation LS can be used, which can then be coupled into the optical system OS via a laser light source LL, preferably with a dichroic mirror SP2, and can then be detected in an image sensor BS. In an alternative embodiment, the objective is designed to change the distance between the objective and the sample G in the Z direction in order to change the focus setting as a function of the control signal ST 6.

According to the embodiment shown here, the image sensor BS is a color image sensor BS1, so that the camera K is a color camera K1.

Fig. 2a shows a view of a user input device N in the form of a touch screen or touch sensitive display unit N1. The user can, for example, perform a movement gesture BG1 on the touch-sensitive surface BO by: he places his finger on the starting point STP1 and guides the finger along the curve of the movement gesture BG1 to the end point EP1, and then ends the movement gesture there. This end of the movement gesture can then be detected from fig. 1 by the control unit S.

Fig. 2b shows a display unit AE, which is preferably identical to the user input device N1 of fig. 2 a. Alternatively, in case the user input device is a computer mouse, the display unit AE may be a so-called computer monitor or screen.

The movement posture BG1 is again shown on the display unit for orienting the reader in dashed lines. Then, due to the change in the relative position between the positioning system or positioning unit P and the optical system OS in fig. 1, the specific image local BA in the immunofluorescence image or specimen is moved along the curve of the movement posture BG1 to the end point EP 1. If the user ends the movement at this end point EP1, a switch is then made to the second operating state and the immunofluorescence image is determined there using the second acquisition duration.

Fig. 2a also shows another motion gesture BG2, which the user performs from a starting point STP2, which starting point STP2 coincides with a previous end point EP1, up to another end point EP 2. If such a start of a new movement posture BG2 of the user is detected, the second operating state is then preferably ended and the first operating state is entered again.

Fig. 6 preferably shows the steps to be performed according to one embodiment of the method of the present invention. During the first operating state BZ1, a step S1 is carried out in which the excitation light source is switched on, so that the sample is continuously irradiated with excitation radiation during the first operating state. Preferably, in step S1, the configuration parameters are also transmitted by the control unit to the camera K via the interface SC 2.

In step S2, the relative position between the specimen and the optical system is then changed in accordance with the acquired movement posture of the user. Here, the relative position may be changed such that only a part of the motion posture is detected first, and then the relative position is changed to some extent according to the part of the motion posture.

In step S3, fluorescence radiation is then acquired with a first acquisition duration. This is selected such that a sufficiently smooth movement of the digital images displayed at successive points in time is given. Based on the sensor pixel values generated for the respective sensor pixels, the digital image is then determined by the control unit in step S3.

In step S4, the digital image is then displayed by means of output to a display unit.

Then, in step S5, it may be checked that: whether the user's motion gesture has been ended by the user. If this is not the case, then it is left in the first operating state BZ 1. This is indicated by branch ZW1 in fig. 6.

In the first operating state BZ1, therefore, fluorescent radiation is acquired and digital images are determined and displayed at a specific repetition rate at successive points in time.

However, if it is detected in step S5: the movement gesture has been ended by the user, and a switch is made from the first operating state BZ1 to the second operating state BZ 2.

Between the first operating state BZ1 and the second operating state BZ2, the optical system is preferably focused onto the sample with the laser radiation LS of the laser light source LL in fig. 1 in an optional intermediate step SF before occupying the second operating state BZ 2. The laser radiation LS is preferably coupled in by means of a dichroic mirror SP 2. Such an autofocus method using laser radiation LS is described in detail, for example, in EP patent application No. 17001037.5 of the applicant. During the course of the autofocus, the excitation light source for emitting excitation radiation is preferably switched off, wherein the image determined last in the first operating state is preferably also displayed continuously on the display unit during the course of the autofocus. Thus, the impression of a "real-time microscope" is not interrupted for the user during autofocusing.

In the second operating state BZ2, in step S6, the sample is irradiated with the excitation radiation, or the excitation light source is switched on for irradiating the sample with the excitation radiation.

In step S7, the fluorescent radiation is then acquired by means of the plurality of sensor pixels with a second acquisition duration, which is preferably greater than the first acquisition duration. Based on the generated sensor pixel values of the image sensor, a further digital image is then determined, which is then displayed in step S8. In step S9, the acquisition of the fluorescent radiation is ended and the irradiation of the sample is ended after the expiration of the second acquisition duration. In step S10, the further digital image is continuously displayed for the second operating state.

By continuously displaying the further digital images in the second operating state BZ2 after the expiration of the second acquisition time duration and ending the irradiation of the sample by the excitation radiation, the occurrence of so-called burning of the sample due to the irradiation with excitation radiation is minimized. For the user, the following impressions are also produced by the method according to the invention: he not only acquires the continuously acquired images in the manner of a representation of the microscope image displayed as a real image during the execution of his movement posture, but also displays the continuously acquired microscope image during the second operating state, since he acquires the further digital images in the continuously displayed manner. The solution proposed here thus achieves: although no further images are obtained while irradiating the tissue by the excitation radiation, the user still gets the impression of a continuous microscope.

In step S11 it is then checked: whether a new motion gesture of the user is detected. This may be, for example, the motion gesture BG2 in fig. 2A.

If this is not the case, then return is made to step S10 and the continued display of further digital images in the second operating state BZ 2. However, if this is the case, the second operating state is ended and the first operating state BZ1 is entered. This is illustrated by branch ZB2 in fig. 6.

Thereby realizing that: the user first moves the first image section into the displayed image by his first movement posture BG1 in fig. 2a, in order to obtain this image section in the second operating state after the end of the second operating state in a manner that it is displayed with a higher image quality. However, it is also possible for the user to return to the first operating state by starting the second movement gesture and to select a new image section by means of the second movement gesture.

In a preferred embodiment of the proposed method, which is described here with reference to fig. 6, the relative position is changed in step S2, fluorescent radiation is acquired in step S3, and the digital image is displayed in step S4 in a form in which steps S2, S3 and S4 are all serially successive to one another. In an alternative embodiment, the change of the relative position can be performed continuously, while the fluorescent radiation is acquired and the digital image is displayed in parallel with the change of the relative position in time. For example, steps S3 and S4 would then temporally succeed one another, but this sequence of steps S3 and S4 is performed in parallel with step S2.

Fig. 7 shows a variant of step S7 for acquiring fluorescent radiation in the second operating state, in which temporal digital images are determined for the respective partial acquisition durations, and in which further digital images of the second operating state to be displayed are also determined on the basis of the temporal digital images.

During a second acquisition duration in the second operating state, steps S71 to AS4 are carried out for acquiring the emitted fluorescence radiation.

First, in step S71, during a sub-acquisition duration of, for example, 19ms, a first temporary digital image is determined and displayed in a display step AS 1. In a next method step S72, a further temporary digital image is then determined with a sub-acquisition duration of 19ms, and a further digital image is determined which is the average of those two temporary digital images in steps S71 and S72, and is displayed in step AS 2. During an acquisition step S73, the emitted fluorescence radiation is then acquired with a sub-acquisition duration of 19ms to obtain or determine a third temporary digital image, wherein the further digital image is then determined from the three temporary digital images by means of averaging the pixel values. The further digital image is then displayed in a third display step AS 3. Corresponding processing then takes place in the acquisition step S74 and the display step AS4, so that in this embodiment four temporary digital images are determined, and the further digital image is determined on the basis of the temporary digital images and then displayed. In this embodiment, the second acquisition duration consists of four sub-acquisition durations.

This method for determining the further digital image on the basis of a plurality of temporary digital images is advantageous, since the sensor noise or the noise signal of the superimposed image signal is reduced or minimized by averaging the temporary digital images due to its statistical properties, but at the same time the actual intensity of the fluorescence radiation or the signal intensity remains unchanged as a useful signal.

The branches ZW11, ZW12, ZW13 leaving the respective steps AS1, AS2, AS3 show that the determination of the temporary digital image may be interrupted if: in the second operating state, the user is detected to have started a new movement gesture. Then, the detection of the emitted fluorescence radiation is also interrupted in the second operating state and the operation proceeds to the first operating state BZ 1. This embodiment is advantageous because it is thus possible for the user to start a new movement posture already during the averaging of the temporal digital images before the expiration of the entire second acquisition duration, which is accumulated by the sub-acquisition durations. If, in the second operating state, further digital images are acquired with a single, uninterrupted entire second acquisition duration, the user can only change the relative position or select an image section by starting a new movement posture when the entire second acquisition duration has expired. However, according to the solution in fig. 7 preferred here, the user can already change the relative position or select an image section before the expiration of the second acquisition duration due to the performed process of averaging the temporal digital images and the possibility of interrupting the second operating state. Continuously determining further digital images from the temporary digital images and updating their display on the display unit after each respective digital image; as a result, an image of better image quality than the digital image during the first operating state is displayed to the user before the expiration of the entire second acquisition duration. In particular, the user obtains images with a continuous improvement in their image quality in a displayed manner during the time profile until finally the second acquisition duration expires.

According to fig. 1, a so-called color camera K1 having a color image sensor BS1 as sensor BS can be used as camera K. According to an alternative embodiment in fig. 3, a camera K2 with a gray-value image sensor BS2, which detects the fluorescence radiation FS, FS2 in the green channel, can be used as camera K. For this purpose, a so-called green channel OF the fluorescence radiation FS2 can be selected by means OF the optical filter OF12 and then supplied to the grayscale image sensor BS 2. This is preferably again achieved via the camera objective KO 2.

The use of a color image sensor brings the following advantages over the use of a single value grayscale image sensor: not only is the fluorescence radiation visible due to the fluorescence of the fluorochromes previously introduced in the image, but also the so-called auto-fluorescence of the tissue can be acquired and displayed in another channel, preferably in the red channel. This may be, for example, a coloration with a brown portion. This additional image information can provide the user with additional optical information about the displayed tissue by using a color image sensor.

In a preferred embodiment according to fig. 3, the device can have, in addition to the gray-value image sensor BS2, a further gray-value image sensor BS3 which preferably detects fluorescent radiation in the red channel. For this purpose, the fluorescence radiation FS of the tissue can be divided by means of a dichroic mirror SP3 in such a way that the fluorescence radiation FS3 represents the red channel. In this case, the optical filter OF13 can select or pass the red color channel. Then, the information or color information of the two gray value image sensors BS2 and BS3 may be superimposed or integrated by the control unit S.

Fig. 4 shows a grayscale image sensor BS2 with pixels P11, P12, P13, P14. The grayscale image sensor BS2 can perform so-called "pixel combination (Binning)" by means of pixel resolution or number of pixels per plane in such a manner that: the pixel values of the pixels P12, P12, P13, P14 are added, wherein then four times the area relative to the area of the sensor pixel is used for the unique image pixel BP. This results in a 4-fold reduction in resolution in the generated digital image relative to the resolution of the image sensor BS 2.

Fig. 5 shows an example of a color image sensor BS1 with different pixels P1.., P9, wherein the corresponding letters G, B, R respectively illustrate: whether a green (G), blue (B), or red (R) color filter is present before the corresponding pixel P1. For the color pixel P5, RGB color image information having three color channels may then be obtained, for example, as follows: so-called "color filter array interpolation (De-bayer)" is performed. Here, for the pixel P5, red information is obtained based on the pixel values of the pixels P4 and P6, green information is obtained based on the sensor pixel values of the pixels P1, P3, P5, P7, and P9, and blue information is obtained based on the sensor pixel values of the pixels P8 and P2. Three-channel color information as red, green and blue channels can thus be obtained for each individual pixel P1.., P9 without causing a reduction in resolution in the resulting image with respect to the number of pixels or the pixel density per facet relative to the resolution of the image sensor. If the RGB information obtained on the basis of the sensor pixels P1., P9 is integrated in a single image pixel corresponding to the area of all the sensor pixels P1., P9, this corresponds to a resolution reduction, i.e. a 9-fold reduction, in addition to the effect of the color filter array interpolation. If the sensor pixels P1, P2, P3, P4 of the sensor BS1 in fig. 5 are not only used to obtain RGB information by color filter array interpolation, but they are also combined according to area, then a resolution reduction by four times is caused by such color filter array interpolation, similar to the pixel combination for the sensor pixels P11,......, P14 in fig. 4.

In order to determine the image pixel values of the digital image from the sensor pixel values, the sensor pixel values can also be increased by an amplifier with an amplification factor, which, however, not only increases the effective signal of the sensor pixels, but also noise signals, for example sensor noise.

In summary, it can be determined that: different parameters can be chosen differently, depending on the choice of the image sensor, which may be a grey value sensor or a color sensor, and depending on the image repetition frequency sought, in order to determine a digital image from the sensor pixel values:

-an amplification factor for the sensor pixel values,

-a number of sensor pixel values integrated into an image pixel value by means of pixel combinations,

-the number of sensor pixel values integrated into an image pixel value by means of color filter array interpolation.

Preferably, the digital image in the first operating state and the further digital image in the second operating state are determined such that the digital image of the first operating state and the further digital image of the second operating state have the same intensity at the same light intensity of the fluorescent radiation.

Preferably, the digital image in the first operating state is determined such that it has a first image resolution, wherein the further digital image in the second operating state is determined such that it has a second image resolution, and the first image resolution is selected to be smaller than the second image resolution.

Examples are set forth below that allow one of ordinary skill in the art to select different parameters so as to implement one or more embodiments of the methods described herein.

In the first example it is assumed that: only grey-value image sensors with a sensor resolution of 2448 × 2048 pixels are used. In the first operating state, the 16 pixels can then be combined together by pixel combination to obtain an image pixel. The first acquisition duration may then be 19ms, so that an image repetition rate of about 53 images per second is achieved. Preferably, no amplification of the sensor pixel values is performed. The generated image then has a resolution of 615 × 512 pixels in the first operating state. In the second operating state, 300ms can then be used as a second, continuous acquisition duration, which corresponds to an image repetition rate of only 3 images per second. Here, it is also preferable that neither amplification of the sensor pixel values nor pixel combination is performed. The image produced then has a resolution of 2448 × 2048 pixels in the first operating state. The intensity values are then identical for the same intensity of the fluorescent radiation in the first and second operating states of the first example.

In the second example it is assumed that: a color image sensor having a sensor resolution of 2448 × 2048 pixels is used. In a first operating state, the 4 pixels can then be combined together by color filter array interpolation to obtain image pixels, together with a resolution reduction of 4 times. The first acquisition duration may then be 19ms, so that an image repetition rate of about 53 images per second is achieved. Preferably, the sensor pixel values are amplified by a linearity factor 300/19. The image produced then has a resolution of 1224 x 1024 pixels in the first operating state. Then, in a second operating state, 300ms can be used as a second continuous acquisition duration, which corresponds to an image repetition rate of only 3 images per second. Here too, the amplification of the sensor pixel values is preferably not carried out. The nine sensor pixel values are then combined by means of color filter array interpolation to determine the RGB information for the individual pixels without causing a resolution reduction. The image produced then has a resolution of 2448 × 2048 pixels in the first operating state. The intensity values are then identical for the same intensity of the fluorescent radiation in the first and second operating states of the second example.

Alternatively, in the second operating state in the second example, a temporary digital image with a corresponding sub-acquisition duration of 19ms may be obtained and the further digital image is obtained by averaging the temporary digital image. The sensor pixel values are then amplified by a linearity factor 300/19. The nine sensor pixel values are then combined by means of color filter array interpolation to determine the RGB information for the individual pixels without causing a resolution reduction. The resulting image then has a resolution of 2448 × 2048 pixels in the first operating state. The intensity values are then the same in the first and second operating states of the second example at the same intensity of the fluorescent radiation.

Fig. 8 and 9 show example images of immunofluorescence images of HEP cells, initially obtained with the aid of a color image sensor. The fluorochrome used in this case is Fluorescein Isothiocyanate (FITC). The fluorescence radiation is collected by means of a CMOS sensor. The power density of the excitation radiation is 65mW/mm2. In the first operating state, the acquisition duration is approximately 5.07ms, the gain is approximately 32.5dB, and a color filter array interpolation with four pixels each is carried out with a resolution reduction such that the image resolution corresponds to 1/4 (one quarter) of the image resolution of the second operating state. In the second operating state, the acquisition duration is approximately 162.12ms, the gain is approximately 2.4dB, and the color filter array interpolation with 9 pixels each is performed without a resolution reduction, so that the image resolution corresponds to four times the image resolution of the first operating state. For both operating states, the same image intensity is achieved with the same intensity of the fluorescence radiation.

Fig. 10 and 11 show example images of immunofluorescence images of tissue portions of rat liver, which were originally obtained with the aid of a color image sensor. In the first operating state, the acquisition duration is approximately 8.93ms, the gain is approximately 10.92dB, and a color filter array interpolation with four pixels each is carried out with a resolution reduction such that the image resolution corresponds to 1/4 (one quarter) of the image resolution of the second operating state. In the second operating state, the acquisition duration is approximately 142.8ms, the gain is approximately 35.00dB, and the color filter array interpolation with nine pixels each is performed without a reduction in resolution, so that the image resolution corresponds to four times the image resolution of the first operating state. For both operating states, the same image intensity is achieved at the same intensity of the fluorescent radiation.

The features disclosed in the above description, claims and drawings may be essential for the realization of the embodiments in different embodiments both individually and in any combination and may be combined with one another as desired, unless otherwise stated in the description.

Although certain aspects have been described in connection with the apparatus device, it is to be understood that these aspects also represent a description of the corresponding method, such that a block or component of the apparatus is also to be understood as a corresponding method step or a feature of a method step. Analogously thereto, aspects described in connection with or as a method step also represent a description of a corresponding block or detail or feature of a corresponding apparatus.

Embodiments of the present invention can be implemented in hardware or software, particularly the control units mentioned herein, depending on the particular implementation requirements. The implementation can be performed using a digital storage medium, for example a floppy disk, a DVD, a blu-ray disk, a CD, a ROM, a PROM, an EPROM, an EEPROM or a FLASH memory, a hard disk or other magnetic or optical memory, on which electronically readable control signals are stored, which cooperate or can cooperate with programmable hardware components in order to perform the respective method.

A Programmable hardware module such as the control Unit mentioned here can be formed by a processor, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a computer System, an Application-Specific Integrated Circuit (ASIC), an Integrated Circuit (IC), a System On Chip (SOC), a Programmable logic device (plc), or a Field Programmable Gate Array (FPGA) having a microprocessor.

Thus, the digital storage medium may be machine-readable or computer-readable. Some embodiments therefore comprise a data carrier with electronically readable control signals, which can interact with a programmable computer system or programmable hardware components such that one of the methods described herein is performed. An embodiment is thus a data carrier (or digital storage medium or computer readable medium) on which a program is recorded to perform one of the methods described herein.

Generally, embodiments of the invention can be implemented as a program, firmware, computer program or a computer program product with a program code or as data, wherein the program code or data is operative to perform one of the methods when the program is run on a processor or programmable hardware component. The program code or data may also be stored on a machine-readable carrier or data carrier, for example. Program code or data can exist, inter alia, as source code, machine code, or byte code, as well as other intermediate code.

A program according to one embodiment may implement one of the methods during its execution, for example, by: reading a memory location or writing date or data into the memory location, thereby causing a switching process or other processes in the transistor structure, in the amplifier structure or in other components which operate according to other functional principles, if necessary. Accordingly, data, values, sensor values, or other information can be collected, determined, or measured by a program by reading the memory locations. Thus, programs may collect, determine or measure quantities, values, measured quantities and other information by reading one or more memory locations, and cause, facilitate or perform an action by writing to one or more memory locations, as well as manipulate other devices, machines and components.

The above-described embodiments are merely illustrative of the principles of the present invention. Obviously, modifications and variations to the arrangements and details described herein will be apparent to others skilled in the art. It is therefore intended that the invention be limited only by the scope of the appended claims and not by the specific details expressed herein by way of the description and the examples.

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种硅锰锆孕育剂中其他杂质元素的分析方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!