Microscope system, projection unit, and image projection method

文档序号:664666 发布日期:2021-04-27 浏览:9次 中文

阅读说明:本技术 显微镜系统、投影单元以及图像投影方法 (Microscope system, projection unit, and image projection method ) 是由 米山贵 壁谷章文 谷洋辅 中田竜男 唐泽雅善 于 2018-12-25 设计创作,主要内容包括:显微镜系统(1)具备:目镜(104);物镜(102),其将来自试样的光引导到目镜(104);成像透镜(103),其配置于目镜(104)与物镜(102)之间,基于来自试样的光来形成试样的光学图像;图像分析部(22);投影图像生成部(23);以及投影装置(133),其向光学图像的像面投射投影图像。图像分析部(22)对试样的数字图像数据进行从多个分析处理中选择出的至少一个分析处理,输出与至少一个分析处理相应的分析结果。投影图像生成部(23)基于分析结果和至少一个分析处理来生成投影图像数据。投影图像数据所表现的投影图像是将分析结果以与至少一个分析处理相应的显示形式表示出来的图像。(A microscope system (1) is provided with: an eyepiece (104); an objective lens (102) that guides light from the sample to an eyepiece lens (104); an imaging lens (103) which is disposed between the eyepiece (104) and the objective lens (102) and which forms an optical image of the sample on the basis of light from the sample; an image analysis unit (22); a projection image generation unit (23); and a projection device (133) that projects the projection image onto the image plane of the optical image. An image analysis unit (22) performs at least one analysis process selected from a plurality of analysis processes on the digital image data of the sample, and outputs an analysis result corresponding to the at least one analysis process. A projection image generation unit (23) generates projection image data on the basis of the analysis result and at least one analysis process. The projection image represented by the projection image data is an image in which the analysis result is represented in a display form corresponding to at least one analysis process.)

1. A microscope system is characterized by comprising:

an eyepiece;

an objective lens that guides light from a specimen to the eyepiece;

an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen;

an image analysis unit that performs at least one analysis process selected from a plurality of analysis processes on the digital image data of the sample and outputs an analysis result corresponding to the at least one analysis process;

a projection image generation unit that generates projection image data based on the analysis result and the at least one analysis process, wherein a projection image represented by the projection image data is an image in which the analysis result is represented in a display format corresponding to the at least one analysis process; and

and a projection device that projects the projection image onto an image plane on which the optical image is formed.

2. The microscope system of claim 1,

the display form includes the color of the image,

the projection image generation unit determines a color of the projection image in accordance with the at least one analysis process.

3. The microscope system of claim 2,

the display form also includes the format of the graphics that make up the image,

the projection image generation unit determines a format of a graphic constituting the projection image in accordance with the at least one analysis process.

4. The microscope system according to claim 2 or 3,

the display form also includes the location of the image,

the projection image generation unit determines a positional relationship between the projection image and the optical image on the image plane in accordance with the at least one analysis process.

5. The microscope system of claim 4,

the projection image generation unit determines whether or not to project at least a part of the projection image outside the area of the optical image in accordance with the at least one analysis process.

6. The microscope system according to any one of claims 1 to 5,

the microscope system adjusts brightness of at least one of the optical image and the projection image in accordance with the at least one analysis process.

7. The microscope system according to any one of claims 1 to 6,

the image analysis unit performs the following operations:

classifying one or more structures represented in a digital image represented by the digital image data into one or more categories;

generating position information for specifying a position of a structure classified into at least one of the one or more classes; and

outputting the analysis result including the location information,

the projection image contains a graph representing the positions of the structures classified into the at least one category.

8. The microscope system of claim 7,

the image analysis unit performs the following operations:

generating statistical information for constructs classified into the at least one category; and

outputting the analysis result including the location information and the statistical information,

the projection image includes a graph indicating a position of the structure classified into the at least one category and the statistical information.

9. The microscope system according to claim 7 or 8,

the structure classified into the at least one category is an object to be used as a basis for a pathologist to make a judgment in pathological diagnosis.

10. The microscope system according to any one of claims 1 to 9,

and a projection control unit for controlling the projection of the light beam,

the projection control unit performs the following operations:

determining whether to project the projection image onto the image plane according to the setting of the microscope system; and

and controlling the projection device so that the projection device projects the projection image onto the image plane when the microscope system is set to a predetermined setting.

11. The microscope system according to any one of claims 1 to 10,

the image analysis section selects the at least one analysis process based on an input operation by a user.

12. The microscope system according to any one of claims 1 to 10,

further comprises an identification means for acquiring identification information attached to the sample,

the image analysis section selects the at least one analysis process based on the identification information acquired by the identification device.

13. The microscope system according to any one of claims 1 to 12,

further comprises an input device for outputting an operation signal corresponding to an input operation of a user,

the image analysis unit changes a part of the threshold used for the at least one analysis process in accordance with the operation signal output from the input device.

14. A microscope system is characterized by comprising:

an eyepiece;

an objective lens that guides light from a specimen to the eyepiece;

an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen;

a projection image generation unit that generates projection image data based on a diagnosis specification selected from a plurality of diagnosis specifications, wherein a projection image represented by the projection image data is an image corresponding to the selected diagnosis specification; and

and a projection device that projects the projection image onto an image plane on which the optical image is formed.

15. The microscope system of claim 14,

the projection image contains a guidance of a diagnostic procedure of the selected diagnostic specification.

16. The microscope system of claim 15,

the projection image generation unit determines the color of the guide in accordance with the selected diagnostic criteria.

17. The microscope system according to any one of claims 14 to 16,

the projection image includes a reference image indicating a criterion in the selected diagnostic criterion.

18. The microscope system according to any one of claims 14 to 17,

the projection image generation unit determines a positional relationship between the projection image and the optical image on the image plane in accordance with the selected diagnosis standard.

19. The microscope system of claim 18,

the projection image generation unit determines whether or not to project at least a part of the projection image outside the area of the optical image in accordance with the selected diagnostic criteria.

20. The microscope system according to any one of claims 14 to 19,

the projection image generation unit performs the following operations:

when the setting of the microscope system does not meet the requirement of the selected diagnosis standard, the projection image data is generated in a manner that the projection image includes a warning display.

21. The microscope system according to any one of claims 14 to 20,

and a projection control unit for controlling the projection of the light beam,

the projection control unit performs the following operations:

determining whether to project the projection image onto the image plane according to the setting of the microscope system; and

and controlling the projection device so that the projection device projects the projection image onto the image plane when the microscope system is set to a predetermined setting.

22. The microscope system according to any one of claims 14 to 21,

the projection image generation unit selects the diagnosis specification based on an input operation by a user.

23. The microscope system according to any one of claims 14 to 21,

further comprises an identification means for acquiring identification information attached to the sample,

the projection image generation unit selects the diagnosis specification based on the identification information acquired by the identification device.

24. A microscope system is characterized by comprising:

an eyepiece;

an objective lens that guides light from a specimen to the eyepiece;

an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen;

an image analysis unit that performs at least one analysis process selected from a plurality of analysis processes on the digital image data of the sample and outputs an analysis result corresponding to the at least one analysis process;

a projection image generation unit that generates first projection image data based on the analysis result and the at least one analysis process, and generates second projection image data based on a diagnosis specification selected from a plurality of diagnosis specifications, the first projection image represented by the first projection image data being an image in which the analysis result is represented in a display format corresponding to the at least one analysis process, and the second projection image represented by the second projection image data being an image corresponding to the selected diagnosis specification; and

and a projection device that projects the first projection image and the second projection image onto an image plane on which the optical image is formed.

25. A projection unit for a microscope provided with an objective lens, an imaging lens, and an eyepiece, the projection unit comprising:

an imaging device that acquires digital image data of a sample based on light from the sample;

an image analysis unit that performs at least one analysis process selected from a plurality of analysis processes on the digital image data of the sample and outputs an analysis result corresponding to the at least one analysis process;

a projection image generation unit that generates projection image data based on the analysis result and the at least one analysis process, wherein a projection image represented by the projection image data is an image in which the analysis result is represented in a display format corresponding to the at least one analysis process; and

and a projection device that projects the projection image onto an image plane on which an optical image of the sample is formed by the imaging lens.

26. A projection unit for a microscope provided with an objective lens, an imaging lens, and an eyepiece, the projection unit comprising:

an imaging device that acquires digital image data of a sample based on light from the sample;

a projection image generation unit that generates projection image data based on a diagnosis specification selected from a plurality of diagnosis specifications, wherein a projection image represented by the projection image data is an image corresponding to the selected diagnosis specification; and

and a projection device that projects the projection image onto an image plane on which an optical image of the sample is formed by the imaging lens.

27. An image projection method, performed by a microscope system, the image projection method being characterized in that,

the microscope system performs the following actions:

performing at least one analysis process selected from a plurality of analysis processes on the digital image data of the specimen, and outputting an analysis result corresponding to the at least one analysis process;

generating projection image data based on the analysis results and the at least one analysis process; and

projecting a projection image represented by the projection image data onto an image plane on which an optical image of the sample is formed, wherein the projection image is an image in which the analysis result is represented in a display format corresponding to the at least one analysis process.

28. An image projection method, performed by a microscope system, the image projection method being characterized in that,

the microscope system performs the following actions:

generating projection image data based on a diagnostic criterion selected from a plurality of diagnostic criteria; and

projecting a projection image represented by the projection image data onto an image plane on which an optical image of the sample is formed, wherein the projection image is an image corresponding to the selected diagnostic criterion.

Technical Field

The disclosure of the present specification relates to a microscope system, a projection unit, and an image projection method.

Background

As one of techniques for reducing the burden on pathologists in pathological diagnosis, the WSI (white Slide Imaging) technique is attracting attention. The WSI technique is a technique for creating a WSI (white Slide Image) that is a digital Image of the entire specimen on a Slide. By displaying the WSI as a digital image on a monitor and performing diagnosis, a pathologist can enjoy various benefits. Specifically, there are advantages that the operation of the microscope body is not required in the diagnosis, the display magnification can be easily changed, and a plurality of pathologists can participate in the diagnosis at the same time. Such a WSI technique is described in patent document 1, for example.

Documents of the prior art

Patent document

Patent document 1: japanese Kohyo publication No. 2001-519944

Disclosure of Invention

Problems to be solved by the invention

On the other hand, a system to which the WSI technology is applied (hereinafter referred to as a WSI system) is required to have high performance. Specifically, in pathological diagnosis, information on color and darkness is extremely important, and therefore a WSI system is required to have high color reproducibility and a large dynamic range. Therefore, each device constituting the WSI system must be an expensive device having high performance, and as a result, users who can introduce the WSI system are limited.

From this fact, a new technique is sought as follows: the pathological diagnosis by the pathologist based on the optical image (simulated image) obtained by the optical microscope is assisted, thereby reducing the burden on the pathologist.

An object of one aspect of the present invention is to provide a diagnosis support technique for supporting pathological diagnosis by a pathologist based on an optical image.

Means for solving the problems

A microscope system according to an embodiment of the present invention includes: an eyepiece; an objective lens that guides light from a specimen to the eyepiece; an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen; an image analysis unit that performs at least one analysis process selected from a plurality of analysis processes on the digital image data of the sample and outputs an analysis result corresponding to the at least one analysis process; a projection image generation unit that generates projection image data based on the analysis result and the at least one analysis process, wherein a projection image represented by the projection image data is an image in which the analysis result is represented in a display format corresponding to the at least one analysis process; and a projection device that projects the projection image onto an image plane on which the optical image is formed.

A microscope system according to another aspect of the present invention includes: an eyepiece; an objective lens that guides light from a specimen to the eyepiece; an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen; a projection image generation unit that generates projection image data based on a diagnosis specification selected from a plurality of diagnosis specifications, wherein a projection image represented by the projection image data is an image corresponding to the selected diagnosis specification; and a projection device that projects the projection image onto an image plane on which the optical image is formed.

A microscope system according to still another aspect of the present invention includes: an eyepiece; an objective lens that guides light from a specimen to the eyepiece; an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen; an image analysis unit that performs at least one analysis process selected from a plurality of analysis processes on the digital image data of the sample and outputs an analysis result corresponding to the at least one analysis process; a projection image generation unit that generates first projection image data based on the analysis result and the at least one analysis process, and generates second projection image data based on a diagnosis specification selected from a plurality of diagnosis specifications, the first projection image represented by the first projection image data being an image in which the analysis result is represented in a display format corresponding to the at least one analysis process, and the second projection image represented by the second projection image data being an image corresponding to the selected diagnosis specification; and a projection device that projects the first projection image and the second projection image onto an image plane on which the optical image is formed.

A projection unit according to an aspect of the present invention is a projection unit for a microscope including an objective lens, an imaging lens, and an eyepiece, the projection unit including: an imaging device that acquires digital image data of a sample based on light from the sample; an image analysis unit that performs at least one analysis process selected from a plurality of analysis processes on the digital image data of the sample and outputs an analysis result corresponding to the at least one analysis process; a projection image generation unit that generates projection image data based on the analysis result and the at least one analysis process, wherein a projection image represented by the projection image data is an image in which the analysis result is represented in a display format corresponding to the at least one analysis process; and a projection device that projects the projection image onto an image plane on which an optical image of the sample is formed by the imaging lens.

A projection unit according to another aspect of the present invention is a projection unit for a microscope including an objective lens, an imaging lens, and an eyepiece, the projection unit including: an imaging device that acquires digital image data of a sample based on light from the sample; a projection image generation unit that generates projection image data based on a diagnosis specification selected from a plurality of diagnosis specifications, wherein a projection image represented by the projection image data is an image corresponding to the selected diagnosis specification; and a projection device that projects the projection image onto an image plane on which an optical image of the sample is formed by the imaging lens.

An image projection method according to an aspect of the present invention is an image projection method performed by a microscope system that performs the following operations: performing at least one analysis process selected from a plurality of analysis processes on the digital image data of the specimen, and outputting an analysis result corresponding to the at least one analysis process; generating projection image data based on the analysis results and the at least one analysis process; and projecting a projection image represented by the projection image data onto an image plane on which an optical image of the sample is formed, wherein the projection image is an image in which the analysis result is represented in a display format corresponding to the at least one analysis process.

An image projection method according to another aspect of the present invention is an image projection method performed by a microscope system that performs the following operations: generating projection image data based on a diagnostic criterion selected from a plurality of diagnostic criteria; and projecting a projection image represented by the projection image data onto an image plane on which an optical image of the sample is formed, wherein the projection image is an image corresponding to the selected diagnostic standard.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the above-described aspect, it is possible to assist a pathologist in performing pathological diagnosis based on an optical image.

Drawings

Fig. 1 is a diagram showing a configuration of a microscope system 1.

Fig. 2 is a diagram showing the structure of the computer 20.

Fig. 3 is a flowchart of the image projection processing performed by the microscope system 1.

Fig. 4 is a diagram illustrating the selection screen 31.

Fig. 5 shows an example of an image that can be observed through the eyepiece 104 of the microscope system 1.

Fig. 6 shows another example of an image that can be observed through the eyepiece 104 of the microscope system 1.

Fig. 7 is still another example of an image that can be observed from the eyepiece 104 of the microscope system 1.

Fig. 8 is still another example of an image that can be observed from the eyepiece 104 of the microscope system 1.

Fig. 9 is still another example of an image that can be observed from the eyepiece 104 of the microscope system 1.

Fig. 10 is still another example of an image that can be observed from the eyepiece 104 of the microscope system 1.

Fig. 11 is a diagram showing the structure of a neural network.

Fig. 12 is a diagram showing the structure of the microscope system 2.

Fig. 13 is a diagram showing the configuration of a computer 60 included in the microscope system 3.

Fig. 14 is a flowchart of the image projection processing performed by the microscope system 3.

Fig. 15 shows an example of an image that can be observed through the eyepiece 104 of the microscope system 3.

Fig. 16 is another example of an image that can be observed from the eyepiece 104 of the microscope system 3.

Fig. 17 is still another example of an image that can be observed from the eyepiece 104 of the microscope system 3.

Fig. 18 is still another example of an image that can be observed from the eyepiece 104 of the microscope system 3.

Fig. 19 is a diagram showing the configuration of a diagnosis assistance system including the microscope system 4 and the external browsing system 300.

Fig. 20 is a diagram showing the structure of the microscope 500.

Fig. 21 shows an example of a change in an image that can be observed from the eyepiece 104 of the microscope system including the microscope 500.

Fig. 22 is a diagram showing the structure of the microscope 600.

Detailed Description

[ first embodiment ]

Fig. 1 is a diagram showing a configuration of a microscope system 1 according to the present embodiment. Fig. 2 is a diagram showing the structure of the computer 20. The microscope system 1 is a microscope system used by a pathologist for pathological diagnosis, and includes at least an objective lens 102, an imaging lens 103, an eyepiece lens 104, an image analysis unit 22, a projection image generation unit 23, and a projection device 133.

The microscope system 1 projects a projection image onto an image plane on which an optical image of a sample is formed by the objective lens 102 and the imaging lens 103 using the projection device 133. More specifically, the image analyzing unit 22 analyzes the digital image data of the sample, the projection image generating unit 23 generates projection image data based on the analysis result and the analysis process, and the projecting device 133 projects the projection image in which the analysis result is displayed in a display format corresponding to the analysis process onto the image plane. As a result, the pathologist observes an image in which the projection image in the display format corresponding to the analysis processing is superimposed on the optical image. Therefore, the microscope system 1 can provide various information for assisting pathological diagnosis to a pathologist who observes a sample by looking into the eyepiece 104 in a display form that is easy to visually recognize.

Next, a specific example of the configuration of the microscope system 1 will be described in detail with reference to fig. 1 and 2. As shown in fig. 1, the microscope system 1 includes a microscope 100, a microscope controller 10, a computer 20, a display device 30, an input device 40, and a recognition device 50.

The microscope 100 is, for example, an upright microscope, and includes a microscope body 110, a lens barrel 120, and an intermediate lens barrel 130. In addition, the microscope 100 may be an inverted microscope.

The microscope body 110 includes: a mounting table 101 on which a sample is mounted; an objective lens (objective lens 102, objective lens 102a) that guides light from the specimen to an eyepiece lens 104; an epi-illumination optical system; and a transmissive illumination optical system. The table 101 may be a manual table or a motorized table. It is desirable that a plurality of objective lenses having different magnifications be attached to the lens changer. For example, the objective lens 102 is a 4-fold objective lens, and the objective lens 102a is a 20-fold objective lens. The microscope body 110 may include at least one of an epi-illumination optical system and a transmission illumination optical system.

The microscope body 110 is further provided with a turret 111 for switching the microscopy technique. For example, a fluorescence cube used in the fluorescence observation method, a half mirror used in the bright field observation method, or the like is disposed on the turret 111. In addition, the microscope body 110 may include an optical element used in a specific microscopic technique that is removable with respect to the optical path. Specifically, the microscope body 110 may include, for example, a DIC prism, a polarizer, an analyzer, and the like used in the differential interference observation method.

The lens barrel 120 is a monocular lens barrel or a binocular lens barrel to which the eyepiece 104 is attached. An imaging lens 103 is provided inside the lens barrel 120. The imaging lens 103 is disposed on an optical path between the objective lens 102 and the eyepiece 104. The imaging lens 103 forms an optical image of the specimen based on light from the specimen at an image plane between the eyepiece lens 104 and the imaging lens 103. The imaging lens 103 also forms a projection image, which will be described later, on the image plane based on the light from the projection device 133. Thereby, at the image plane, the projected image is superimposed on the optical image.

The intermediate barrel 130 is disposed between the microscope body 110 and the barrel 120. The intermediate barrel 130 includes an image pickup element 131, a light deflection element 132, a projection device 133, and a light deflection element 134.

The imaging element 131 is an example of a photodetector that detects light from the sample. The image pickup element 131 is a two-dimensional image sensor, such as a CCD image sensor or a CMOS image sensor. The image pickup device 131 detects light from the sample, and generates digital image data of the sample based on the detection result.

The optical deflection element 132 is an example of a first optical deflection element that deflects light from the sample toward the image pickup element 131. The light deflecting element 132 is, for example, a beam splitter such as a half mirror. As the light deflecting element 132, a variable beam splitter capable of changing transmittance and reflectance may also be used. The light deflecting element 132 is disposed on the optical path between the eyepiece 104 and the objective lens 102. This enables the image pickup device 131 to obtain a digital image of the sample observed from the same direction as the visual observation direction.

The projection device 133 is a projection device that projects a projection image, which will be described later, onto an image plane in accordance with a command from the computer 20. The projection device 133 is, for example, a projector using a liquid crystal device, a projector using a digital mirror device, a projector using LCOS, or the like.

The light deflection element 134 is an example of a second light deflection element that deflects the light emitted from the projection device 133 toward the image plane. The light deflecting element 134 is, for example, a beam splitter such as a half mirror. As the light deflecting element 134, a variable beam splitter capable of changing transmittance and reflectance may also be used. As the light deflecting element 134, a dichroic mirror or the like may be used. The light deflection element 134 is disposed on the optical path between the image plane and the light deflection element 132. This can prevent light from the projector 133 from entering the image sensor 131.

The microscope controller 10 controls the microscope 100, particularly, the microscope main body 110. The microscope controller 10 is connected to the computer 20 and the microscope 100, and controls the microscope 100 in accordance with a command from the computer 20.

The display device 30 is, for example, a liquid crystal display, an organic el (oled) display, a CRT (Cathode Ray Tube) display, or the like. The input device 40 outputs an operation signal corresponding to an input operation of a user to the computer 20. The input device 40 is, for example, a keyboard, but may include a mouse, a joystick, a touch panel, or the like.

The identification device 50 is a device that acquires identification information attached to a sample. The identification information includes at least information for identifying the sample. The identification information may include information on an analysis method of the sample, a diagnosis standard, and the like. The identification device 50 is, for example, a barcode reader, an RFID reader, a QR (registered trademark) code reader, or the like.

The computer 20 controls the entire microscope system 1. The computer 20 is connected to the microscope 100, the microscope controller 10, the display device 30, the input device 40, and the recognition device 50. As shown in fig. 1, the computer 20 mainly includes a camera control unit 21, an image analysis unit 22, a projection image generation unit 23, an information acquisition unit 24, a projection control unit 25, an image recording unit 26, an image synthesis unit 27, and a display control unit 28 as components related to the control of the projection device 133.

The camera control unit 21 controls the image pickup device 131 to acquire digital image data of the sample. The digital image data acquired by the camera control unit 21 is output to the image analysis unit 22, the image recording unit 26, and the image synthesis unit 27.

The image analysis unit 22 performs at least one analysis process selected from a plurality of analysis processes on the digital image data acquired by the camera control unit 21, and outputs an analysis result corresponding to the at least one analysis process to the projection image generation unit 23. The plurality of analysis processes to be selected may be processes to be performed by a plurality of different staining methods such as HE staining and IHC staining. The plurality of analysis processes to be selected may be processes to be performed on a plurality of different biomarkers such as HER2, Ki-67, and ER/PgR. The plurality of analysis processes to be selected may be processes to be performed by a plurality of different combinations of staining methods and biomarkers.

The image analysis unit 22 may select at least one analysis process based on an input operation by the user. More specifically, the image analysis unit 22 may select at least one analysis process based on the operation information of the user acquired by the information acquisition unit 24. The image analysis unit 22 may select at least one analysis process based on the identification information acquired by the identification device 50. More specifically, the identification information acquired by the identification device 50 is acquired from the information acquisition unit 24, and at least one analysis process is selected based on the analysis method included in the identification information. Further, the operation information and the identification information each contain information for selecting analysis processing. Therefore, these pieces of information will be collectively referred to as selection information hereinafter.

The content of the analysis process performed by the image analysis unit 22 is not particularly limited. The image analysis unit 22 may classify one or more structures represented in a digital image represented by digital image data into one or more categories, for example, and generate an analysis result including position information for specifying a position of a structure classified into at least one category of the one or more categories. More specifically, the image analyzer 22 may classify cells present in the digital image according to the staining intensity, and generate an analysis result including category information obtained by classifying the cells and position information for specifying the outline of the cells or the outline of nuclei of the cells. The image analysis unit 22 may generate an analysis result including statistical information of structures classified into at least one category, such as the number of cells of each category, and the ratio of cells of each category to the whole, in addition to the category information and the position information. Further, it is desirable that the structure classified into at least one category is an object to be a basis for a pathologist to make a judgment in pathological diagnosis.

The projection image generation unit 23 generates projection image data based on the analysis result output from the image analysis unit 22 and at least one analysis process specified based on the selection information acquired from the information acquisition unit 24. The projection image represented by the projection image data is an image in which the analysis result is represented in a display form corresponding to at least one analysis process. The projection image generated by the projection image generating unit 23 is output to the projection control unit 25, the image recording unit 26, and the image synthesizing unit 27.

The display form includes at least the color of the image. Therefore, the projection image generating unit 23 determines the color of the projection image in accordance with at least one analysis process. The display format may include a format of graphics (for example, lines) constituting the image in addition to the color of the image. Therefore, the projection image generating unit 23 may determine the format of the graphics constituting the projection image in accordance with at least one analysis process. Further, the format of the graphics includes whether or not the graphics are filled, the kind of the graphics, and the like. For example, if the graphic is a line, the format of the line includes the type of line, the thickness of the line, and the like. The display format may include the position of the image in addition to the color of the image. Therefore, the projection image generating unit 23 may determine the positional relationship between the projection image and the optical image on the image plane in accordance with at least one analysis process, or may determine whether or not to project at least a part of the projection image outside the area of the optical image.

More specifically, the projection image generation unit 23 generates projection image data so that the color of the projection image is different from the color of the optical image. Since the color of the optical image differs depending on the staining method, the projection image generation unit 23 may change the color of the projection image according to the staining method targeted by the selected analysis processing. For example, in the HE staining method, the optical image is bluish purple, and therefore it is desirable to make the color of the projected image a color other than bluish purple.

Since the staining site in the cell differs depending on the biomarker, the projection image generating unit 23 may change the format of the graph constituting the projection image according to the biomarker to be targeted by the selected analysis process. For example, if overexpression of the HER2 protein is analyzed, since the cell membrane is stained, a projection image can be formed using a hollow figure that outlines the cell. In addition, if the expression of ER/PgR is analyzed, since nuclei of cells are stained, a projection image can be constructed using a pattern of nuclei filling cells.

Also, HE staining is commonly used to observe the morphology of cells. In the case of observing the morphology of a cell in detail, it is desirable that the observation of the optical image is not hindered by the projection image. Therefore, the projection image generating unit 23 may change the position of the projection image according to the dyeing method targeted by the selected analysis processing. For example, in the HE staining method, when the projection image includes supplementary character information, the position of the projection image may be changed so that the overlap between the character information and the optical image is reduced.

The information acquisition unit 24 acquires information from a device external to the computer 20. Specifically, the information acquiring unit 24 acquires the operation information of the user based on the operation signal from the input device 40. Further, the information acquiring unit 24 acquires identification information from the identification device 50.

The projection control unit 25 controls the projection device 133 to project the projection image onto the image plane. The projection control unit 25 may control the projection device 133 according to the setting of the microscope system 1. Specifically, the projection control unit 25 may determine whether or not to project the projection image on the image plane based on the setting of the microscope system 1, or may control the projection device 133 so that the projection device 133 projects the projection image on the image plane when the microscope system 1 is set to a predetermined setting. That is, the microscope system 1 can change whether or not to project the projection image onto the image plane according to the setting.

The image recording section 26 records the digital image data and the projection image data. Specifically, the image recording unit 26 records the projection image data in a region different from the digital image data in association with the digital image data. This makes it possible to read out the digital image data and the projection image data associated with each other individually as needed. The image recording unit 26 may acquire identification information attached to the sample via the identification device 50 and the information acquiring unit 24, and record the acquired identification information in association with the digital image data. The image recording unit 26 may record the digital image data and the projected image data upon detection of an input of a recording instruction by the user.

The image combining unit 27 generates image data of a combined image obtained by combining the digital image and the projection image based on the digital image data and the projection image data, and outputs the image data of the combined image to the display control unit 28.

The display control unit 28 displays the composite image on the display device 30 based on the composite image data output from the image combining unit 27. The display control unit 28 may display the digital image on the display device 30 based on the digital image data.

The computer 20 may be a general-purpose device or a dedicated device. The structure of the computer 20 is not particularly limited, and may have a physical structure as shown in fig. 2, for example. Specifically, the computer 20 may include a processor 20a, a memory 20b, an auxiliary storage device 20c, an input/output interface 20d, a media drive device 20e, and a communication control device 20f, and may be connected via a bus 20 g.

The processor 20a is, for example, an arbitrary Processing circuit including a CPU (Central Processing Unit). The processor 20a may execute the programs stored in the memory 20b, the auxiliary storage device 20c, and the storage medium 20h to perform programmed processes, thereby realizing the components (the camera control unit 21, the image analysis unit 22, the projection image generation unit 23, and the like) related to the control of the projection device 133 described above. The processor 20a may be configured by a dedicated processor such as an ASIC or FPGA.

The memory 20b is a working memory of the processor 20 a. The Memory 20b is an arbitrary semiconductor Memory such as a RAM (Random Access Memory). The auxiliary storage device 20c is a nonvolatile memory such as an EPROM (Erasable Programmable ROM) or a Hard disk Drive (Hard disk Drive). The input/output interface 20d exchanges information with external devices (the microscope 100, the microscope controller 10, the display device 30, the input device 40, and the recognition device 50).

The media drive device 20e can output data stored in the memory 20b and the auxiliary storage device 20c to the storage medium 20h, and can read programs, data, and the like from the storage medium 20 h. The storage medium 20h is an arbitrary recording medium that can be transported. The storage medium 20h includes, for example, an SD card, a USB (Universal Serial Bus) flash memory, a CD (Compact Disc), a DVD (Digital Versatile Disc), and the like.

The communication control device 20f performs input/output of information to/from the network. As the communication control device 20f, for example, an NIC (Network Interface Card), a wireless LAN (Local Area Network) Card, or the like can be used. The bus 20g connects the processor 20a, the memory 20b, the auxiliary storage device 20c, and the like to be able to transfer data to and from each other.

The microscope system 1 configured as described above performs the image projection processing shown in fig. 3. Fig. 3 is a flowchart of the image projection processing performed by the microscope system 1. Fig. 4 is a diagram illustrating the selection screen 31. Next, an image projection method of the microscope system 1 is explained with reference to fig. 3 and 4.

First, the microscope system 1 projects an optical image of the specimen onto an image plane (step S1). Here, the imaging lens 103 condenses light from the sample acquired by the objective lens 102 on an image surface to form an optical image of the sample.

Then, the microscope system 1 acquires digital image data of the specimen (step S2). Here, the light deflecting element 132 deflects a part of the light from the sample acquired by the objective lens 102 toward the image pickup element 131. The image pickup device 131 picks up an image of the sample based on the light deflected by the light deflection device 132, thereby generating digital image data.

After that, the microscope system 1 determines an analysis process selected from a plurality of analysis processes prepared in advance (step S3). Here, the user presses the button 36 after selecting, for example, each menu (menu 32, menu 33, menu 34, menu 35) on the selection screen 31 shown in fig. 4. The image analysis unit 22 determines the selected analysis process based on the input operation by the user.

When the analysis processing is determined, the microscope system 1 executes the determined analysis processing (step S4). Here, the image analysis unit 22 performs the analysis processing selected in step S3 on the digital image data acquired in step S2 to obtain an analysis result.

The microscope system 1 generates projection image data based on the analysis result acquired through step S4 and the analysis processing determined through step S3 (step S5). Here, the projection image generating unit 23 generates projection image data indicating a projection image in which the analysis result acquired in step S4 is displayed in the display format according to the analysis processing determined in step S3.

Finally, the microscope system 1 projects the projection image to the image plane (step S6). The projection control unit 25 controls the projection device 133 based on the projection image data, whereby the projection device 133 projects the projection image onto the image plane. Thereby, the projected image is superimposed on the optical image of the sample.

In the microscope system 1, the image analysis result obtained by the computer is displayed on the optical image. Thus, in pathological diagnosis based on an optical image of a sample, a pathologist can obtain various information for assisting diagnosis without moving the eye away from the eyepiece. Further, since the image analysis unit 22 performs the analysis process selected from the plurality of analysis processes, the microscope system 1 can cope with various types of diagnoses. The projection image projected by the projection device 133 has a display format corresponding to the analysis processing. Therefore, the microscope system 1 can provide various information for assisting pathological diagnosis to a pathologist in a display form easy to visually recognize. Therefore, according to the microscope system 1, it is possible to assist pathological diagnosis based on an optical image, and it is possible to reduce the work load of a pathologist.

In the microscope system 1, additional information (projection image) is displayed on the optical image to assist pathological diagnosis. Therefore, unlike the WSI system that performs pathological diagnosis based on digital images, expensive equipment is not required. Therefore, according to the microscope system 1, the burden on the pathologist can be reduced while avoiding a significant increase in the facility cost. In addition, although it is necessary to create a WSI (white Slide image) in advance in order to diagnose a pathology in the WSI system, it is not necessary to prepare it in advance in the microscope system 1, and the diagnosis operation can be started immediately.

Fig. 5 to 10 are diagrams illustrating images that can be observed from the eyepiece 104 of the microscope system 1. Next, a case of observation performed using the microscope system 1 that performs the image projection processing shown in fig. 3 will be specifically described with reference to fig. 5 to 10.

First, a case where "cell diagnosis", "breast cancer", "IHC staining", and "ER/PgR" are selected on the selection screen 31 shown in fig. 4 will be described with reference to fig. 5 to 7.

When the observation using the microscope system 1 is started and the eyepiece 104 is peeped, the pathologist can observe the image V1 shown in fig. 5. The image V1 is an optical image M1 corresponding to the actual field of view formed at the image plane. Stained nuclei of cancer cells are presented in image V1. A dark region R2 exists around the region R1 where the image V1 is projected. The region R2 is a region in which light from the objective lens 102 does not pass in a region on the image plane that can be observed from the eyepiece 104. At this time, an image corresponding to the image V1 may be displayed on the display device 30 based on the digital image data generated by the image pickup device 131.

The computer 20 then analyzes the digital image data. Nuclei of cells are determined by analysis, and classified according to their staining intensity. For example, nuclei that are not stained are classified as class 0, which indicates negativity. In addition, weakly stained nuclei are classified into class 1+ indicating weak positive. In addition, moderately stained nuclei are classified into class 2+ which indicates moderate positivity. In addition, strongly stained nuclei are classified into class 3+ indicating strong positivity.

The computer 20 generates projection image data based on the analysis result, and the projection device 133 projects the projection image represented by the projection image data to the image plane. The projection image represented by the projection image data includes a position image composed of a graph representing the positions of nuclei of the classified cells, and the graph has different colors for each category.

When the projection image is projected by the projection device 133, the pathologist can observe the image V2 shown in fig. 6. The image V2 is an image in which the projected image including the position image P1 is superimposed on the optical image M1. At this time, an image corresponding to the image V2 may be displayed on the display device 30. In comparison with the image V1 (optical image M1) shown in fig. 5, the staining state of each cell can be easily discriminated in the image V2 shown in fig. 6. Therefore, it becomes easy to calculate a score using a scoring method prescribed by J-score or the like used in pathological diagnosis. Therefore, the microscope system 1 assists the positive/negative determination operation by the pathologist, and thus the burden on the pathologist can be reduced.

The projection image represented by the projection image data may include a statistical image T1 including statistical information of the classified cells in addition to the position image P1. In this case, the pathologist can observe the image V3 shown in fig. 7. The image V3 is an image in which a projection image including the position image P1 and the statistical image T1 is superimposed on the optical image M1. At this time, an image corresponding to the image V3 may be displayed on the display device 30. In the image V3 shown in fig. 7, it becomes easier to calculate the score by the statistical image T1. Therefore, the microscope system 1 can assist the pathologist in the positive/negative determination work more favorably, and thus the burden on the pathologist can be further reduced.

The threshold value for each score determination of 0, 1+, 2+, and 3+ indicating the degree of staining intensity may vary depending on the individual pathologist, the guidelines of hospital facilities, and the diagnosis criteria of each country. Based on this, when the optical image M1 is compared with the position image P1 and the statistical image T1 and the statistical image T1 based on the analysis result is suspect, the observer may change the threshold value used in the analysis process using the input device. The user performs a new analysis process based on the changed threshold value and confirms a new statistical image T1' obtained by reflecting the result in real time, thereby providing a better support for the task of setting an appropriate threshold value. Therefore, the burden on the pathologist can be further reduced.

Next, a case where "cell diagnosis", "breast cancer", "IHC staining", and "HER 2" are selected on the selection screen 31 shown in fig. 4 will be described with reference to fig. 8 to 10.

When the observation using the microscope system 1 is started and the eyepiece 104 is peeped, the pathologist can observe the image V4 shown in fig. 8. The image V4 is an optical image M2 corresponding to the actual field of view formed at the image plane. Stained cell membranes of cancer cells are presented in image V4. At this time, an image corresponding to the image V4 may be displayed on the display device 30 based on the digital image data generated by the image pickup device 131.

The computer 20 then analyzes the digital image data. The cell membrane is determined by analysis, and the classification is made according to the staining intensity and staining pattern thereof. For example, cells with membranes that are not stained at all are classified as class 0, which indicates negativity. In addition, cells with membranes partially stained or weakly stained are classified as class 1+ indicating weak positive. In addition, nuclei that are stained more extensively in the membrane and stained overall are classified as class 2+ indicating moderate positivity. In addition, cells that were strongly stained for membrane and stained overall were classified as class 3+ indicating strong positive.

The computer 20 generates projection image data based on the analysis result, and the projection device 133 projects the projection image represented by the projection image data to the image plane. The projection image represented by the projection image data includes a position image P2 composed of a pattern representing the positions of the classified cells, the pattern having a format (line type) different for each category.

When the projection image is projected by the projection device 133, the pathologist can observe an image V5 shown in fig. 9. The image V5 is an image in which the projected image including the position image P2 is superimposed on the optical image M2. At this time, an image corresponding to the image V5 may be displayed on the display device 30. In comparison with the image V4 (optical image M2) shown in fig. 8, the stained state of the cell membrane can be easily discriminated in the image V5 shown in fig. 9. Therefore, it becomes easy to calculate a score using a prescribed scoring method used in pathological diagnosis. Therefore, the microscope system 1 assists the positive/negative determination operation by the pathologist, and thus the burden on the pathologist can be reduced.

The projection image represented by the projection image data may include a statistical image T2 including statistical information of the classified cells in addition to the position image P2. In this case, the pathologist can observe the image V6 shown in fig. 10. The image V6 is an image in which a projection image including the position image P2 and the statistical image T2 is superimposed on the optical image M2. In this example, the statistical image T2 is projected to the outside of the region R1 through which the light beam from the objective lens 102 passes. At this time, an image corresponding to the image V6 may be displayed on the display device 30. In the image V6 shown in fig. 10, by the statistical image T2, it becomes easier to calculate a score. Therefore, the microscope system 1 can assist the pathologist in the positive/negative determination work more favorably, and thus the burden on the pathologist can be further reduced.

In the examples of fig. 5 to 10, the projection images having different display forms depending on the formats (presence or absence of padding and line type) of the graphics constituting the images are illustrated, but the image colors of the projection images may be different depending on the analysis processing.

The image analysis unit 22 of the microscope system 1 may perform a plurality of analysis processes using a plurality of predetermined algorithms, or may perform a plurality of analysis processes using a plurality of trained neural networks.

The parameters of each of the plurality of trained neural networks may be generated by training each neural network in a device different from the microscope system 1, and the computer 20 may download the generated parameters to be applied to the image analysis unit 22. The computer 20 may add an analysis process that can be selected by the image analysis unit 22 as needed by downloading the parameters of the neural network to a new neural network.

Fig. 11 is a diagram showing the structure of the neural network NN. The neural network NN has an input layer, a plurality of intermediate layers, and an output layer. The output data D2 output from the output layer by inputting the input data D1 to the input layer is compared with the correct data D3. Then, learning is performed by an error back propagation method, thereby updating the parameters of the neural network NN. In addition, the set of input data D1 and correct data D3 is training data for supervised learning.

[ second embodiment ]

Fig. 12 is a diagram showing a configuration of the microscope system 2 according to the present embodiment. The microscope system 2 differs from the microscope system 1 in the following respects: a microscope 200 is provided instead of the microscope 100. The microscope 200 includes a projection unit 140 between the microscope body 110 and the lens barrel 120.

The projection unit 140 is a projection unit for a microscope including the objective lens 102, the imaging lens 103, and the eyepiece 104, and includes an intermediate barrel 130. That is, the projection unit 140 includes: an image pickup device 131 as an example of an image pickup apparatus that acquires digital image data of a sample based on light from the sample; and a projection device 133 that projects a projection image onto an image plane where an optical image is formed.

The projection unit 140 further includes an image analysis unit 142 and a projection image generation unit 143. The projection unit 140 may include a camera control unit 141, an information acquisition unit 144, and a projection control unit 145.

The camera control unit 141, the image analysis unit 142, the projection image generation unit 143, and the projection control unit 145 are the same as the camera control unit 21, the image analysis unit 22, the projection image generation unit 23, and the projection control unit 25, respectively. Therefore, detailed description is omitted.

The information acquisition unit 144 acquires operation information of the user based on an operation signal from the input device 40 acquired via the computer 20. Further, the information acquisition unit 144 acquires the identification information from the identification device 50 via the computer 20.

In the present embodiment, the same effect as that of the microscope system 1 can be obtained only by attaching the projection unit 140 to an existing microscope. Thus, according to the projection unit 140 and the microscope system 2, the existing microscope system can be easily expanded to assist pathological diagnosis by a pathologist based on an optical image.

[ third embodiment ]

Fig. 13 is a diagram showing a configuration of a computer 60 included in the microscope system 3 according to the present embodiment. Further, the microscope system 3 is the same as the microscope system 1 except for the following: a computer 60 is included in place of the computer 20.

The computer 60 controls the microscope system 3 as a whole. The computer 60 is identical to the computer 20 in the following respects: the computer 60 is connected to the microscope 100, the microscope controller 10, the display device 30, the input device 40, and the recognition device 50.

The computer 60 mainly includes a camera control unit 61, a projection image generation unit 63, an information acquisition unit 64, a projection control unit 65, an image recording unit 66, an image combining unit 67, and a display control unit 68, as components related to the control of the projection device 133.

The camera control unit 61, the information acquisition unit 64, the projection control unit 65, the image recording unit 66, the image combining unit 67, and the display control unit 68 correspond to the camera control unit 21, the information acquisition unit 24, the projection control unit 25, the image recording unit 26, the image combining unit 27, and the display control unit 28 included in the computer 20, respectively.

Computer 60 differs greatly from computer 20 in the following respects: the configuration corresponding to the image analysis section 22 is not included. In addition, the following points are also different from the computer 20: the projection image generation unit 63 performs a process different from that of the projection image generation unit 23.

The projection image generation unit 63 generates projection image data based on a diagnosis standard selected from a plurality of diagnosis standards. The diagnostic specification is a series of arrangements including a procedure from the start to the end of diagnosis, a criterion for determination, and the like. The projection image represented by the projection image data is an image corresponding to the selected diagnostic criterion. The projection image generated by the projection image generating unit 63 is output to the projection control unit 65, the image recording unit 66, and the image synthesizing unit 67.

Further, the selection method of the diagnosis standard in the projection image generation section 63 is the same as the selection method of the analysis processing in the image analysis section 22 of the microscope system 1. That is, the projection image generation unit 63 may select the diagnosis standard based on the input operation by the user, or may select the diagnosis standard based on the identification information acquired by the identification device 50.

The projection image may include guidance of the diagnostic procedure of the selected diagnostic rule, and the projection image generating unit 23 may determine the color of the guidance of the diagnostic procedure according to the selected diagnostic rule. Thus, the pathologist can refer to the diagnosis procedure without separating the eye from the eyepiece 104, and therefore, the diagnosis can be advanced efficiently without delaying the diagnosis procedure.

The projection image may include a reference image indicating a criterion in the selected diagnostic criterion. This allows the pathologist to confirm the optical image and the reference image at the same time, and therefore, a reduction in diagnosis time, an improvement in diagnosis accuracy, and the like can be expected.

The projection image generation unit 63 may determine the positional relationship between the projection image and the optical image on the image plane in accordance with the selected diagnosis standard, or may determine whether or not to project at least a part of the projection image outside the area of the optical image.

When the setting of the microscope system 3 does not satisfy the requirement of the selected diagnostic standard, the projection image generation unit 63 may generate projection image data so that the projection image includes a warning display. This enables the pathologist to make various determinations in diagnosis under accurate circumstances, and therefore improvement in diagnosis accuracy can be expected.

The projection control unit 65 may control the projection device 133 according to the setting of the microscope system 3. Specifically, the projection control unit 65 may determine whether or not to project the projection image on the image plane based on the setting of the microscope system 3, or may control the projection device 133 so that the projection device 133 projects the projection image on the image plane when the microscope system 3 is set to a predetermined setting. That is, the microscope system 3 can change whether or not to project the projection image onto the image plane according to the setting.

Further, depending on the diagnostic criteria, the method may include a process of measuring the area and position of structures such as cancer cells and the distance between structures. In this case, the projection image generation unit 63 may generate the projection image data so as to include the measurement result measured by using the length measurement function of the microscope, and the projection control unit 65 may project the projection image including the measurement result on the image plane.

The microscope system 3 configured as described above performs the image projection processing shown in fig. 14. Fig. 14 is a flowchart of the image projection processing performed by the microscope system 3. Next, an image projection method of the microscope system 3 will be described with reference to fig. 14.

First, the microscope system 3 projects an optical image of the specimen onto an image plane (step S11). The process is the same as step S1 shown in fig. 3.

Next, the microscope system 3 determines a diagnosis standard selected from a plurality of diagnosis standards prepared in advance (step S12). Here, the user selects a diagnosis standard on the selection screen, for example, and the projection image generating unit 63 specifies the selected diagnosis standard based on the input operation by the user.

When the diagnosis specification is determined, the microscope system 3 generates projection image data based on the determined diagnosis specification (step S13). Here, the projection image generating unit 63 generates projection image data representing the projection image corresponding to the diagnosis standard determined in step S12.

Finally, the microscope system 3 projects the projection image to the image plane (step S14). The projection control unit 65 controls the projection device 133 based on the projection image data, whereby the projection device 133 projects the projection image onto the image plane. Thereby, the projected image is superimposed on the optical image of the sample.

In the microscope system 3, a projection image corresponding to the diagnostic criteria is displayed on the optical image. Thus, in pathological diagnosis based on an optical image of a sample, a pathologist can obtain various information for assisting diagnosis such as a diagnosis procedure and a judgment reference without taking the eye away from the eyepiece. In addition, since the projection image generating unit 63 generates projection image data corresponding to a diagnosis standard selected from a plurality of diagnosis standards, the microscope system 3 can support various kinds of diagnosis standards. Therefore, according to the microscope system 3, the pathological diagnosis based on the optical image can be assisted, and the work load of the pathologist can be reduced.

In addition, the microscope system 3 is the same as the microscope system 1 in the following respects: the burden of a pathologist can be reduced while avoiding a large increase in equipment cost; unlike the WSI system, it is not necessary to prepare in advance, and the diagnostic work can be started immediately.

Fig. 15 to 18 are diagrams illustrating images that can be observed from the eyepiece 104 of the microscope system 3. Next, a case of observation performed using the microscope system 3 that performs the image projection processing shown in fig. 14 will be specifically described with reference to fig. 15 to 18. Here, a diagnostic standard for determining overexpression of HER2 protein by IHC staining (hereinafter referred to as IHC-HER2 diagnostic standard) will be described as an example.

When the observation using the microscope system 3 is started and the eyepiece 104 is peeped, the pathologist can observe the image V7 shown in fig. 15. The image V7 is an image in which a projection image including the guide image G1 is superimposed on the optical image M2. The optical image M2 is an image acquired by using the 4-fold objective lens 102, for example, and shows a stained cell membrane of a cancer cell. Guide image G1 is an image that guides the diagnostic process of the IHC-HER2 diagnostic specification. The IHC-HER2 diagnostic standard specifies a procedure for observing a positive HER2 protein stained image, the intensity of staining, and the proportion of positive cells using a 4-fold objective lens, and the guide image G1 guides the procedure.

In the microscope system 3, instead of the image V7, the image V8 may be projected onto the image plane. The image V8 is an image in which a projection image including the guide image G1 and the contrast image C1 is superimposed on the optical image M2. The contrast image C1 contains a plurality of reference images representing the decision criteria in the IHC-HER2 diagnostic specification. More specifically, 4 reference images are included, which exemplify images that should be determined as score 0, score 1+, score 2+, and score 3 +. The pathologist can refer to the comparison image C1 when determining the proportion of positive cells.

Thereafter, the image V9 is projected to the image plane. The image V9 is an image in which a projection image including the guide image G2 is superimposed on the optical image M2. Guide image G2 is an image that guides the diagnostic process of the IHC-HER2 diagnostic specification. In the IHC-HER2 diagnostic specification, it is stipulated that observation using a 4-fold objective lens is performed followed by observation using a 10-fold objective lens, and the guide image G2 guides this process.

In a case where switching from the 4-fold objective lens 102 to the 10-fold objective lens 102a is not detected after the image V9 is projected, the image V10 is projected onto the image plane. The image V10 is an image in which a projection image including the guide image G3 is superimposed on the optical image M2. The guide image G3 is a warning display for warning the pathologist that the diagnosis is not performed according to the diagnosis specification. The pathologist can recognize the error of the procedure by the warning display, and therefore, according to the microscope system 3, it is possible to avoid proceeding with the diagnosis in the wrong procedure.

[ fourth embodiment ]

Fig. 19 is a diagram showing the configuration of a diagnosis support system including the microscope system 4 and the external view system 300 according to the present embodiment. The microscope system 4 differs from the microscope system 1 in the following respects: a computer 70 is provided instead of the computer 20.

The microscope system 4 is connected to 1 or more external browsing systems 300 via the internet 400. The external browsing system 300 is a system including a computer 310 having at least a communication control unit 311, an input device 320, and a display device 330.

The internet 400 is an example of a communication network. The microscope system 3 and the external browsing system 300 may be connected via a VPN (Virtual Private Network), a dedicated line, or the like, for example.

The computer 70 is different from the computer 20 in that it includes a communication control unit 29. The communication control unit 29 exchanges data with the external browsing system 300.

The communication control unit 29 transmits the image data to the external browsing system 300, for example. The image data transmitted by the communication control unit 29 may be, for example, synthesized image data generated by the image synthesizing unit 27. The digital image data and the projection image data may be transmitted separately. In addition, only the digital image data may be transmitted. In the external browsing system 300, the computer 310 that receives the image data displays an image in the display device 330 based on the image data. The computer 310 may generate composite image data based on the digital image data and the projected image data, for example, and may display a composite image on the display device 330 based on the composite image data.

The communication control unit 29 receives operation information input by a user of the external browsing system 300, for example. The image analysis unit 22 may select the analysis processing based on the operation information received by the communication control unit 29. The microscope system 4 may project a projection image based on an input operation of a user of the external browsing system 300 onto the image plane using the projection device 133.

The microscope system 4 can have a conversation with an external browsing system 300 connected via a network. Therefore, it is possible to perform pathological diagnosis while communicating with users located at different positions.

The above-described embodiments show specific examples for facilitating understanding of the present invention, and the embodiments of the present invention are not limited to these examples. The microscope system, the projection unit, and the image projection method can be variously modified and changed without departing from the scope of claims.

In fig. 12, a projection unit 140 including an image analysis section 142 that performs analysis processing selected from a plurality of analysis processing is illustrated. However, the projection unit may include a projection image generation section that generates projection image data based on a diagnosis specification selected from a plurality of diagnosis specifications.

Fig. 19 illustrates a microscope system 4 in which a communication function with an external browsing system 300 is added to the microscope system 1. However, a new microscope system may be configured by adding a communication function with the external browsing system 300 to the microscope system 2.

Fig. 1 shows an example in which the projection image generation unit 23 generates projection image data based on the analysis result of the selected analysis process and the analysis process. However, the projection image generation unit 23 may generate the first projection image data based on the analysis result of the selected analysis processing and the analysis processing, and generate the second projection image data based on the selected diagnosis specification. In addition, the projection device 133 may project the first projection image indicated by the first projection image data and the second projection image indicated by the second projection image data onto the optical image.

In addition, in fig. 5 to 10, examples of pathological diagnosis of breast cancer are shown, but the microscope system 1 is also used in pathological diagnosis of other cancers such as cervical cancer. For example, if the method is applied to pathological diagnosis of cervical cancer, classification may be performed based on the Bethesda (Bethesda) system, and the classification results of NILM, LSIL, HSIL, SCC, ASC, and the like may be displayed using projection images. In the genome diagnosis, the number of tumor cells, the number of all cells, the ratio thereof, and the like may be displayed using the projection image.

Although an example in which the display format of the projection image is changed in accordance with the analysis processing is shown, the microscope system 1 may change the display format in accordance with a change in the setting of the illumination optical system or the observation optical system.

Although the microscope system 1 is described as including the image analysis unit 22 in the computer 20 of the microscope system 1, the image analysis unit 22 may be implemented by the computer 20 inside the microscope system 1 and a remote module outside the microscope system 1. The remote module is, for example, a server placed on the cloud. The computer 20 may also support new analysis processes by downloading the latest programs from the remote module and updating the analysis programs. In addition, the computer 20 may also support new diagnostic specifications by downloading the latest programs from a remote module.

The microscope included in the microscope system 1 may be, for example, a microscope 500 shown in fig. 20. In the above-described embodiment, the configuration in which the intermediate barrel 130 is provided with the image pickup device 131 is exemplified, but the image pickup device 151 for acquiring digital image data used for image analysis may be provided in the digital camera 150 as shown in fig. 12, and the digital camera 150 may be attached to the barrel 120a as a three-eye barrel. However, in this case, light emitted from the projection device 133 included in the intermediate barrel 130a enters the image pickup device 151. Therefore, the digital camera 150 may be controlled so that the light emission period of the projector 133 does not overlap the exposure period of the image pickup device 151. Thereby, the occurrence of a projected image in the digital image can be prevented.

The microscope system including the microscope 500 may be used for diagnosis of a moving object such as sorting of sperm in artificial insemination. In sperm sorting, the quality of sperm is determined based on information that can be determined based on a still image, such as the shape of sperm (hereinafter referred to as shape information), and information that can be determined based on a moving image or a plurality of still images, such as the straightness and speed of movement of sperm (hereinafter referred to as movement information). Therefore, in the analysis processing for assisting the sperm sorting work, the sperm movement information may be output so as to be reflected in the projection image, and the sperm candidates to be selected by the user may be specified using the sperm shape information and the sperm movement information.

More specifically, when the user starts observing the sperm to be sorted, the image analysis unit 22 analyzes a plurality of pieces of digital image data acquired at different times to calculate the movement trajectory of the sperm. Then, the projection image generation unit 23 generates projection image data based on the analysis result, and the projection device 133 projects the projection image including the trajectory display MT onto the image plane. The trajectory display MT is a display of a trajectory indicating the movement of each sperm to the current position. The trajectory display MT may indicate a trajectory of movement from a time when a predetermined time (for example, 3 seconds) is traced back to the current time.

Thus, the user first observes the image V11 including the optical image M3 shown in fig. 21, and when the analysis processing by the image analysis section 22 is completed, can observe the image V12 including the optical image M4 and the auxiliary image a1 shown in fig. 22. Since the auxiliary image a1 includes the trajectory display MT of each sperm, the user who observes the image V12 can grasp the characteristics of the movement of the sperm in addition to the characteristics of the shape of the sperm. Therefore, the user can easily determine the quality of the sperm, and the burden of the sorting operation can be reduced.

The image analysis unit 22 analyzes the digital image data of the superimposed image in which the auxiliary image a1 is superimposed on the optical image M4, thereby specifying the sperm cell candidates to be selected by the user. Then, the projection image generating unit 23 generates projection image data based on the analysis result including the position information of the sperm, and the projecting device 133 projects the projection image including the region of interest display ROI onto the image plane. The attention area display ROI is a display for prompting the user to pay attention to the sperm specified by the image analysis unit 22, and is, for example, a rectangular or circular figure surrounding the sperm. In addition, the probability that the sperm is a good-quality sperm can be displayed by the difference in the color of the graph.

Thereby, the user can observe the image V13 including the optical image M4 and the auxiliary image a2 shown in fig. 23. The auxiliary image a2 includes the region of interest display ROI in addition to the trajectory display MT. Therefore, as in the case of the image V12, the user can grasp the characteristics of the movement of the sperm at a glance in addition to the characteristics of the shape of the sperm. In addition, the user can select good-quality sperm as soon as possible by preferentially observing sperm identified by the ROI displayed in the region of interest. Therefore, the burden of the sorting work is further reduced.

The microscope included in the microscope system 1 may be, for example, a microscope 600 shown in fig. 22. The microscope 600 includes an intermediate barrel 130b instead of the intermediate barrel 130, and the intermediate barrel 130b includes a projection device 135 using a transmissive liquid crystal device. In the above-described embodiment, the configuration in which the light emitted from the projection device 133 is deflected by the light deflecting element 134 disposed on the optical path between the objective lens 102 and the eyepiece 104 to project the projection image on the image plane has been illustrated, but as shown in fig. 22, the projection device 135 may be disposed on the optical path between the objective lens 102 and the eyepiece 104.

In the above-described embodiments, the example in which the image pickup element is included as the photodetector is described, but the photodetector is not limited to the image pickup element. For example, the above-described technique may be applied to a scanning microscope, and in this case, the light detector may be a photomultiplier tube (PMT) or the like.

The microscope system may adjust the brightness of at least one of the optical image and the projection image in accordance with the selected analysis processing, or may adjust the brightness of at least one of the optical image and the projection image in accordance with the selected diagnosis standard. The brightness may be adjusted by controlling the light amount of the light source, or by controlling the amount of transmitted light of a variable ND filter or the like.

In the above-described embodiment, the keyboard, the mouse, the joystick, the touch panel, and the like are exemplified as the input device 40, but the input device 40 may be a device that receives an audio input, such as a microphone. In this case, the computer 20 may have a function of recognizing the voice instruction input from the input device 40, and for example, the information acquiring unit 24 included in the computer 20 may convert voice data into operation information by a voice recognition technique and output the operation information to the projection image generating unit 23.

Description of the reference numerals

1. 2, 3, 4: a microscope system; 10: a microscope controller; 20. 60, 70, 310: a computer; 20 a: a processor; 20 b: a memory; 20 c: a secondary storage device; 20 d: an input/output interface; 20 e: a medium drive device; 20 f: a communication control device; 20 g: a bus; 20 h: a storage medium; 21. 61, 141: a camera control unit; 22. 142: an image analysis section; 23. 63, 143: a projection image generation unit; 24. 64, 144: an information acquisition unit; 25. 65, 145: a shadow control unit; 26. 66: an image recording section; 27. 67: an image synthesizing unit; 28. 68: a display control unit; 29. 311: a communication control unit; 30. 330: a display device; 31: selecting a picture; 32. 33, 34, 35: a menu; 36: a button; 40. 320, and (3) respectively: an input device; 50: an identification device; 100. 200, 500, 600: a microscope; 101: a mounting table; 102. 102 a: an objective lens; 103: an imaging lens; 104: an eyepiece; 110: a microscope body; 111: a turret; 120. 120 a: a lens barrel; 130. 130a, 130 b: a middle lens barrel; 131. 151, 151: an image pickup element; 132. 134: a light deflecting element; 133. 135, and (3) adding: a projection device; 140: a projection unit; 150: a digital camera; 300: an external browsing system; 400: the internet; a1, A2: an auxiliary image; c1: comparing the images; d1: inputting data; d2: outputting the data; d3: correct data; g1, G2, G3: a guide image; m1, M2, M3, M4: an optical image; MT: displaying a track; NN: a neural network; p1, P2: a position image; r1, R2: an area; ROI: displaying the attention area; t1, T2: counting the images; V1-V13: and (4) an image.

34页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:显微镜系统、投影单元以及图像投影方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!