Microscope system, projection unit, and image projection method

文档序号:664665 发布日期:2021-04-27 浏览:11次 中文

阅读说明:本技术 显微镜系统、投影单元以及图像投影方法 (Microscope system, projection unit, and image projection method ) 是由 壁谷章文 城田哲也 中田竜男 于 2018-12-25 设计创作,主要内容包括:显微镜系统(1)具备:目镜(104);物镜(102),其将来自试样的光引导到目镜(104);以及成像透镜(103),其配置在目镜(104)与物镜(102)之间的光路上,基于来自试样的光来形成试样的光学图像。显微镜系统(1)还具备向形成有光学图像的像面以能够相互区分的方式投射第一投影图像和第二投影图像的投影装置(133)。第一投影图像是基于计算机对试样的数字图像数据的分析结果的图像,第二投影图像是基于利用者的输入操作的图像。(A microscope system (1) is provided with: an eyepiece (104); an objective lens (102) that guides light from the sample to an eyepiece lens (104); and an imaging lens (103) which is disposed on the optical path between the eyepiece (104) and the objective lens (102) and forms an optical image of the sample based on light from the sample. The microscope system (1) further comprises a projection device (133) for projecting the first projection image and the second projection image in a manner that the first projection image and the second projection image can be distinguished from each other on an image plane on which the optical image is formed. The first projection image is an image based on the result of analysis of digital image data of a sample by a computer, and the second projection image is an image based on an input operation by a user.)

1. A microscope system is characterized by comprising:

an eyepiece;

an objective lens that guides light from a specimen to the eyepiece;

an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen; and

and a projection device that projects a first projection image and a second projection image so as to be distinguishable from each other on an image plane on which the optical image is formed, wherein the first projection image is an image based on a result of analysis of digital image data of the sample by a computer, and the second projection image is an image based on an input operation by a user.

2. The microscope system of claim 1,

the second projection image is displayed in a different form from the first projection image.

3. The microscope system of claim 2,

the display form includes the color of the image or the format of the lines that make up the image.

4. The microscope system according to any one of claims 1 to 3,

the projection period of the second projection image is different from the projection period of the first projection image.

5. The microscope system according to any one of claims 1 to 4,

and a projection control unit for controlling the projection of the light beam,

the projection control unit performs the following operations:

determining whether to project each of the first projection image and the second projection image onto the image plane according to a setting of the microscope system; and

and controlling the projection device such that the projection device projects the first projection image and the second projection image onto the image plane when the microscope system is set to a predetermined level.

6. The microscope system according to any one of claims 1 to 5, further comprising:

a photodetector that detects light from the sample, wherein the digital image data is generated based on a detection result of the photodetector; and

and a first light deflecting element disposed on an optical path between the eyepiece and the objective lens, and deflecting light from the sample toward the photodetector.

7. The microscope system of claim 6,

the projector further includes a second light deflection element that is disposed on an optical path between the image plane and the first light deflection element and deflects light emitted from the projector toward the image plane.

8. The microscope system according to any one of claims 1 to 7,

the image analysis unit analyzes the digital image data and outputs the analysis result.

9. The microscope system of claim 8,

the image analysis unit performs the following operations:

classifying one or more structures represented in a digital image represented by the digital image data into one or more categories; and

outputting the analysis result including information for determining a position of a structure classified into at least one of the one or more classes,

the first projection image includes a graph representing a position of the structure classified into the at least one category.

10. The microscope system according to claim 8 or 9,

the image analysis section analyzes the digital image data using a trained neural network.

11. The microscope system according to any one of claims 1 to 10,

the second projection image includes a graphic representing an area to be focused specified by the user.

12. The microscope system according to claim 11, further comprising:

a mounting table on which the sample is mounted;

a movement amount calculation unit that calculates a movement amount of the mounting table; and

and a projection image generation unit that generates second projection image data representing the second projection image based on the input operation and the movement amount, wherein the projection image generation unit changes a position of the graphic included in the second projection image on the image plane in accordance with the movement amount.

13. The microscope system of claim 11,

the image processing apparatus further includes a projection image generation unit that generates second projection image data representing the second projection image based on the input operation and a magnification of the optical image, and the projection image generation unit changes a size of the graphic included in the second projection image on the image plane in accordance with the magnification of the optical image.

14. The microscope system according to any one of claims 1 to 13,

further comprises an image recording unit for recording the digital image data, first projection image data representing the first projection image, and second projection image data representing the second projection image,

the image recording section records each of the first projection image data and the second projection image data in association with the digital image data to an area different from the digital image data.

15. The microscope system of claim 14,

further comprises an identification means for acquiring identification information attached to the sample,

the image recording unit records the identification information acquired by the identification device in association with the digital image data.

16. The microscope system according to any one of claims 1 to 15, further comprising:

an image synthesizing unit that generates synthetic image data of a synthetic image obtained by synthesizing the digital image represented by the digital image data, the first projection image data representing the first projection image, and the second projection image data representing the second projection image, based on the digital image data, the first projection image data, and the second projection image data; and

and a display control unit that displays the digital image or the composite image on a display device.

17. The microscope system of claim 16,

further comprises a communication control unit for transmitting image data to an external browsing system connected to the microscope system via a network,

the external browsing system is provided with the display device.

18. The microscope system according to any one of claims 1 to 16,

further comprises a communication control unit for receiving operation information input by a user of an external browsing system connected to the microscope system via a network,

the second projection image is an image based on an input operation by a user of the external browsing system.

19. A projection unit for a microscope provided with an objective lens, an imaging lens, and an eyepiece, the projection unit comprising:

an imaging device that acquires digital image data of a sample based on light from the sample; and

and a projection device that projects a first projection image and a second projection image so as to be distinguishable from each other on an image plane on which an optical image of the sample is formed by the imaging lens, wherein the first projection image is an image based on an analysis result of the digital image data, and the second projection image is an image based on an input operation by a user.

20. An image projection method, performed by a microscope system, the image projection method being characterized in that,

the microscope system performs the following actions:

acquiring digital image data of a sample;

acquiring information of input operation of a user; and

a first projection image and a second projection image are projected so as to be distinguishable from each other onto an image plane on which an optical image of the sample is formed based on light from the sample, wherein the first projection image is an image based on a result of analysis of the digital image data by a computer, and the second projection image is an image based on the input operation.

Technical Field

The disclosure of the present specification relates to a microscope system, a projection unit, and an image projection method.

Background

As one of techniques for reducing the burden on pathologists in pathological diagnosis, the WSI (white Slide Imaging) technique is attracting attention. The WSI technique is a technique for creating a WSI (white Slide Image) that is a digital Image of the entire specimen on a Slide. By displaying the WSI as a digital image on a monitor and performing diagnosis, a pathologist can enjoy various benefits. Specifically, there are advantages that a troublesome microscopic operation is not required in the diagnosis, the display magnification can be easily changed, and a plurality of pathologists can participate in the diagnosis at the same time. Such a WSI technique is described in patent document 1, for example.

Documents of the prior art

Patent document

Patent document 1: japanese Kohyo publication No. 2001-519944

Disclosure of Invention

Problems to be solved by the invention

On the other hand, a system to which the WSI technology is applied (hereinafter referred to as a WSI system) is required to have high performance. Specifically, for example, in pathological diagnosis, information on color and darkness is extremely important, and therefore a WSI system is required to have high color reproducibility and a large dynamic range. Therefore, each device constituting the WSI system must be an expensive device having high performance, and as a result, users who can introduce the WSI system are limited.

From this fact, a new technique is sought as follows: the pathological diagnosis by the pathologist based on the optical image (simulated image) obtained by the optical microscope is assisted, thereby reducing the burden on the pathologist.

An object of one aspect of the present invention is to provide a diagnosis support technique for supporting pathological diagnosis by a pathologist based on an optical image.

Means for solving the problems

A microscope system according to an embodiment of the present invention includes: an eyepiece; an objective lens that guides light from a specimen to the eyepiece; an imaging lens disposed on an optical path between the eyepiece and the objective lens, and configured to form an optical image of the specimen based on light from the specimen; and a projection device that projects a first projection image and a second projection image so as to be distinguishable from each other on an image plane on which the optical image is formed, wherein the first projection image is an image based on a result of analysis of digital image data of the sample by a computer, and the second projection image is an image based on an input operation by a user.

A projection unit according to an aspect of the present invention is a projection unit for a microscope including an objective lens, an imaging lens, and an eyepiece, the projection unit including: an imaging device that acquires digital image data of a sample based on light from the sample; and a projection device that projects a first projection image and a second projection image so as to be distinguishable from each other on an image plane on which an optical image of the sample is formed by the imaging lens, wherein the first projection image is an image based on an analysis result of the digital image data, and the second projection image is an image based on an input operation by a user.

An image projection method according to an aspect of the present invention is an image projection method performed by a microscope system that performs the following operations: acquiring digital image data of a sample; acquiring information of input operation of a user; and projecting a first projection image and a second projection image in a manner distinguishable from each other onto an image plane on which an optical image of the specimen is formed based on light from the specimen, where the first projection image is an image based on a result of analysis of the digital image data by a computer, and the second projection image is an image based on the input operation.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the above-described aspect, it is possible to assist a pathologist in performing pathological diagnosis based on an optical image.

Drawings

Fig. 1 is a diagram showing a configuration of a microscope system 1.

Fig. 2 is a diagram showing the structure of the computer 20.

Fig. 3 is a flowchart of the image projection processing performed by the microscope system 1.

Fig. 4 is a diagram illustrating the distribution of cells.

Fig. 5 is a diagram showing an example of observation performed using the microscope system 1.

Fig. 6 is a diagram showing another example of observation performed using the microscope system 1.

Fig. 7 is a diagram showing still another example of observation performed using the microscope system 1.

Fig. 8 is a diagram showing the structure of a neural network.

Fig. 9 is a diagram showing the structure of the microscope system 2.

Fig. 10 is a diagram showing the configuration of a diagnosis assistance system including the microscope system 3 and the external browsing system 300.

Fig. 11 is a diagram showing an example of observation performed using the microscope system 3.

Fig. 12 is a diagram showing the structure of the microscope 500.

Fig. 13 is a diagram showing the structure of the microscope 600.

Detailed Description

[ first embodiment ]

Fig. 1 is a diagram showing a configuration of a microscope system 1 according to the present embodiment. Fig. 2 is a diagram showing the structure of the computer 20. The microscope system 1 is a microscope system used by a pathologist for pathological diagnosis, and includes at least an objective lens 102, an imaging lens 103, an eyepiece 104, and a projection device 133.

The microscope system 1 projects two types of projection images onto an image plane on which an optical image of a sample is formed by the objective lens 102 and the imaging lens 103 using the projection device 133. More specifically, the two kinds of projection images are a first projection image based on the analysis result of the computer and a projection image based on an input operation of a user of the microscope system 1 including a pathologist. The pathologist thus sees an image in which these projection images are superimposed on the optical image. Therefore, the microscope system 1 can provide various information for assisting pathological diagnosis to a pathologist who is viewing the eyepiece 104 to observe a sample.

Next, a specific example of the configuration of the microscope system 1 will be described in detail with reference to fig. 1 and 2. As shown in fig. 1, the microscope system 1 includes a microscope 100, a microscope controller 10, a computer 20, a display device 30, an input device 40, and a recognition device 50.

The microscope 100 is, for example, an upright microscope, and includes a microscope body 110, a lens barrel 120, and an intermediate lens barrel 130. In addition, the microscope 100 may be an inverted microscope.

The microscope body 110 includes: a mounting table 101 on which a sample is mounted; an objective lens (objective lens 102, objective lens 102a) that guides light from the specimen to an eyepiece lens 104; an epi-illumination optical system; and a transmissive illumination optical system. The table 101 may be a manual table or a motorized table. It is desirable that a plurality of objective lenses having different magnifications be attached to the lens changer. For example, the objective lens 102 is a 4-fold objective lens, and the objective lens 102a is a 20-fold objective lens. The microscope body 110 may include at least one of an epi-illumination optical system and a transmission illumination optical system.

The microscope body 110 is further provided with a turret 111 for switching the microscopy technique. For example, a fluorescence cube used in the fluorescence observation method, a half mirror used in the bright field observation method, or the like is disposed on the turret 111. In addition, the microscope body 110 may include an optical element used in a specific microscopic technique that is removable with respect to the optical path. Specifically, the microscope body 110 may include, for example, a DIC prism, a polarizer, an analyzer, and the like used in the differential interference observation method.

The lens barrel 120 is a monocular lens barrel or a binocular lens barrel to which the eyepiece 104 is attached. An imaging lens 103 is provided inside the lens barrel 120. The imaging lens 103 is disposed on an optical path between the objective lens 102 and the eyepiece 104. The imaging lens 103 forms an optical image of the specimen based on light from the specimen at an image plane between the eyepiece lens 104 and the imaging lens 103. The imaging lens 103 also forms a projection image, which will be described later, on the image plane based on the light from the projection device 133. Thereby, at the image plane, the projected image is superimposed on the optical image.

The intermediate barrel 130 is disposed between the microscope body 110 and the barrel 120. The intermediate barrel 130 includes an image pickup element 131, a light deflection element 132, a projection device 133, and a light deflection element 134.

The imaging element 131 is an example of a photodetector that detects light from the sample. The image pickup element 131 is a two-dimensional image sensor, such as a CCD image sensor or a CMOS image sensor. The image pickup device 131 detects light from the sample, and generates digital image data of the sample based on the detection result.

The optical deflection element 132 is an example of a first optical deflection element that deflects light from the sample toward the image pickup element 131. The light deflecting element 132 is, for example, a beam splitter such as a half mirror. As the light deflecting element 132, a variable beam splitter capable of changing transmittance and reflectance may also be used. The light deflecting element 132 is disposed on the optical path between the eyepiece 104 and the objective lens 102. This enables the image pickup device 131 to obtain a digital image of the sample observed from the same direction as the visual observation direction.

The projection device 133 is a projection device that projects a first projection image and a second projection image, which will be described later, onto an image plane so as to be distinguishable from each other in accordance with a command from the computer 20. The projection device 133 is, for example, a projector using a liquid crystal device, a projector using a digital mirror device, a projector using LCOS, or the like.

The light deflection element 134 is an example of a second light deflection element that deflects the light emitted from the projection device 133 toward the image plane. The light deflecting element 134 is, for example, a beam splitter such as a half mirror. As the light deflecting element 134, a variable beam splitter capable of changing transmittance and reflectance may also be used. As the light deflecting element 134, a dichroic mirror or the like may be used. The light deflection element 134 is disposed on the optical path between the image plane and the light deflection element 132. This can prevent light from the projector 133 from entering the image sensor 131.

The microscope controller 10 controls the microscope 100, particularly, the microscope main body 110. The microscope controller 10 is connected to the computer 20 and the microscope 100, and controls the microscope 100 in accordance with a command from the computer 20.

The display device 30 is, for example, a liquid crystal display, an organic el (oled) display, a CRT (Cathode Ray Tube) display, or the like. The input device 40 outputs an operation signal corresponding to an input operation of a user to the computer 20. The input device 40 is, for example, a keyboard, but may include a mouse, a joystick, a touch panel, or the like.

The identification device 50 is a device that acquires identification information attached to a sample. The identification information includes at least information for identifying the sample. The identification information may include information on the method of analyzing the sample, and the like. The identification device 50 is, for example, a barcode reader, an RFID reader, a QR code (registered trademark) reader, or the like.

The computer 20 controls the entire microscope system 1. The computer 20 is connected to the microscope 100, the microscope controller 10, the display device 30, the input device 40, and the recognition device 50. As shown in fig. 1, the computer 20 mainly includes a camera control unit 21, an image analysis unit 22, a movement amount calculation unit 22a, a projection image generation unit 23, an information acquisition unit 24, a projection control unit 25, an image recording unit 26, an image synthesis unit 27, and a display control unit 28 as components related to the control of the projection device 133.

The camera control unit 21 controls the image pickup device 131 to acquire digital image data of the sample. The digital image data acquired by the camera control unit 21 is output to the image analysis unit 22, the movement amount calculation unit 22a, the image recording unit 26, and the image synthesis unit 27.

The image analysis unit 22 analyzes the digital image data acquired by the camera control unit 21, and outputs the analysis result to the projection image generation unit 23. The content of the analysis process performed by the image analysis unit 22 is not particularly limited. The image analysis unit 22 may classify one or more structures represented in the digital image represented by the digital image data into one or more categories, and output an analysis result including information for specifying the position of the structure classified into at least one of the one or more categories, for example. More specifically, the image analyzing section 22 may classify cells present in the digital image and output an analysis result including information for determining the outline of a specific cell and information for determining the outline of the nucleus of the specific cell.

The movement amount calculator 22a analyzes the digital image data acquired by the camera controller 21 to calculate the movement amount of the mounting table 101. Specifically, the amount of movement of the mounting table 101 is calculated by comparing digital image data acquired at different times. The movement amount calculated by the movement amount calculation unit 22a is output to the projection image generation unit 23.

The projection image generation unit 23 generates first projection image data and second projection image data. The first projection image represented by the first projection image data is an image based on the analysis result of the digital image data of the specimen by the computer 20. The second projection image represented by the second projection image data is an image based on an input operation by the user. The first projection image and the second projection image generated by the projection image generating unit 23 are output to the projection control unit 25, the image recording unit 26, and the image synthesizing unit 27.

Desirably, the projection image generation unit 23 generates the first projection image data and the second projection image data so that the display form of the first projection image is different from the display form of the second projection image. Further, the display form includes, for example, the color of the image or the format of the lines constituting the image. The format of the line includes the color, kind, thickness, etc. of the line.

The projection image generation unit 23 generates first projection image data based on the analysis result output from the image analysis unit 22. For example, if the analysis result includes information for specifying the position of the structure classified into at least one category, the first projection image data generated by the projection image generating unit 23 represents a first projection image including a figure for specifying the position of the structure classified into at least one category. In addition, if the analysis result includes information for determining the contour of the specific cell and information for determining the contour of the nucleus of the specific cell, the first projection image data represents a first projection image including a closed curve overlapping with the contour of the specific cell and a closed curve overlapping with the contour of the nucleus of the specific cell. Furthermore, the closed curve used to determine the contour of a particular cell may also be a different color than the closed curve used to determine the contour of the nucleus.

The projection image generation unit 23 generates second projection image data based on at least the operation information acquired by the information acquisition unit 24 described later. For example, if the operation information is information for specifying a region to be focused on by the user's operation, the second projection image data generated by the projection image generation unit 23 represents a second projection image including a figure indicating the region to be focused (hereinafter referred to as ROI) specified by the user. The figure representing the ROI is, for example, a closed curve overlapping the outline of the ROI. Further, if the operation information is information obtained by an operation of inputting a comment by the user, the second projection image data generated by the projection image generation unit 23 represents a second projection image including the comment input by the user.

The projection image generation unit 23 may generate the second projection image data based on the input operation and the movement amount of the table 101 acquired by the movement amount calculation unit 22 a. This is particularly effective when the table 101 is moved after the operation information is acquired by the information acquiring unit 24. Thus, the projection image generating unit 23 can change the position of the graphic indicating the ROI included in the second projection image on the image plane according to the movement amount of the table 101 without newly designating the ROI by the user.

The projection image generation unit 23 may generate the second projection image data based on the input operation and the magnification of the optical image formed on the image plane. This is particularly effective when the magnification of the optical image is changed after the operation information is acquired by the information acquiring unit 24. Thus, the projection image generation unit 23 can change the size of the graphics included in the second projection image on the image plane in accordance with the magnification of the optical image without newly designating the ROI by the user.

The information acquisition unit 24 acquires information from a device external to the computer 20. Specifically, the information acquiring unit 24 acquires the operation information of the user based on the operation signal from the input device 40. Further, the information acquiring unit 24 acquires identification information from the identification device 50. The operation information of the user is information generated by the user operating the input device 40 to display information on the image plane.

The projection control unit 25 controls the projection device 133 to control projection of the first projection image and the second projection image onto the image plane. The projection control unit 25 may control the projection device 133 to make the projection period of the first projection image different from the projection period of the second projection image. Specifically, the projection control unit 25 may adjust the projection period as follows: the first projection image is projected periodically and the second projection image is projected all the time. The projection control unit 25 may control the projection device 133 according to the setting of the microscope system 1. Specifically, the projection control unit 25 may determine whether to project each of the first projection image and the second projection image onto the image plane based on the setting of the microscope system 1, or may control the projection device 133 so that the projection device 133 projects the first projection image and the second projection image onto the image plane when the microscope system 1 is set to a predetermined setting. That is, the microscope system 1 can change whether or not to project each of the first projection image and the second projection image onto the image plane according to the setting.

The image recording unit 26 records the digital image data, the first projection image data, and the second projection image data. Specifically, the image recording section 26 records each of the first projection image data and the second projection image data in association with the digital image data to an area different from the digital image data. This makes it possible to read out the digital image data, the first projection image data, and the second projection image data associated with each other individually as needed. The image recording unit 26 may acquire identification information attached to the sample via the identification device 50 and the information acquiring unit 24, and record the acquired identification information in association with the digital image data. The image recording unit 26 may record the digital image data, the first projection image data, and the second projection image data when an input of a recording instruction by the user is detected.

The image synthesizing unit 27 generates image data of a synthesized image obtained by synthesizing the digital image with the first projection image and the second projection image based on the digital image data, the first projection image data, and the second projection image data, and outputs the image data of the synthesized image to the display control unit 28.

The display control unit 28 displays the composite image on the display device 30 based on the composite image data output from the image combining unit 27. The display control unit 28 may display the digital image on the display device 30 based on the digital image data alone.

The computer 20 may be a general-purpose device or a dedicated device. The structure of the computer 20 is not particularly limited, and may have a physical structure as shown in fig. 2, for example. Specifically, the computer 20 may include a processor 20a, a memory 20b, an auxiliary storage device 20c, an input/output interface 20d, a media drive device 20e, and a communication control device 20f, and may be connected via a bus 20 g.

The processor 20a is, for example, an arbitrary Processing circuit including a CPU (Central Processing Unit). The processor 20a may execute the programs stored in the memory 20b, the auxiliary storage device 20c, and the storage medium 20h to perform programmed processes, thereby realizing the components (the camera control unit 21, the image analysis unit 22, the projection image generation unit 23, and the like) related to the control of the projection device 133 described above. The processor 20a may be configured by a dedicated processor such as an ASIC or FPGA.

The memory 20b is a working memory of the processor 20 a. The Memory 20b is an arbitrary semiconductor Memory such as a RAM (Random Access Memory). The auxiliary storage device 20c is a nonvolatile memory such as an EPROM (Erasable Programmable ROM) or a Hard disk Drive (Hard disk Drive). The input/output interface 20d exchanges information with external devices (the microscope 100, the microscope controller 10, the display device 30, the input device 40, and the recognition device 50).

The media drive device 20e can output data stored in the memory 20b and the auxiliary storage device 20c to the storage medium 20h, and can read programs, data, and the like from the storage medium 20 h. The storage medium 20h is an arbitrary recording medium that can be transported. The storage medium 20h includes, for example, an SD card, a USB (Universal Serial Bus) flash memory, a CD (Compact Disc), a DVD (Digital Versatile Disc), and the like.

The communication control device 20f performs input/output of information to/from the network. As the communication control device 20f, for example, an NIC (Network Interface Card), a wireless LAN (Local Area Network) Card, or the like can be used. The bus 20g connects the processor 20a, the memory 20b, the auxiliary storage device 20c, and the like to be able to transfer data to and from each other.

The microscope system 1 configured as described above performs the image projection processing shown in fig. 3. Fig. 3 is a flowchart of the image projection processing performed by the microscope system 1. Next, an image projection method of the microscope system 1 will be described with reference to fig. 3.

First, the microscope system 1 projects an optical image of the specimen onto an image plane (step S1). Here, the imaging lens 103 condenses light from the sample acquired by the objective lens 102 on an image surface to form an optical image of the sample.

Then, the microscope system 1 acquires digital image data of the specimen (step S2). Here, the light deflecting element 132 deflects a part of the light from the sample acquired by the objective lens 102 toward the image pickup element 131. The image pickup device 131 picks up an image of the sample based on the light deflected by the light deflection device 132, thereby generating digital image data.

After that, the microscope system 1 generates first projection image data based on the analysis result of the digital image data (step S3). Here, the image analysis unit 22 that acquires the digital image data via the camera control unit 21 performs analysis processing, and the projection image generation unit 23 generates first projection image data based on the analysis result.

When the first projection image data is generated, the microscope system 1 projects the first projection image onto the image plane (step S4). Here, the projection control unit 25 controls the projection device 133 based on the first projection image data, whereby the projection device 133 projects the first projection image onto the image plane. Thereby, the first projection image is superimposed on the optical image of the sample.

Then, the microscope system 1 generates second projection image data based on the input operation by the user (step S5). Here, the projection image generation unit 23 generates the second projection image data based on the operation information acquired via the input device 40 and the information acquisition unit 24.

Finally, the microscope system 1 projects the second projection image onto the image plane in a manner distinguishable from the first projection image (step S6). Here, the projection control unit 25 controls the projection device 133 based on the second projection image data, whereby the projection device 133 projects the second projection image onto the image plane so as to be distinguishable from the first projection image. More specifically, for example, in the case where the display form of the first projection image is different from that of the second projection image, the projection device 133 projects the first projection image and the second projection image so as to be distinguishable by the difference in the display form. On the other hand, for example, in the case where the display form of the first projection image is the same as the display form of the second projection image, the first projection image and the second projection image are projected in a distinguishable manner by making the projection period of the first projection image different from the projection period of the second projection image.

In the microscope system 1, the image analysis result obtained by the computer and the information added by the user through the input operation are displayed on the optical image. Thus, in pathological diagnosis based on an optical image of a sample, a pathologist can obtain various information for assisting diagnosis without moving the eye away from the eyepiece. Therefore, according to the microscope system 1, the workload of the pathologist can be reduced. In particular, by displaying the image analysis result of the computer, the pathologist can obtain various information which is the basis of the judgment in the pathological diagnosis. In addition, by displaying an image based on the input operation of the user, the pathologist can obtain suggestions of other pathologists having a large experience in real time, for example. Thus, according to the microscope system 1, the pathological diagnosis based on the optical image can be assisted.

Also, in the microscope system 1, the pathological diagnosis is assisted by displaying additional information on the optical image. Therefore, unlike the WSI system that performs pathological diagnosis based on digital images, expensive equipment is not required. Therefore, according to the microscope system 1, the burden on the pathologist can be reduced while avoiding a significant increase in the facility cost. In addition, although it is necessary to create a WSI (white Slide image) in advance in order to diagnose a pathology in the WSI system, it is not necessary to prepare it in advance in the microscope system 1, and the diagnosis operation can be started immediately.

Fig. 4 is a diagram illustrating the distribution of cells on the slide glass SG. Fig. 5 to 7 are diagrams showing examples of observation performed using the microscope system 1. Next, a case of observation performed using the microscope system 1 that performs the image projection processing shown in fig. 3 will be specifically described with reference to fig. 4 to 7.

First, a case where observation is performed on the slide glass SG shown in fig. 4 with the field of view of the microscope system 1 fixed to the field of view F1 will be described with reference to fig. 5.

When the observation using the microscope system 1 is started and the eyepiece 104 is peeped, the pathologist can observe the image V1 in which the cell C1 is present. The image V1 is an optical image formed on the image plane corresponding to the field of view F1. At this time, the display device 30 may be displaying the image M1 corresponding to the image V1 based on the digital image data generated by the image pickup element 131.

Hereinafter, an image observed by the pathologist using the eyepiece 104 is referred to as a visual image, and an image displayed on the display device 30 is referred to as a monitor image.

Thereafter, when the digital image data is analyzed by the computer 20 to determine the contour of the cell C1 and the contour of the nucleus of the cell C1, the projection image P1 as the first projection image is projected to the image plane. Further, the marker CC included in the projection image P1 shows the outline of the cell, and the marker NC shows the outline of the nucleus of the cell. Thereby, the pathologist observes the image V2 in which the projection image P1 is superimposed on the image V1. At this time, the display device 30 displays an image M2 obtained by combining the image M1 with the projection image P1.

Finally, when the user of the microscope system 1 designates the ROI using the input device 40, the projection image P2 as the second projection image is projected onto the image plane. Further, the user is, for example, an experienced assistant who is viewing the monitor image, and the mark UR included in the projection image P2 shows the ROI. Thereby, the pathologist can observe the image V3 in which the projection image P2 is superimposed on the image V2. At this time, the display device 30 displays an image M3 obtained by combining the image M2 with the projection image P2.

In the example shown in fig. 5, the contour of the cell C1 and the contour of the nucleus are emphasized by the first projection image. Therefore, the pathologist can make a diagnosis based on the presence of the cell C1 being blindly observed but reliably recognized. In addition, the attention area designated by the experienced assistant is shown by the second projection image. Therefore, the pathologist can make a diagnosis after particularly carefully observing the region that should be focused on.

Next, a case where the field of view of the microscope system 1 is moved from the field of view F1 to the field of view F2 on the slide glass SG shown in fig. 4 and observation is performed will be described with reference to fig. 6.

When the field of view of the microscope system 1 is moved from the field of view F1 to the field of view F2 from the state in which the pathologist is observing the image V3 shown in fig. 5, the pathologist can observe the image V4 presenting the cell C1 and the cell C2. The image V4 is an optical image formed on the image plane corresponding to the field of view F2. At this time, the display device 30 may be displaying the image M4 corresponding to the image V4 based on the digital image data generated by the image pickup element 131.

Thereafter, when the digital image data is analyzed by the computer 20 to determine the respective contours of the cell C1 and the cell C2 and the respective contours of the nucleus of the cell C1 and the nucleus of the cell C2, the projection image P3 as the first projection image is projected to the image plane. When the movement amount of the stage 101 is calculated based on the change in the position of the cell C1 in the digital image, the projection image P4 obtained by moving the marker UR of the projection image P2 by a distance corresponding to the movement amount is projected onto the image plane. Thereby, the pathologist observes the image V5 in which the projection image P3 and the projection image P4 are superimposed on the image V4. At this time, the display device 30 displays an image M5 obtained by combining the image M4 with the projection images P3 and P4.

In the example shown in fig. 6, the marker UR included in the second projection image is moved in accordance with the amount of movement of the stage 101. Therefore, since the mark UR follows the ROI designated by the user, the user does not need to re-designate the ROI again after the movement of the table 101. Therefore, regardless of whether the table 101 is moved or not, the pathologist can correctly recognize the region to be focused on.

Further, with reference to fig. 7, a case where the field of view of the microscope system 1 is moved from the field of view F1 to the field of view F2 on the slide glass SG shown in fig. 4 is described. Further, fig. 7 differs from fig. 6 in the following respects: the pathologist inputs the observation opinions in the middle of the pathological diagnosis.

When the user of the microscope system 1 designates the ROI using the input device 40 and inputs the observation in a state where the image V2 in which the cell C1 appears is formed on the image plane, the projection image P5 as the second projection image is projected onto the image plane. Further, the user is, for example, a pathologist who is peeping into the eyepiece 104 himself. The marker UR included in the projection image P5 shows the ROI, and the note N included in the projection image P5 is an opinion of the pathologist regarding the ROI. Thereby, the pathologist can observe the image V6 in which the projection image P5 is superimposed on the image V2. At this time, the display device 30 displays an image M6 obtained by combining the image M2 with the projection image P5.

When the field of view of the microscope system 1 is moved from the field of view F1 to the field of view F2 from the state in which the pathologist is observing the image V6, the pathologist observes the image V7 presenting the cell C1 and the cell C2. The image V7 is an image in which the projection image P3 and the projection image P6 are superimposed on the image V4, which is an optical image corresponding to the field of view F2. The projection image P6 is an image in which the marker UR of the projection image P5 is moved by a distance corresponding to the movement amount and the comment N is included at the same position as the projection image P5. At this time, the image M7 is being displayed on the display device 30.

In the example shown in fig. 7, the markers included in the second projection image are managed by being divided into markers that move together with the table 101 and markers that do not move. Therefore, the mark UR follows the ROI designated by the user, while the comment N indicating the opinion is maintained at the predetermined position. This can avoid the observation to be displayed at all times becoming undisplayed due to the movement of the mounting table 101.

In the examples of fig. 5 to 7, the first projection image and the second projection image are displayed with different line types (solid line and broken line), but the first projection image and the second projection image may be made distinguishable by, for example, making colors different from each other.

The image analysis unit 22 of the microscope system 1 may analyze the digital image data by image recognition processing based on a predetermined algorithm, or may analyze the digital image data using a trained neural network.

The parameters of the trained neural network may be generated by training the neural network in a device different from the microscope system 1, and the computer 20 may download the generated parameters to be applied to the image analysis unit 22.

Fig. 8 is a diagram showing the structure of the neural network NN. The neural network NN has an input layer, a plurality of intermediate layers, and an output layer. The output data D2 output from the output layer by inputting the input data D1 to the input layer is compared with the correct data D3. Then, learning is performed by an error back propagation method, thereby updating the parameters of the neural network NN. In addition, the set of input data D1 and correct data D3 is training data for supervised learning.

[ second embodiment ]

Fig. 9 is a diagram showing a configuration of the microscope system 2 according to the present embodiment. The microscope system 2 differs from the microscope system 1 in the following respects: a microscope 200 is provided instead of the microscope 100. The microscope 200 includes a projection unit 140 between the microscope body 110 and the lens barrel 120.

The projection unit 140 is a projection unit for a microscope including the objective lens 102, the imaging lens 103, and the eyepiece 104, and includes an intermediate barrel 130. That is, the projection unit 140 includes: an image pickup device 131 as an example of an image pickup apparatus that acquires digital image data of a sample based on light from the sample; and a projection device 133 that projects the first projection image and the second projection image onto an image plane where the optical image is formed.

The projection unit 140 further includes a camera control unit 141, an image analysis unit 142, a movement amount calculation unit 142a, a projection image generation unit 143, an information acquisition unit 144, and a projection control unit 145.

The camera control unit 141, the image analysis unit 142, the movement amount calculation unit 142a, the projection image generation unit 143, and the projection control unit 145 are the same as the camera control unit 21, the image analysis unit 22, the movement amount calculation unit 22a, the projection image generation unit 23, and the projection control unit 25, respectively. Therefore, detailed description is omitted.

The information acquisition unit 144 acquires operation information of the user based on an operation signal from the input device 40 acquired via the computer 20. Further, the information acquisition unit 144 acquires the identification information from the identification device 50 via the computer 20.

In the present embodiment, the same effect as that of the microscope system 1 can be obtained only by attaching the projection unit 140 to an existing microscope. Thus, according to the projection unit 140 and the microscope system 2, the existing microscope system can be easily expanded to assist pathological diagnosis by a pathologist based on an optical image.

[ third embodiment ]

Fig. 10 is a diagram showing a configuration of a diagnosis support system including the microscope system 3 and the external view system 300 according to the present embodiment. The microscope system 3 differs in the following respects: a computer 60 is provided instead of the computer 20.

The microscope system 3 is connected to 1 or more external browsing systems 300 via the internet 400. The external browsing system 300 is a system including a computer 310 having at least a communication control unit 311, an input device 320, and a display device 330.

The internet 400 is an example of a communication network. The microscope system 3 and the external browsing system 300 may be connected via a VPN (Virtual Private Network), a dedicated line, or the like, for example.

The computer 60 is different from the computer 20 in that it includes a communication control unit 29. The communication control unit 29 exchanges data with the external browsing system 300.

The communication control unit 29 transmits the image data to the external browsing system 300, for example. The image data transmitted by the communication control unit 29 may be, for example, synthesized image data generated by the image synthesizing unit 27. The digital image data, the first projection image data, and the second projection image data may be transmitted individually. In addition, only the digital image data may be transmitted. In the external browsing system 300, the computer 310 that receives the image data displays an image in the display device 330 based on the image data. The computer 310 may generate composite image data based on the digital image data, the first projection image data, and the second projection image data, for example, and may display a composite image on the display device 330 based on the composite image data.

The communication control unit 29 receives operation information input by a user of the external browsing system 300, for example. The projection image generation unit 23 may generate the second projection image data based on the operation information received by the communication control unit 29. The microscope system 3 may project the second projection image based on an input operation by the user of the external browsing system 300 onto the image plane using the projection device 133.

The microscope system 3 can have a conversation with an external browsing system 300 connected via a network. Thus, recommendations can be accepted from remotely located pathologists. Therefore, according to the microscope system 3, the burden of pathological diagnosis on the pathologist can be further reduced.

Fig. 11 is a diagram showing an example of observation performed using the microscope system 3. The case of observation using the microscope system 3 is specifically described with reference to fig. 11.

When the observation using the microscope system 3 is started and the eyepiece 104 is peeped, the pathologist can observe the image V8 in which the cell C1 and the cell C2 are present. The image V8 is an optical image formed on the image plane corresponding to the field of view F2. At this time, an image M8 corresponding to the image V8 may be being displayed on the basis of the digital image data generated by the image pickup element 131 in the display device 30 and the display device 330.

Thereafter, when the contour of the cell and the contour of the nucleus of the cell are determined by analyzing the digital image data by the computer 60, the projection image P7 as the first projection image is projected onto the image plane. Further, the marker CC included in the projection image P7 shows the outline of the cell, and the marker NC shows the outline of the nucleus of the cell. Thereby, the pathologist observes the image V9 in which the projection image P7 is superimposed on the image V8. At this time, an image M9 obtained by combining the image M8 with the projection image P7 is being displayed on the display device 30 and the display device 330.

When the user of the microscope system 3 designates the ROI using the input device 40, the projection image P8 as the second projection image is projected onto the image plane. The user is a pathologist who is peeping into the eyepiece 104, and the mark UR included in the projection image P8 shows an ROI focused on by the pathologist himself. Thereby, the pathologist can observe the image V10 in which the projection image P8 is superimposed on the image V9. At this time, an image M10 obtained by combining the image M9 with the projection image P8 is being displayed on the display device 30 and the display device 330.

Thereafter, the user of the external browsing system 300 who recognizes that the pathologist is paying attention to the cell C1 through the image M10 displayed on the display device 330 operates the input device 320 in order for the pathologist to pay attention to the cell C2. Thereby, the operation information is transmitted from the external browsing system 300 to the microscope system 3. The microscope system 3 that has received the operation information projects another projection image P9, which is a second projection image, to the image plane based on an input operation by the user of the external browsing system 300. The projection image P9 includes a marker UR2 for promoting the cell of interest C2. As a result, the pathologist can observe the image V11 in which the projection image P9 is superimposed on the image V10. At this time, an image M11 obtained by combining the image M10 with the projection image P9 is being displayed on the display device 30 and the display device 330.

In the example shown in fig. 11, a projection image based on input operations by the user of the microscope system 1 and the user of the external browsing system 300 is projected onto an image plane. Therefore, it is possible to perform pathological diagnosis while communicating with users located at different positions.

The above-described embodiments show specific examples for facilitating understanding of the present invention, and the embodiments of the present invention are not limited to these examples. The microscope system, the projection unit, and the image projection method can be variously modified and changed without departing from the scope of claims.

The microscope included in the microscope system 1 may be, for example, a microscope 500 shown in fig. 12. In the above-described embodiment, the configuration in which the intermediate barrel 130 is provided with the image pickup device 131 is exemplified, but the image pickup device 151 for acquiring digital image data used for image analysis may be provided in the digital camera 150 as shown in fig. 12, and the digital camera 150 may be attached to the barrel 120a as a three-eye barrel. However, in this case, light emitted from the projection device 133 included in the intermediate barrel 130a enters the image pickup device 151. Therefore, the digital camera 150 may be controlled so that the light emission period of the projector 133 does not overlap the exposure period of the image pickup device 151. Thereby, the occurrence of a projected image in the digital image can be prevented.

The microscope included in the microscope system 1 may be, for example, the microscope 600 shown in fig. 13. The microscope 600 includes an intermediate barrel 130b instead of the intermediate barrel 130, and the intermediate barrel 130b includes a projection device 135 using a transmissive liquid crystal device. In the above-described embodiment, the configuration in which the light emitted from the projection device 133 is deflected by the light deflecting element 134 disposed on the optical path between the objective lens 102 and the eyepiece 104 to project the projection image on the image plane has been illustrated, but as shown in fig. 13, the projection device 135 may be disposed on the optical path between the objective lens 102 and the eyepiece 104.

In the above-described embodiments, the example in which the image pickup element is included as the photodetector is described, but the photodetector is not limited to the image pickup element. For example, the above-described technique may be applied to a scanning microscope, and in this case, the light detector may be a photomultiplier tube (PMT) or the like.

In the above-described embodiment, the example in which the movement amount of the table 101 is calculated based on the image is described, but the movement amount of the table 101 may be calculated by another method. For example, when the table 101 is a motorized table, the movement amount of the table 101 may be calculated based on instruction information for instructing the movement of the motorized table, or the movement amount of the table 101 may be calculated based on output information from an encoder of a motor attached to the motorized table. In addition, when the table 101 is a manual table, the movement amount of the table 101 may be estimated based on output information of an acceleration sensor attached to the table 101.

In the above-described embodiment, the keyboard, the mouse, the joystick, the touch panel, and the like are exemplified as the input device 40, but the input device 40 may be a device that receives an audio input, such as a microphone. In this case, the computer 20 may have a function of recognizing the voice instruction input from the input device 40, and for example, the information acquiring unit 24 included in the computer 20 may convert voice data into operation information by a voice recognition technique and output the operation information to the projection image generating unit 23.

Description of the reference numerals

1. 2, 3: a microscope system; 10: a microscope controller; 20. 60, 310: a computer; 20 a: a processor; 20 b: a memory; 20 c: a secondary storage device; 20 d: an input/output interface; 20 e: a medium drive device; 20 f: a communication control device; 20 g: a bus; 20 h: a storage medium; 21. 141: a camera control unit; 22. 142: an image analysis section; 22a, 142 a: a movement amount calculation unit; 23. 143: a projection image generation unit; 24. 144, and (3) 144: an information acquisition unit; 25. 145: a projection control unit; 26: an image recording section; 27: an image synthesizing unit; 28: a display control unit; 29. 311: a communication control unit; 30. 330: a display device; 40. 320, and (3) respectively: an input device; 50: an identification device; 100. 200, 500, 600: a microscope; 101: a mounting table; 102. 102 a: an objective lens; 103: an imaging lens; 104: an eyepiece; 110: a microscope body; 111: a turret; 120. 120 a: a lens barrel; 130. 130a, 130 b: a middle lens barrel; 131. 151, 151: an image pickup element; 132. 134: a light deflecting element; 133: a projection device; 140: a projection unit; 150: a digital camera; 300: an external browsing system; 400: the internet; CC. NC, UR 2: marking; c1, C2: a cell; d1: inputting data; d2: outputting the data; d3: correct data; f1, F2: a field of view; n: annotating; NN: a neural network; P1-P9: projecting an image; SG: a glass slide; V1-V11, M1-M11: and (4) an image.

25页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:显微镜系统、投影单元以及图像投影方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!