Diagnosis support program, diagnosis support system, and diagnosis support method

文档序号:1942851 发布日期:2021-12-07 浏览:16次 中文

阅读说明:本技术 诊断支持程序、诊断支持系统和诊断支持方法 (Diagnosis support program, diagnosis support system, and diagnosis support method ) 是由 杉江雄生 于 2020-06-08 设计创作,主要内容包括:该诊断支持程序是用于使计算机执行以下过程的程序:图像处理过程,用于对由成像设备捕获的病理图像进行预定图像处理;以及输出处理过程,用于输出识别信息,该识别信息用于识别病理图像的图像处理完成的部分和病理图像的未处理部分。(The diagnosis support program is a program for causing a computer to execute: an image processing process for performing predetermined image processing on a pathology image captured by an imaging apparatus; and an output processing process for outputting identification information for identifying an image-processed portion of the pathological image and an unprocessed portion of the pathological image.)

1. A diagnosis support program for causing a computer to execute:

an image processing process for performing predetermined image processing on the pathological image captured by the image capturing apparatus; and

an output processing process for outputting identification information for identifying a processed portion and an unprocessed portion of image processing on the pathological image.

2. The diagnostic support program of claim 1, wherein,

the output processing procedure comprises:

displaying a boundary line as the identification information at a boundary between the processed portion and the unprocessed portion displayed on a display unit.

3. The diagnostic support program of claim 2, wherein,

the output processing procedure comprises:

highlighting a new processed part of the processed parts displayed on the display unit with at least one of color, thickness, and flicker of the boundary line therearound.

4. The diagnostic support program of claim 2, wherein,

the image processing process includes:

the image processing is performed from a central portion of an area displayed on the display unit in the pathological image.

5. The diagnostic support program of claim 2, wherein,

the image processing process includes:

the image processing is performed from a portion of the pathological image corresponding to the cursor position displayed on the display unit.

6. The diagnostic support program of claim 2, wherein,

the output processing procedure comprises:

the boundary line is displayed by a line of a first line type on the side of the processed portion and a line of a second line type on the side of the unprocessed portion.

7. The diagnostic support program of claim 2, wherein,

the output processing procedure comprises:

when the display unit enlarges and displays a part of the pathological image, the entire pathological image is displayed in a part of the display unit, and a boundary line is displayed at a boundary between the processed part and the unprocessed part in the entire displayed pathological image.

8. The diagnostic support program of claim 7, wherein,

the output processing procedure comprises:

highlighting a portion of the pathological image when the portion of the pathological image displayed enlarged at the display unit is the unprocessed portion.

9. The diagnostic support program of claim 2, wherein,

the output processing procedure comprises:

in the display unit, at least one of the processed portion and the unprocessed portion is indicated by a text display.

10. The diagnostic support program of claim 2, wherein,

the output processing procedure comprises:

indicating in the display unit by means of an image display that the unprocessed portion is being displayed.

11. The diagnostic support program of claim 2, wherein,

the output processing procedure comprises:

performing display on the display unit after performing a predetermined image quality degradation process on the unprocessed portion.

12. The diagnostic support program of claim 2, wherein,

the pathological image is composed of a plurality of tiled images, and

the image processing process includes:

when the layer of the tile image displayed on the display unit is changed, the predetermined image processing is performed on the tile image in the new layer.

13. The diagnostic support program of claim 2, wherein,

the output processing procedure comprises:

when the unprocessed portion is displayed on the display unit, this is notified by sound.

14. A diagnostic support system comprising:

an image capture device; and

an information processing device that performs predetermined image processing on a pathological image captured by the image capturing device and outputs identification information for identifying a processed portion and an unprocessed portion of the image processing on the pathological image.

15. A diagnosis support method, wherein a computer executes:

an image processing process for performing predetermined image processing on the pathological image captured by the image capturing apparatus; and

an output processing process for outputting identification information for identifying a processed portion and an unprocessed portion of the image processing on a pathological image.

16. A diagnosis support system comprising an image capturing apparatus and software for processing a pathological image captured by the image capturing apparatus, wherein,

the software is software for causing an information processing apparatus to execute: image processing for performing predetermined image processing on the pathological image; and an output process of outputting identification information for identifying a processed part and an unprocessed part of the image processing on the pathological image.

Technical Field

The present disclosure relates to a diagnosis support program, a diagnosis support system, and a diagnosis support method.

Background

In addition to clinical diagnosis in which a clinician diagnoses a patient, a main conventional diagnostic method in a medical institution includes, for example, pathological diagnosis in which a pathologist diagnoses pathological images, which are captured images of an observation object (specimen) collected from a patient. Pathological diagnosis is very important because its diagnosis results significantly affect a patient's treatment plan, etc.

When viewing a pathological image in a pathological diagnosis, a pathologist may perform predetermined image processing (e.g., color correction, edge enhancement, and contrast correction) on the pathological image according to the performance of a monitor, the preference of the pathologist, and the like.

Disclosure of Invention

Technical problem

However, generally, the pathological image has high resolution, and thus predetermined image processing on the pathological image is not generally completed immediately. Accordingly, the pathologist can view a pathological image in which a processed portion and an unprocessed portion of predetermined image processing are mixed. There is a concern that a wrong diagnosis may occur because a pathologist cannot reliably distinguish between processed and unprocessed portions in a pathology image.

The present disclosure has been made in view of the above circumstances, and provides a diagnosis support program, a diagnosis support system, and a diagnosis support method, which can improve the accuracy of pathological diagnosis using pathological images.

Solution to the problem

In order to solve the above problem, the diagnosis support program causes a computer to execute: an image processing process for performing predetermined image processing on the pathological image captured by the image capturing apparatus; and an output processing process for outputting identification information for identifying a processed portion and an unprocessed portion of the image processing on the pathological image.

Drawings

Fig. 1 is an overall configuration diagram of a diagnosis support system according to a first embodiment;

FIG. 2 is a flowchart showing a process of an observer according to the first embodiment;

fig. 3 is a diagram schematically showing a first example of a pathological image according to the first embodiment;

fig. 4 is a diagram schematically showing a second example of a pathological image according to the first embodiment;

fig. 5 is a diagram schematically showing a third example of a pathological image according to the first embodiment;

fig. 6 is a diagram schematically showing a fourth example of a pathological image according to the first embodiment;

fig. 7 is a diagram schematically showing a fifth example of a pathological image according to the first embodiment;

fig. 8 is a diagram schematically showing a sixth example of a pathological image according to the first embodiment;

fig. 9 is a diagram schematically showing an example of a UI for performing predetermined image processing on a pathology image according to the first embodiment;

fig. 10 is a diagram schematically showing an example of a display for identifying an unprocessed portion in a pathological image according to the first embodiment;

fig. 11 is an overall configuration diagram of a diagnosis support system according to a second embodiment;

FIG. 12 is a diagram for illustrating an image capturing process according to the second embodiment;

fig. 13 is a diagram for illustrating a generation process of partial images (tile images) in the second embodiment;

FIG. 14 is a diagram for illustrating a pathological image according to the second embodiment;

FIG. 15 is a diagram for illustrating a pathological image according to the second embodiment;

fig. 16 is a diagram schematically showing an example of displaying a pathological image according to the second embodiment;

fig. 17 is an overall configuration diagram of a diagnosis support system according to a third embodiment;

fig. 18 is a hardware configuration diagram showing an example of a computer for realizing the function of the viewer.

Detailed Description

Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. Note that in the following embodiments, overlapping description will be omitted as appropriate by giving the same reference characters to the same components.

The present disclosure will be described in the following sequence of items.

< first embodiment >

1. Configuration of the System according to the first embodiment

2. Processing procedure of viewer according to the first embodiment

3. First to sixth examples of pathological images according to the first embodiment and the like

< second embodiment >

4. Configuration of the System according to the second embodiment

5. Description of tiled image in second embodiment

6. Example of displaying a pathological image according to the second embodiment

< third embodiment >

7. Configuration of System according to third embodiment

< other examples >

< first embodiment >

[1. configuration of System according to first embodiment ]

First, a diagnosis support system 1 according to a first embodiment will be described with reference to fig. 1. Fig. 1 is an overall configuration diagram of a diagnosis support system 1 according to a first embodiment. As shown in fig. 1, the diagnosis support system 1 includes a scanner 2, a server 3 (information processing apparatus), and a viewer 4 (information processing apparatus). Note that the scanner 2, the server 3, and the viewer 4 each include a communication unit (not shown) implemented by an NIC (network interface card) or the like, are connected to a communication network (not shown) in a wired or wireless manner, and can transmit and receive information to and from each other via the communication network. Note that the arrows in the drawing indicate the main flow of information, and transmission and reception of information may also be performed at portions without arrows.

In such a pathological diagnosis (digital pathology imaging (DPI)) system, in a conventional technique, when predetermined image processing on a pathological image is not completed immediately, for example, a quick preview display of an unprocessed portion of the pathological image by a quick development process has been used in order to reduce a waiting time experienced by an observer (e.g., a pathologist, hereinafter the same applies). However, there is a concern that a wrong diagnosis may occur because a pathologist who may make a diagnosis on one pathology image in a few seconds cannot reliably distinguish between a processed portion and an unprocessed portion in the pathology image, in which such a quick preview display is mixed. Accordingly, a technique of providing a pathology image in which a pathologist can reliably distinguish between a processed portion and an unprocessed portion will be described below.

The scanner 2 is, for example, an image capturing apparatus having an optical microscope function, captures an image of an observation object (specimen) contained in a slide glass, and acquires a pathological image, which is a digital image. Note that the observation target is, for example, a tissue or a cell collected from a patient, for example, a piece of meat of an organ, saliva, or blood. The scanner 2 includes an image capturing unit 21, an image processing unit 22, an encoding unit 23, and a transmitting unit 24.

The image capturing unit 21 captures an image of an observation target contained in a slide glass, and outputs an image capturing signal. The image processing unit 22 performs basic image processing (e.g., demosaicing) on the image capture signal output by the image capture unit 21.

The encoding unit 23 encodes the pathology image on which the image processing unit 22 performs image processing. The transmission unit 24 transmits the pathology image encoded by the encoding unit 23 to the server 3.

The server 3 is a computer device that performs storage, processing, and the like of the pathology image captured by the scanner 2. Further, when accepting a request to view a pathological image from the viewer 4, the server 3 retrieves the pathological image and transmits the retrieved pathological image to the viewer 4. The server 3 includes a receiving unit 31, a storage unit 32, a decoding unit 33, an image processing unit 34, an encoding unit 35, and a transmitting unit 36.

The server 3 executes a predetermined program to realize each function. Note that the program may be stored in the server 3, or may be stored in a storage medium such as a Digital Versatile Disc (DVD), a cloud computer, or the like. Further, the program may be executed by a Central Processing Unit (CPU) or a microprocessor unit (MPU) by using a Random Access Memory (RAM) or the like as a work space in the server 3, or may be executed by an integrated circuit such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).

The reception unit 31 receives the pathology image transmitted from the scanner 2, and stores it in the storage unit 32. The storage unit 32 is implemented by, for example, a storage device, for example, a semiconductor storage device such as a RAM or a flash memory, a hard disk, or an optical disk. The storage unit 32 stores various programs, data, pathology images received from the scanner 2, and the like.

The decoding unit 33 reads the pathology image from the storage unit 32 and decodes it. The image processing unit 34 performs predetermined image processing (for example, color correction, edge enhancement, or contrast correction, which may also be simply referred to as "image processing" hereinafter) on the pathology image decoded by the decoding unit 33, in accordance with the performance of the display unit 45 of the viewer 4, the preference of a pathologist, or the like.

The encoding unit 35 encodes the pathology image on which the image processing unit 34 performs image processing. The transmitting unit 36 transmits the pathology image encoded by the encoding unit 35 to the viewer 4.

The viewer 4 is a computer device mainly used by a pathologist, and displays a pathology image accepted from the server 3 to which a viewing request is sent, and is installed in, for example, a research institute or a hospital. The viewer 4 includes a receiving unit 41, a decoding unit 42, an image processing unit 43, a display control unit 44, a display unit 45, a storage unit 46, and an operation unit 47.

The viewer 4 executes a predetermined program to realize each function. Note that the program may be stored in the server 3 or the viewer 4, or may be stored in a storage medium such as a DVD, a cloud computer, or the like. Further, the program may be executed by a CPU or MPU by using a RAM or the like as a work space, or may be executed by an integrated circuit such as an ASIC or FPGA.

The receiving unit 41 receives the pathology image transmitted from the server 3. The decoding unit 42 decodes the pathology image received by the receiving unit 41.

The image processing unit 43 performs an image processing process for performing predetermined image processing (for example, color correction, edge enhancement, or contrast correction) on the pathology image decoded by the decoding unit 42, in accordance with the performance of the display unit 45, the preference of a pathologist, or the like. Note that even when the pathological image is displayed on the display unit 45, the processing of the image processing unit 43 is executed (continued).

For example, the image processing process performs image processing from the center portion of the region displayed on the display unit 45 in the pathological image (fig. 3: details will be described later).

For example, the image processing process may also perform image processing from a portion of the pathological image corresponding to the cursor position displayed on the display unit 45 (fig. 6: details will be described later).

The display control unit 44 performs an output processing procedure of outputting identification information for identifying a processed part and an unprocessed part of image processing on the pathological image. Note that even when the pathological image is displayed on the display unit 45, the processing of the display control unit 44 is executed (continued).

For example, the output processing procedure displays a boundary line at the boundary between the processed portion and the unprocessed portion displayed on the display unit 45 as the identification information (fig. 3: details will be described later).

The output processing may also highlight a new processed part of the processed parts displayed on the display unit 45 by at least one of color, thickness, or blinking of a border line around it (fig. 5: details will be described later).

The output processing may also display the boundary lines by lines of a first line type located closer to the processed portion and lines of a second line type located closer to the unprocessed portion (fig. 7: details will be described later).

The output processing may also display the entire pathological image in a part of the display unit 45, and display a boundary line at a boundary between a processed part and an unprocessed part in the displayed entire pathological image when enlarging a part of the pathological image on the display unit (fig. 8: details will be described later).

The output processing may also highlight a part of the pathological image enlarged on the display unit 45 when the part of the pathological image is an unprocessed part (fig. 8: details will be described later).

The output processing may also indicate at least one of the processed portion and the unprocessed portion through a text display on the display unit 45 (fig. 8: details will be described later).

The output processing may also indicate the display of unprocessed portions through a graphic display on the display unit 45 (fig. 10: details will be described later).

The output processing procedure may also perform display on the display unit 45 after predetermined image quality degradation processing is performed on the unprocessed portion. For example, by performing luminance reduction or gradation scaling on the unprocessed portion, the observer can reliably notice the unprocessed portion as a predetermined image quality deterioration process.

The display unit 45 is a device for displaying information, and has a screen using, for example, liquid crystal, Electroluminescence (EL), a Cathode Ray Tube (CRT), or the like. Further, the display unit 45 may be compatible with 4K or 8K, and may be formed of a plurality of display devices. The display unit 45 displays information (e.g., an image) according to the control of the display control unit 44.

The storage unit 46 is implemented by, for example, a storage device, for example, a semiconductor storage device, for example, a RAM or a flash memory, a hard disk, or an optical disk. The storage unit 46 stores various programs, data, pathology images received from the server 3, and the like.

The operation unit 47 is a device operated by a user (e.g., pathologist, hereinafter the same applies) of the viewer 4, and is, for example, a mouse, a keyboard, a touch panel, or the like.

Note that although in the present embodiment, both the image processing unit 34 of the server 3 and the image processing unit 43 of the viewer 4 can perform predetermined image processing, there is no limitation in this respect. For example, if the viewer 4 has sufficient computing power, only the viewer 4 can perform predetermined image processing. Alternatively, for example, if the viewer 4 does not have sufficient computing power, only the server 3 may perform predetermined image processing. Further, although the present disclosure will be described by assuming that the viewer 4 of the server 3 and the viewer 4 has a function of outputting identification information for identifying a processed part and an unprocessed part of predetermined image processing in a pathological image, there is no limitation in this respect, and the server 3 may have such a function.

[2. treatment procedure of the viewer according to the first embodiment ]

Next, the process of the viewer 4 according to the first embodiment will be described with reference to fig. 2. Fig. 2 is a flowchart showing the processing of the viewer 4 according to the first embodiment. Note that, hereinafter, description of some processes such as the process of the decoding unit 42 may be omitted for simplification of the description.

First, at step S1, the viewer 4 determines whether an image display operation, i.e., an operation for displaying a pathological image by a pathologist or the like via the operation unit 47 has occurred, and if so, the process proceeds to step S2, and if not, the process returns to step S1.

In step S2, the reception unit 41 acquires a pathology image from the server 3.

Next, in step S3, the image processing unit 43 starts predetermined image processing (e.g., color correction, edge enhancement, or contrast correction) on the pathological image.

Next, in step S4, the display control unit 44 displays the pathological image on the display unit 45.

Next, in step S5, the display control unit 44 executes an output processing procedure of outputting identification information for identifying a processed part and an unprocessed part of the image processing on the pathological image.

Next, in step S6, the image processing unit 43 determines whether or not predetermined image processing has ended, and if so, the processing proceeds to step S7, and if not, the processing returns to step S4. That is, during the loop processing of step S4 → step S5 → no in step S6 → step S4.. the processed portion and the unprocessed portion are displayed on the display unit 45 in a mixed manner as a pathology image and a boundary line or the like for identifying the processed portion and the unprocessed portion. In this way, the observer can reliably distinguish between a processed portion and an unprocessed portion of image processing on the pathology image (details will be described later).

In step S7, the viewer 4 determines whether an operation to change the image display portion has occurred, that is, whether an operation for the viewer to change the region of the pathological image displayed on the display unit 45 via the operation unit 47 has occurred, and if so, the process returns to step S3, and if not, the process proceeds to step S8.

In step S8, the viewer 4 determines whether an image display end operation, i.e., an operation for ending the pathological image display by a pathologist or the like via the operation unit 47 has occurred, and if so, the process proceeds to step S9, and if not, the process returns to step S7.

In step S9, the viewer 4 ends the image display, that is, the display of the pathological image on the display unit 45.

[ 3] first to sixth examples of pathological image according to the first embodiment and the like ]

Fig. 3 is a diagram schematically illustrating a first example of a pathological image according to the first embodiment. Fig. 3 (a) to (c) show pathological images displayed on the display unit 45. Fig. 3 (a) is a pathological image before predetermined image processing is performed. Fig. 3 (c) is a pathological image after predetermined image processing is completed. The image processing process performed by the image processing unit 43 on the pathological image in (a) of fig. 3 performs predetermined image processing from the center portion of the region displayed on the display unit 45 in the pathological image. Since it can be generally considered that the observer often views the pathological image from the central portion thereof, it is convenient for the observer to perform image processing from the central portion of the display area of the pathological image in this manner if predetermined image processing is not completed immediately.

The output processing performed by the display control unit 44 also displays a boundary line at the boundary between the processed portion and the unprocessed portion displayed on the display unit 45 as identification information. Fig. 3 (b) is a pathological image in the middle of predetermined image processing being performed. The inside of region R1 is the treated portion and the outside of region R1 is the untreated portion.

For example, if the predetermined image processing is color correction processing, it is not easy to distinguish between a processed portion and an unprocessed portion in a pathological image without displaying such a boundary line. Specifically, for example, in the case of a pathological image of an observation target stained with Hematoxylin and Eosin (HE), a lesion is determined by a purple to magenta color tone. In this case, when viewing the pathological image without displaying the boundary line, the pathologist may mistake the unprocessed portion of the color correction as the processed portion and make an erroneous diagnosis. The same applies to edge enhancement, and a normal cell can be determined as a cell having a lesion in a high magnification image without performing an appropriate edge enhancement process, and thus a mixture of a processed portion and an unprocessed portion in a pathological image may cause an erroneous diagnosis without displaying a boundary line.

In contrast, according to the present disclosure, by displaying the boundary line at the boundary between the processed portion and the unprocessed portion in the pathological image, the observer can reliably distinguish the processed portion and the unprocessed portion in the pathological image, and can avoid an erroneous diagnosis.

Fig. 4 is a diagram schematically illustrating a second example of a pathological image according to the first embodiment. Fig. 4 (a) to (c) show pathological images displayed on the display unit 45. Fig. 4 (a) is a pathological image after predetermined image processing is performed. Here, for example, it is assumed that the observer has moved the pathological image displayed on the display unit 45 to the left by operating the operation unit 47 (yes in step S7 in fig. 2).

Therefore, as shown in fig. 4 (b), the region R2 on the left side of the display unit 45 is a processed portion, and the outer side of the region R2 on the right side is an unprocessed portion. The output processing performed by the display control unit 44 displays the boundary line between the inside and the outside of the region R2. In this way, according to the present disclosure, by displaying the boundary line at the boundary between the processed portion and the unprocessed portion in the pathological image, the observer can reliably distinguish the processed portion and the unprocessed portion in the pathological image, and can avoid an erroneous diagnosis. Thereafter, when the predetermined image processing for the unprocessed portion outside the region R2 in (b) of fig. 4 is completed, the pathological image after the completion of the predetermined image processing is on the entire screen, as shown in (c) of fig. 4.

Fig. 5 is a diagram schematically illustrating a third example of a pathological image according to the first embodiment. Fig. 5 (a) to (c) show pathological images displayed on the display unit 45. Fig. 5 (a) is a pathological image before predetermined image processing is performed. The image processing process performed by the image processing unit 43 on the pathological image in (a) of fig. 5 performs image processing from the center portion of the region displayed on the display unit 45 in the pathological image, and displays a boundary line at the boundary between the processed portion and the unprocessed portion. Fig. 5 (b) is a pathological image in the middle of predetermined image processing being performed. The inside of region R3 is the treated portion and the outside of region R1 is the untreated portion.

Here, the image processing process further performs predetermined image processing on the region R4 shown in (c) of fig. 5, and in this process, the output processing process performs highlighting by displaying a thick boundary line surrounding the region R4 as a new processed portion. In this way, by highlighting the new processed portion, the observer reliably recognizes the new processed portion (region R4) in the pathology image, and can avoid an erroneous diagnosis. Note that such a manner of highlighting is not limited to thickening the boundary line with respect to the other boundary line, and it is also possible to make the color of the boundary line different from that of the other boundary line or make the boundary line blink.

Fig. 6 is a diagram schematically illustrating a fourth example of a pathological image according to the first embodiment. Fig. 6 (a) to (c) show pathological images displayed on the display unit 45. Fig. 6 (a) is a pathological image before predetermined image processing is performed. The image processing process of the pathological image in (a) of fig. 6 performs image processing from a portion (hereinafter, also referred to as "cursor portion") corresponding to the position of the cursor (reference character C in (b) of fig. 6) displayed on the display unit 45 in the pathological image. Since it can be considered that the observer is likely to see the cursor portion of the pathological image, it is convenient for the observer to perform image processing from the cursor portion.

The output processing also displays a boundary line at a boundary between the processed portion and the unprocessed portion. In fig. 6 (b), the inside of the region R5 is a treated portion, and the outside of the region R5 is an untreated portion. Note that in (c) of fig. 6, the boundary line of the region R6 including the new processed portion is highlighted in a manner similar to the boundary line of the region R4 in (c) of fig. 5.

Fig. 7 is a diagram schematically illustrating a fifth example of a pathological image according to the first embodiment. As shown in fig. 7, the output processing procedure displays the boundary line by the line L71 of the first line type located closer to the processed portion (on the center side), the line L73 of the second line type located closer to the unprocessed portion (on the outer side), and the line L72 therebetween. Here, the lines L71, L72, and L73 have different degrees of color darkness that increase in this order. The observer can reliably distinguish the processed portion from the unprocessed portion in the pathological image by recognizing in advance that the processed portion is on the lighter color side and the unprocessed portion is on the darker color side, and can avoid an erroneous diagnosis. Note that the manner of changing the line type is not limited to changing the color, and may also be changed between a solid line and a broken line, or the like. Further, the number of lines forming the boundary line is not limited to three, and may be two or four or more.

Fig. 8 is a diagram schematically illustrating a sixth example of a pathological image according to the first embodiment. When the pathological image is enlarged and the processed portion or the unprocessed portion is on the entire screen displayed on the display unit 45, the observer may not be able to easily determine which of the processed portion and the unprocessed portion is on the entire screen. Therefore, as shown in fig. 8, the output processing enlarges a part of the pathological image (region R8) on the display unit 45 by displaying the entire pathological image in a part of the display unit 45 (region R9), and displays the boundary line at the boundary between the processed portion (inside the region R10 in the region R9) and the unprocessed portion (outside the region R10 in the region R9) in the entire displayed pathological image. Further, a region R11 in the region R9 corresponds to the enlarged portion (region R8).

In this way, the observer can easily recognize that the enlarged portion (region R8) corresponds to the region R11 in the entire portion (region R9) and is an unprocessed portion.

The output processing also highlights (e.g., with a bold frame) the enlarged portion (region R8), which is part of the pathology image. In this way, the observer can further easily recognize that the enlarged portion (region R8) is an unprocessed portion.

The output handler also indicates the unprocessed part by a text indication (indication of "attention | unprocessed" with reference character T). In this way, the observer can further easily recognize that the enlarged portion (region R8) is an unprocessed portion. Note that the text indication is not limited to being provided to the unprocessed portion, but may be provided to the processed portion. Further, although the text indication is preferably provided by an on-screen display (OSD), there is no limitation in this respect, and may be provided by other means.

Fig. 9 is a diagram schematically illustrating an example of a User Interface (UI) for performing predetermined image processing on a pathology image according to the first embodiment. On this screen, any one of color temperature correction (color temperature), edge enhancement (sharpness), and contrast correction (contrast) as examples of color correction can be selected via menus for predetermined image processing (image processing menus) ((a), (b)) in the user interface display UI 1. For example, when selecting color temperature correction (color temperature) ((c)), a triple selection of 5000K, 6000K, or 7000K is enabled. Then, if 5000K ((d)) is selected, for example, an image obtained by performing color temperature correction of 5000K on a top-view image (an image showing a general view of a pathological image) is displayed in a pop-up manner as an example in the region R12 in (a) of fig. 9. The observer views the pop-up indication and, if satisfied, performs an operation of confirming the selection, thereby performing color temperature correction of 5000K on the entire display screen.

In this way, the observer can easily select the details of the predetermined image processing on the pathology image by using the UI as described above. Note that, in the above-described example, the required processing time is reduced by performing color temperature correction on a top-view image having a smaller area instead of the entire display screen shown in (a) of fig. 9 for display as an example.

Fig. 10 is a diagram schematically illustrating an example of a display for identifying an unprocessed portion in a pathological image according to the first embodiment. As described above, the output processing procedure may also indicate the display of the unprocessed portion by a graphic indication on the display unit 45. In the pathology image of fig. 10, the inside of the region R14 is a processed portion, and the outside of the region R14 is an unprocessed portion. Further, the region R13 is a top view image. Further, an icon a1 is an icon indicating that an unprocessed portion is displayed by displaying an hourglass. Further, the status bar SB1 is a status bar indicating that the unprocessed portion is displayed by displaying the approximate remaining time.

In this way, by indicating that the unprocessed portion is displayed by the graphic indication of the icon a1 or the status bar SB1 in addition to the boundary line between the inside and outside of the region R14, the observer can more reliably recognize that the unprocessed portion exists in the displayed pathological image.

Note that although the icon a1 and the status bar SB1 are displayed in fig. 10 for convenience of creating the drawing, there is no limitation in this respect, and only one of them may be displayed. Another illustrative indication may also be used.

In this way, according to the diagnosis support system 1 in the first embodiment, by outputting the identification information for identifying the processed part and the unprocessed part of the image processing on the pathological image, the accuracy of the pathological diagnosis using the pathological image can be improved.

Specifically, by displaying a boundary line at a boundary between a processed portion and an unprocessed portion in a displayed pathology image, an observer can reliably distinguish the processed portion from the unprocessed portion in the pathology image, and can avoid an erroneous diagnosis.

Further, by highlighting the new processed portion by means of at least one of color, thickness, or flicker of the boundary line around it, the observer reliably recognizes the new processed portion in the pathology image, and can better avoid erroneous diagnosis.

Further, it is convenient for the observer to perform image processing from the central portion of the displayed pathological image, because it can be considered that the observer often views the pathological image from the central portion.

Further, when the cursor is displayed together with the pathological image, it is convenient for the observer to perform image processing from a portion corresponding to the cursor position, because it can be considered that the observer is likely to look at the cursor portion.

Further, by displaying the boundary line in lines of a plurality of line types, the observer can more reliably distinguish between the processed portion and the unprocessed portion in the pathology image.

Further, by displaying a boundary line at a boundary between a processed portion and an unprocessed portion in the top view image when a portion of the pathological image is enlarged on the display unit 45, the observer can easily recognize which of the processed portion and the unprocessed portion is enlarged by viewing the top view image.

Further, by highlighting (displaying with a thick frame or the like) the enlarged portion when the enlarged portion is an unprocessed portion in this process, the observer can further easily recognize that the enlarged portion is an unprocessed portion.

Further, by indicating at least one of the processed portion and the unprocessed portion of the pathological image by means of the text indication, the observer can more reliably recognize the processed portion and the unprocessed portion.

Further, by indicating that an unprocessed portion is displayed by means of a graphic indication (for example, an icon a1 or a status bar SB1 in fig. 10), the observer can more reliably recognize that an unprocessed portion exists in the pathological image.

Further, by displaying the pathological image after performing predetermined image quality deterioration processing (luminance reduction or gradation scaling) on the unprocessed portion, it is possible to further reduce the possibility of erroneous diagnosis occurring due to difficulty in viewing the unprocessed portion.

Note that the start position of predetermined image processing on the pathological image is not limited to the screen center or the cursor portion as described above, and may be a lesion if the lesion is identified by estimation of machine learning or the like, for example.

Further, the predetermined image processing is not limited to color correction, edge enhancement, or contrast correction, and may be other image processing such as tone curve correction.

< second embodiment >

[4. configuration of System according to second embodiment ]

Next, the diagnosis support system 1 according to the second embodiment will be described. Descriptions of items similar to those in the first embodiment will be omitted as appropriate. The second embodiment is different from the first embodiment in that a so-called tile image is used. That is, in the second embodiment, a pathology image is composed of a plurality of tile images. Further, when the layer of the tile image displayed on the display unit 45 is changed, the image processing process performs predetermined image processing on the tile image in the new layer. Details will be described below.

Fig. 11 is an overall configuration diagram of the diagnosis support system 1 according to the second embodiment. Compared with the case of the first embodiment, the tile image generating unit 37 is added to the server 3. Details of the tile image generating unit 37 will be described later.

[5 description of tiled image in second embodiment ]

Fig. 12 is a diagram for illustrating an image capturing process according to the second embodiment. As described above, the scanner 2 captures an image of the observation object a10 contained in the slide glass G10, and acquires a pathological image, which is a digital image. In the second embodiment, for example, the scanner 2 generates the entire image, then identifies the region in which the observation object a10 exists in the entire image, and the images of the divided regions obtained by dividing the region in which the observation object a10 exists by a predetermined size are sequentially captured by the high-resolution image capturing unit. For example, as shown in fig. 12, the scanner 2 first captures an image of the region R11 and generates a high-resolution image I11 as an image showing a partial region of the observation object a 10. Subsequently, the scanner 2 moves the stage to capture an image of the region R12 by the high-resolution image capturing unit and generates a high-resolution image I12 corresponding to the region R12. The scanner 2 generates high-resolution images I13, I14,. corresponding to the regions R13, R14,. in a similar manner. Although only the regions up to R18 are shown in fig. 12, the scanner 2 sequentially moves the stage to capture images of all the divisional regions corresponding to the observation object a10 by the high-resolution image capturing unit and generates high-resolution images corresponding to the respective divisional regions.

Incidentally, the slide G10 may move on the stage when the stage moves. If the slide glass G10 is moved, a region of the observation object a10 on which image capturing is not performed may appear. The scanner 2 performs image capturing by the high-resolution image capturing unit so that the adjacent divided regions partially overlap as shown in fig. 12, whereby even when the slide glass G10 is slightly moved, the occurrence of a region where image capturing is not performed can be prevented.

Note that although the above shows an example in which the image capturing area is changed by moving the stage, the scanner 2 may change the image capturing area by moving the optical system (e.g., a high-resolution image capturing unit). Further, in fig. 12, an example is shown in which the scanner 2 captures an image of the observation object a10 from the central portion thereof. However, the scanner 2 may also capture images of the observation object a10 in an order different from the image capturing order shown in fig. 12. For example, the scanner 2 may capture an image of the observation object a10 from its peripheral portion.

Subsequently, each high-resolution image generated by the scanner 2 is transmitted to the server 3, and divided into a predetermined size by the tile image generating unit 37 in the server 3. In this way, partial images (tile images) are generated from the high-resolution images. This will be described by using fig. 13. Fig. 13 is a diagram for illustrating a generation process of partial images (tile images) in the second embodiment.

Fig. 13 shows a high-resolution image I11 corresponding to the region R11 shown in fig. 12. Note that the following description will be made by assuming that the server 3 generates a partial image from a high-resolution image. However, the partial image may be generated by a device other than the server 3 (for example, an information processing device provided in the scanner 2).

In the example shown in fig. 13, the server 3 divides a single high-resolution image I11 to generate 100 tiled images T11, T12. For example, if the resolution of the high-resolution image I11 is 2560 × 2560[ pixels ], the server 3 generates 100 tile images T11, T12,.. multidot.256 [ pixels ] having a resolution of 256 × 256[ pixels ] from the high-resolution image I11. Similarly, the server 3 divides other high-resolution images into the same size to generate a tile image.

Note that, in the example of fig. 13, the regions R111, R112, R113, and R114 are regions (not shown in fig. 13) that overlap with other adjacent high-resolution images. The server 3 performs alignment of the overlapping regions by a technique such as template matching to perform a stitching process on the high-resolution images adjacent to each other. In this case, the server 3 may generate a tile image by dividing the high-resolution image after the stitching process. Alternatively, the server 3 may generate the tile images of the areas other than the areas R111, R112, R113, and R114 before the stitching process, and generate the tile images of the areas R111, R112, R113, and R114 after the stitching process.

In this way, the server 3 generates a tile image as the minimum unit of the captured image of the observation object a 10. The server 3 (or viewer 4) then sequentially synthesizes the minimum unit of tile images to generate tile images having different layers. Specifically, the server 3 synthesizes a predetermined number of adjacent tile images to generate one tile image. This will be described by using fig. 14 and 15. Fig. 14 and 15 are diagrams for illustrating a pathological image according to the second embodiment.

In the upper part of fig. 14, a set of tile images of the minimum unit generated from each high-resolution image by the server 3 is shown. In the upper example of fig. 14, the server 3 synthesizes four tile images T111, T112, T211, and T212 adjacent to each other in the tile images to generate one tile image T110. For example, if each of the tile images T111, T112, T211, and T212 has a resolution of 256 × 256, the server 3 generates the tile image T110 having a resolution of 256 × 256. Similarly, the server 3 synthesizes four tile images T113, T114, T213, and T214 adjacent to each other to generate a tile image T120. In this way, the server 3 generates tile images that synthesize a predetermined number of minimum units of tile images.

The server 3 also generates a tile image by further synthesizing tile images adjacent to each other among the tile images obtained by synthesizing the tile images of the minimum unit. In the example of fig. 14, the server 3 synthesizes four tile images T110, T120, T210, and T220 adjacent to each other to generate one tile image T100. For example, if the tile images T110, T120, T210, and T220 have a resolution of 256 × 256, the server 3 generates the tile image T100 having a resolution of 256 × 256. For example, the server 3 generates a tile image having a resolution of 256 × 256 from images having a resolution of 512 × 512 obtained by synthesizing four tile images adjacent to each other by performing four-pixel averaging, weighted filtering (processing for increasing reflection of pixels closer to pixels farther away), 1/2 thinning-out processing, or the like.

The server 3 repeats this synthesizing process to finally generate one tile image having the same resolution as that of the tile image of the minimum unit. For example, as in the above-described example, if the minimum unit of tile images has a resolution of 256 × 256, the server 3 repeats the above-described synthesizing process to finally generate one tile image T1 having a resolution of 256 × 256.

Fig. 15 schematically shows the tile image shown in fig. 14. In the example shown in fig. 15, the tile image group in the lowest layer is the smallest unit of tile images generated by the server 3. The tile image group in the second layer from the bottom is a tile image obtained by synthesizing the tile image groups in the lowest layer. The uppermost tile image T1 represents one of the finally generated tile images. In this way, the server 3 generates a tile image group having layers such as a pyramid structure as shown in fig. 15 as a pathology image.

Note that a region D shown in fig. 14 represents an example of a region displayed on a display screen such as the display unit 45. For example, assume that the display apparatus is capable of displaying a resolution of three vertical tile images by four horizontal tile images. In this case, as in the region D shown in fig. 14, the degree of detail of the observed object a10 displayed on the display device depends on the layer to which the tile image being displayed belongs. For example, when the tile image in the lowest layer is used, a small area of the observation object a10 is displayed in detail. Since the tile images used are located in higher layers, a larger area of the observed object a10 is displayed rougher.

The server 3 stores the tile images in each layer as shown in fig. 15 in the storage unit 32. For example, the server 3 stores each tile image together with tile identification information (an example of partial image information) capable of uniquely identifying each tile image. In this case, when accepting a request for acquiring a tile image including tile identification information from the viewer 4, the server 3 transmits a tile image corresponding to the tile identification information to the viewer 4. Further, for example, the server 3 may store each tile image together with layer identification information identifying each layer and tile identification information capable of being uniquely identified within the same layer. In this case, when accepting a request to acquire tile images including layer identification information and tile identification information from the viewer 4, the server 3 transmits, to the viewer 4, a tile image corresponding to the tile identification information among tile images belonging to a layer corresponding to the layer identification information.

Note that the server 3 may also store the tile images in each layer as shown in fig. 15 in the viewer 4, a cloud server (not shown), or the like. Further, the generation processing of the tile images shown in fig. 14 and 15 may be performed in a cloud server or the like.

Further, the server 3 may not store the tile images in all layers. For example, the server 3 may store only the tile images in the lowest layer, may store only the tile images in the lowest layer and the tile images in the uppermost layer, or may store the tile images in a predetermined layer (for example, an odd layer or an even layer). At this time, when a tile image in a layer that is not stored is requested from another device, the server 3 dynamically synthesizes the stored tile images to generate a tile image requested from another device. In this way, with the server 3, it is possible to prevent the squeezing of the storage capacity by reducing the tile images to be stored.

Although image capturing conditions are not mentioned in the above example, the server 3 may store a tile image as shown in fig. 15 in each layer for each image capturing condition. One example of an image capturing condition is a focal length of an object (e.g., the observation object a 10). For example, the scanner 2 may capture an image while changing the focal length of the same object. In this case, the server 3 may store the tile images in each layer as shown in fig. 15 for each focal length. Note that the reason for changing the focal length is that the observation object a10 may be translucent, and therefore there is a focal length suitable for capturing an image of the surface of the observation object a10 and a focal length suitable for capturing an image of the inside of the observation object a 10. In other words, the scanner 2 can generate a pathology image captured from the surface of the observation object a10 and a pathology image captured from the inside of the observation object a10 by changing the focal distance.

Another example image capturing condition is a condition for staining the observation object a 10. Specifically, in pathological diagnosis, a specific part (for example, a cell) of the observation target a10 may be stained with a luminescent substance. The light-emitting substance is a substance that emits light in response to irradiation with light of a specific wavelength, for example. The same observed object a10 may be stained with a different luminescent substance. In this case, the server 3 may store a tile image in each layer as shown in fig. 15 for each luminescent substance used for dyeing.

Further, the number and resolution of the tile images described above are one example, and may be changed as appropriate depending on the system. For example, the number of tile images synthesized by the server 3 is not limited to four. For example, the server 3 may repeat the process of synthesizing 3 × 3 ═ 9 tile images. Further, although the example in which the resolution of the tile image is 256 × 256 is shown above, the resolution of the tile image may not be 256 × 256.

The viewer 4 extracts a desired tile image from the tile image group having the layer structure according to an input operation of the user using software employing a system capable of processing the tile image group having the layer structure described above, and outputs it on the display unit 45. Specifically, the display unit 45 displays an image of a specific portion selected by the user among images having a specific resolution selected by the user. This processing allows the user to experience the feeling of observing the observation target while changing the observation magnification. That is, the viewer 4 functions as a virtual microscope. The virtual observation magnification here actually corresponds to the resolution.

[ 6] example of displaying pathological image according to second embodiment ]

Fig. 16 is a diagram schematically illustrating an example of displaying a pathological image according to the second embodiment. The pathology image I1 shown in fig. 16 (a) is composed of a set of tiled images in the middle layer part of the pyramid structure. As shown in (b) of fig. 16, a region R15 of the pathology image I1 is displayed on the display unit 45. In the pathology image in fig. 16 (b), the inside of the region R16 is a processed portion, and the outside of the region R16 is an unprocessed portion. The border lines are displayed on their borders.

Further, the pathology image I2 is composed of a set of tile images at a higher level than the pathology image I1. The region R17 of the pathology image I2 corresponds to the region R15 of the pathology image I1. Further, the pathology image I3 is composed of a set of tile images of a lower layer than the pathology image I1. The region R18 of the pathology image I3 corresponds to the region R15 of the pathology image I1.

In this case, for example, when the region R15 of the pathology image I1 is displayed, predetermined image processing may be performed on the background on the region R17 of the pathology image I2 and the region R18 of the pathology image I3 corresponding thereto to reduce the time for the user to wait during the zoom operation (Z-direction movement). Predetermined image processing may also be performed in advance by predicting user operations in the X direction and the Y direction.

In this way, according to the diagnosis support system 1 in the second embodiment, even for a pathological image composed of a plurality of tile images, by outputting identification information (for example, displaying a boundary line) for identifying a processed portion and an unprocessed portion of image processing, it is possible to improve the accuracy of pathological diagnosis using the pathological image.

Specifically, for example, when the layer of the displayed tile images is changed, by performing predetermined image processing on the tile images in the new layer and performing display of the boundary line between the processed portion and the unprocessed portion, or the like, the observer can reliably distinguish the processed portion and the unprocessed portion in the pathology image.

< third embodiment >

[7. configuration of System according to third embodiment ]

Next, the diagnosis support system 1 according to the third embodiment will be described. Descriptions of items similar to those in the first embodiment will be omitted as appropriate. Fig. 17 is an overall configuration diagram of the diagnosis support system 1 according to the third embodiment. The third embodiment is different from the first embodiment in that a sound control unit 48 and a sound output unit 49 are added to the viewer 4.

The sound control unit 48 performs an output processing procedure of outputting a sound as identification information for identifying a processed portion and an unprocessed portion of image processing on the pathological image. The sound output unit 49 is a device for outputting sound, and is, for example, a speaker. For example, when the unprocessed portion is displayed on the display unit 45, the sound control unit 48 notifies this by sound.

In this way, according to the diagnosis support system 1 in the third embodiment, not only display but also sound can be used to identify a processed part and an unprocessed part of image processing on a pathological image. Therefore, the observer can more reliably distinguish between the processed portion and the unprocessed portion in the pathology image.

< other examples >

The process according to the above-described embodiment may be performed in various different forms other than the above-described embodiment.

[ display device ]

In the above-described embodiment, it is assumed that the display unit 45 is a desktop display device. However, there is no limitation in this respect, and the display unit 45 may also be a wearable device (e.g., a head-mounted display) worn by a pathologist or the like.

[ image capturing apparatus ]

Further, although in the above-described embodiments, description has been made by using a scanner as an example of an apparatus for capturing an image of an observation target, there is no limitation in this respect. For example, the apparatus for capturing an image of an observation object may be a medical image acquisition apparatus, such as an endoscope, a Computed Tomography (CT), or a Magnetic Resonance Image (MRI), for capturing an image of the inside of a patient body. In this case, medical images such as two-dimensional still images or moving images generated by an endoscope or three-dimensional images generated by CT or MRI are saved in the server 3.

[ Server ]

Other pathological images captured by another medical image acquisition apparatus such as an endoscope, CT, or MRI may also be stored in the server 3 in association with the pathological images generated by the scanner 2.

[ hardware configuration ]

For example, the information apparatus such as the server 3 or the viewer 4 according to the above-described embodiment is realized by a computer 1000 having a configuration as shown in fig. 18. An example of the viewer 4 according to the first embodiment will be described below. Fig. 18 is a hardware configuration diagram showing an example of a computer for realizing the function of the viewer 4.

The computer 1000 includes a CPU 1100, a RAM 1200, a Read Only Memory (ROM)1300, a Hard Disk Drive (HDD)1400, a communication interface 1500, and an input/output interface 1600. The components of computer 1000 are connected by a bus 1050.

The CPU 1100 operates based on a program stored in the ROM 1300 or the HDD 1400, and controls components. For example, the CPU 1100 develops programs stored in the ROM 1300 or the HDD 1400 onto the RAM 1200, and executes processing corresponding to the various programs.

The ROM 1300 stores a boot program such as a Basic Input Output System (BIOS) executed by the CPU 1100 when the computer 1000 is started, a program depending on hardware of the computer 1000, and the like.

The HDD 1400 is a computer-readable recording medium that non-temporarily records a program executed by the CPU 1100, data used by the program, and the like. Specifically, the HDD 1400 is a recording medium that records a response generation program according to the present disclosure as an example of the program data 1450.

The communication interface 1500 is an interface for the computer 1000 to connect to an external network 1550 (e.g., the internet). For example, the CPU 1100 receives data from other devices and transmits data generated by the CPU 1100 to other devices via the communication interface 1500.

The input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000. For example, the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600. The CPU 1100 also transmits data to an output device such as a display, a speaker, or a printer via the input/output interface 1600. The input/output interface 1600 can also function as a medium interface for reading a program recorded in a predetermined computer-readable recording medium (medium) or the like. Examples of the medium include an optical recording medium such as a Digital Versatile Disk (DVD) and a phase-change rewritable disk (PD), a magneto-optical recording medium such as a magneto-optical disk (MO), a magnetic tape medium, a magnetic recording medium, a semiconductor memory, and the like.

For example, if the computer 1000 is used as the viewer 4 according to the first embodiment, the CPU 1100 of the computer 1000 executes the diagnosis support program loaded on the RAM 1200 to realize the functions of the receiving unit 41, the decoding unit 42, the image processing unit 43, the display control unit 44, and the like.

[ others ]

All or some of the processes described as being performed automatically in the above embodiments may also be performed manually, or all or some of the processes described as being performed manually may also be performed automatically in a known manner. Further, unless otherwise specified, the information shown in the above text and drawings, including the processing procedures, specific names, various data and parameters, may be modified as necessary. For example, the various information shown in the figures is not limited to the information shown.

Further, the components of the devices shown are functional concepts and need not be physically configured as shown. That is, the specific form of distribution and integration of the devices is not limited to those shown, and all or some of them may be functionally or physically distributed or integrated in any unit according to various loads, use conditions, and the like.

Further, the above-described embodiments and modifications may be appropriately combined without inconsistency of the process.

Note that the effects described herein are merely examples and are not limiting, and other effects may be provided.

Note that the present technology can adopt the following configuration.

(1) A diagnosis support program for causing a computer to execute:

an image processing process for performing predetermined image processing on the pathological image captured by the image capturing apparatus; and

an output processing process for outputting identification information for identifying a processed portion and an unprocessed portion of image processing on the pathological image.

(2) The diagnosis support program according to (1), wherein,

the output processing procedure comprises:

a boundary line as identification information is displayed at a boundary between a processed portion and an unprocessed portion displayed on a display unit.

(3) The diagnosis support program according to (2), wherein,

the output processing procedure comprises:

highlighting a new processed part of the processed parts displayed on the display unit with at least one of color, thickness, and flicker of the boundary line therearound.

(4) The diagnosis support program according to (2) or (3), wherein,

the image processing process includes:

image processing is performed from a central portion of a region displayed on the display unit in the pathological image.

(5) The diagnosis support program according to (2) or (3), wherein,

the image processing process includes:

image processing is performed from a portion of the pathological image corresponding to the cursor position displayed on the display unit.

(6) The diagnosis support program according to (2) or (3), wherein,

the output processing procedure comprises:

the boundary line is displayed by a line of the first line type on the side of the processed portion and a line of the second line type on the side of the unprocessed portion.

(7) The diagnosis support program according to (2) or (3), wherein,

the output processing procedure comprises:

when the display unit enlarges and displays a part of the pathological image, the entire pathological image is displayed in a part of the display unit, and a boundary line is displayed at a boundary between a processed part and an unprocessed part in the displayed entire pathological image.

(8) The diagnosis support program according to (7), wherein,

the output processing procedure comprises:

when a portion of the pathological image displayed enlarged on the display unit is an unprocessed portion, the portion of the pathological image is highlighted.

(9) The diagnosis support program according to (2) or (3), wherein,

the output processing procedure comprises:

in the display unit, at least one of the processed portion and the unprocessed portion is indicated by a text display.

(10) The diagnosis support program according to (2) or (3), wherein,

the output processing procedure comprises:

indicating in the display unit by means of an image display that the unprocessed portion is being displayed.

(11) The diagnosis support program according to (2) or (3), wherein,

the output processing procedure comprises:

after predetermined image quality deterioration processing is performed on the unprocessed portion, display is performed on the display unit.

(12) The diagnosis support program according to (2) or (3), wherein,

the pathological image is composed of a plurality of tiled images, and

the image processing process includes:

when the layer of the tile image displayed on the display unit is changed, predetermined image processing is performed on the tile image in the new layer.

(13) The diagnosis support program according to (2) or (3), wherein,

the output processing procedure comprises:

when the unprocessed portion is displayed on the display unit, this is notified by sound.

(14) A diagnostic support system comprising:

an image capture device; and

an information processing device that performs predetermined image processing on the pathological image captured by the image capturing device and outputs identification information for identifying a processed portion and an unprocessed portion of the image processing on the pathological image.

(15) A diagnosis support method, wherein a computer executes:

an image processing process for performing predetermined image processing on the pathological image captured by the image capturing apparatus; and

an output processing process for outputting identification information for identifying a processed portion and an unprocessed portion of image processing on the pathological image.

(16) A diagnosis support system comprising an image capturing apparatus and software for processing a pathological image captured by the image capturing apparatus, wherein,

the software is software for causing an information processing apparatus to execute: image processing for performing predetermined image processing on the pathological image; and an output process for outputting identification information for identifying a processed portion and an unprocessed portion of the image processing on the pathological image.

List of reference numerals

1 diagnosis support system

2 scanner

3 server

4 observer

21 image capturing unit

22 image processing unit

23 coding unit

24 sending unit

31 receiving unit

32 memory cell

33 decoding unit

34 image processing unit

35 coding unit

36 sending unit

37 tiled image generating unit

41 receiving unit

42 decoding unit

43 image processing unit

44 display control unit

45 display unit

46 memory cell

47 operating unit

48 sound control unit

49 sound output unit.

36页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于将绝缘层从导线端部分离的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!