Medical image processing apparatus and method

文档序号:53612 发布日期:2021-09-28 浏览:14次 中文

阅读说明:本技术 医用图像处理装置及方法 (Medical image processing apparatus and method ) 是由 竹之内星矢 于 2020-03-02 设计创作,主要内容包括:本发明提供一种不会阻碍基于医用图像的检查、并且能够使基于报知信息的辅助结果的灵活利用充分地发挥功能的医用图像处理装置及方法。所述医用图像处理装置具备:图像获取部(40),其获取医用图像;分类部(42),其基于由图像获取部(40)获取的医用图像对医用图像或医用图像中包含的关注区域进行分类;报知信息生成部(43),其根据分类后的分类结果生成显示用的第一报知信息和与第一报知信息不同的保存用的第二报知信息;以及记录部(48),其保存医用图像和第二报知信息。在此,通过使显示用的第一报知信息与保存用的第二报知信息不同,显示用的第一报知信息能够设为仅用户(医生)可识别的报知形式(患者等无法识别的形式)的第一报知信息,另一方面,保存用的第二报知信息能够设为用户以外的人也能够理解的报知形式。(The invention provides a medical image processing device and method which do not obstruct the examination based on medical images and can make the flexible use of the auxiliary result based on the report information fully play the function. The medical image processing apparatus includes: an image acquisition unit (40) which acquires a medical image; a classification unit (42) that classifies the medical image or a region of interest included in the medical image on the basis of the medical image acquired by the image acquisition unit (40); a notification information generation unit (43) that generates first notification information for display and second notification information for storage that is different from the first notification information, based on the classification result after the classification; and a recording unit (48) that stores the medical image and the second notification information. Here, by making the first notification information for display different from the second notification information for storage, the first notification information for display can be made into a notification form (a form that cannot be recognized by a patient or the like) that is recognizable only by a user (doctor), while the second notification information for storage can be made into a notification form that can be understood by a person other than the user.)

1. A medical image processing apparatus is provided with:

an image acquisition unit that acquires a medical image;

a classification unit that classifies the medical image or a region of interest included in the medical image based on the medical image acquired by the image acquisition unit;

a notification information generation unit that generates first notification information for display and second notification information for storage different from the first notification information, based on the classification result after the classification; and

and a recording unit that stores the medical image and the second notification information.

2. The medical image processing apparatus according to claim 1,

the recording unit stores a composite image obtained by combining the medical image and the second notification information.

3. The medical image processing apparatus according to claim 1,

the recording unit stores the medical image in association with the second notification information.

4. The medical image processing apparatus according to any one of claims 1 to 3,

the recording section is constituted by a plurality of recording sections,

the notification information generation unit generates a plurality of pieces of the second notification information with a total number of the plurality of recording units as an upper limit,

the plurality of recording units store the medical image and the second notification information corresponding to each of the plurality of recording units, among the plurality of second notification information.

5. The medical image processing apparatus according to any one of claims 1 to 4,

the medical image processing apparatus includes a display control unit that displays the medical image and the first notification information on a display unit.

6. The medical image processing apparatus according to claim 5,

the notification information generation unit generates the plurality of pieces of first notification information with an upper limit of a total number of the plurality of display units,

the display control unit causes the medical image and the plurality of first notification information to be displayed in association with the plurality of display units, respectively.

7. The medical image processing apparatus according to any one of claims 1 to 6,

the notification information generation unit generates the first notification information in a notification format that does not transmit details of the classification result to the patient and the second notification information indicating details of the classification result.

8. The medical image processing apparatus according to any one of claims 1 to 7,

the classification result contains information of the certainty of the classified lesion,

the notification information generation unit generates the first notification information in which the information of certainty factor is omitted and the second notification information including the information of certainty factor.

9. The medical image processing apparatus according to any one of claims 1 to 8,

the notification information generation unit acquires a display area of the classification result from resolution information of a display unit, and generates the first notification information in which details of the classification result are simplified in accordance with a size of the display area and the second notification information indicating details of the classification result.

10. The medical image processing apparatus according to any one of claims 1 to 9,

the image acquisition unit acquires the medical images in time series,

the recording unit stores the time-series medical images and the second notification information, or the time-series medical images and the second notification information obtained by still image capturing included in the time-series medical images.

11. A medical image processing method, comprising:

acquiring a medical image;

classifying the medical image or a region of interest included in the medical image based on the acquired medical image;

generating first notification information for display and second notification information for storage different from the first notification information, based on the classified result;

displaying the medical image and the first notification information on a display unit; and

and saving the medical image and the second notification information.

Technical Field

The present invention relates to a medical image processing apparatus and method, and more particularly to a technique for automatically classifying a lesion or the like based on a medical image and notifying a doctor or a patient of the classification result.

Background

Patent document 1 describes an image processing apparatus in which a classification result given to a doctor is different from a classification result given to a patient.

The image processing apparatus described in patent document 1 classifies lesion candidate regions in images based on different criteria (i.e., the degree of reliability of lesion candidates) for each image included in an image group captured in time series. Then, a first video signal capable of discriminating a lesion candidate region with a low degree of reliability from another region or more is generated and outputted to a first display device for a doctor, and a second video signal capable of discriminating a lesion candidate region with a high degree of reliability from another region or more is generated and outputted to a second display device for a patient.

This prevents the patient from being provided with information (information on a lesion candidate region with low reliability) that is not required to be provided to the patient, and does not cause excessive anxiety in the patient.

Prior art documents

Patent document

Patent document 1: japanese patent laid-open publication No. 2017-213058

Disclosure of Invention

Technical problem to be solved by the invention

An auxiliary system is desired which classifies whether a lesion is cancerous or non-cancerous according to a medical image and reports the classification result to a user.

When reporting the classification result, it is desirable to perform the classification in a format such as a text, a format, and a position that does not hinder the diagnosis of the user. Further, if the notification is made in a form that can be clearly recognized as cancerous, it may cause excessive anxiety to people other than the user, particularly the patient. However, if the form is recognizable to the user who only performs the examination, it may be difficult for the expert to understand the form in a study or the like.

The image processing apparatus described in patent document 1 classifies lesion candidate regions in an image based on different reliability criteria, generates a first image signal for a doctor and a second image signal for a patient, which allow the classification result to be visually recognized, and displays the generated first image signal and second image signal on a first display device for a doctor and a second display device for a patient, respectively. Patent document 1 does not describe a recording unit for storing medical images in association with classification results.

The present invention has been made in view of the above circumstances, and an object thereof is to provide a medical image processing apparatus and method that can generate notification information in a plurality of formats for display and storage based on a classification result of a medical image, and can sufficiently function to flexibly use an assist result based on the notification information without hindering an examination based on the medical image.

Means for solving the technical problem

In order to achieve the above object, a medical image processing apparatus according to an aspect of the present invention includes: an image acquisition unit that acquires a medical image; a classification unit that classifies the medical image or a region of interest included in the medical image based on the medical image acquired by the image acquisition unit; a notification information generation unit that generates first notification information for display and second notification information for storage different from the first notification information, based on the classification result after the classification; and a recording unit that stores the medical image and the second notification information.

According to one aspect of the present invention, a medical image or a region of interest included in the medical image is classified based on an acquired medical image, and first notification information for display and second notification information for storage different from the first notification information are generated from a classification result after the classification. The generated second notification information for storage is stored in the recording unit together with the medical image. Here, by making the first notification information for display different from the second notification information for storage, the first notification information for display can be made into a notification form (a form that cannot be recognized by a patient or the like) that is recognizable only by a user (doctor), while the second notification information for storage can be made into a notification form that can be understood by a person other than the user.

In the medical image processing apparatus according to another aspect of the present invention, the recording unit preferably stores a synthesized image obtained by synthesizing the medical image and the second notification information.

In the medical image processing apparatus according to still another aspect of the present invention, the recording unit preferably stores the medical image in association with the second notification information.

In the medical image processing apparatus according to still another aspect of the present invention, it is preferable that the recording unit includes a plurality of recording units, the notification information generating unit generates a plurality of pieces of second notification information with the total number of the plurality of recording units being an upper limit, and the plurality of recording units store the second notification information corresponding to each of the plurality of recording units out of the medical image and the plurality of pieces of second notification information.

In the medical image processing apparatus according to still another aspect of the present invention, it is preferable that the medical image processing apparatus further includes a display control unit that displays the medical image and the first notification information on the display unit.

In the medical image processing apparatus according to still another aspect of the present invention, the notification information generating unit preferably generates the plurality of first notification information items with the total number of the plurality of display units being an upper limit, and the display control unit preferably causes the medical image and the plurality of first notification information items to be displayed in association with the plurality of display units, respectively. For example, the first notification information displayed on the user display unit can be made different from the first notification information displayed on the patient display unit.

In the medical image processing apparatus according to still another aspect of the present invention, the notification information generating unit preferably generates first notification information in a notification format that does not transmit the details of the classification result to the patient and second notification information indicating the details of the classification result.

In the medical image processing apparatus according to still another aspect of the present invention, it is preferable that the classification result includes information on certainty of the classified lesion, and the notification information generating unit generates first notification information in which the certainty information is omitted and second notification information including the certainty information.

In the medical image processing apparatus according to still another aspect of the present invention, the notification information generating unit preferably acquires a display area of the classification result from the resolution information of the display unit, and generates first notification information in which details of the classification result are simplified in accordance with a size of the display area, and second notification information indicating details of the classification result. Due to the resolution of the display unit, a display area for displaying the classification result may be sufficiently secured. In this case, it is preferable that the first notification information is displayed in a simplified manner so as to converge to the size of the display area.

In the medical image processing apparatus according to still another aspect of the present invention, it is preferable that the image acquisition unit acquires time-series medical images, and the recording unit stores the time-series medical images and the second notification information, or stores the medical images obtained by still image shooting and the second notification information included in the time-series medical images.

A medical image processing method according to still another aspect of the present invention includes: acquiring a medical image; classifying the medical image or a region of interest included in the medical image based on the acquired medical image; generating first notification information for display and second notification information for storage different from the first notification information, based on the classified result; displaying the medical image and the first notification information on the display unit; and a step of saving the medical image and the second notification information.

Effects of the invention

According to the present invention, the notification information is generated in a plurality of formats for display and storage based on the classification result of the medical image, so that the flexible use of the support result based on the notification information can be sufficiently performed without hindering the examination based on the medical image.

Drawings

Fig. 1 is a schematic diagram showing an overall configuration of an endoscope system 9 including a medical image processing apparatus according to the present invention.

Fig. 2 is a block diagram showing an embodiment of the medical image processing apparatus 14.

Fig. 3 is a schematic diagram showing a representative configuration example of CNN applied to the classification unit 42 of the present example.

Fig. 4 is a graph showing the relationship between the classification result and the first notification information for display and the second notification information for storage.

Fig. 5A is a diagram showing a first embodiment of a notification format of a first composite image for display.

Fig. 5B is a diagram showing the first embodiment of the notification format of the second composite image for storage.

Fig. 6A is a diagram showing a modification of the first embodiment of the notification format of the first composite image for display.

Fig. 6B is a diagram showing a modification of the first embodiment of the notification format of the second composite image for storage.

Fig. 7A is a diagram showing a second embodiment of a notification format of a first composite image for display.

Fig. 7B is a diagram showing a second embodiment of a notification format of a second composite image for storage.

Fig. 8A is a diagram showing a third embodiment of a notification format of a first composite image for display.

Fig. 8B is a diagram showing a third embodiment of a notification format of a second composite image for storage.

Fig. 9 is a graph showing the relationship between the classification result corresponding to the number of displays and recording units, the first notification information for display, and the second notification information for storage.

Fig. 10A is a diagram showing a composite image of the storage 1 in which the second notification information of the storage 1 is composited.

Fig. 10B is a diagram showing a composite image of the storage 2 in which the second notification information of the storage 2 is composited.

Fig. 11A is a diagram showing a composite image of display 1 in which the first notification information of display 1 is composited.

Fig. 11B is a diagram showing a composite image of the display 2 in which the first notification information of the display 2 is composited.

Fig. 12 is a flowchart showing an embodiment of a medical image processing method according to the present invention.

Detailed Description

Preferred embodiments of a medical image processing apparatus and method according to the present invention will be described below with reference to the accompanying drawings.

[ Overall Structure of endoscope System including medical image processing apparatus ]

Fig. 1 is a schematic diagram showing an overall configuration of an endoscope system 9 including a medical image processing apparatus according to the present invention.

As shown in fig. 1, the endoscope system 9 includes an endoscope scope (scope)10 as an electronic endoscope, a light source device 11, an endoscope processor device 12, a display device 13, a medical image processing device 14, an operation unit 15, and a display 16.

The scope 10 is a device for capturing a time-series medical image including an object image, and is, for example, a flexible endoscope. The endoscope scope 10 includes: an insertion section 20) which is inserted into the subject and has a distal end and a proximal end; a hand operation unit 21 which is connected to the proximal end side of the insertion unit 20 and is grasped by the operator to perform various operations; and a universal cord 22 connected to the operation unit 21.

The insertion portion 20 is formed to have a small diameter and a long shape as a whole. The insertion portion 20 is configured to have a flexible portion 25, a bending portion 26 that can be bent by an operation of the manual operation portion 21, and a distal end portion 27 having an imaging optical system (objective lens), an imaging element 28, and the like, which are not shown, connected in this order from a proximal end side toward a distal end side thereof.

The image pickup device 28 is a cmos (complementary metal oxide semiconductor) type or a ccd (charged coupled device) type image pickup device. The image light of the observed site is incident on the imaging surface of the image pickup device 28 via an unillustrated observation window that opens at the distal end surface of the distal end portion 27 and an unillustrated objective lens that is arranged behind the observation window. The image pickup device 28 picks up (converts into an electric signal) image light incident on the observed portion of the image pickup surface, and outputs an image pickup signal.

The hand operation unit 21 is provided with various operation members operated by the operator. Specifically, the manual operation unit 21 is provided with two types of bending operation knobs 29 for bending the bending portion 26, an air and water supply button 30 for air and water supply operation, and an attraction button 31 for attraction operation. Further, the manual operation unit 21 includes: a still image shooting instruction section 32 for instructing shooting of a still image 39 of the observed region; and a treatment instrument introduction port 33 for inserting a treatment instrument (not shown) into a treatment instrument insertion passage (not shown) inserted into the insertion portion 20.

The universal cord 22 is a connection cord for connecting the endoscope scope 10 and the light source device 11. The universal cord 22 incorporates an optical guide 35, a signal cable 36, and a fluid pipe (not shown) inserted into the insertion portion 20. Further, a connector 37A connected to the light source device 11 and a connector 37B branched from the connector 37A and connected to the endoscope processor device 12 are provided at an end portion of the universal cord 22.

The light guide 35 and the fluid pipe (not shown) are inserted into the light source device 11 by connecting the connector 37A to the light source device 11. Thus, necessary illumination light, water, and gas are supplied from the light source device 11 to the endoscope scope 10 via the light guide 35 and the fluid pipe (not shown). As a result, illumination light is emitted toward the region to be observed from an illumination window (not shown) on the distal end surface of the distal end portion 27. Further, the gas or water is ejected from the gas and water supply nozzle (not shown) on the distal end surface of the distal end portion 27 toward the observation window (not shown) on the distal end surface in accordance with the pressing operation of the gas and water supply button 30.

By connecting the connector 37B to the endoscope processor device 12, the signal cable 36 is electrically connected to the endoscope processor device 12. Thereby, an image pickup signal of the observed site is output from the image pickup element 28 of the endoscope scope 10 to the endoscope processor device 12 via the signal cable 36, and a control signal is output from the endoscope processor device 12 to the endoscope scope 10.

The light source device 11 supplies illumination light to the light guide 35 of the endoscope scope 10 via the connector 37A. The illumination light is selected from special light of various wavelength bands corresponding to the purpose of observation, such as white light (light of a white wavelength band or light of a plurality of wavelength bands), light of one or a plurality of specific wavelength bands, or a combination thereof. The specific wavelength band is a band narrower than the white wavelength band.

The special Light of each wavelength band includes special Light for special Light image (bli (Blue Light Imaging or Blue LASER Imaging), lci (linked Color Imaging), or nbi (narrow band Imaging).

The illumination light for BLI is illumination light in which the ratio of violet light having a high absorptance in superficial blood vessels is high and the ratio of green light having a high absorptance in middle blood vessels is suppressed, and is suitable for generating an emphasized image (BLI) suitable for blood vessels or structures on the mucosal surface of a subject.

The illumination light for LCI is illumination light having a higher ratio of violet light than white light and suitable for capturing a more subtle change in hue than white light, and is suitable for generating an image (LCI) which is subjected to color emphasis processing such as reddish with red and whiter with a color near the mucous membrane as the center by using a signal of a red component.

The NBI illumination light is suitable for generating an image (NBI) in which fine variations of an irradiated surface are emphasized by narrowing the wavelength range of the irradiated illumination light.

A first example of the specific wavelength band is, for example, a blue band or a green band of the visible range. The wavelength band of the first example includes a wavelength band of 390nm to 450nm or 530nm to 550nm, and the light of the first example has a peak wavelength in the wavelength band of 390nm to 450nm or 530nm to 550 nm.

A second example of a specific wavelength band is, for example, the red band of the visible range. The wavelength band of the second example includes a wavelength band of 585nm to 615nm or less or 610nm to 730nm, and the light of the second example has a peak wavelength in the wavelength band of 585nm to 615nm or less or 610nm to 730 nm.

A third example of the specific wavelength band includes wavelength bands in which the absorption coefficients are different in oxidized hemoglobin and reduced hemoglobin, and the light of the third example has peak wavelengths in wavelength bands in which the absorption coefficients are different in oxidized hemoglobin and reduced hemoglobin. The wavelength band of the third example includes a wavelength band of 400 + -10 nm, 440 + -10 nm, 470 + -10 nm, or 600nm to 750nm, and the light of the third example has a peak wavelength in the above-mentioned wavelength band of 400 + -10 nm, 440 + -10 nm, 470 + -10 nm, or 600nm to 750 nm.

A fourth example of the specific wavelength band is a wavelength band (390nm to 470nm) of excitation light for observation of fluorescence emitted by a fluorescent substance in a living body (fluorescence observation) and exciting the fluorescent substance.

A fifth example of the specific wavelength band is a wavelength band of infrared light. The wavelength band of the fifth example includes a wavelength band of 790nm to 820nm or 905nm to 970nm, and the light of the fifth example has a peak wavelength in the wavelength band of 790nm to 820nm or 905nm to 970 nm.

The endoscope processor device 12 controls the operation of the endoscope scope 10 via the connector 37B and the signal cable 36. Further, the endoscope processor device 12 generates an image (also referred to as a "moving image 38") composed of a time-series frame image 38A including an object image based on an image pickup signal acquired from the image pickup element 28 of the endoscope scope 10 via the connector 37B and the signal cable 36. When the still image shooting instruction unit 32 is operated by the manual operation unit 21 of the endoscope scope 10, the endoscope processor device 12 sets one frame image of the moving images 38 as the still image 39 corresponding to the timing of the shooting instruction in parallel with the generation of the moving images 38.

The moving image 38 and the still image 39 are medical images obtained by imaging the inside of the subject, that is, the inside of a living body. Further, when the moving image 38 and the still image 39 are images obtained by the light (special light) of the above-described specific wavelength band, both are special light images. Then, the endoscope processor apparatus 12 outputs the generated moving image 38 and still image 39 to the display apparatus 13 and medical image processing apparatus 14, respectively.

The endoscope processor device 12 may generate (acquire) a special light image having information of the specific wavelength band based on the normal light image obtained from the white light. In this case, the endoscope processor device 12 functions as a special light image acquiring unit. The endoscope processor device 12 obtains a signal of a specific wavelength band by performing an operation based on color information of Red, Green, and Blue [ RGB (Red, Green, Blue) ] or Cyan, Magenta, and Yellow [ CMY (Cyan, Magenta, Yellow) ] included in the normal light image.

The endoscope processor device 12 may generate a feature amount image such as a known oxygen saturation image based on at least one of a normal light image obtained by the white light and a special light image obtained by the light (special light) of the specific wavelength band. In this case, the endoscope processor device 12 functions as a feature image generating unit. The moving image 38 or the still image 39 including the in-vivo image, the normal light image, the special light image, and the feature amount image is a medical image obtained by imaging a human body of a person for the purpose of diagnosis or examination based on an image or imaging the measurement result.

The display device 13 is connected to the endoscope processor device 12, and functions as a display unit for displaying a moving image 38 and a still image 39 input from the endoscope processor device 12. The user (doctor) performs an operation of inserting and removing the insertion portion 20 while checking the moving image 38 displayed on the display device 13, and when a lesion or the like is found in the region to be observed, operates the still image shooting instruction portion 32 to perform still image shooting of the region to be observed, and performs diagnosis, biopsy, and the like.

[ medical image processing apparatus ]

The medical image processing apparatus 14 classifies a medical image being captured or a region of interest included in the medical image into one of two or more categories mainly based on time-series medical images, and notifies the user of the classification result. The operation unit 15 uses a keyboard, a mouse, or the like wired or wirelessly connected to a personal computer, and the display (display unit) 16 uses various monitors such as a liquid crystal monitor connectable to the personal computer.

Fig. 2 is a block diagram showing an embodiment of the medical image processing apparatus 14.

The medical image Processing apparatus 14 shown in fig. 2 is mainly composed of an image acquisition unit 40, a cpu (central Processing unit)41, a classification unit 42, a notification information generation unit 43, a composite image generation unit 44, a display control unit 46, a storage unit 47, and a recording unit 48.

The CPU41 operates based on the program stored in the storage unit 47 to collectively control the image acquisition unit 40, the classification unit 42, the notification information generation unit 43, the composite image generation unit 44, the display control unit 46, and the recording unit 48, and functions as a part of these units.

The image acquisition unit 40 acquires an image (in this example, a moving image 38 captured by the endoscope viewer 10) composed of time-series frame images 38A including a subject image from the endoscope processor device 12 using an image input/output interface (not shown) wired or wirelessly connected to the endoscope processor device 12 (fig. 1). When the above-described still image 39 is captured during the capturing of the moving image 38 by the endoscope viewer 10, the image acquisition unit 40 acquires the moving image 38 and the still image 39 from the endoscope processor device 12.

The classification unit 42 acquires the feature amount of the frame image 38A based on the time-series frame images 38A acquired by the image acquisition unit 40, and classifies each frame image 38A or the region of interest included in each frame image 38A into one of two or more classes based on the acquired feature amount.

In this example, as described later, the two or more categories are classified into one of three categories, i.e., "non-neoplastic", "neoplastic", and "others".

The notification information generating unit 43 is a part that generates first notification information for display and second notification information for storage different from the first notification information, from the frame image 38A or the classification result of the region of interest classified by the classifying unit 42. The details of the notification information generating unit 43 will be described later.

The synthetic image generating unit 44 generates a synthetic image (first synthetic image for display) in which the frame image 38A is synthesized with the first notification information for display generated based on the classification result of the frame image 38A, and a synthetic image (second synthetic image for storage) in which the frame image 38A is synthesized with the second notification information for storage generated based on the classification result of the frame image 38A.

The display control unit 46 generates image data for display based on the medical images (the moving image 38 and the still image 39) acquired by the image acquisition unit 40 and outputs the image data to the display 16, but outputs the image data of the first composite image for display preferentially to the display 16 when the first composite image for display is generated by the composite image generation unit 44.

In this way, the medical image is displayed on the display 16, and first notification information for display (that is, a first composite image for display) indicating the classification result of the medical image or the region of interest is displayed for the medical image having the region of interest such as a lesion.

The storage unit 47 functions as a work area of the CPU41, or a storage unit that stores various programs such as an operating system and a medical image processing program, a table indicating the relationship between the classification result of the medical image and the first notification information and the second notification information corresponding to the classification result, and the like.

The recording unit 48 stores the captured moving image 38 and still image 39, but when the second composite image for storage is generated by the composite image generating unit 44, the second composite image for storage is preferentially stored.

< Classification section >

Next, an embodiment of the sorting unit 42 will be described.

The classification unit 42 in this example calculates a feature amount from an image (frame image 38A), and includes a Convolutional Neural Network (CNN) that performs image recognition processing, and calculates a feature amount from color information in an image, gradient of pixel values, and the like. The classification unit 42 classifies the image (medical image) or the region of interest included in the image into any one of a plurality of categories, such as "non-tumor", "tumor", and "other", in the present example, based on the calculated feature amount.

Fig. 3 is a schematic diagram showing a representative configuration example of CNN applied to the classification unit 42 of the present example.

As shown in fig. 3, the classification unit (CNN)42 includes an input layer 42A, an intermediate layer 42B having a plurality of sets of convolution layers and pooling layers and all-connected layers, and an output layer 42C, and each layer has a structure in which a plurality of "nodes" are connected by "edges".

Each frame image 38A of the moving image 38 is sequentially input to the input layer 42A.

The intermediate layer 42B has a plurality of sets of convolutional layers and pooling layers and all-connected layers, and extracts feature amounts from the frame image 38A input from the input layer. The convolutional layer performs filtering processing (convolution operation using a filter) on a node located in the vicinity of the previous layer, and acquires a "feature map". The pooling layer reduces the feature map output from the convolutional layer to obtain a new feature map. The "convolutional layer" plays a role of extracting features such as edges from an image, and the "pooling layer" plays a role of giving robustness so that the extracted features are not affected by parallel shift or the like.

The intermediate layer 42B is not limited to the case where the buildup layer and the pooling layer are formed as a set, and includes the case where the buildup layer is continuous and the normalization layer.

The fully-connected layer is a portion that is combined with all the nodes of the previous layer in a weighted manner and outputs a value (feature variable) converted by an activation function, and in this example, outputs a feature variable for each classification for the frame image 38A or each region of interest such as a lesion included in the frame image 38A.

The output layer 42C functioning as an inference unit converts the output (feature variable) from the all-connected layer into a probability using a soft maximum function, and calculates a score (probability) for each classification. In this example, in order to classify the frame image 38A or the region of interest into any of the three categories "non-neoplastic", "neoplastic", and "other", the output layer 42C outputs, as a classification result, the category having the highest score among the scores of the three categories (the total of the three scores is 100%), and the score of the category.

The parameters of the filter used for each convolutional layer in the intermediate layer 42B, the weight coefficients of all the connected layers, and the like are optimized in advance by a plurality of learning data.

Fig. 4 is a graph showing the relationship between the classification result and the first notification information for display and the second notification information for storage.

In the example shown in fig. 4, when the classification result of the medical image or the region of interest included in the medical image is "non-tumorous", the first notification information for display is character information of "NN", and the second notification information for storage is character information of "nonoplastic".

In addition, when the classification result is "tumorous", the first notification information for display is the character information of "N", the second notification information for storage is the character information of "neoplasticic", when the classification result is "other", the first notification information for display is the character information of "0", and the second notification information for storage is the character information of "other".

In the example shown in fig. 4, the first notification information for display is the first character of the second notification information for storage, and when the first characters of different classification results are the same, the first character is simplified character information that can be at least distinguished.

The first notification information may be set arbitrarily by the user, or may be automatically generated from the second notification information according to a predetermined rule. Further, a table showing the relationship between the classification result and the first notification information for display and the second notification information for storage as shown in fig. 4 may be prepared, and the user can arbitrarily set the first notification information and the second notification information on the table by using the operation unit 15.

< first embodiment of the report form >

Fig. 5A and 5B are views showing a first embodiment of a notification format of a first composite image for display and a second composite image for storage, respectively.

As described above, the composite image generating unit 44 generates a first composite image for display (see fig. 5A) in which the frame image 38A and the first broadcast information for display generated based on the classification result of the frame image 38A are synthesized, and a second composite image for storage (see fig. 5B) in which the frame image 38A and the second broadcast information for storage generated based on the classification result of the frame image 38A are synthesized.

Fig. 5A and 5B show an example of a first synthetic image for display and a second synthetic image for storage in the case where the classification result of "tumor property" is obtained by the classification unit 42.

Since the first notification information for display when the classification result of "tumorous" is obtained is the character information of "N" as shown in fig. 4, the first synthesized image shown in fig. 5A is an image obtained by synthesizing the medical image and the character information of "N" indicating the classification result (result).

On the other hand, since the first notification information for storage when the classification result of "tumorous" is obtained is the character information of "neoplasticic" as shown in fig. 4, the second composite image is an image obtained by combining the medical image and the character information of "neoplasticic" indicating the classification result (result), as shown in fig. 5B. In the second synthetic image for storage of the present example, as shown in fig. 5B, information ("70%") of the certainty factor (score) of "tumorous property" classified as a lesion is synthesized, but information of the certainty factor of "tumorous property" is not synthesized in the first synthetic image for display.

The first composite image for display is displayed on the display 16 during endoscopic diagnosis, and is made available to the user (doctor) and the patient for observation. The first composite image for display is displayed in real time but is not saved.

The user can perform an endoscopic examination based on the medical image included in the first composite image displayed on the display 16, and can flexibly use the first notification information (character information of "N" indicating the classification result) included in the first composite image for the endoscopic examination.

Further, the user can recognize the classification result of the medical image or the region of interest by the classification unit 42 from "N" of the first notification information, but a person (patient) other than the user cannot recognize what the "N" of the first notification information means. That is, "N" of the first notification information is information in a notification format in which the classification result is not directly transmitted to the patient, and is not recognized as information that causes anxiety in the patient.

In the example shown in fig. 4, when the classification result is "other", the first notification information for display is the character information of "0" and the second notification information for storage is the character information of "other", but the first notification information for display may be the character information of "other". This is because, when the classification result is "other", even if the patient identifies it, it does not cause excessive anxiety. That is, the first notification information for display and the second notification information for storage may be different or the same depending on the classification result.

On the other hand, if the form is recognizable to the user who only performs the examination, like the "N" of the first report information, it may be difficult for the expert or other doctor to understand it in a scholarly or the like. Therefore, the second notification information for storage is provided with information in a notification format (in the present example, character information of "neoplasmc") that can be understood by such experts and the like, and in the example shown in fig. 5B, information ("70%") regarding the certainty factor of the classification result is further added.

The second composite image for storage including the second notification information for storage is stored in the recording unit 48. The stored second composite image is a moving image in which the second notification information is synthesized for each frame, but only a composite image in which a frame in which imaging of a still image is instructed in the endoscopic examination and a still image in which the second notification information for the frame are synthesized may be stored in the recording unit 48.

The classification of the medical image or the region of interest by the classification unit 42 is classified into any one of the "non-tumor", "tumor", and "other" categories in the above embodiments, but the classification category is not limited to this. For example, there are NICE (NBI International scientific Endoscopic) classification, JNET (the Japan NBI Expert team) classification, and the like as endoscopic findings. Further, classification based on the disease species (for example, hyperplastic polyps, linear, intra-mucosal cancers, highly invasive cancers, inflammatory polyps, and the like), classification based on the shape, size, position, and the like of a lesion, classification based on the certainty of the classification result of a lesion portion, classification based on the degree of severe injury, and classification combining these are included.

When the endoscopic image is an image captured under special light for NBI, the classification unit 42 can classify the endoscopic image by NICE classification or JNET classification. It is needless to say that the first notification information for display and the second notification information for storage corresponding to the classification result are different depending on the classification method.

The form factor of the notification information is not limited to the above-described characters and numerical values, and includes figures, colors, positions, and the like.

< modification of the first embodiment of the report form >

Fig. 6A and 6B are diagrams showing a modification of the first embodiment of the notification format of the first notification information and the second notification information, and show a first composite image for display and a second composite image for storage in a case where the classification unit 42 classifies the medical image or the region of interest into Type2 of the NICE classification.

Since "Type 2" classified by NICE is the pathology with the highest possibility of the line Type "adenoma", the first notification information for display is set as the character information of "a" which is the first character of "adenoma". Therefore, as shown in fig. 6A, the first composite image for display is an image obtained by combining the medical image and the character information of "a".

On the other hand, the second notification information for storage when the classification result of "Type 2" of the NICE classification is obtained is the character information of "adenoma". Therefore, as shown in fig. 6B, the second composite image for storage is an image obtained by combining the medical image and the character information of "adenoma".

As described above, in the modification of the first embodiment of the report form, the first report information for display is information of a report form that is not directly transmitted to the patient as a result of classification, and the second report information for storage is information of a report form that can be understood by experts and the like, as in the first embodiment of the report form.

< second embodiment of the report form >

Fig. 7A and 7B are diagrams showing a second embodiment of the notification format of the first notification information and the second notification information, and show a first composite image for display and a second composite image for storage in a case where the classification unit 42 classifies the medical image or the region of interest into Type2 of the NICE classification.

When the classification result is obtained as a classification result with high certainty (high cancerous property) of the line Type "adenoma" in the case of the classification as the "Type 2" of the NICE classification, the first notification information for display is character information of "adenoma" in a color (e.g., achromatic color) that does not attract attention. Therefore, as shown in fig. 7A, the first composite image for display is an image obtained by combining the medical image and the first notification information for display (character information of "adenoma" in an unnoticeable color).

On the other hand, the second notification information for storage when the classification result of "Type 2" of the NICE classification is obtained is, for example, character information of "adenoma" in a color calling attention, such as red. Therefore, as shown in fig. 7B, the second composite image for storage is an image obtained by combining the medical image and the character information of "adenoma" in the color calling for attention.

As described above, the first notification information for display of the second embodiment of the notification form is character information of a color that does not attract attention when a classification result with high certainty of a lesion (high cancerous state) is obtained, and the second notification information for storage is character information of a color that attracts attention.

Thus, even if the patient observes the first composite image displayed on the display 16, the electric power can prevent the cancerous state of the lesion or the like from being transmitted to the patient, and anxiety can be avoided in the patient.

The information for calling the user's attention is not limited to the color of the character information, and may be added to numerical value information indicating the certainty of a lesion, the size and position of the character information, or the like.

< third embodiment of the report form >

Fig. 8A and 8B are diagrams showing a third embodiment of the notification format of the first notification information and the second notification information, and show a first composite image for display and a second composite image for storage in a case where the classification unit 42 classifies the medical image or the region of interest into "Type 2" of the NICE classification.

The notification information generating unit 43 acquires resolution information of a display of an output destination of the first composite image for display.

When the resolution of the display is low, if the image size of the medical image is reduced, the diagnosis is adversely affected, and therefore the medical image is preferably displayed in the image size at the time of imaging. Therefore, when the resolution of the display is low, the medical image is displayed on the screen of the display in full, and as a result, a display area in which the first notification information for display corresponding to the classification result is sufficiently displayed may not be secured.

Therefore, the notification information generating unit 43 acquires the display area of the classification result from the resolution information of the display acquired from the display, and generates the first notification information for display in which the details of the classification result are simplified in accordance with the size of the display area.

In the third embodiment of the notification form shown in fig. 8A, the first notification information for display in the case of classification as "Type 2" of the NICE classification reduces "adenoma" to "a" of only the first character thereof.

The synthetic image generating unit 44 generates a first synthetic image for display in which "a" is synthesized in the display region, with the lower right region of the screen of the display as the display region of the classification result.

On the other hand, in the third embodiment of the notification form shown in fig. 8B, since the second notification information for storage in the case of obtaining the classification result of "Type 2" of the NICE classification is not affected by the resolution of the display, the second notification information (character information of "adenoma" which is not simplified) indicating the details of the classification result is used.

The synthetic image generator 44 generates a second synthetic image for display in which the character information of "adenoma" is synthesized substantially in the center of the right area of the medical image.

That is, the size of the first notification information for display may be limited by the resolution of the display, and in this case, the size of the first display information may be reduced by simplifying the first notification information for display itself. On the other hand, since the second notification information for storage does not necessarily match the size of the display, the second notification information indicating the details of the classification result is used without reducing the size of the second notification information corresponding to the classification result.

In the example shown in fig. 8A and 8B, the first notification information is simplified as "adenoma" to "a" of the first character, but the first notification information is not limited to the simplification of character information, and when a plurality of pieces of information are included as a result of classification, the amount of information for display may be simplified by filtering only necessary information from the plurality of pieces of information.

< notification format corresponding to the number of displays and recording units >

Fig. 9 is a graph showing the relationship between the classification result corresponding to the number of displays and recording units, and the first notification information for display and the second notification information for storage, and shows a case where two displays and two recording units are provided, respectively.

Depending on facilities, there are cases where the display is divided into a patient use case and a user use case, and a case where the same display is observed and an examination is performed. In addition, there is a case where information to be stored for setting or learning is to be distinguished at the time of recording.

The two pieces of first notification information for display and the two pieces of second notification information for storage shown in fig. 9 are the cases where the classification unit 42 classifies the medical image or the region of interest into the classification results "1", "2", "3", "4", and … ….

The classification results "1", "2", and "3" respectively indicate the cases of classification into "Type 1", "Type 2", and "Type 3" of the NICE classification, and the classification result "4" indicates the case of classification into "adonoma" and "certainty" 70% ".

The second notification information of the storage 1 corresponding to the classification results "1", "2", "3", and "4" is "Hyperplastic", "Adenoma", "malignance", and "Adenoma" of the pathology indicated by "Type 1", "Type 2", and "Type 3" of the NICE classification, which are the most probable pathologies, and the first notification information of the display 1 corresponding thereto is the initial characters "H", "a", "M", and "a" of these character information.

The second notification information of the storage 2 corresponding to the classification results "1", "2", "3", and "4" is "NICE classification Type", "NICE classification Type 2", "NICE classification Type 3", and "advanced 70%", and the second notification information of the display 2 corresponding thereto is "Type a", "Type B", "Type C", and achromatic "advanced".

Here, the first notification information of the display device 2 corresponding to the second notification information of the storage device 2 corresponding to the classification results "1", "2", and "3" is "Type a", "Type B", and "Type C", but the classification method of these first notification information, "Type a", "Type B", and "Type C" is unknown, and therefore, people (patients) other than the user cannot recognize the classification results. The first notification information for display 2 corresponding to the second notification information for storage 2 corresponding to the classification result "4" is achromatic "Adenoma" and omits a value of certainty, so that the patient is not anxious.

In the example shown in fig. 9, two displays and two recording units are provided, respectively, but at least one of the displays and the recording units may be two or more. The notification information generating unit 43 generates the plurality of first notification information items with the total number of the plurality of displays as an upper limit when the plurality of displays are provided, and generates the plurality of second notification information items with the total number of the plurality of recording units as an upper limit when the plurality of recording units are provided.

Fig. 10A and 10B show the composite images of the storage 1 and 2 in which the second notification information of the storage 1 and 2 is combined, respectively, the composite image of the storage 1 is stored in the recording unit 48A, and the composite image of the storage 2 is stored in the recording unit 48B. The recording units 48A and 48B may be physically different recording units, or may be different storage areas in the recording unit 48 (fig. 2).

Fig. 11A and 11B show composite images of display devices 1 and 2 in which the first notification information for display devices 1 and 2 is composited, respectively, the composite image for display device 1 being displayed on the display device 16A, and the composite image for display device 2 being displayed on the display device 16B.

[ medical image processing method ]

Fig. 12 is a flowchart showing an embodiment of the medical image processing method according to the present invention, and shows a processing procedure of each part of the medical image processing apparatus 14 shown in fig. 2.

In fig. 12, the image acquisition unit 40 acquires a medical image of one frame of time-series medical images to be processed from the endoscope processor device 12 (step S10).

The classification unit 42 obtains the feature amount of the medical image from the medical image acquired in step S10, and classifies the medical image or the region of interest included in the medical image into any one of two or more classes based on the feature amount (step S12). For example, in the example shown in fig. 4, the medical image or the region of interest is classified into any one of a plurality of categories such as "non-tumor", "tumor", and "other".

The notification information generator 43 generates first notification information for display and second notification information for storage based on the classification result of the medical image or the region of interest by the classifier 42 (step S14). As shown in fig. 4, the second notification information for storage is the character information of "nonoplastic", "neoplasticic", and "other" corresponding to the classification results of "non-neoplastic", "neoplastic", and "other", but the first notification information for display is "NN", "N", and "0" in which these second notification information are omitted, and the classification result indicated by the first notification information for display cannot be recognized by a person (patient) other than the user.

The synthetic image generating unit 44 generates a first synthetic image for display, which is obtained by synthesizing the medical image and the first notification information, and a second synthetic image for storage, which is obtained by synthesizing the medical image and the second notification information (step S16).

The display controller 46 displays the first synthetic image for display on the display 16 (step S18), and the recording unit 48 stores the second synthetic image for storage (step S20).

Next, the CPU41 determines whether or not there is an instruction to end diagnosis (or to end imaging) based on the endoscopic image from the operation unit 15 (step S22). When there is No instruction to end the diagnosis (in the case of "No"), the CPU41 proceeds to step S10, executes processing for the medical image of the next frame through steps S10 to S22, and ends the processing when there is an instruction to end the diagnosis (in the case of "Yes").

[ others ]

The classifier according to the present invention is not limited to a classifier that classifies a medical image or a region of interest included in the medical image by a learner such as CNN, and may be configured to detect the region of interest by analyzing feature quantities such as color, gradient of pixel value, shape, and size in the medical image by image processing, classify the region of interest included in the medical image or the medical image into any one of two or more categories based on the detected feature quantities of the region of interest, or may be used in combination with the learner.

The medical image processing apparatus 14 may output the composite image for display to the endoscope processor apparatus 12 without providing the display control unit 46, and a display control unit (not shown) included in the endoscope processor apparatus 12 may display the composite image for display on the display device 13.

In the above embodiment, the endoscope processor apparatus 12 and the medical image processing apparatus 14 are provided separately, but the endoscope processor apparatus 12 and the medical image processing apparatus 14 may be integrated. That is, the endoscope processor device 12 may be provided with a function as the medical image processing device 14.

In the above embodiment, the composite image generating unit 44 generates the second composite image in which the medical image and the second notification information for storage are combined, but the present invention is not limited thereto, and the medical image and the second notification information may be stored in association with each other. In this case, the second notification information is not image information indicating the second notification information, and may be text information, for example, and may be added as auxiliary information to the medical image. The second notification information can be edited by setting the second notification information as text information. In addition, when the medical image is a moving image, each frame of the moving image can be recorded in association with the second notification information by adding the frame number of the moving image or the type stamp of the frame to the second notification information corresponding to each frame.

The hardware configuration for executing various controls of the medical image processing apparatus 14 according to the above embodiment is a processor (processor) as described below. The various processors include a processor-Programmable Logic Device (PLD) such as a general-purpose processor (cpu) (central Processing unit), an fpga (field Programmable Gate array), or the like that executes software (program) and functions as various control units, and a dedicated circuit such as an asic (application Specific Integrated circuit) having a circuit configuration specifically designed for executing a Specific process.

One processing unit may be constituted by one of these various processors, or may be constituted by two or more processors of the same type or different types (for example, a plurality of FPGAs, or a combination of a CPU and an FPGA). Further, a plurality of control units may be configured by one processor. As an example of configuring a plurality of control units with one processor, there is a configuration in which one processor is configured using a combination of one or more CPUs and software, as typified by a computer such as a client or a server, and functions as a plurality of control units. Next, a processor is used, which is represented by a System On Chip (SoC) and which implements the functions of the entire System including a plurality of control units On a single ic (integrated circuit) Chip. In this manner, the various control units are configured using one or more of the various processors described above as a hardware configuration.

In the above-described embodiment, the time-series images or still images captured by the endoscope 10 are provided as medical images to be processed, but the present invention is not limited thereto, and may be medical images captured by other medical equipment such as an ultrasonic diagnostic apparatus and an X-ray imaging apparatus.

It is needless to say that the present invention is not limited to the above embodiments, and various modifications can be made without departing from the spirit of the present invention.

Description of the symbols

9 endoscope system

10 endoscope observer

11 light source device

12 endoscope processor device

13 display device

14 medical image processing apparatus

15 operating part

16. 16A, 16B display

20 insertion part

21 hand-side operation part

22 universal cord

25 soft part

26 bending part

27 tip end portion

28 image pickup element

29 bending operation knob

30 air and water supply button

31 attraction button

32 still image shooting instruction unit

33 introduction port for treatment instrument

35 light guide

36 signal cable

37A connector

37B connector

38 dynamic image

38A frame image

39 still picture

40 image acquisition part

41 CPU

42 classification unit

42A input layer

42B intermediate layer

42C output layer

43 report information generating part

44 composite image generating part

46 display control part

47 storage part

48. 48A, 48B recording part

S10-S22

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于数字耳镜的一次性窥器

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!