Medical image diagnosis apparatus, medical image processing apparatus, and medical image processing method

文档序号:329541 发布日期:2021-12-03 浏览:6次 中文

阅读说明:本技术 医用图像诊断装置、医用图像处理装置及医用图像处理方法 (Medical image diagnosis apparatus, medical image processing apparatus, and medical image processing method ) 是由 五十岚悠 渡边正毅 本庄泰德 今村智久 川岸哲也 于 2021-05-28 设计创作,主要内容包括:实施方式关于医用图像诊断装置、医用图像处理装置及医用图像处理方法。目的是使颤动评价的精度提高。实施方式的医用图像诊断装置具备收集部、计算部和输出部。收集部随着时间经过地从被检体收集信号。计算部对多个帧及多个位置分别计算表示上述信号的帧间的类似度的第1类似度,并计算表示上述第1类似度的随着时间经过的变化在上述多个位置之间的类似度的第2类似度。输出部输出上述第2类似度。(The embodiment relates to a medical image diagnosis apparatus, a medical image processing apparatus, and a medical image processing method. The purpose is to improve the accuracy of judder evaluation. A medical image diagnostic apparatus according to an embodiment includes a collection unit, a calculation unit, and an output unit. The collecting unit collects signals from the subject over time. The calculation unit calculates 1 st similarity indicating the inter-frame similarity of the signal for each of a plurality of frames and a plurality of positions, and calculates 2 nd similarity indicating the similarity between the plurality of positions with respect to a change with time of the 1 st similarity. The output section outputs the 2 nd similarity.)

1. A medical image diagnostic apparatus, wherein,

the disclosed device is provided with:

a collection unit that collects signals from a subject over time;

a calculation unit that calculates 1 st similarity for each of a plurality of frames and a plurality of positions, and calculates 2 nd similarity, the 1 st similarity indicating a similarity between frames of the signal, and the 2 nd similarity indicating a similarity between the plurality of positions with a change in the 1 st similarity with time; and

and an output unit for outputting the 2 nd similarity.

2. The medical image diagnostic apparatus according to claim 1,

the calculation unit generates a correlation curve indicating a change in the frame direction of the 1 st similarity as a change with time of the 1 st similarity, and calculates a similarity with the correlation curve at a different position as the 2 nd similarity.

3. The medical image diagnostic apparatus according to claim 1,

the calculation unit calculates a similarity between a change with time of the 1 st similarity calculated for a1 st position and a change with time of the 1 st similarity calculated for a 2 nd position as the 2 nd similarity, the 2 nd position being included in a predetermined range in the vicinity of the 1 st position.

4. The medical image diagnostic apparatus according to claim 3,

the calculation unit variably controls the predetermined range according to an analysis object.

5. The medical image diagnostic apparatus according to claim 1,

the calculation unit sets an analysis region for calculating the 1 st similarity, and calculates the 1 st similarity with respect to a position included in the analysis region.

6. The medical image diagnostic apparatus according to claim 5,

the calculation unit variably controls the analysis area according to an analysis object.

7. The medical image diagnostic apparatus according to claim 1,

the calculation unit sets a comparison area corresponding to a plurality of pixels at a corresponding position in each of a plurality of frames, and compares the plurality of pixels in the comparison area between frames to calculate the 1 st similarity.

8. The medical image diagnostic apparatus according to claim 7,

the calculation unit variably controls the comparison area according to an analysis object.

9. The medical image diagnostic apparatus according to claim 1,

the calculating unit calculates the 2 nd similarity for each of a plurality of positions;

the output unit generates and outputs an image showing the distribution of the 2 nd similarity.

10. The medical image diagnostic apparatus according to claim 1,

the calculating unit calculates the 2 nd similarity for each of a plurality of positions and a plurality of frames;

the output unit generates and outputs an image showing a distribution of the 2 nd similarity for each of the plurality of frames.

11. The medical image diagnostic apparatus according to claim 1,

the calculation unit calculates the 1 st similarity for a new frame every time the signal is collected, and calculates the 2 nd similarity based on the 1 st similarity calculated for a predetermined number of frames from the new frame.

12. The medical image diagnostic apparatus according to claim 1,

the calculation unit calculates the 1 st similarity for a new frame each time the signal is collected, and calculates the 2 nd similarity based on the 1 st similarity calculated for a predetermined number of frames each time the 1 st similarity is calculated for the predetermined number of frames.

13. The medical image diagnostic apparatus according to claim 11,

the calculation unit variably controls the predetermined number of frames according to an analysis object.

14. A medical image processing apparatus, wherein,

the disclosed device is provided with:

a calculation unit that calculates 1 st similarity indicating similarity between frames of a signal collected from a subject over time, and 2 nd similarity indicating similarity between the plurality of positions in accordance with a change in the 1 st similarity over time, for each of the plurality of frames and the plurality of positions; and

and an output unit for outputting the 2 nd similarity.

15. A medical image processing method, comprising the steps of:

calculating 1 st similarity for each of a plurality of frames and a plurality of positions, and calculating 2 nd similarity, the 1 st similarity indicating a similarity between frames of a signal collected from a subject over time, and the 2 nd similarity indicating a similarity between the plurality of positions with respect to a change of the 1 st similarity over time; and

the 2 nd similarity described above is output.

Technical Field

Embodiments relate to a medical image diagnostic apparatus, a medical image processing apparatus, and a medical image processing method.

Background

In some cases, the diagnosis using medical images is to evaluate tremor. For example, a hemangioma, which is one of benign tumors, is known to appear as tremor in medical images. Therefore, by evaluating the tremor of a suspected tumor site, it is possible to determine whether the site is a hemangioma.

Disclosure of Invention

The invention aims to improve the precision of vibration evaluation.

A medical image diagnostic apparatus according to the present invention includes a collection unit, a calculation unit, and an output unit. The collecting unit collects signals from the subject over time. The calculation unit calculates 1 st similarity indicating the inter-frame similarity of the signal for each of a plurality of frames and a plurality of positions, and calculates 2 nd similarity indicating the similarity between the plurality of positions with respect to a change with time of the 1 st similarity. The output section outputs the 2 nd similarity.

Technical effects

According to the medical image diagnostic apparatus of the embodiment, the accuracy of judder evaluation can be improved.

Drawings

Fig. 1 is a block diagram showing an example of the configuration of an ultrasonic diagnostic apparatus according to embodiment 1.

Fig. 2 is a diagram showing an example of an ultrasonic image according to embodiment 1.

Fig. 3 is a diagram showing an example of the 1 st similarity according to embodiment 1.

Fig. 4A is a diagram for explaining the 2 nd similarity degree calculation processing according to embodiment 1.

Fig. 4B is a diagram for explaining the 2 nd similarity calculation processing according to embodiment 1.

Fig. 4C is a diagram for explaining the 2 nd similarity degree calculation processing according to embodiment 1.

Fig. 5A is a diagram showing an example of a correlation curve according to embodiment 1.

Fig. 5B is a diagram showing an example of a correlation curve according to embodiment 1.

Fig. 6 is a diagram showing an example of the 2 nd similarity calculation processing according to embodiment 1.

Fig. 7 is a diagram showing an example of a color image according to embodiment 1.

Fig. 8 is a diagram showing an example of the color image generation processing according to embodiment 1.

Fig. 9 is a diagram showing an example of the color image generation processing according to embodiment 1.

Fig. 10 is a flowchart for explaining a series of flows of processing performed by the ultrasonic diagnostic apparatus according to embodiment 1.

Fig. 11 is a diagram showing an example of the color image generation processing according to embodiment 2.

Fig. 12 is a block diagram showing an example of the configuration of the medical image processing system according to embodiment 2.

Detailed Description

Hereinafter, embodiments of the medical image diagnostic apparatus and the medical image processing apparatus will be described in detail with reference to the drawings.

In the present embodiment, an ultrasonic diagnostic apparatus 100 shown in fig. 1 will be described as an example of a medical image diagnostic apparatus. Fig. 1 is a block diagram showing an example of the configuration of an ultrasonic diagnostic apparatus 100 according to embodiment 1. For example, the ultrasonic diagnostic apparatus 100 has an apparatus main body 110, an ultrasonic probe 120, an input interface 130, and a display 140. The ultrasonic probe 120, the input interface 130, and the display 140 are communicably connected to the apparatus main body 110. The subject P is not included in the configuration of the ultrasonic diagnostic apparatus 100.

The ultrasonic probe 120 includes a plurality of transducers (for example, piezoelectric transducers) that generate ultrasonic waves based on a drive signal supplied from a transmission/reception circuit 111 included in the device main body 110, which will be described later. The plurality of transducers of the ultrasonic probe 120 receive reflected waves from the subject P and convert the reflected waves into electrical signals. The ultrasonic probe 120 includes a matching layer provided on the transducer, a backing material for preventing propagation of ultrasonic waves backward from the transducer, and the like.

When an ultrasonic wave is transmitted from the ultrasonic probe 120 to the subject P, the transmitted ultrasonic wave is successively reflected by the discontinuity surface of the acoustic impedance of the body tissue of the subject P, and is received as a reflected wave signal (echo signal) by the plurality of transducers included in the ultrasonic probe 120. The amplitude of the received reflected wave signal depends on the difference in acoustic impedance of the discontinuity surface on which the ultrasonic wave is reflected. Here, the reflected wave signal in the case where the ultrasonic pulse is reflected by the surface of a moving blood flow, a heart wall, or the like is subjected to a frequency shift depending on the velocity component of the moving body with respect to the ultrasonic wave transmission direction due to the doppler effect.

The type of the ultrasonic probe 120 is not particularly limited. For example, the ultrasonic probe 120 may be a one-dimensional ultrasonic probe in which a plurality of piezoelectric transducers are arranged in a row, a one-dimensional ultrasonic probe in which a plurality of piezoelectric transducers arranged in a row are mechanically oscillated, or a two-dimensional ultrasonic probe in which a plurality of piezoelectric transducers are two-dimensionally arranged in a grid.

The input interface 130 receives various input operations from a user, converts the received input operations into electrical signals, and outputs the electrical signals to the apparatus main body 110. For example, the input interface 130 is implemented by a mouse, a keyboard, a trackball, a switch, a button, a joystick, a touch panel that performs an input operation by touching an operation surface, a touch panel in which a display screen and the touch panel are integrated, a non-contact input circuit using an optical sensor, an audio input circuit, or the like. The input interface 130 may be a tablet terminal or the like capable of wirelessly communicating with the apparatus main body 110. The input interface 130 may be a circuit that receives an input operation from a user by motion capture. For example, the input interface 130 can receive the body movement and the line of sight of the user as input operations by processing signals acquired via the tracker and images collected by the user. The input interface 130 is not limited to a physical operation component including a mouse, a keyboard, and the like. For example, a processing circuit that receives an electric signal corresponding to an input operation from an external input device provided separately from the apparatus main body 110 and outputs the electric signal to the apparatus main body 110 is also included in the example of the input interface 130.

The display 140 displays various information. For example, the display 140 displays an ultrasonic image collected from the subject P under the control of the processing circuit 114. In addition, for example, the display 140 displays various processing results of the processing circuit 114. The processing performed by the processing circuit 114 will be described later. For example, the display 140 displays a gui (graphical User interface) for receiving various instructions and settings from a User via the input interface 130. For example, the display 140 is a liquid crystal display or a CRT (cathode Ray tube) display. The display 140 may be a desktop type, or may be configured by a tablet terminal or the like that can wirelessly communicate with the apparatus main body 110.

In addition, although the ultrasonic diagnostic apparatus 100 is described as including the display 140 in fig. 1, the ultrasonic diagnostic apparatus 100 may include a projector instead of or in addition to the display 140. The projector may project a screen, a wall, a floor, a body surface of the subject P, or the like under the control of the processing circuit 114. For example, the projector may project the image onto an arbitrary plane, an object, a space, or the like by projection mapping.

The apparatus main body 110 is an apparatus that collects signals from the subject P via the ultrasonic probe 120. The apparatus main body 110 can also generate an ultrasonic image based on a signal collected from the subject P. For example, the device body 110 has a transceiver circuit 111, a signal processing circuit 112, a memory 113, and a processing circuit 114. The transceiver circuit 111, the signal processing circuit 112, the memory 113, and the processing circuit 114 are communicably connected to each other.

The transmission/reception circuit 111 has a pulser, a transmission delay unit, a pulser, and the like, and supplies a drive signal to the ultrasonic probe 120. The pulser repeatedly generates rate pulses for forming transmission ultrasonic waves at a predetermined rate frequency. The transmission delay unit focuses the ultrasound waves generated from the ultrasound probe 120 into a beam shape, and gives delay time of each piezoelectric transducer necessary for determining transmission directivity to each rate pulse generated by the pulse generator. Further, the pulser applies a drive signal (drive pulse) to the ultrasonic probe 120 at a timing based on the rate pulse. That is, the transmission delay unit arbitrarily adjusts the transmission direction of the ultrasonic wave transmitted from the piezoelectric transducer surface by changing the delay time given to each rate pulse.

The transmission/reception circuit 111 has a function of instantaneously changing a transmission frequency, a transmission drive voltage, and the like in order to execute a predetermined scanning procedure based on an instruction from a processing circuit 114 described later. In particular, the transmission drive voltage is changed by a linear amplifier type transmission circuit capable of instantaneously switching the value thereof or a mechanism for electrically switching a plurality of power supply units.

The transmission/reception circuit 111 includes a preamplifier, an a/D (Analog/Digital) converter, a reception delay unit, an adder, and the like, and generates reflected wave data by performing various processes on the reflected wave signal received by the ultrasonic probe 120. The pre-amplifier amplifies the reflected wave signal for each channel. The A/D converter A/D-converts the amplified reflected wave signal. The reception delay unit provides a delay time necessary for determining reception directivity. The adder performs an addition process of the reflected wave signal processed by the reception delay unit to generate reflected wave data. By the addition processing by the adder, the reflection component from the direction corresponding to the reception directivity of the reflected wave signal is emphasized, and a comprehensive beam for ultrasonic transmission and reception is formed by the reception directivity and the transmission directivity.

The transmission/reception circuit 111 transmits an ultrasonic beam in a two-dimensional direction from the ultrasonic probe 120 when scanning a two-dimensional region of the subject P. The transceiver circuit 111 generates two-dimensional reflected wave data from the reflected wave signal received by the ultrasonic probe 120. When scanning the three-dimensional region of the subject P, the transmission/reception circuit 111 transmits an ultrasonic beam in the three-dimensional direction from the ultrasonic probe 120. The transceiver circuit 111 generates three-dimensional reflected wave data from the reflected wave signal received by the ultrasonic probe 120.

The signal processing circuit 112 performs, for example, logarithmic amplification, envelope detection processing, and the like on the reflected wave data received from the transmission/reception circuit 111, and generates data (B-mode data) in which the signal intensity at each sample point is expressed by brightness of luminance. The B-mode data generated by the signal processing circuit 112 is output to the processing circuit 114.

The signal processing circuit 112 generates data (doppler data) obtained by extracting motion information of the moving object based on the doppler effect at each sample point in the scanning area, for example, based on the reflected wave data received from the transmission/reception circuit 111. Specifically, the signal processing circuit 112 performs frequency analysis on velocity information from the reflected wave data, extracts blood flow and tissue due to the doppler effect, and contrast agent echo components, and generates data (doppler data) in which moving body information such as average velocity, variance, and power is extracted for a plurality of points. The moving object is, for example, a blood flow, a tissue such as a heart wall, or a contrast medium. The motion information (blood flow information) obtained by the signal processing circuit 112 is output to the processing circuit 114. The doppler data can be displayed in color as, for example, an average velocity image, a variance image, a power image, or a combined image thereof.

The memory 113 is implemented by, for example, a semiconductor memory element such as a ram (random Access memory) or a flash memory, a hard disk, an optical disk, or the like. For example, the memory 113 stores a program for realizing the functions of the circuits included in the ultrasonic diagnostic apparatus 100. The memory 113 stores various data collected by the ultrasonic diagnostic apparatus 100. For example, the memory 113 stores B-mode data and doppler data generated by the signal processing circuit 112. Further, for example, the memory 113 stores the ultrasonic image generated by the processing circuit 114 based on the B-mode data doppler data. In addition, the memory 113 stores various data such as diagnosis information (for example, patient ID, doctor's knowledge, and the like), a diagnosis protocol, and body markers. The memory 113 may be implemented by a server group (cloud) connected to the ultrasound diagnostic apparatus 100 via a network.

The processing circuit 114 controls the overall operation of the ultrasonic diagnostic apparatus 100 by executing the control function 114a, the collection function 114b, the calculation function 114c, and the output function 114 d. Here, the collecting function 114b is an example of a collecting unit. The calculation function 114c is an example of a calculation unit. The output function 114d is an example of an output unit.

For example, the processing circuit 114 reads and executes a program corresponding to the control function 114a from the memory 113, thereby controlling various functions such as the collection function 114b, the calculation function 114c, and the output function 114d based on various input operations received from the user via the input interface 130.

Further, for example, the processing circuit 114 reads out a program corresponding to the collection function 114b from the memory 113 and executes the program, thereby collecting a signal from the subject P. For example, the collection function 114B collects B-mode data and doppler data from the subject P by controlling the transmission/reception circuit 111 and the signal processing circuit 112.

The collection function 114b may perform processing for generating an ultrasonic image based on a signal collected from the subject P. For example, the collection function 114B generates a B-mode image in which the intensity of the reflected wave is expressed by brightness based on the B-mode data. Further, the collection function 114b generates a doppler image representing moving body information based on the doppler data, for example. The doppler image is velocity image data, variance image data, power image data, or image data obtained by combining these.

For example, the collection function 114B generates an ultrasound image by scan-converting the B-mode data and the doppler data. That is, the acquisition function 114b converts (scan-converts) the scan line signal sequence of the ultrasonic scan into a scan line signal sequence of a video format represented by a television or the like, thereby generating an ultrasonic image. For example, the collection function 114b generates an ultrasound image by performing coordinate transformation according to the scanning pattern of the ultrasound probe 120.

The collection function 114b may perform various image processing on the ultrasound image. For example, the acquisition function 114b performs image processing (smoothing processing) for reconstructing an average value image of luminance, image processing (edge enhancement processing) using a differential filter in an image, and the like using ultrasonic images of a plurality of frames. The collection function 114b combines accompanying information (character information of various parameters, scales, body marks, and the like) into the ultrasound image, for example. For example, when three-dimensional image data (volume data) is generated as an ultrasound image, the collection function 114b performs rendering processing on the volume data to generate a two-dimensional image for display.

For example, the processing circuit 114 reads out a program corresponding to the calculation function 114c from the memory 113 and executes the program, thereby calculating a1 st similarity indicating a similarity between frames of a signal collected from the subject P for each of a plurality of frames and a plurality of positions, and further calculating a 2 nd similarity indicating a similarity between positions of a change with time of the 1 st similarity. Further, for example, the processing circuit 114 reads out a program corresponding to the output function 114d from the memory 113 and executes the program, thereby outputting the 2 nd similarity calculated by the calculation function 114 c. For example, the output function 114d controls display on the display 140 or data transmission via a network. The processing of the calculation function 114c and the output function 114d will be described later.

In the following description, the "similarity" such as the 1 st similarity and the 2 nd similarity may be either an index indicating a degree of similarity or an index indicating a degree of dissimilarity. That is, the similarity may be defined such that the value is larger as the similarity is larger, or may be defined such that the value is smaller as the similarity is larger.

In the ultrasonic diagnostic apparatus 100 shown in fig. 1, each processing function is stored in the memory 113 in the form of a program executable by a computer. The transmission/reception circuit 111, the signal processing circuit 112, and the processing circuit 114 are processors that read out and execute programs from the memory 113 to realize functions corresponding to the respective programs. In other words, the transmission/reception circuit 111, the signal processing circuit 112, and the processing circuit 114 that read the program have functions corresponding to the read program.

In addition, although the control function 114a, the collection function 114b, the calculation function 114c, and the output function 114d are described as being realized by a single processing circuit 114 in fig. 1, the processing circuit 114 may be configured by combining a plurality of independent processors, and the functions may be realized by executing programs by the respective processors. In addition, the processing functions of the processing circuit 114 can also be implemented by appropriately distributing or combining the processing functions into a single or multiple processing circuits.

Further, the processing circuit 114 may also realize functions by a processor of an external device connected via a network. For example, the processing circuit 114 reads out and executes a program corresponding to each function from the memory 113, and realizes each function shown in fig. 1 by using a server group (cloud) connected to the ultrasound diagnostic apparatus 100 via a network as a computing resource.

The configuration example of the ultrasonic diagnostic apparatus 100 is explained above. With this configuration, the ultrasonic diagnostic apparatus 100 improves the accuracy of the tremor evaluation by the processing performed by the processing circuit 114.

First, the collection function 114b collects signals from the subject P over time by controlling the transmission/reception circuit 111 and the signal processing circuit 112. For example, the collection function 114B collects and performs image generation processing of signals with the passage of time, and generates the B-mode image I11, the B-mode image I12, the B-mode image I13, and the B-mode image I14 shown in fig. 2 in turn. That is, in the case shown in fig. 2, the collection function 114b collects signals of a plurality of frames over time, and generates an ultrasonic image of each frame. Fig. 2 is a diagram showing an example of an ultrasonic image according to embodiment 1.

Next, the calculation function 114c sets an analysis region (analysis ROI). For example, the calculation function 114c receives an operation of specifying an analysis ROI from a user via the input interface 130, and sets the analysis ROI. For example, the output function 114d displays the B-mode image I11 on the display 140, and the calculation function 114c receives an operation for specifying an analysis ROI from a user who refers to the B-mode image I11. In this case, the analytical ROI set on the B-mode image I11 is applied as it is to the corresponding positions of the B-mode image I12, the B-mode image I13, and the B-mode image I14. Alternatively, the calculation function 114c may automatically set the analysis ROI based on the diagnostic information or the like. Alternatively, the calculation function 114c may analyze the entire collected ultrasound image as the ROI.

The calculation function 114c may variably control the analysis ROI according to the analysis target. For example, the calculation function 114c adjusts the shape and size of the analysis ROI so as to include the analysis target and the peripheral region of the analysis target having no local signal change. That is, the calculation function 114c adjusts the shape and size of the analysis ROI so that the analysis ROI includes a region serving as a reference of analysis in addition to the analysis target.

Next, the calculation function 114c sets a comparison region in each analysis ROI of the ultrasound image. For example, the calculation function 114c sets the kernel R11 shown in fig. 2 for the B-mode image I11. The kernel R11 is a small region having a predetermined size and shape, and corresponds to a plurality of pixels in the B-mode image I11. Similarly, the calculation function 114c sets the kernel R12 for the B-mode image I12, the kernel R13 for the B-mode image I13, and the kernel R14 for the B-mode image I14. The kernel R11, the kernel R12, the kernel R13, and the kernel R14 are set at corresponding positions in a plurality of frames. Further, core R11, core R12, core R13, and core R14 are examples of comparison regions.

Next, the calculation function 114c calculates the degree of similarity by comparing a plurality of pixels in the comparison area between frames. That is, the calculation function 114c calculates the degree of similarity between frames of the signal. For example, the calculation function 114c calculates the correlation coefficient rxy of the image in the comparison area between adjacent frames by the following expression (1). In addition, x represents a "frame number of interest", y represents a "frame number of interest + 1", i represents an "ith pixel value", and n represents the total number of pixels in the comparison area.

For example, in the case shown in fig. 2, the calculation function 114C calculates the correlation coefficient C1 between the kernel R11 and the kernel R12, the correlation coefficient C2 between the kernel R12 and the kernel R13, and the correlation coefficient C3 between the kernel R13 and the kernel R14, using the above formula (1). The calculation function 114c may calculate the correlation coefficient not between adjacent frames but between frames separated by a predetermined interval.

In other words, the calculation function 114c calculates the similarity in the time direction of the signals by comparing a plurality of pixels in the comparison area between frames. Hereinafter, the similarity in the time direction is also referred to as "1 st similarity".

The calculation function 114c may variably control the comparison area according to the analysis target. For example, the calculation function 114c adjusts the size of the comparison area to be the same size as a minute signal change. For example, the calculation function 114c adjusts the size of the comparison region so that the size corresponds to one period of the shade of the signal generated by the dither. For example, the calculation function 114c adjusts the size of the comparison area so that the comparison area is enlarged by a predetermined magnification with a slight signal change.

As described above, the calculation function 114c performs the 1 st similarity calculation for each frame. Thus, regarding the 1 st similarity, for example, as shown in fig. 3, rendering may be performed in correspondence with the frame number. For example, the calculation function 114C sets the average value of the correlation coefficient C1 calculated between the kernel R11 and the kernel R12 and the correlation coefficient C2 calculated between the kernel R12 and the kernel R13 as the 1 st similarity in the frame number of the B-mode image I12. In addition, as an example, the calculation function 114C sets the average value of the correlation coefficient C2 calculated between the kernel R12 and the kernel R13 and the correlation coefficient C3 calculated between the kernel R13 and the kernel R14 as the 1 st similarity of the frame number of the B-mode image I13. Fig. 3 is a diagram showing an example of the 1 st similarity according to embodiment 1.

Further, the calculation function 114c repeatedly executes the 1 st similarity calculation process while shifting the comparison area in the spatial direction. That is, the calculation function 114c calculates the 1 st similarity for each position within the analysis ROI. For example, the calculation function 114c calculates the 1 st similarity for each pixel within the resolution ROI. Alternatively, the calculation function 114c may calculate the 1 st similarity degree for each pixel group in which a plurality of pixels are grouped. In other words, the calculation function 114c calculates the 1 st similarity degree per frame and per position. In this case, a graph as shown in fig. 3 can be generated for each position in the analysis ROI.

Here, the user can evaluate the tremor based on the 1 st similarity. That is, since the 1 st similarity indicates a change in a signal between frames, a value varies at a position where chattering occurs. Therefore, the user can determine whether or not the region is a hemangioma by evaluating tremor with reference to the 1 st similarity at the suspected region having a tumor.

However, during the period in which the signals of a plurality of frames are collected, positional deviation between frames may occur due to respiratory motion of the subject P, motion of the ultrasonic probe 120, or the like. When such interference occurs, the signal changes between frames, and therefore, the value of the 1 st similarity varies similarly to the case where chattering occurs. That is, when the judder is evaluated based on only the 1 st similarity, it is difficult to distinguish the judder from the disturbance, and the change that is originally intended to be captured may be masked.

In order to cope with the positional deviation between frames, it is conceivable to perform the motion correction between frames in advance before calculating the 1 st similarity. For example, the positional deviation between frames can be classified into 3 directions of the depth direction, the azimuth direction, and the depth direction, and the positional deviation in the depth direction and the azimuth direction can be corrected by applying a motion stabilizer or the like. However, it is difficult to correct the movement with respect to the positional deviation in the depth direction. That is, when a positional deviation in the depth direction occurs, the cross section of the collected signal itself changes, and therefore, a corresponding position cannot be specified between frames. Therefore, even if the motion correction is performed as the preprocessing, at least the positional deviation in the depth direction remains. The positional deviation in the depth direction is also referred to as an off plane (off plane).

Therefore, the calculation function 114c further calculates the 2 nd similarity based on the 1 st similarity, thereby improving the accuracy of the judder evaluation. This point will be described below with reference to fig. 4A, 4B, and 4C. Fig. 4A, 4B, and 4C are diagrams for explaining the 2 nd similarity degree calculation processing according to embodiment 1.

For example, the calculation function 114c takes the position a1 shown in fig. 4A as the point of interest, and acquires the change with time of the 1 st similarity of the point of interest. For example, the calculation function 114c plots the 1 st similarity calculated for the attention point in association with the frame number, as in the case shown in fig. 3. Further, the calculation function 114c approximates the plot (plot) with a curve to generate a curve shown in fig. 4B. That is, the curve shown in fig. 4B represents the change in the frame direction of the 1 st similarity of the attention point. Hereinafter, a curve showing a change in the frame direction of the 1 st similarity is also referred to as a correlation curve. Note that the focus is also referred to as a1 st position.

Further, the calculation function 114c generates a correlation curve for each of the neighboring points included in a predetermined range in the vicinity of the target point. For example, the calculation function 114c sets a rectangular region of "7 pixels × 7 pixels" in advance as a predetermined range. In this case, as shown in fig. 4A, the neighborhood 48 pixels of the position a1 are defined as a neighborhood point. Further, as shown in fig. 4C, the calculation function 114C generates a correlation curve for each of the plurality of nearby points. The adjacent point is also referred to as "position 2".

The calculation function 114c may variably control the predetermined range according to the analysis target. That is, the calculation function 114c may variably control the range of the neighboring point according to the analysis object. In fig. 4A to 4C, although a plurality of neighboring points are assumed to be described, the neighboring points may be only 1 point.

Next, the calculation function 114c calculates the similarity of the correlation curve generated for the point of interest and the correlation curve generated for the nearby points. For example, the calculation function 114c calculates correlation coefficients of a correlation curve with the attention point for each of the plurality of nearby points, and calculates an average value of the calculated correlation coefficients.

In other words, the calculation function 114c calculates the similarity in the spatial direction with respect to the change with time of the 1 st similarity by comparing the correlation curve of the point of interest with the correlation curves of different positions. Hereinafter, the similarity in such a spatial direction is also referred to as "similarity 2".

Here, if the correlation curves of the attention point and the neighboring points are compared with each other when the attention point is jittered, the positions and heights of the peaks become inconsistent as shown in fig. 5A. That is, in the case where there is a change in a signal that is locally specific to the periphery in a tumor, such as a hemangioma, if a correlation curve between a point of interest and a nearby point is compared, the 1 st similarity differs depending on the position, and therefore the correlation curve itself observed in the time direction also differs depending on the analysis location. Therefore, when chattering occurs, the correlation curve of the attention point tends not to be similar to the correlation curve of the neighboring point. Fig. 5A is a diagram showing an example of a correlation curve according to embodiment 1.

On the other hand, when the interference occurs, as shown in fig. 5B, the frame direction changes in the correlation curve at the attention point and the vicinity point in the same manner. That is, since the positional deviation occurs at the same timing at which position, the correlation curves of the attention point and the neighboring points change uniformly at the same timing. Thus, in the case where the disturbance occurs, the correlation curve of the point of interest has a tendency similar to that of the neighboring points. From the above, by calculating the similarity of the correlation curve as the 2 nd similarity, it is possible to recognize the disturbance and the chattering. Fig. 5B is a diagram showing an example of a correlation curve according to embodiment 1.

Here, the calculation processing of the 2 nd similarity degree will be described in more detail with reference to fig. 6. Fig. 6 is a diagram showing an example of the 2 nd similarity calculation processing according to embodiment 1. In fig. 6, the correlation curve is represented as a sinusoidal curve for convenience of explanation.

For example, when the peak positions and the like of the correlation curves at the neighboring points coincide with each other with respect to the correlation curve at the point of interest and correspond to "0 ° deviation", calculation function 114c calculates "correlation coefficient CC is 1.0". Note that, when the "60 ° deviation" is satisfied, the calculation function 114c calculates "the correlation coefficient CC is 0.5". Note that, when the "90 ° deviation" is satisfied, the calculation function 114c calculates "the correlation coefficient CC is 0.0". Further, calculation function 114c calculates correlation coefficients CC for each of the plurality of nearby points, and performs averaging processing for the plurality of correlation coefficients CC.

Here, the calculation function 114c may perform a value inversion process. That is, when the correlation coefficient CC becomes large, such as a case corresponding to "0 ° deviation", it may be considered that the correlation coefficient CC changes at the same timing at each position due to interference. On the other hand, when the correlation coefficient CC becomes small, such as the case corresponding to the "60 ° deviation" and the case corresponding to the "90 ° deviation", it may be considered that a local change occurs due to chattering. In addition, when evaluating chatter vibration, since the value of the characteristic value becomes larger when chatter vibration occurs and is easier to intuitively understand, the calculation function 114c may invert the value of the correlation coefficient CC. For example, the calculation function 114c calculates a value obtained by removing the value of the correlation coefficient CC from 1 as the 2 nd similarity.

As described above, the calculation function 114c calculates the correlation coefficient CC between the attention point and the neighboring point, performs the averaging process and the value inversion process, and calculates the 2 nd similarity. For example, the calculation function 114c may set the number of the neighboring points to "i", and calculate the 2 nd similarity by the expression of "1-mean (cci)". The calculation function 114c also repeats the 2 nd similarity calculation process while moving the position of the point of interest within the analysis ROI, thereby calculating the 2 nd similarity for each position within the analysis ROI.

Next, the output function 114d outputs the 2 nd similarity calculated by the calculation function 114 c. For example, the output function 114d generates and outputs an image representing the distribution of the 2 nd similarity degree. For example, the output function 114d generates a color image shown in fig. 7 by assigning a color corresponding to the magnitude of the 2 nd similarity to each pixel (position). The color described here may be one of hue, brightness, and chroma, or a combination thereof. The color image is also referred to as a parametric image. The output function 114d causes the display 140 to display the generated color image. Fig. 7 is a diagram showing an example of a color image according to embodiment 1.

In addition, the color image may be displayed as a still image or a moving image. In the case of performing display as a still image, the calculation function 114c calculates the 2 nd similarity for at least 1 frame, and the output function 114d generates at least 1 color image and causes the display 140 to display it. In the case of displaying a moving image, the calculation function 114c calculates the 2 nd similarity for a plurality of frames, and the output function 114d generates a color image for each of the plurality of frames and sequentially displays the color image on the display 140.

Here, a case where display as a still image is performed will be described with reference to fig. 8. Fig. 8 is a diagram showing an example of the color image generation processing according to embodiment 1. For example, as shown in fig. 8, the calculation function 114c calculates 1 st similarity for each position in the analysis ROI with respect to the B-mode images I111 to I11 n. That is, the predetermined number "n" shown in fig. 8 indicates the analysis range in the frame direction. Further, the calculation function 114c generates a correlation curve indicating a change in the frame direction of the 1 st similarity for each position within the analysis ROI. Next, the calculation function 114c calculates the 2 nd similarity indicating the similarity between the positions of the correlation curves for each position in the analysis ROI.

Next, the output function 114d generates a color image I211 by assigning a color corresponding to the magnitude of the 2 nd similarity to each pixel. The color image I211 shown in fig. 8 is a color image in which n frames from the B-mode image I111 to the B-mode image I11n are used as an analysis range. The output function 114d then displays the color image I211 as a still image on the display 140.

Next, a case where display is performed as a moving image will be described with reference to fig. 9. Fig. 9 is a diagram showing an example of the color image generation processing according to embodiment 1. For example, as shown in fig. 9, the calculation function 114c calculates 1 st similarity for each position in the analysis ROI with respect to each of the B-mode images I121 to I12 n. Further, the calculation function 114c generates a correlation curve indicating a change in the frame direction of the 1 st similarity for each position within the analysis ROI. Next, the calculation function 114c calculates the 2 nd similarity indicating the similarity between the positions of the correlation curves for each position within the analysis ROI. Next, the output function 114d generates a color image I221 by assigning a color corresponding to the magnitude of the 2 nd similarity to each pixel. Color image I221 is a color image having n frames from B-mode image I121 to B-mode image I12n as the analysis range.

Here, when a signal is newly collected, the calculation function 114c calculates the 1 st similarity for a new frame, and further calculates the 2 nd similarity based on the 1 st similarity calculated for a predetermined number of frames from the new frame. For example, when a signal is newly collected and a B-mode image I12(n +1) is generated, the calculation function 114c calculates the 1 st similarity for each position in the analysis ROI with respect to the B-mode image I12(n + 1). Further, the calculation function 114c generates a correlation curve indicating a change in the frame direction of the 1 st similarity for each position within the analysis ROI based on the 1 st similarity calculated for n frames from the B-mode image I122 to the B-mode image I12(n + 1). Next, the calculation function 114c calculates the 2 nd similarity indicating the similarity between the positions of the correlation curves for each position within the analysis ROI. Next, the output function 114d generates a color image I222 by assigning a color corresponding to the magnitude of the 2 nd similarity to each pixel. The color image I222 is a color image obtained by using n frames from the B-mode image I122 to the B-mode image I12(n +1) as an analysis range.

Similarly, the calculation function 114c and the output function 114d can generate and display a color image each time a signal is newly collected. For example, the calculation function 114c and the output function 114d can generate a color image in real time in parallel with the signal collection from the subject P and display the color image on the display 140 as a moving image.

The predetermined number "n" shown in fig. 8 and 9 may be variably controlled according to the analysis object. That is, the calculation function 114c may variably control the analysis range in the frame direction according to the analysis target. For example, when the analysis target is a part that moves periodically by heartbeat, respiration, or the like, the calculation function 114c adjusts the predetermined number "n" so as to include the movement of the analysis target for 1 cycle or more. For example, the calculation function 114c analyzes in advance the analysis time required for the analysis result to be different from the peripheral region of the analysis target, and adjusts the predetermined number "n" according to the required analysis time.

Next, an example of the procedure of the processing performed by the ultrasonic diagnostic apparatus 100 will be described with reference to fig. 10. Fig. 10 is a flowchart for explaining a series of flows of processing performed by the ultrasonic diagnostic apparatus 100 according to embodiment 1. Step S101, step S102, and step S103 correspond to the collection function 114 b. Step S104, step S105, step S106, and step S107 correspond to the calculation function 114 c. Step S108 and step S109 correspond to the output function 114 d. In fig. 10, a case where a color image is generated and displayed in parallel with signal collection from the subject P will be described as an example, as in the case shown in fig. 9.

First, the processing circuit 114 determines whether or not to start signal collection from the subject P (step S101), and if not, it is in a standby state (no in step S101). On the other hand, when the signal collection is started (yes in step S101), the processing circuit 114 determines whether or not the signal collection is continued (step S102), and when the signal collection is continued, the transmission/reception circuit 111 and the signal processing circuit 112 are controlled to collect a signal from the subject P (step S103).

Next, the processing circuit 114 determines whether or not a predetermined number of frames have been collected (step S104). For example, in the case shown in fig. 8 and 9, a predetermined number n of frames are set as the analysis range, and color images cannot be generated until signals of n frames are collected. Therefore, when the predetermined number of frames have not been collected (no in step S104), the processing circuit 114 again proceeds to step S102 to continue the signal collection.

On the other hand, when a predetermined number of frames are collected (yes in step S104), the processing circuit 114 calculates a correlation coefficient of the signal between frames for each position in the analysis ROI (step S105). That is, the processing circuit 114 calculates the 1 st similarity. The processing circuit 114 also generates a correlation curve for each position in the analysis ROI (step S106), and calculates a correlation coefficient of the correlation curve between the positions (step S107). That is, the processing circuit 114 calculates the 2 nd similarity degree.

Next, the processing circuit 114 generates a color image by assigning a color corresponding to the magnitude of the 2 nd similarity to each pixel (step S108), and causes the display 140 to display the generated color image (step S109). After step S109, the processing circuit 114 again proceeds to step S102, and determines whether or not to continue signal collection. When the collection of the signal is continued, the processing circuit 114 executes the processing of step S103 to step S109 again. That is, while the collection of the signal is continued, the processing circuit 114 updates the color image based on the newly collected signal and displays the color image as a moving image on the display 140. On the other hand, when the collection of the signal is not continued (no in step S102), the processing circuit 114 ends the processing.

As described above, according to embodiment 1, the collection function 114b collects signals from the subject P over time. Further, the calculation function 114c calculates the 1 st similarity indicating the similarity between frames of the signals, and calculates the 2 nd similarity indicating the similarity between positions of the 1 st similarity which changes with time. Further, the output function 114d outputs the 2 nd similarity degree. Therefore, the ultrasonic diagnostic apparatus 100 according to embodiment 1 can improve the accuracy of the tremor evaluation.

That is, the ultrasonic diagnostic apparatus 100 can perform quantitative evaluation of the tremor by outputting a numerical value indicating the tremor. Further, when evaluating the judder based on the similarity between the frames of the signal, the change that is originally intended to be captured may be masked by the influence of the disturbance such as the cross section deviation. On the other hand, the ultrasonic diagnostic apparatus 100 calculates the 1 st similarity indicating the similarity between frames of the signals, and calculates the 2 nd similarity indicating the similarity between positions of the 1 st similarity with time. Thus, the ultrasonic diagnostic apparatus 100 can recognize the chatter vibration from the disturbance, and can improve the accuracy of the chatter vibration evaluation.

Incidentally, although the embodiment 1 has been described so far, it may be implemented in various forms other than the above-described embodiment.

For example, in embodiment 1, an example of fig. 9 is shown in a case where a color image is displayed as a moving image fr. That is, in fig. 9, a case where a color image is generated and displayed every time a new frame is collected is described. However, the embodiment is not limited thereto.

For example, the calculation function 114c and the output function 114d may generate and display a color image every time a predetermined number of frames are collected. Specifically, as shown in fig. 11, the calculation function 114c calculates 1 st similarity for each position in the analysis ROI with respect to the B-mode images I131 to I13 n. Further, the calculation function 114c generates a correlation curve indicating a change in the frame direction of the 1 st similarity for each position within the analysis ROI. Next, the calculation function 114c calculates the 2 nd similarity indicating the similarity between the positions of the correlation curves for analyzing each position in the ROI. Next, the output function 114d generates a color image I231 by assigning a color corresponding to the magnitude of the 2 nd similarity to each pixel. The color image I231 is a color image having n frames from the B-mode image I131 to the B-mode image I13n as an analysis range. The output function 114d causes the display 140 to display the color image I231. Fig. 11 is a diagram showing an example of the color image generation processing according to embodiment 2.

Similarly, the calculation function 114c calculates 1 st similarity for each position in the analysis ROI with respect to the B-mode images I141 to I14 n. Further, the calculation function 114c generates a correlation curve indicating a change in the frame direction of the 1 st similarity for each position within the analysis ROI. Next, the calculation function 114c calculates the 2 nd similarity indicating the similarity between the positions of the correlation curves for each position within the analysis ROI. Next, the output function 114d generates a color image I241 by assigning a color corresponding to the magnitude of the 2 nd similarity to each pixel. The color image I241 is a color image having n frames from the B-mode image I141 to the B-mode image I14n as an analysis range. The output function 114d causes the display 140 to display the color image I241 instead of the color image I231.

In the case shown in fig. 11, the frequency of generating a color image is lower than that in the case shown in fig. 9, and therefore the frame rate as a moving image is lower. However, in the case shown in fig. 11, the frequency of performing the 2 nd similarity calculation is lower than that in the case shown in fig. 9, and therefore, the calculation load can be reduced. The output function 114d may switch the display mode of fig. 9 and the display mode of fig. 11 in accordance with an instruction from the user.

In the above-described embodiment, a case where a color image is displayed in parallel with the collection of signals has been described. That is, in the above-described embodiment, the real-time processing is explained. However, the embodiment is not limited thereto.

For example, the collection function 114b collects signals from the subject P over time, generates ultrasonic images of a plurality of frames, and stores the ultrasonic images in the memory 113 or an external image storage device. Then, the calculation function 114c reads out the stored ultrasonic image and calculates the 1 st similarity and the 2 nd similarity, for example, in response to a request from the user. The output function 114d generates a color image showing the distribution of the 2 nd similarity, and displays the color image on the display 140.

Alternatively, the output function 114d generates a color image showing the distribution of the 2 nd similarity and stores the color image in the memory 113, an external image storage device, or the like. Then, the output function 114d reads out the stored color image in response to a request from the user, for example, and displays the image on the display 140.

In the above-described embodiment, a case where the color image generated by the output function 114d is displayed on the display 140 has been described. However, the embodiment is not limited thereto. For example, the output function 114d may transmit the generated color image to an external device. In this case, the user can refer to the color image on a display provided in the external device, for example.

In the above-described embodiment, a case where a color image representing the distribution of the 2 nd similarity is generated and output has been described as an example of the process of outputting the 2 nd similarity. However, the embodiment is not limited thereto. For example, the output function 114d may output the calculated 2 nd similarity as a graph, a table, or a text. For example, the output function 114d may generate a graph in which the position coordinates in the analysis ROI are associated with the 2 nd similarity, and display the graph on the display 140.

In the above-described embodiment, the correlation coefficient of the formula (1) is described as the 1 st similarity. However, the embodiment is not limited thereto. For example, the calculating function 114c may calculate sad (sum of Absolute difference) and ssd (sum of Squared difference) as the 1 st similarity.

As another example of the 1 st similarity calculation process, the calculation function 114c first performs a difference process between images for a plurality of frames of B-mode images at a predetermined frame interval to generate a difference image for the plurality of frames. Here, in the B-mode image, a background component may be mixed in addition to a component derived from the pulsation of the hemangioma. The background component is, for example, vibration derived from various factors such as vibration derived from liver tissue, vibration derived from hand operation of the user, vibration derived from device performance, and vibration of speckle. The calculation function 114c can remove the background component by performing difference processing between images.

Next, the calculation function 114c takes the absolute value of the pixel value of each pixel for each of the plurality of difference images. That is, the difference value of each pixel included in the difference image includes a negative value, but the calculation function 114c converts them into a positive value. Next, the calculation function 114c calculates an integrated value obtained by integrating the absolute value of each pixel and the absolute value of the peripheral pixel for each of the plurality of difference images. For example, the calculation function 114c integrates the absolute value of each pixel and the absolute values of the peripheral pixels using a kernel (small region). Further, the calculation function 114c calculates the average value of the pixel value of each pixel and the pixel values of the peripheral pixels for each of the plurality of B-mode images. For example, the calculation function 114c calculates an average value of the pixel value of each pixel and the pixel values of the peripheral pixels using the kernel.

Next, the calculation function 114c calculates a division value obtained by dividing the integrated value by the average value. Here, the calculation function 114c can calculate a division value for each frame for each position in the analysis ROI. Next, the calculation function 114c calculates an index value by integrating the division values at the respective positions in the frame direction. For example, when the division values of N frames are accumulated, the calculation function 114c can calculate an index value based on the signal of the past N frames for each frame and for each position. The larger the signal due to the jitter varies between frames, the larger the index value. That is, the index value is an example of the 1 st similarity indicating the similarity between frames of the signal.

In the above-described embodiment, the case where the similarity between the correlation curve and the different position is calculated as the 2 nd similarity is described. However, the embodiment is not limited thereto. For example, the calculation function 114c may also generate a scatter chart or a line chart drawn with the 1 st similarity degree associated with the frame number for each position within the analysis ROI, and calculate the similarity degree with the scatter chart or the line chart of a different position as the 2 nd similarity degree. For example, the calculation function 114c may calculate a statistical value indicating a change with time of the 1 st similarity for each position in the analysis ROI, and calculate a similarity with the statistical value of a different position as the 2 nd similarity.

The collection function 114b, the calculation function 114c, and the output function 114d may also perform various processes, which are not described in the above embodiments. For example, the calculation function 114c may apply various image processing to the B-mode image before the 1 st similarity is calculated. For example, the calculation function 114c applies a low-pass filter in the frame direction and a median filter in the spatial direction to the B-mode images of a plurality of frames. Thus, the calculation function 114c can reduce various kinds of noise such as spike noise and speckle noise, and can calculate the 1 st similarity and the 2 nd similarity based on the 1 st similarity with higher accuracy.

In the above-described embodiment, the case of evaluating the fibrillation of the hemangioma was described. However, the embodiment is not limited thereto. That is, the present invention is not limited to hemangiomas, and can be applied similarly to changes in tissues that exhibit tremor.

In the above-described embodiment, it is assumed that a B-mode image is generated and the 1 st similarity and the 2 nd similarity are calculated based on the B-mode image. However, the embodiment is not limited thereto. For example, the calculation function 114c may calculate the 1 st similarity and the 2 nd similarity based on other ultrasound images such as a color doppler image, Elastography, Shear Wave Elastography (SWE), attenuation imaging image, and the like.

In the above-described embodiment, the 1 st similarity and the 2 nd similarity are calculated based on the ultrasonic image. However, the embodiment is not limited thereto. For example, the calculation function 114c may calculate the 1 st similarity and the 2 nd similarity based on the B-mode data. That is, the calculation function 114c may perform the 1 st similarity and the 2 nd similarity calculation processing based on the image, or may perform the 1 st similarity and the 2 nd similarity calculation processing based on the data at the previous stage of the image generation processing.

In the above-described embodiment, the processing for the signal collected by the ultrasonic diagnostic apparatus 100 is described. However, the embodiment is not limited thereto. For example, the present invention is also applicable to signals collected by other types of medical image diagnostic apparatuses such as a photoacoustic wave diagnostic apparatus (photoacoustic imaging apparatus), an X-ray diagnostic apparatus, an X-ray ct (Computed tomography) apparatus, an mri (magnetic Resonance imaging) apparatus, a spect (single Photon Emission Computed tomography) apparatus, and a pet (pulsed Emission Computed tomography) apparatus.

In the above-described embodiment, the processing circuit 114 included in the ultrasonic diagnostic apparatus 100 is assumed to execute the calculation function 114c and the output function 114 d. That is, in the above-described embodiment, the description has been given assuming that the calculation function 114c and the output function 114d are executed in the medical image diagnostic apparatus that collects and executes signals from the subject P. However, the embodiment is not limited to this, and may be a case where an apparatus different from the medical image diagnostic apparatus executes the functions corresponding to the calculation function 114c and the output function 114 d. This point will be described below with reference to fig. 12. Fig. 12 is a block diagram showing an example of the configuration of the medical image processing system 1 according to embodiment 2.

The medical image processing system 1 shown in fig. 12 includes a medical image diagnostic apparatus 10, an image archiving apparatus 20, and a medical image processing apparatus 30. For example, the medical image diagnostic apparatus 10, the image archiving apparatus 20, and the medical image processing apparatus 30 are connected to each other via a network NW. The medical image diagnostic apparatus 10, the image storage apparatus 20, and the medical image processing apparatus 30 may be installed in any place as long as they can be connected via the network NW. For example, the medical image diagnostic apparatus 10, the image archiving apparatus 20, and the medical image processing apparatus 30 may be installed in different facilities. That is, the network NW may be a local network closed within the facility or may be a network via the internet.

The medical image diagnostic apparatus 10 is an apparatus that collects signals from a subject P. The medical image diagnostic apparatus 10 is, for example, the ultrasonic diagnostic apparatus 100 of fig. 1. Alternatively, the medical image diagnostic apparatus 10 may be a photoacoustic wave diagnostic apparatus, an X-ray CT apparatus, an MRI apparatus, a SPECT apparatus, a PET apparatus, or the like.

The medical image diagnostic apparatus 10 transmits the signal collected from the subject P to the image archiving apparatus 20 or the medical image processing apparatus 30 via the network NW. Here, the medical image diagnostic apparatus 10 may generate an image and transmit the image, or may transmit data at a previous stage of the image generation process. For example, the medical image diagnostic apparatus 10 may transmit the B-mode image or the B-mode data.

The image storage device 20 stores various data collected by the medical image diagnosis apparatus 10. The image storage device 20 may store an image such as a B-mode image, or may store data at a stage prior to image generation processing such as B-mode data. For example, the image archive device 20 is a server of pacs (picture Archiving and Communication system).

The medical image processing apparatus 30 is an apparatus that executes functions corresponding to the calculation function 114c and the output function 114 d. For example, as shown in fig. 12, the medical image processing apparatus 30 includes an input interface 31, a display 32, a memory 33, and a processing circuit 34. Here, the input interface 31, the display 32, and the memory 33 may be configured similarly to the input interface 130, the display 140, and the memory 113 in fig. 1.

The processing circuit 34 controls the overall operation of the medical image processing apparatus 30 by executing the control function 34a, the calculation function 34b, and the output function 34 c. Here, the calculation function 34b is an example of a calculation unit. The output function 34c is an example of an output unit.

For example, the processing circuit 34 reads out and executes a program corresponding to the control function 34a from the memory 33, and controls various functions such as the calculation function 34b and the output function 34c based on various input operations received from the user via the input interface 31.

For example, the processing circuit 34 reads out a program corresponding to the calculation function 34b from the memory 33 and executes the program, thereby executing the same function as the calculation function 114c in fig. 1. Specifically, the calculation function 34b first acquires signals collected over time from the subject P. For example, the calculation function 34b acquires signals collected by the medical image diagnostic apparatus 10 and stored in the image storage apparatus 20 via the network NW. Alternatively, the calculation function 34b may directly acquire the signal from the medical image diagnosis apparatus 10 without passing through the image storage apparatus 20. The calculation function 34b calculates a1 st similarity indicating the similarity between frames of the signal collected from the subject P with the passage of time, and calculates a 2 nd similarity indicating the similarity between positions of the 1 st similarity with the passage of time.

For example, the processing circuit 34 reads a program corresponding to the output function 34c from the memory 33 and executes the program, thereby executing the same function as the output function 114d in fig. 1. That is, the output function 34c outputs the 2 nd similarity calculated by the calculation function 34 b. For example, the output function 34c generates an image showing the distribution of the 2 nd similarity and displays the image on the display 32. For another example, the output function 34c generates an image indicating the distribution of the 2 nd similarity degree and transmits the image to an external device.

In the medical image processing apparatus 30 shown in fig. 1, each processing function is stored in the memory 33 in the form of a program executable by a computer. The processing circuit 34 is a processor that reads and executes programs from the memory 33 to realize functions corresponding to the respective programs. In other words, the processing circuit 34 that reads the program has a function corresponding to the read program.

In fig. 12, the control function 34a, the calculation function 34b, and the output function 34c are described assuming that a single processing circuit 34 realizes the functions, but the processing circuit 34 may be configured by combining a plurality of independent processors, and the functions may be realized by executing programs by the respective processors. Further, the processing functions of the processing circuit 34 may be distributed or combined as appropriate into a single or a plurality of processing circuits.

The processing circuit 34 may also be realized by a processor of an external device connected via the network NW. For example, the processing circuit 34 reads out and executes a program corresponding to each function from the memory 33, and realizes each function shown in fig. 12 using a server group (cloud) connected to the medical image processing apparatus 30 via the network NW as a computing resource.

The term "processor" used in the above description refers to, for example, a CPU, a GPU (Graphics Processing Unit), an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (e.g., a Simple Programmable Logic Device (SPLD)), a Complex Programmable Logic Device (CPLD), a Field Programmable Gate Array (FPGA)), and other circuits. In the case where the processor is, for example, a CPU, the processor realizes a function by reading out and executing a program stored in a memory circuit. On the other hand, when the processor is, for example, an ASIC, the functions are directly incorporated into the circuit of the processor as a logic circuit, instead of storing a program in a memory circuit. Note that each processor of the embodiment is not limited to the case where each processor is configured as a single circuit, and a plurality of independent circuits may be combined to configure 1 processor to realize the functions thereof. Furthermore, a plurality of components in each drawing may be combined into 1 processor to realize the functions thereof.

In fig. 1, the description has been given assuming that a single memory 113 stores programs corresponding to respective processing functions of the processing circuit 114. In fig. 12, the description has been given assuming that the single memory 33 stores programs corresponding to the respective processing functions of the processing circuit 34. However, the embodiment is not limited thereto. For example, a plurality of memories 113 may be arranged in a distributed manner, and the processing circuit 114 may read the corresponding program from the individual memory 113. Similarly, a plurality of memories 33 may be arranged in a distributed manner, and the processing circuit 34 may read the corresponding program from the individual memory 33. Alternatively, instead of storing the program in the memory, the program may be directly incorporated into the circuit of the processor. In this case, the processor realizes the function by reading out and executing a program incorporated in the circuit.

The components of the devices according to the embodiments described above are functionally conceptual, and need not necessarily be physically configured as shown in the drawings. That is, the specific form of distribution/integration of the respective devices is not limited to the illustration, and all or a part thereof may be configured to be functionally or physically distributed/integrated in arbitrary units according to various loads, use situations, and the like. Further, all or any part of the processing functions performed by the respective devices may be realized by a CPU and a program analyzed and executed by the CPU, or may be realized as hardware based on wired logic.

The medical image processing method described in the above-described embodiment can be realized by executing a medical image processing program prepared in advance by a computer such as a personal computer or a workstation. The medical image processing program can be distributed via a network such as the internet. The medical image processing program may be recorded in a non-transitory computer-readable recording medium such as a hard disk, a Flexible Disk (FD), a CD-ROM, an MO, or a DVD, and may be read from the recording medium by a computer and executed.

According to at least 1 embodiment described above, the accuracy of judder evaluation can be improved.

Several embodiments have been described, but these embodiments are presented as examples and are not intended to limit the scope of the invention. These embodiments may be implemented in various other forms, and various omissions, substitutions, changes, and combinations of the embodiments may be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.

30页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种四维彩超用检查床

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!