Method and apparatus for displaying ultrasound image and computer program product

文档序号:1258249 发布日期:2020-08-25 浏览:25次 中文

阅读说明:本技术 用于显示超声图像的方法和设备以及计算机程序产品 (Method and apparatus for displaying ultrasound image and computer program product ) 是由 朴成昱 于 2019-12-11 设计创作,主要内容包括:提供了一种用于显示超声图像的方法和设备以及计算机程序产品。所述显示超声图像的方法包括:基于通过沿第一方向对孕妇的身体进行扫描获得的第一超声图像来确定胎儿的头部的方向;基于通过沿第二方向对孕妇的身体进行扫描获得的第二超声图像来确定胎儿的脊柱的方向;基于胎儿的头部和胎儿的脊柱的方向确定在孕妇的身体内的胎儿的方位;并且显示表示胎儿的方位的图像以及通过对孕妇的身体进行扫描获得的超声图像。(A method and apparatus for displaying an ultrasound image and a computer program product are provided. The method for displaying an ultrasound image includes: determining a direction of a head of a fetus based on a first ultrasound image obtained by scanning a body of a pregnant woman in a first direction; determining a direction of a spine of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction; determining an orientation of the fetus within the body of the pregnant woman based on the orientation of the head of the fetus and the spine of the fetus; and displays an image representing the orientation of the fetus and an ultrasound image obtained by scanning the body of the pregnant woman.)

1. A method of displaying an ultrasound image, the method comprising:

determining a direction of a head of a fetus based on a first ultrasound image obtained by scanning a body of a pregnant woman in a first direction;

determining a direction of a spine of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction;

determining an orientation of the fetus within the body of the pregnant woman based on the orientation of the head of the fetus and the spine of the fetus; and is

An image representing the orientation of the fetus and an ultrasound image obtained by scanning the body of the pregnant woman are displayed.

2. The method of claim 1, wherein the step of determining the orientation of the head of the fetus comprises:

displaying a user interface for guiding a scan of a body of a pregnant woman in the first direction;

displaying a user interface for guiding rescanning of the pregnant woman's body when the fetal head is not detected in the first ultrasound image;

when the head of the fetus is detected in the first ultrasound image, the orientation of the head of the fetus is determined, and

wherein the step of determining the orientation of the spine of the fetus comprises:

displaying a user interface for guiding a scan of a body of a pregnant woman along the second direction;

displaying a user interface for guiding rescanning of the pregnant woman's body when the fetal spine is not detected in the second ultrasound image;

when the spine of the fetus is detected in the second ultrasound image, the orientation of the spine of the fetus is determined.

3. The method of claim 1, wherein the step of determining the orientation of the head of the fetus comprises:

displaying a user interface for guiding a scan of a body of a pregnant woman in the first direction;

determining an orientation of the head of the fetus when the head of the fetus is detected in the first ultrasound image; and is

When the head of the fetus is not detected in the first ultrasound image, determining a direction of the head of the fetus based on a user input to the first ultrasound image, and

wherein the step of determining the orientation of the spine of the fetus comprises:

displaying a user interface for guiding a scan of a body of a pregnant woman along the second direction;

determining a direction of a spinal column of the fetus when the spinal column of the fetus is detected in the second ultrasound image; and is

Determining a direction of a spinal column of the fetus based on user input to the second ultrasound image when the spinal column of the fetus is not detected in the second ultrasound image.

4. The method of claim 1, wherein the step of determining the orientation of the head of the fetus comprises:

detecting a head of a fetus in the first ultrasound image;

determining whether the head of the fetus in the first ultrasound image is positioned to the right or left relative to the torso of the fetus; and is

Displaying the head of the fetus and the orientation of the head of the fetus on the first ultrasound image.

5. The method of claim 1, wherein the step of determining the orientation of the spine of the fetus comprises:

detecting a spine of the fetus and a torso of the fetus in the second ultrasound image;

determining an angle of rotation of an axis of the fetus relative to a reference point based on a spine of the fetus and a torso of the fetus; and is

Displaying the spinal column of the fetus and the angle of the rotation on the second ultrasound image.

6. The method of claim 1, wherein the ultrasound image is an ultrasound image obtained by scanning a body of a pregnant woman in the second direction, and

wherein the image representing the orientation of the fetus comprises at least one of: an image indicated with at least one of the left and right sides of the fetus in the ultrasound image, an image showing a cross section of the fetus in the ultrasound image, the left and right sides, and a simulated image showing the fetus in three dimensions based on the orientation of the fetus.

7. The method of claim 1, further comprising:

displaying on a screen a marker indicating a position of a probe for scanning the body of a pregnant woman to obtain the ultrasound image, and

wherein the step of determining the orientation of the fetus comprises determining the orientation of the fetus based on the orientation of the head of the fetus and the spine of the fetus and the orientation of the probe.

8. The method of claim 1, wherein the image representing the orientation of the fetus comprises a simulated image showing the fetus in three dimensions based on the orientation of the fetus, and

wherein the simulated image is rotated based on user input.

9. The method of claim 1, wherein the image representing the orientation of the fetus comprises a simulated image showing the fetus in three dimensions based on the orientation of the fetus, and

wherein the step of displaying an image representing the orientation of the fetus comprises:

determining a pose of the fetus based on the detected legs of the fetus in the first ultrasound image; and is

The simulated image is generated based on the orientation and posture of the fetus and displayed.

10. An apparatus for displaying an ultrasound image, the apparatus comprising:

at least one processor configured to:

determining an orientation of a head of the fetus based on a first ultrasound image obtained by scanning a body of the pregnant woman in a first direction,

determining a direction of a spinal column of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction,

determining an orientation of the fetus within the body of the pregnant woman based on the orientation of the head of the fetus and the spine of the fetus; and

a display that displays an image representing an orientation of the fetus and an ultrasound image obtained by scanning a body of the pregnant woman.

11. The device of claim 10, wherein the at least one processor is further configured to:

controlling a display to display a user interface for guiding a scan of a body of a pregnant woman in the first direction,

controlling a display to display a user interface for guiding a rescan of a body of a pregnant woman when a head of a fetus is not detected in the first ultrasound image,

determining an orientation of the head of the fetus when the head of the fetus is detected in the first ultrasound image,

control a display to display a user interface for guiding a scan of a body of the pregnant woman in the second direction,

controlling a display to display a user interface for guiding rescanning of the pregnant woman's body when the spinal column of the fetus is not detected in the second ultrasound image,

when the spine of the fetus is detected in the second ultrasound image, the orientation of the spine of the fetus is determined.

12. The apparatus of claim 10, further comprising: an input interface configured to receive a user input,

wherein the at least one processor is further configured to:

controlling a display to display a user interface for guiding a scan of a body of a pregnant woman in the first direction,

determining an orientation of the head of the fetus when the head of the fetus is detected in the first ultrasound image,

determining an orientation of a head of a fetus based on a user input to the first ultrasound image when the head of the fetus is not detected in the first ultrasound image,

control a display to display a user interface for guiding a scan of a body of the pregnant woman in the second direction,

when the spine of the fetus is detected in the second ultrasound image, the direction of the spine of the fetus is determined, and

determining a direction of a spinal column of the fetus based on user input to the second ultrasound image when the spinal column of the fetus is not detected in the second ultrasound image.

13. The device of claim 10, wherein the at least one processor is further configured to:

detecting the head of the fetus in the first ultrasound image, and

determining whether the head of the fetus in the first ultrasound image is located to the right or left relative to the torso of the fetus, and

wherein the display further displays the head of the fetus and the orientation of the head of the fetus on the first ultrasound image.

14. The device of claim 10, wherein the at least one processor is further configured to:

detecting the spine of the fetus and the torso of the fetus in the second ultrasound image, and

determining an angle of rotation of the axis of the fetus relative to a reference point based on the spine of the fetus and the torso of the fetus, and

wherein the display further displays the spine of the fetus and the angle of rotation on the second ultrasound image.

15. A computer-readable recording medium storing a program for executing the method of claim 1.

Technical Field

The present disclosure relates to a method and apparatus for displaying an ultrasound image and a computer program product, and more particularly, to a method of generating and displaying an image representing the orientation of a fetus within the body of a pregnant woman.

Background

Due to their non-invasive and non-destructive nature, ultrasound systems have been widely used in the medical field to obtain information about the interior of a subject. Ultrasound systems also play a key role in medical diagnostics because they can provide physicians with high resolution images of the internal region of a subject in real time without the need to perform surgical procedures to directly dissect the subject for observation.

The ultrasound system transmits an ultrasound signal to an object, receives an ultrasound signal (i.e., an ultrasound echo signal) reflected from the object, and generates a two-dimensional or three-dimensional (2D or 3D) ultrasound image based on the received ultrasound echo signal.

The risk of cardiac malformation of the fetus may be determined based on the position of the fetal heart. For example, when the heart of a fetus is in a normal position, the risk of cardiac malformation of the fetus is about 1%. On the other hand, when the heart of the fetus is located on the right side of the chest (i.e., the fetus has a right heart), the risk of cardiac malformation is reported to be about 95% or higher. Therefore, it is important to accurately determine the position of the heart of the fetus to predict the risk of cardiac malformation. However, because the fetus can move freely in the uterus, it may be difficult for even an experienced user to accurately identify the orientation of the fetus from the ultrasound cross-sectional image. Thus, errors may occur when predicting the risk of cardiac malformation based on the position of the heart.

Disclosure of Invention

Methods and apparatus are provided for displaying an image representing the orientation of a fetus within the body of a pregnant woman, and an ultrasound image to accurately determine the location of the internal organs of the fetus.

Additional aspects will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the presented embodiments.

According to an aspect of the present disclosure, a method of displaying an ultrasound image includes: determining a direction of a head of a fetus based on a first ultrasound image obtained by scanning a body of a pregnant woman in a first direction; determining a direction of a spine of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction; determining an orientation of the fetus within the body of the pregnant woman based on the orientation of the head of the fetus and the spine of the fetus; and displays an image representing the orientation of the fetus and an ultrasound image obtained by scanning the body of the pregnant woman.

According to another aspect of the present disclosure, an apparatus for displaying an ultrasound image includes: at least one processor configured to: determining a direction of a head of the fetus based on a first ultrasound image obtained by scanning a body of the pregnant woman in a first direction, determining a direction of a spine of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction, and determining an orientation of the fetus within the body of the pregnant woman based on the head of the fetus and the direction of the spine of the fetus; and a display that displays an image representing an orientation of the fetus and an ultrasound image obtained by scanning the body of the pregnant woman.

According to another aspect of the present disclosure, a computer program product includes at least one computer-readable recording medium storing a program for executing the method of displaying an ultrasound image.

Drawings

The above and other aspects, features and advantages of particular embodiments of the present disclosure will become more apparent from the following description when taken in conjunction with the accompanying drawings, in which:

fig. 1 is a block diagram of a configuration of an ultrasonic diagnostic apparatus according to an embodiment;

fig. 2 shows an example of an ultrasonic diagnostic apparatus according to an embodiment;

fig. 3A and 3B are views for explaining a general method of diagnosing a fetus by using an ultrasonic diagnostic apparatus;

fig. 4A and 4B are block diagrams of the configuration of an ultrasound image display apparatus according to an embodiment;

fig. 5A to 5C are diagrams for explaining a marker for indicating the position of a probe according to an embodiment;

fig. 6 is a diagram for explaining a direction in which a user scans the body of a pregnant woman via a probe, according to an embodiment;

fig. 7 illustrates a User Interface (UI) displayed by an ultrasound image display apparatus according to an embodiment;

fig. 8A to 8D illustrate ultrasound images obtained by performing a scan along the orientation of a fetus within a pregnant woman's body, according to an embodiment;

fig. 9 shows an example of ultrasound images representing a head of a fetus and an orientation of the head of the fetus according to an embodiment;

fig. 10 illustrates a UI displayed by an ultrasound image display device according to an embodiment;

fig. 11 is a view for explaining an ultrasound image obtained by scanning according to the orientation of the fetus according to the embodiment;

fig. 12 shows an example of an ultrasound image representing the spine and rotation angle of a fetus according to an embodiment;

fig. 13 and 14 respectively show a UI displayed by an ultrasound image display device according to an embodiment;

fig. 15 is a diagram for explaining the orientation of the fetus determined based on the orientation of the head and spine of the fetus, according to an embodiment;

fig. 16 shows an example of ultrasound images representing the left and right sides of a fetus according to an embodiment;

fig. 17A and 17B illustrate screens displayed by an ultrasound image display apparatus according to an embodiment;

fig. 18A is a view for explaining a method of rotating and displaying a simulation image performed by an ultrasound image display apparatus according to an embodiment;

fig. 18B is a diagram for explaining a method of rotating an ultrasound image based on a user input for a simulation image and displaying the resulting ultrasound image, performed by an ultrasound image display apparatus, according to an embodiment;

fig. 19A and 19B illustrate a screen that displays whether the position of an organ of a fetus is normal or abnormal based on information about the orientation of the fetus provided by an ultrasound image display device according to an embodiment;

fig. 20A, 20B, and 21 are diagrams for explaining a method of determining and displaying a fetal posture performed by an ultrasound image display apparatus according to an embodiment of the present disclosure;

fig. 22 is a flowchart of a method of displaying an ultrasound image according to an embodiment; and

fig. 23A and 23B are flowcharts of a method of displaying an ultrasound image according to an embodiment.

Detailed Description

Specific embodiments are described in more detail below with reference to the accompanying drawings.

In the following description, the same reference numerals are used for the same elements even in different drawings. The matters defined in the description such as a detailed construction and elements are provided to assist in a comprehensive understanding of the exemplary embodiments. Therefore, it is apparent that the exemplary embodiments may be carried out without these specifically defined matters. In other instances, well-known functions or constructions are not described in detail since they would obscure the example embodiments in unnecessary detail.

Terms such as "component" and "portion" as used herein mean terms that can be implemented through software or hardware. According to exemplary embodiments, a plurality of components or portions may be realized by a single unit or element, or a single component or portion may include a plurality of elements. When a statement such as "at least one of … …" follows a list of elements, the statement modifies the entire list of elements without modifying individual elements in the list.

In embodiments of the present disclosure, the image may include any medical image obtained by various medical imaging devices, such as a Magnetic Resonance Imaging (MRI) device, a Computed Tomography (CT) device, an ultrasound imaging device, or an X-ray device.

Further, in the present specification, the "object" as an object to be imaged may include a human, an animal, or a part of a human or an animal. For example, the object may include a part of a human (that is, an organ or tissue) or a body membrane. Although the following description is provided with respect to an example in which captured images of a body of a pregnant woman and a fetus as a subject in the body of the pregnant woman are displayed, the present disclosure may also be applied to display captured images of various subjects.

Throughout the specification, an ultrasound image refers to an image of an object processed based on an ultrasound signal transmitted to and reflected from the object. The ultrasound image may be a two-dimensional or three-dimensional (2D or 3D) image.

Fig. 1 is a block diagram showing a configuration of an ultrasonic diagnostic apparatus 100 (i.e., a diagnostic apparatus) according to an embodiment of the present disclosure.

Referring to fig. 1, the ultrasonic diagnostic apparatus 100 may include a probe 20, an ultrasonic transceiver 110, a controller 120, an image processor 130, one or more displays 140, a memory 150 (e.g., an internal memory), a communicator 160 (i.e., a communication device or interface), and an input interface 170.

The ultrasonic diagnostic apparatus 100 may be a portable type or cart type ultrasonic diagnostic apparatus that is portable, movable, mobile, or hand-held. Examples of the portable ultrasonic diagnostic device may include a smart phone, a laptop computer, a Personal Digital Assistant (PDA), and a tablet Personal Computer (PC), wherein each of the smart phone, the laptop computer, the Personal Digital Assistant (PDA), and the tablet Personal Computer (PC) may include a probe and a software application, but the embodiment is not limited thereto.

The probe 20 may include a plurality of transducers. The plurality of transducers may transmit ultrasound signals to the subject 10 in response to transmit signals received from the transmitter 113 through the probe 20. The plurality of transducers may receive ultrasonic signals reflected from the subject 10 to generate received signals. In addition, the probe 20 and the ultrasonic diagnostic apparatus 100 may be integrally formed (e.g., provided in a single housing), or the probe 20 and the ultrasonic diagnostic apparatus 100 may be separately formed (e.g., separately provided in separate housings), but may be connected wirelessly or via an electric wire. In addition, according to an embodiment, the ultrasonic diagnostic apparatus 100 may include one or more probes 20.

The controller 120 may control the transmitter 113 to cause the transmitter 113 to generate a transmit signal to be applied to each of the plurality of transducers included in the probe 20 based on the position and focus of the plurality of transducers.

The controller 120 may control the ultrasound receiver 115 to generate ultrasound data by converting the received signals received from the probe 20 from analog signals to digital signals and summing the received signals converted to digital form based on the positions and focal points of the plurality of transducers.

The image processor 130 may generate an ultrasound image by using the ultrasound data generated from the ultrasound receiver 115.

The display 140 may display the generated ultrasound image and various pieces of information processed by the ultrasound diagnostic apparatus 100. According to the present exemplary embodiment, the ultrasonic diagnostic apparatus 100 may include two or more displays 140. The display 140 may include a touch screen in combination with a touch panel.

The controller 120 may control the operation of the ultrasonic diagnostic apparatus 100 and the signal flow between the internal elements of the ultrasonic diagnostic apparatus 100. The controller 120 may include a memory for storing a program or data for performing the functions of the ultrasonic diagnostic apparatus 100 and a processor and/or microprocessor (not shown) for processing the program or data. For example, the controller 120 may control the operation of the ultrasonic diagnostic apparatus 100 by receiving a control signal from the input interface 170 or an external device.

The ultrasonic diagnostic apparatus 100 may include a communicator 160 and may be connected to external apparatuses, for example, servers, medical apparatuses, and portable devices (such as smart phones, tablet Personal Computers (PCs), wearable devices, etc.) via the communicator 160.

The communicator 160 may include at least one element capable of communicating with an external device. For example, the communicator 160 may include at least one of a short-range communication module, a wired communication module, and a wireless communication module.

The communicator 160 may receive a control signal and data from an external device and transmit the received control signal to the controller 120 so that the controller 120 may control the ultrasonic diagnostic apparatus 100 in response to the received control signal.

The controller 120 may transmit a control signal to the external device via the communicator 160 so that the external device may be controlled in response to the control signal of the controller 120.

For example, an external device connected to the ultrasonic diagnostic apparatus 100 may process data of the external device in response to a control signal of the controller 120 received via the communicator 160.

A program for controlling the ultrasonic diagnostic apparatus 100 may be installed in the external apparatus. The program may include a command language for performing a part of the operation of the controller 120 or the entire operation of the controller 120.

The program may be preinstalled in the external device or may be installed by a user of the external device by downloading the program from a server providing the application. The server that provides the application may include a recording medium that stores the program.

The memory 150 may store various data or programs for driving and controlling the ultrasonic diagnostic apparatus 100, input and/or output ultrasonic data, ultrasonic images, applications, and the like.

The input interface 170 may receive an input of a user to control the ultrasonic diagnostic apparatus 100 and may include a keyboard, buttons, keys, a mouse, a trackball, a wheel switch, a knob, a touch pad, a touch screen, a microphone, a motion input device, a biometric input device, and the like. For example, the input of the user may include an input for manipulating a button, a keyboard, a mouse, a trackball, a wheel switch, or a knob, an input for touching a touch pad or a touch screen, a voice input, a motion input, and a biological information input (e.g., iris recognition or fingerprint recognition), but the exemplary embodiments are not limited thereto.

An example of the ultrasonic diagnostic apparatus 100 according to the present exemplary embodiment is described below with reference to fig. 2.

Fig. 2 shows an example of an ultrasonic diagnostic apparatus 100a, an ultrasonic diagnostic apparatus 100b, and an ultrasonic diagnostic apparatus 100c according to the present exemplary embodiment.

Referring to fig. 2, the ultrasonic diagnostic apparatus 100a may include a main display 121 and a sub-display 122. At least one of the main display 121 and the sub-display 122 may include a touch screen. The main display 121 and the sub-display 122 may display the ultrasound image and/or various information processed by the ultrasound diagnostic apparatus 100 a. The main display 121 and the sub-display 122 may provide a Graphical User Interface (GUI) to receive data input of a user to control the ultrasonic diagnostic apparatus 100 a. For example, the main display 121 may display an ultrasound image and the sub-display 122 may display a control panel for controlling the display of the ultrasound image as a GUI. The sub-display 122 may receive input of data to control display of images through a control panel displayed as a GUI. The ultrasonic diagnostic apparatus 100a can control the display of the ultrasonic image on the main display 121 by using the input control data.

Referring to fig. 2, the ultrasonic diagnostic apparatus 100b may include a control panel 165. The control panel 165 may include buttons, a trackball, a wheel switch, or a knob, and may receive data for controlling the ultrasonic diagnostic apparatus 100b from a user. For example, the control panel 165 may include a Time Gain Compensation (TGC) button 171 and a freeze button 172. The TGC button 171 is used to set a TGC value for each depth of the ultrasound image. Further, when the input of the freeze button 172 is detected during scanning of the ultrasound image, the ultrasound diagnostic apparatus 100b may keep the frame image displayed at the point in time.

Buttons, a trackball, a wheel switch, and a knob included in the control panel 165 may be provided as GUIs to the main display 121 and the sub-display 122.

Referring to fig. 2, the ultrasonic diagnostic apparatus 100c may be a portable ultrasonic diagnostic apparatus. Examples of the portable ultrasonic diagnostic apparatus may include, for example, a smart phone, a laptop computer, a Personal Digital Assistant (PDA), or a tablet PC including a probe and an application, but the exemplary embodiments are not limited thereto.

The ultrasonic diagnostic apparatus 100c may include a probe 20 and a body 40. The probe 20 may be connected to one side of the body 40 by a wire or wirelessly. The body 40 may include a touch screen 145. The touch screen 145 may display an ultrasound image, various pieces of information processed by the ultrasound diagnostic apparatus 100c, and a GUI.

Fig. 3A and 3B are diagrams for explaining a method of diagnosing a fetus by using an ultrasonic diagnostic apparatus according to an embodiment.

Fig. 3A shows the orientation of the fetus within the body of a pregnant woman. Images 301 to 303 shown in fig. 3A represent a vertical hip mount, and a horizontal shoulder mount, respectively.

Fig. 3B shows an example in which the user 305 diagnoses the fetus in the body 310 of the pregnant woman by using the ultrasonic diagnostic apparatus 320. When the user 505 diagnoses the fetus by using the ultrasonic diagnostic apparatus 320, information on the left and right sides of the fetus is important for determining whether the organs of the fetus are in a normal position (visceral normal position) or an abnormal position (visceral retrograde position). However, because the fetus may be located within the body of the pregnant woman in various directions as shown in fig. 3A, the information about the left and right sides of the body of the pregnant woman may be different from the information about the left and right sides of the fetus, which makes it difficult to identify the left and right sides of the fetus.

To determine information about the left and right sides of the fetus, the user 305 examines information related to the position of the fetus (e.g., information related to whether the fetus is in head or hip presentation) and directly determines the left and right sides of the fetus according to prior art methods.

However, it may be difficult for a user to intuitively determine the left and right sides of the fetus based solely on the cross-section of the fetus. Thus, according to various embodiments of the present disclosure, methods and devices are provided for displaying an image representing the position of a fetus within the body of a pregnant woman and an ultrasound image so that a user can accurately and intuitively diagnose the fetus.

Fig. 4A and 4B are block diagrams of the configuration of an ultrasound image display apparatus 400 according to an embodiment.

The ultrasound image display apparatus 400 of fig. 4A and 4B is an apparatus for displaying an ultrasound image. The ultrasound image display apparatus 400 may be included in the ultrasound diagnostic apparatus 100 of fig. 1 or may include the ultrasound diagnostic apparatus 100 of fig. 1. Alternatively, the ultrasound image display apparatus 400 may be a separate electronic device connected to the ultrasound diagnostic apparatus 100 and display an ultrasound image obtained from the ultrasound diagnostic apparatus 100 or an external device.

The description of the ultrasonic diagnostic apparatus 100 of fig. 1 may be applied to the components of the ultrasonic image display apparatus 400 of fig. 4A and 4B, and thus, the description will not be repeated below.

Referring to fig. 4A, an ultrasound image display apparatus 400 may include a processor 410 and a display 420, according to an embodiment. However, ultrasound image display device 400 may include many more components than those shown in FIG. 4A. For example, according to an embodiment, as shown in fig. 4B, the ultrasound image display apparatus 400 may further include at least one of an input interface 430, a memory 440, and a communicator 450.

Further, according to an embodiment, the ultrasound image display apparatus 400 may include the probe 20 or may be connected to the probe 20. The ultrasound image display apparatus 400 may display an ultrasound image of an object acquired via the probe 20.

According to an embodiment, the ultrasonic transceiver 411, the image processor 413, and the controller 415 of the processor 410 may perform at least some of the operations and functions of the ultrasonic transceiver 110, the image processor 130, and the controller 120, respectively, described with reference to fig. 1. According to an embodiment, the display 420, the input interface 430, the memory 440, and the communicator 450 may correspond to the display 140, the input interface 170, the memory 150, and the communicator 160, respectively. The description that has been provided above for the respective components will be omitted here.

Although fig. 4A and 4B illustrate that the ultrasound image display apparatus 400 includes a single processor for simplicity, embodiments are not limited thereto and the ultrasound image display apparatus 400 may include a plurality of processors. When the ultrasound image display apparatus 400 includes the plurality of processors, the operations of the processor 410, which will be described below, may be performed by the plurality of processors, respectively.

The components of the ultrasound image display device 400 will now be described in more detail.

According to an embodiment of the present disclosure, the processor 410 of the ultrasound image display device 400 may determine the orientation of the fetus within the body of the pregnant woman. The orientation of the fetus may be the orientation of the axis of the fetus relative to the body of the pregnant woman. For example, the orientation of the fetus may include information about whether the fetus is in a longitudinal or lateral position relative to the body of the pregnant woman, information about whether the head or the hips of the fetus face the cervix of the pregnant woman, and information about the direction in which the front portion of the fetus points toward the body of the pregnant woman.

The processor 410 may determine one of the directions of the head and the spine of the fetus based on a first ultrasound image obtained by scanning the body of the pregnant woman in a first direction, and determine the other of the directions of the head and the spine of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction.

The first direction may be different from the second direction. The first direction may be perpendicular to the second direction. For example, the first direction may be a direction of scanning one of a longitudinal plane and a transverse plane of the body of the pregnant woman, and the second direction may be a direction of scanning the other of the longitudinal plane and the transverse plane. As another example, the first direction may be a direction of a sagittal plane, a median plane, or an anterior-posterior plane that scans the body of the pregnant woman, and the second direction may be a direction of an axial plane, a lateral plane, or a horizontal plane that scans the body of the pregnant woman.

For example, the processor 410 may first determine the orientation of the head of the fetus based on a first ultrasound image obtained by scanning the body of the pregnant woman in a first direction, and then determine the orientation of the spine of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction. However, embodiments are not limited thereto, the processor 410 may first determine the orientation of the spine of the fetus based on the first ultrasound image and then determine the orientation of the head of the fetus based on the second ultrasound image.

The processor 410 may determine the orientation of the fetus within the body of the pregnant woman based on the orientation of the head and spine of the fetus.

The display 420 may display an image representing the orientation of the fetus and an ultrasound image obtained by scanning the body of the pregnant woman. The display 420 may display an image representing the orientation of the fetus and a real-time ultrasound image obtained by scanning the body of the pregnant woman, thereby allowing the user to easily recognize information on the left and right sides of the cross-sectional image of the fetus.

The operation of the components of the ultrasound image display device 400 for displaying an image representing the orientation of the fetus will now be described in more detail.

When scanning the subject, the display 420 may display an index indicating the position of the probe 20 used to scan the subject to obtain an ultrasound image. The user can scan the body of the pregnant woman while holding the probe 20 in one hand in a direction corresponding to the indicia being displayed based on the indicia displayed on the screen of the display 420. The processor 410 can determine the orientation of the fetus based on the orientation of the head of the fetus, the orientation of the spine of the fetus, and the orientation of the probe 20.

Fig. 5A to 5C are diagrams for explaining markers for indicating the position of the probe 20 according to the embodiment.

Fig. 5A shows an example in which the user holds the probe 20 in a vertical position and scans the body of a pregnant woman in a longitudinal direction with the probe 20. By considering that the marker 501 is displayed at the top of the screen of the display 420, the user can hold the probe 20 with one hand so that the marker 503 attached to the probe 20 is located at the top of the probe 20 and can scan the body of the pregnant woman with reference to the marker 501.

Fig. 5B shows an example in which the user scans with the probe 20 in a horizontal position facing the body of the pregnant woman in a lateral direction. By considering that the mark 501 is displayed on the left side of the screen of the display 420, the user can hold the probe 20 with one hand so that the mark 503 attached to the probe 20 is located on the left side of the probe 20 and can scan the body of the pregnant woman with reference to the mark 501.

Fig. 5C shows the relationship between the mark 503 attached to the probe 20 and the mark 501 displayed on the screen.

Fig. 5C shows an example in which the user holds the probe 20 in a vertical position and scans with the probe 20 facing the body of the pregnant woman in the longitudinal direction. By considering that the marker 501 is displayed at the top of the screen of the display 420, the user can hold the probe 20 with one hand so that the marker 503 attached to the probe 20 is located at the top of the probe 20 and can scan the body of the pregnant woman with reference to the marker 501. When it is scanned that the fetus is located in the body of the pregnant woman in the longitudinal direction as shown in the image 510, the ultrasound image display apparatus 400 may obtain the ultrasound image 511, the ultrasound image 512, the ultrasound image 513, or the ultrasound image 514.

The user or the ultrasound image display apparatus 400 may determine that the fetus is positioned in the body of the pregnant woman with a prenatal head presentation based on the ultrasound images 511 and the ultrasound images 512. As shown in the ultrasound image 511 and the ultrasound image 512, when the head of the fetus in the ultrasound image 511 and the ultrasound image 512 is positioned on the right side of the ultrasound image 511 and the ultrasound image 512, the user or the ultrasound image display apparatus 400 may determine that the fetus is positioned in the body of the pregnant woman with a head presenting in a parturient style.

The user or the ultrasound image display apparatus 400 may determine that the fetus is positioned in the body of the pregnant woman with a longitudinal hip presentation based on the ultrasound images 513 and 514. As shown in the ultrasound image 513 and the ultrasound image 514, when the head of the fetus in the ultrasound image 513 and the ultrasound image 514 is positioned on the left side of the ultrasound image 513 and the ultrasound image 514, the user or the ultrasound image display apparatus 400 may determine that the fetus is positioned in the body of the pregnant woman with a fertile hip presentation.

The image 520 of fig. 5C is obtained by scanning the body of the pregnant woman in a lateral direction with the probe 20 in a horizontal position. By considering that the mark 501 is displayed on the left side of the screen of the display 420, the user can hold the probe 20 with one hand so that the mark 503 attached to the probe 20 is located on the left side of the probe 20 and can scan the body of the pregnant woman with reference to the mark 501. When it is scanned that the fetus is located in the body of the pregnant woman in the lateral direction as shown in the image 520, the ultrasound image display apparatus 400 may obtain the ultrasound image 521, the ultrasound image 522, the ultrasound image 523, or the ultrasound image 524.

The user or the ultrasound image display apparatus 400 may determine that the fetus is positioned in the body of the pregnant woman with the perinatal shoulder presentation based on the ultrasound image 521, the ultrasound image 522, the ultrasound image 523, or the ultrasound image 524. As shown in the ultrasound image 521, the ultrasound image 522, the ultrasound image 523, or the ultrasound image 524, when the head of the fetus in the ultrasound image obtained by scanning laterally facing the body of the pregnant woman is positioned on the right or left side of the ultrasound image, the user or the ultrasound image display apparatus 400 can determine that the fetus is positioned in the body of the pregnant woman with the perinatal shoulder presentation.

The ultrasound image display apparatus 400 may display a marker indicating the orientation of the probe 20 to enable the user to continuously scan the body of the pregnant woman in a specific direction to obtain the ultrasound image.

According to an embodiment of the present disclosure, the ultrasound image display apparatus 400 may obtain the first and second ultrasound images by scanning the body of the pregnant woman in the first and second directions, respectively.

As shown in image 601 of fig. 6, the processor 410 may obtain an ultrasound image by scanning a cross-section of the fetus along a longitudinal axis of the body of the pregnant woman. The processor 410 may obtain an ultrasound image by scanning the body of the pregnant woman in a longitudinal direction. Further, as shown in image 603 of fig. 6, the processor 410 may obtain an ultrasound image by scanning a cross-section of the fetus along a transverse axis of the pregnant woman's body. The processor 410 may obtain an ultrasound image by scanning the body of the pregnant woman in a lateral direction.

According to an embodiment of the present disclosure, the ultrasound image display apparatus 400 may display a UI for guiding a user through a direction in which an object is to be scanned.

The processor 410 may control the display 420 to display a UI for guiding a user to scan the body of the pregnant woman in a first direction.

The ultrasound image display apparatus 400 may display a UI for guiding a user by a scanning direction. The scanning direction may be represented by at least one of a character, an image, and a symbol. For example, referring to fig. 7, the ultrasound image display apparatus 400 may display an image 700 including at least one of an arrow 701 and a guide phrase 703 therein so as to scan the body of the pregnant woman in a longitudinal direction.

According to the guidance of the ultrasonic image display apparatus 400, the user can manipulate the probe 20 to scan in a longitudinal direction facing the body of the pregnant woman.

Fig. 8A to 8D show ultrasound images obtained by performing scanning along the orientation of the fetus within the body of the pregnant woman.

When the fetus is in the position as shown in the image 811 of fig. 8A, the ultrasound image 813 can be obtained by scanning the body of the pregnant woman in the longitudinal direction via the probe 20. Ultrasound image 813 can show a cross-section of the fetus with the same orientation as shown in image 815. The head of the fetus in the ultrasound image 813 may be positioned to the right relative to the torso.

When the fetus is in the position as shown in the image 812 of fig. 8B, an ultrasound image 823 may be obtained by scanning the body of the pregnant woman in the longitudinal direction via the probe 20. Ultrasound image 823 may show a cross-section of the fetus with the same orientation as shown in image 825. The head of the fetus in ultrasound image 823 may be located to the right relative to the torso.

When the fetus is in the position as shown in the image 831 of fig. 8C, an ultrasound image 833 can be obtained by scanning the body of the pregnant woman in the longitudinal direction via the probe 20. Ultrasound image 833 may show a cross-section of the fetus with the same orientation as shown in image 835. The head of the fetus in the ultrasound image 833 may be positioned to the left relative to the torso.

When the fetus is in the position as shown in the image 841 of fig. 8D, an ultrasound image 843 may be obtained by scanning the body of the pregnant woman in the longitudinal direction via the probe 20. Ultrasound image 843 may show a cross-section of the fetus with the same orientation as shown in image 845. The head of the fetus in the ultrasound image 843 may be positioned to the left relative to the torso.

The processor 410 may detect the head of the fetus in the first ultrasound image and determine the orientation of the head of the fetus based on the detected head. The processor 410 may determine whether the head of the fetus in the first ultrasound image is positioned to the right or left relative to the torso. Processor 410 may determine the presence based on a relationship between the orientation of probe 20 (or the position of the marker being displayed on display 420) and the orientation of the head of the fetus.

For example, when the head of the fetus detected in the ultrasound image is oriented in a right direction, the processor 410 may determine the orientation of the fetus to correspond to the head presentation of the fetus in the body of the pregnant woman. Alternatively, when the head of the fetus detected in the ultrasound image is oriented in the left direction, the processor 410 may determine the orientation of the fetus to correspond to the gluteal presentation of the fetus in the body of the pregnant woman.

The display 420 may display the head of the fetus and the orientation of the head of the fetus on the first ultrasound image. The orientation of the head of the fetus may be represented by at least one of an image, a drawing, a letter, and a symbol.

Fig. 9 illustrates an example of ultrasound images representing a head of a fetus and an orientation of the head of the fetus according to an embodiment.

As shown in the first ultrasound image 910 of fig. 9, the display 420 may display the head 913 of the detected fetus and show the head of the fetus as being located on the right side of the image 910. The display 420 may show the head of the fetus in the first ultrasound image 910 as being positioned to the right relative to the torso. As shown in the image 920 of fig. 9, the display 420 may display an indicator 921 indicating that the head 913 is located on the right side of the image 920.

The processor 410 may control the display 420 to display a UI for guiding a user to scan the body of the pregnant woman in a second direction.

The ultrasound image display apparatus 400 may display a UI for guiding a user by a scanning direction. The scanning direction may be represented by at least one of a character, an image, and a symbol. For example, referring to fig. 10, the ultrasound image display apparatus 400 may display an image 1000 including at least one of an arrow 1001 and a guide phrase 1003 therein such that a lateral face of a body of a pregnant woman is scanned.

The user may manipulate probe 20 to scan a lateral face of the pregnant woman's body, as guided by ultrasound image display device 400.

Fig. 11 is a diagram for explaining an ultrasound image obtained by scanning along the position of the fetus.

When the fetus is located in the longitudinal direction and the body of the pregnant woman is scanned in the lateral direction via the probe 20 as shown in the image 1102, the image 1104, the image 1112, the image 1114, the image 1122, the image 1124, the image 1132, and the image 1134, ultrasound images showing cross sections of the fetus shown in the image 1101, the image 1103, the image 1111, the image 1113, the image 1121, the image 1123, the image 1131, and the image 1133, respectively, can be obtained.

For example, when the fetus is in the position as shown in image 1102 of fig. 11, an ultrasound image may be obtained by scanning the body of the pregnant woman in a lateral direction via probe 20. When the spine of the fetus faces the back of the pregnant woman as shown in image 1102, the spine may be located at the bottom in the ultrasound image as shown in image 1101. The position of the spine detected in the second ultrasound image obtained when the spine of the fetus faces the back of the pregnant woman may be used as a reference point for determining the rotation angle of the fetus. Alternatively, a point corresponding to the 6 o' clock direction on the ultrasound image or a point corresponding to the bottom of the ultrasound image may be determined as a reference point for determining the rotation angle of the fetus.

As another example, image 1104 of fig. 11 shows the tangential axis of the fetus rotated 90 degrees compared to image 1102 of fig. 11. When the fetus is in the position as shown in image 1104 of fig. 11, an ultrasound image showing a cross section of the fetus shown in image 1103 can be obtained by scanning the body of the pregnant woman in a lateral direction via probe 20. As shown in the image 1103 of fig. 11, an ultrasound image may be obtained when the spine of the fetus is rotated 90 degrees clockwise relative to the reference point.

The processor 410 may detect a spine of the fetus in the second ultrasound image and determine a direction of the spine of the fetus based on the detected spine of the fetus. The processor 410 may detect the torso and spine of the fetus in the second ultrasound image and determine the angle by which the axis of the fetus in the second ultrasound image is rotated relative to the reference point. The display 420 may display the spine and the rotation angle of the fetus on the second ultrasound image.

According to an embodiment of the present disclosure, the ultrasound image display apparatus 400 may acquire an angle at which the fetus rotates within the body of the pregnant woman based on the position of the spinal column of the fetus by scanning the body of the pregnant woman in the horizontal-axis direction.

Fig. 12 shows an example of an ultrasound image representing the spine and rotation angle of a fetus in accordance with an embodiment of the present disclosure. As shown in the ultrasound image 1210 of fig. 12, the display 420 may display the spine 1211 of the fetus and the rotation angle 1215 of the fetus between the line connecting the center point of the torso of the fetus and the reference point 1213 and the line connecting the center point of the torso of the fetus and the spine 1211 of the fetus.

According to an embodiment of the present disclosure, when the head or the spine of the fetus is not detected in one of the first and second ultrasound images, the ultrasound image display apparatus 400 may display a UI for guiding the user to rescan the body of the pregnant woman.

When the head of the fetus is detected in the first ultrasound image, the processor 410 may determine the orientation of the head of the fetus by analyzing the first ultrasound image. However, when the head of the fetus is not detected in the first ultrasound image, the processor 410 may control the display 420 to display a UI for guiding the user to rescan the body of the pregnant woman. The processor 410 may control the display 420 to display a UI for guiding a user to scan the body of the pregnant woman in a first direction.

Further, when the spine of the fetus is detected in the second ultrasound image, the processor 410 may determine the direction of the spine of the fetus by analyzing the second ultrasound image. Otherwise, when the spine of the fetus is not detected in the second ultrasound image, the processor 410 may control the display 420 to display a UI for guiding the user to rescan the pregnant woman's body. The processor 410 may control the display 420 to display a UI for guiding a user to scan the body of the pregnant woman in a second direction.

Referring to fig. 13, the ultrasound image display apparatus 400 may display the obtained first and second ultrasound images together on a single screen. However, embodiments of the present disclosure are not limited thereto, and the ultrasound image display apparatus 400 may sequentially display the first ultrasound image and the second ultrasound image.

As shown in an image 1301 of fig. 13, according to an embodiment, when the head of a fetus is not detected in the first ultrasound image obtained by scanning the body of the pregnant woman in the longitudinal direction, the ultrasound image display apparatus 400 may display a UI for guiding the user to rescan the body of the pregnant woman. The user may change the position, angle, etc. of the probe 20 based on the UI for guiding rescanning so that the ultrasound image display apparatus 400 may obtain the first ultrasound image again.

Further, as shown in an image 1303 of fig. 13, according to an embodiment, when the spine of the fetus is not detected in the second ultrasound image obtained by scanning by facing the body of the pregnant woman in the lateral direction, the ultrasound image display apparatus 400 may display a UI for guiding the user to rescan the body of the pregnant woman. The user may change the position, angle, etc. of the probe 20 based on the UI for guiding rescanning so that the ultrasound image display apparatus 400 may obtain the second ultrasound image again.

According to an embodiment of the present disclosure, the ultrasound image display apparatus 400 may receive a user input related to the orientation of the fetus when the head or the spine of the fetus is not detected in one of the first ultrasound image and the second ultrasound image.

As shown in fig. 4B, according to an embodiment, the ultrasound image display apparatus 400 may further include an input interface for receiving user input.

When the head of the fetus is detected in the first ultrasound image, the processor 410 may determine the orientation of the head of the fetus by analyzing the first ultrasound image. However, when the head of the fetus is not detected in the first ultrasound image, the processor 410 may determine the orientation of the head of the fetus based on the user input to the first ultrasound image.

Further, when the spine of the fetus is detected in the second ultrasound image, the processor 410 may determine the direction of the spine of the fetus by analyzing the second ultrasound image. Otherwise, when the spine of the fetus is not detected in the second ultrasound image, the processor 410 may determine the direction of the spine of the fetus based on the user input to the second ultrasound image.

As shown in an image 1401 of fig. 14, according to an embodiment, when the head of the fetus is not detected in the first ultrasound image obtained by scanning the body of the pregnant woman in the longitudinal direction, the ultrasound image display apparatus 400 may display a UI for receiving a user input specifying the direction of the head of the fetus. The user may specify the orientation of the fetal head determined in the image 1401 based on a UI for guiding the user to specify the orientation of the fetal head.

Further, as shown in the image 1403 of fig. 14, according to the embodiment, when the spinal column of the fetus is not detected in the second ultrasound image obtained by scanning by facing the body of the pregnant woman in the lateral direction, the ultrasound image display apparatus 400 may display a UI for receiving a user input specifying the direction of the spinal column of the fetus. The user may specify the orientation of the spine of the fetus determined in the image 1403 based on the UI for guiding the user to specify the orientation of the spine of the fetus.

As shown in the table of fig. 15, according to an embodiment, the ultrasound image display apparatus 400 may determine the orientation of the fetus within the body of the pregnant woman based on the orientation of the head and spine of the fetus.

The first column 1501 of the table of fig. 15 shows the fetus in a prenatal head presentation with the head of the fetus facing the cervix. The second column 1503 shows the fetus in a prenatal hip presentation with the fetus' buttocks facing the cervix.

The processor 410 may determine whether the fetus is in the prenatal head presentation or the prenatal hip presentation based on whether the head of the fetus is oriented in the left direction or the right direction in the first ultrasound image obtained by scanning the fetus along the longitudinal plane. Processor 410 may determine the orientation of the head of the fetus when it is assumed that the first ultrasound image was obtained via probe 20 positioned along the orientation corresponding to the marker being displayed on display 420. The processor 410 may previously identify which of the left and right sides of the first ultrasound image is closer to the cervix. The processor 410 may determine which of the head and hip is closer to the cervix based on the orientation of the head of the fetus in the first ultrasound image and the orientation of the marker being displayed (or the orientation of the probe 20).

The first row 1505 of the table of fig. 15 represents a fetus in the presentation with the spine of the fetus facing the back of a pregnant woman. The second through fourth rows 1506, 1507 and 1508 represent the fetus in the presence of 90, 180 and 270 degrees clockwise rotation of the fetal axis, respectively.

The processor 410 can determine the angle of rotation of the axis of the fetus from a second ultrasound image obtained by scanning the fetus along a transverse plane. When it is assumed that a second ultrasound image is obtained via probe 20 positioned in a position corresponding to the marker being displayed on display 420, processor 410 may determine the orientation of the spinal column of the fetus. The processor 410 can determine the angle of the axis rotation of the fetus based on the orientation of the spine of the fetus.

The processor 410 may also determine the orientation of the fetus within the body of the pregnant woman based on the orientation of the fetus' head and spine.

Fig. 16 shows an example of ultrasound images representing the left and right sides of a fetus being displayed, in accordance with an embodiment of the present disclosure. The ultrasound images 1621-1624 may be obtained by scanning the body of the pregnant woman in a second direction (e.g., a transverse direction). As shown in fig. 16, the information about the left and right positions of the fetus, which is indicated on the ultrasound images 1621 to 1624 respectively showing the lateral faces of the fetus, may vary depending on whether the fetal presenting is the head presenting (images 1611 and 1612) or the hip presenting (images 1613 and 1614). In each of the ultrasound image 1621 to ultrasound image 1624, "Lt" and "Rt" indicate the left and right sides of the fetus, respectively.

As described above, according to the embodiment of the present disclosure, the ultrasound image display apparatus 400 may scan the fetus in the direction of the longitudinal or lateral face of the pregnant woman's body and detect a region corresponding to a particular structure (e.g., the head, the spine, etc.), thereby providing information on the orientation of the fetus determined based on the detected region. Accordingly, the user can easily check information on the left and right sides of the fetus and intuitively determine whether the arrangement of the fetal organs is normal or abnormal based on the information on the left and right sides of the fetus.

The image representing the orientation of the fetus displayed on the display 420 may include at least one of the following images: an image indicated with at least one of the left and right sides of the fetus in the ultrasound image, a cross-section showing the fetus in the ultrasound image, images of the left and right sides, and a simulated image showing the fetus in three dimensions based on the orientation of the fetus. Fig. 17A and 17B illustrate a screen 1700 and a screen 1750 displayed by an ultrasound image display device according to an embodiment of the present disclosure.

As seen on the screen 1700 of fig. 17A, according to an embodiment of the present disclosure, the ultrasound image display apparatus 400 may display an image 1720 and an image 1730 representing the orientation of a fetus and an ultrasound image 1710 obtained by scanning the body of a pregnant woman. Image 1720 shows a cross-section, left side, and right side of the fetus in ultrasound image 1710. The image 1730 is a simulated image showing the fetus in three dimensions based on the orientation of the fetus.

As seen on the screen 1750 of fig. 17B, the ultrasound image display apparatus 400 may display an image 1740 obtained by superimposing the image 1720 representing the orientation of the fetus on the ultrasound image 1710 obtained by scanning the body of the pregnant woman.

According to an embodiment of the present disclosure, when the orientation of the fetus is represented using the simulated image, the processor 410 of the ultrasound image display apparatus 400 may control the display 420 to rotate the simulated image based on the user input.

As seen on screen 1810 of fig. 18A, typically, the body of the pregnant woman may be scanned in a longitudinal direction in order to scan the fetus in a longitudinal plane. Arrow 1811 may indicate a general longitudinal scan, and simulated image 1813 may represent the orientation of the fetus within the pregnant body. However, as seen on the screen 1820 of fig. 18A, when the fetus lies obliquely within the body of the pregnant woman, the fetus may be scanned along a longitudinal plane oblique to the body of the pregnant woman by considering the direction in which the fetus lies. As indicated by arrow 1821, the processor may set the scan direction based on user input. The processor 410 may control the display 420 such that the direction of the simulated image 1823 may change as the scan direction changes. It can be seen that the simulated image 1823 is rotated as compared to the simulated image 1813.

Therefore, according to the embodiment of the present disclosure, the ultrasound image display apparatus 400 may provide an accurate simulation image by considering the orientation of the fetus within the body of the pregnant woman.

Further, according to an embodiment of the present disclosure, when the position of the fetus is represented using the analog image, the processor 410 of the ultrasound image display apparatus 400 may adjust the ultrasound image being displayed based on the user input with respect to the analog image. For example, the processor 410 may zoom in, zoom out, pan, and/or rotate the ultrasound image based on user input.

As seen on the screen 1830 of fig. 18B, the ultrasound image display device 400 may rotate the ultrasound image 1831 based on user input that rotates the simulated image 1833.

According to the embodiment of the present disclosure, the user of the ultrasound image display apparatus 400 can easily recognize the left and right sides of the fetus based on the image representing the orientation of the fetus, and can determine whether the arrangement of the organs of the fetus is the visceral normal position or the visceral reverse position. The ultrasound image display apparatus 400 may display whether the arrangement of the organs of the fetus is the visceral normal position or the visceral retrograde position based on the user input. For example, as shown in fig. 19A or 19B, the ultrasound image display apparatus 400 may display the arrangement of the organs of the fetus as a normal case where the organs of the fetus are in a normal position.

Alternatively, the ultrasound image display apparatus 400 may directly identify the position of the organs of the fetus based on the information on the left and right sides of the fetus. The ultrasound image display apparatus 400 may detect a region corresponding to at least one of the organs of the fetus in the ultrasound image to determine whether the organs of the fetus are in a normal position based on information on the left and right sides of the fetus. The ultrasound image display apparatus 400 may determine whether the arrangement of the organs of the fetus is the visceral normal position or the visceral retrograde position based on the positions of the organs of the fetus. For example, as shown in fig. 19A or 19B, the ultrasound image display apparatus 400 may display the arrangement of the organs of the fetus as a normal case where the organs of the fetus are in a normal position.

For example, the ultrasound image display apparatus 400 may determine that a fetal organ such as the heart is located on the left side of the chest as visceral orthostatic, and that the heart is located on the right side as visceral retrograde. The ultrasound image display apparatus 400 may determine whether the arrangement of the fetal organs is the visceral normal position or the visceral reverse position by determining whether the fetal organs are located at the left or right side in the scanned cross-section. Further, the ultrasound image display apparatus 400 may display information on the determination result on a screen to allow the user to easily recognize the information.

Further, according to an embodiment of the present disclosure, when the orientation of the fetus is represented using the simulated image, the processor 410 of the ultrasound image display apparatus 400 may determine the orientation of the fetus based on the leg of the fetus detected in the ultrasound image and generate the simulated image based on the orientation and the orientation of the fetus. The ultrasound image for detecting the leg of the fetus may be a first ultrasound image obtained by scanning the fetus along the longitudinal plane.

Fig. 20A, 20B, and 21 are diagrams for explaining a method of determining and displaying a fetal posture performed by an ultrasound image display apparatus according to an embodiment of the present disclosure.

As shown in fig. 20A, the fetus may assume various postures within the body of the pregnant woman. Image 2001 shows the fetus straightening both legs towards the leg butt of his head, and image 2002 shows the incomplete butt of one leg of the fetus under his butt. Image 2003 shows a full hip with both legs of the fetus bent at the knee.

According to an embodiment, the ultrasound image display apparatus 400 may also detect the legs or feet of the fetus. As shown in fig. 20B, according to an embodiment, the ultrasound image display apparatus 400 may determine the type of the posture of the fetus based on the position of the leg of the fetus and display the determined type of the posture on the screen.

Referring to fig. 20B, the processor 410 of the ultrasound image display apparatus 400 may determine the posture of the fetus as a straight hip based on the leg 2011 of the fetus detected in the ultrasound image 2010. Further, processor 401 may determine the pose of the fetus as an incomplete hip based on the fetal foot 2021 detected in ultrasound image 2020.

As shown in fig. 21, the display 420 of the ultrasound image display apparatus 400 may determine the posture of the fetus as a leg-butt based on the head 2101 and the leg 2103 of the fetus detected in the ultrasound image 2100, and display information 2105 on the determined posture of the fetus on the ultrasound image 2100.

A method of displaying an image representing the orientation of the fetus and an ultrasound image, which is performed by the ultrasound image display apparatus 400, will be described in more detail below. Each operation of the methods of fig. 22, 23A, and 23B may be performed by at least one component described with reference to fig. 4A and 4B. Therefore, the description that has been provided above with respect to fig. 4A and 4B will not be repeated below.

Fig. 22 is a flowchart of a method of displaying an ultrasound image according to an embodiment.

According to an embodiment, the ultrasound image display apparatus 400 may determine the direction of the head of the fetus based on the first ultrasound image obtained by scanning the body of the pregnant woman in the first direction (S2210).

The ultrasound image display apparatus 400 may display a UI for guiding a user to scan the body of a pregnant woman in a first direction, obtain a first ultrasound image, and detect the head of a fetus in the first ultrasound image. For example, the first direction may be a longitudinal direction.

The ultrasound image display apparatus 400 may determine the presence of the fetus based on the relationship between the position of the probe (or the position of the marker being displayed) and the orientation of the head of the fetus. In detail, when at least a portion of the head of the fetus and at least a portion of the buttocks of the fetus are both identified in the first ultrasound image, the ultrasound image display apparatus 400 may determine the orientation of the head of the fetus. Alternatively, the ultrasound image display apparatus 400 may determine the orientation of the head of the fetus when at least a portion of the head of the fetus or at least a portion of the buttocks of the fetus is identified in the first ultrasound image. The ultrasound image display apparatus 400 may determine whether the fetal presenting is a head presenting or a hip presenting based on the orientation of the fetal head.

When the head of the fetus is not detected in the first ultrasound image, the ultrasound image display apparatus 400 may display a UI for guiding the user to rescan the body of the pregnant woman. The ultrasound image display apparatus 400 may detect the head of the fetus in the first ultrasound image obtained by rescanning the body of the pregnant woman and determine the orientation of the head of the fetus based on the first ultrasound image.

Alternatively, when the head of the fetus is not detected in the first ultrasound image, the ultrasound image display apparatus 400 may determine the direction of the head of the fetus based on a user input.

When the head of the fetus is detected in the first ultrasound image, the ultrasound image display apparatus 400 may determine whether the head of the fetus in the first ultrasound image is located on the right or left with respect to the torso. The ultrasound image display apparatus 400 may display the head of the fetus and the orientation of the head of the fetus on the first ultrasound image. For example, when the head of the fetus detected in the ultrasound image is oriented in the right direction, the ultrasound image display apparatus 400 may determine that the fetus is positioned in the body of the pregnant woman with head presenting. When the head of the fetus detected in the ultrasound image is oriented in the left direction, the ultrasound image display device 400 may determine that the fetus is positioned in the body of the pregnant woman with the hip presentation.

According to the embodiment, the ultrasound image display apparatus 400 may determine the direction of the spine of the fetus based on a second ultrasound image obtained by scanning the body of the pregnant woman in a second direction (S2220).

The ultrasound image display apparatus 400 may display a UI for guiding a user to scan the body of the pregnant woman in a second direction, obtain a second ultrasound image, and detect the spine of the fetus in the second ultrasound image. For example, the second direction may be perpendicular to the first direction. The second direction may be a lateral direction.

When the spine of the fetus is detected in the second ultrasound image, the ultrasound image display apparatus 400 may determine the direction of the spine of the fetus.

When the spine of the fetus is not detected in the second ultrasound image, the ultrasound image display apparatus 400 may display a UI for guiding the user to rescan the pregnant woman's body. The ultrasound image display apparatus 400 may detect the spine of the fetus in a second ultrasound image obtained by scanning the body of the pregnant woman and determine the direction of the spine of the fetus based on the second ultrasound image.

Alternatively, when the spine of the fetus is not detected in the second ultrasound image, the ultrasound image display apparatus 400 may determine the direction of the spine of the fetus based on the user input.

When the direction of the spine of the fetus is determined, the ultrasound image display apparatus 400 may detect at least one of the torso and the spine of the fetus in the second ultrasound image. The ultrasound image display apparatus 400 may calculate an angle at which the fetus rotates within the body of the pregnant woman based on the detected position of the spine of the fetus. The ultrasound image display apparatus 400 may determine the angle of rotation of the axis of the fetus relative to the reference point in the second ultrasound image. The ultrasound image display apparatus 400 can also display the spine and the rotation angle of the fetus on the second ultrasound image.

Although it is described in operations S2210 and S2220 that the head and the spine of the fetus are detected in the longitudinal plane image and the transverse plane image of the body of the pregnant woman, respectively, embodiments of the present disclosure are not limited to the example shown in fig. 22. The spine of the fetus may be detected earlier than the head of the fetus, and an ultrasound image obtained by scanning the body of the pregnant woman in the lateral direction may be acquired before an ultrasound image obtained by scanning the body of the pregnant woman in the longitudinal direction is acquired.

For example, when the fetus is positioned with shoulder presentation within the body of the pregnant woman as shown in image 303 of fig. 3A, the ultrasound image display device 400 may detect the spine of the fetus in the longitudinal plane image of the body of the pregnant woman and then detect the head of the fetus in the transverse plane image of the body of the pregnant woman.

According to an embodiment, the ultrasound image display apparatus 400 may determine the orientation of the fetus within the body of the pregnant woman based on the orientation of the head and spine of the fetus (S2230).

According to an embodiment, the ultrasound image display apparatus 400 may display an image representing the orientation of the fetus and an ultrasound image obtained by scanning the body of the pregnant woman (S2240).

The ultrasound image display apparatus 400 may display an image representing the orientation of the fetus and a real-time ultrasound image obtained by scanning the body of the pregnant woman in the second direction. The ultrasound image display apparatus 400 may display information on the left and right positions of the fetus on the cross-section scanned in real time.

The image representing the orientation of the fetus may comprise at least one of the following images: an image indicated with at least one of the left and right sides of the fetus in the ultrasound image, a cross-section showing the fetus in the ultrasound image, images of the left and right sides, and a simulated image showing the fetus in three dimensions based on the orientation of the fetus.

For example, the ultrasound image display apparatus 400 may generate and provide a simulated image based on the rotation angle of the fetus with respect to the longitudinal axis of the pregnant woman's body and the rotation angle of the fetus with respect to the lateral axis of the pregnant woman's body, thereby accurately representing the orientation of the fetus within the pregnant woman's body.

In the scanned cross-sections other than the cross-sections corresponding to the first and second ultrasound images, respectively, the ultrasound image display apparatus 400 may determine whether the arrangement of the organs of the fetus is the visceral normal position or the visceral reverse position by using information on the left and right sides of the fetus.

According to an embodiment, the ultrasound image display apparatus 400 may display a marker indicating the position of the probe 20 on the screen when the object is scanned via the probe 20. The ultrasound image display apparatus 200 can analyze an ultrasound image on the assumption that the ultrasound image is obtained via the probe 20 placed in a direction corresponding to the marker being displayed. The ultrasound image display device 400 can determine the orientation of the fetus based on the orientation of the head and spine of the fetus and the orientation of the probe 20.

According to an embodiment, the ultrasound image display apparatus 400 may rotate the simulation image based on the user input. The ultrasound image display apparatus 400 may display the ultrasound image rotated as the analog image is rotated.

According to an embodiment, the ultrasound image display apparatus 400 may also detect the legs and feet of the fetus in the ultrasound image and determine the type of the posture of the fetus based on the detected legs and feet of the fetus. The ultrasound image display apparatus 400 may generate a simulation image based on the position and posture of the fetus and display the generated simulation image. An ultrasound image used as a reference in determining the posture of the fetus may be obtained by scanning the fetus in the longitudinal direction. The ultrasound image display apparatus 400 may display information on the posture of the fetus on a screen.

Fig. 23A and 23B are flowcharts of a method of displaying an ultrasound image according to an embodiment of the present disclosure.

The ultrasound image display apparatus 400 may display a UI for guiding scanning of the body of the pregnant woman in a first direction (S2311). The user may place the probe 20 on the body of the pregnant woman based on the UI.

The ultrasound image display apparatus 400 may obtain an ultrasound image (S2312). An ultrasound image may be obtained by scanning the body of a pregnant woman in a first direction.

The ultrasound image display apparatus 400 may analyze the obtained ultrasound image (S2313). The ultrasound image display apparatus 400 may detect a particular structure (e.g., head, spine, leg, etc.) of the fetus in the obtained ultrasound image.

The ultrasound image display apparatus 400 may determine whether the head of the fetus is detected in the ultrasound image (S2314).

When the head of the fetus is detected in the ultrasound image, the ultrasound image display apparatus 400 may determine the orientation of the head of the fetus (S2315). Otherwise, when the head of the fetus is not detected in the ultrasound image, the ultrasound image display apparatus 400 may display a UI for guiding rescanning of the body of the pregnant woman (S2316). The ultrasound image display apparatus 400 may obtain an ultrasound image by rescanning the body of the pregnant woman (S2317) and return to operation S2313 to analyze the ultrasound image.

The ultrasound image display apparatus 400 may display a UI for guiding the scanning of the body of the pregnant woman in the second direction (S2321). The user may place the probe 20 on the body of the pregnant woman based on the UI.

The ultrasound image display apparatus 400 may obtain an ultrasound image (S2322). The ultrasound image display apparatus 400 may analyze the obtained ultrasound image (S2323). The ultrasound image display apparatus 400 may detect a particular structure (e.g., head, spine, leg, etc.) of the fetus in the obtained ultrasound image.

The ultrasound image display apparatus 400 may determine whether a spinal column of the fetus is detected in the ultrasound image (S2324).

When the spine of the fetus is detected in the ultrasound image, the ultrasound image display apparatus 400 may determine the direction of the spine of the fetus (S2325). Otherwise, when the spinal column of the fetus is not detected in the ultrasound image, the ultrasound image display apparatus 400 may display a UI for guiding rescanning of the body of the pregnant woman (S2326). The ultrasound image display apparatus 400 may obtain an ultrasound image by rescanning the body of the pregnant woman (S2327) and return to operation S2323 to analyze the ultrasound image.

The ultrasound image display apparatus 400 may determine the orientation of the fetus based on the orientation of the head and spine of the fetus (S2330). The ultrasound image display apparatus 400 may display an image representing the orientation of the fetus together with the ultrasound image (S2340).

According to the embodiments of the present disclosure, by displaying an image representing the orientation of a fetus within the body of a pregnant woman and an ultrasound image, the position of an organ inside the fetus can be easily and accurately determined.

Embodiments of the present disclosure may be implemented as a software program including instructions stored in a computer-readable recording medium.

The computer may refer to an apparatus that can retrieve instructions stored in a computer-readable storage medium and perform operations according to embodiments of the present disclosure in response to the retrieved instructions, and may include an ultrasonic diagnostic device according to embodiments of the present disclosure.

The computer-readable recording medium may be provided in the form of a non-transitory storage medium. In this case, the term "non-transitory" merely means that the storage medium does not include a signal and is tangible, and the term does not distinguish semi-permanently stored data from temporarily stored data in the storage medium.

Furthermore, an ultrasound image display method according to an embodiment of the present disclosure may be included in a computer program product when provided. The computer program product may be used to conduct a transaction between a seller and a buyer as an article of commerce.

The computer program product may include a software program and a computer readable storage medium storing the software program. For example, the computer program product may be included by the manufacturer of the ultrasound diagnostic apparatus or through an electronic marketplace (e.g., google play Store)TMAnd App StoreTM) A product in the form of an electronically distributed software program (e.g., a downloadable application). For such electronic distribution, at least a part of the software program may be stored on a storage medium or may be temporarily generated. In this case, the storage medium may be a storage medium of a server of a manufacturer, a server of an electronic market, or a relay server for temporarily storing the software program.

In a system including a server and a terminal (e.g., an ultrasound diagnostic apparatus), the computer program product may include a storage medium of the server or a storage medium of the terminal. Alternatively, in the case where a third device (e.g., a smartphone) is connected to a server or a terminal through a communication network, the computer program product may include a storage medium of the third device. Alternatively, the computer program product may comprise a software program transmitted from the server to the terminal or the third device, or a software program transmitted from the third device to the terminal.

In this case, one of the server, the terminal, and the third device may run the computer program product to perform the method according to the embodiment of the present disclosure. Optionally, two or more of the server, the terminal and the third device may run the computer program product to perform the method according to embodiments of the present disclosure in a distributed manner.

For example, a server (e.g., a cloud server, an Artificial Intelligence (AI) server, etc.) may run a computer program product stored in the server to control a terminal in communication with the server to perform a method according to embodiments of the present disclosure.

As another example, a third apparatus may execute a computer program product to control a terminal in communication with the third apparatus to perform a method according to an embodiment of the disclosure. In detail, the third means may remotely control the ultrasonic diagnostic apparatus to transmit an ultrasonic signal to the object and generate an ultrasonic image of an internal part of the object based on information on a signal reflected from the object.

As another example, the third device may run the computer program product to directly perform the method according to embodiments of the present disclosure based on a value input from an auxiliary device (e.g., a probe for medical equipment). In detail, the auxiliary device may transmit an ultrasonic signal to the object and acquire an ultrasonic signal reflected from the object. The third device may receive information about the reflected ultrasound signal from the auxiliary device and generate an image of an internal part of the object based on the received information.

In a case where the third apparatus runs the computer program product, the third apparatus may download the computer program product from the server and run the downloaded computer program product. Optionally, a third apparatus may run the preloaded computer program product to perform a method according to an embodiment of the present disclosure.

49页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种基于心电同步的超声多普勒心功能包络峰识别方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!