Ultrasound system and method for shear wave elastography of anisotropic tissue

文档序号:602324 发布日期:2021-05-04 浏览:8次 中文

阅读说明:本技术 各向异性组织的剪切波弹性成像的超声系统和方法 (Ultrasound system and method for shear wave elastography of anisotropic tissue ) 是由 谢华 G·A·托波雷克 G·C-H·吴 V·T·沙姆达莎尼 于 2019-07-22 设计创作,主要内容包括:本公开包括用于利用剪切波弹性成像以相对于组织的各种角度来对各向异性组织成像的超声系统和方法。示例性超声成像系统包括:耦合到位置跟踪系统的探头,所述位置跟踪系统用于跟踪探头相对于对象的位置;以及与探头通信的处理器。处理器可以从所述位置跟踪系统接收位置跟踪数据。所述处理器可以:在各向异性组织中定义至少一个目标平面,确定探头的当前位置与目标平面的位置之间的差异,并且提供该差异的视觉指示符,其中,所述处理器响应于成像平面相对于目标平面的所述位置的变化而动态地更新所述视觉指示符。(The present disclosure includes ultrasound systems and methods for imaging anisotropic tissue at various angles relative to the tissue using shear wave elastography. An exemplary ultrasound imaging system includes: a probe coupled to a position tracking system for tracking a position of the probe relative to the subject; and a processor in communication with the probe. A processor may receive location tracking data from the location tracking system. The processor may: defining at least one target plane in anisotropic tissue, determining a difference between a current position of the probe and a position of the target plane, and providing a visual indicator of the difference, wherein the processor dynamically updates the visual indicator in response to changes in the position of the imaging plane relative to the target plane.)

1. An ultrasound imaging system for shear wave imaging, comprising:

a probe configured to acquire ultrasound echo signals to produce a shear wave image of anisotropic tissue of a subject, wherein the probe is configured to be coupled to a position measurement system to track a position of the probe relative to the position measurement system;

a processor in communication with the probe and configured to receive position tracking data from the position tracking system, wherein the processor is further configured to:

defining at least one target plane in the anisotropic tissue at an angle relative to a reference plane of the anisotropic tissue;

determining a difference between a first position of an imaging plane of the probe at a location indicated by the position tracking data and a second position of the at least one target plane; and is

Providing a visual indicator of the discrepancy on a display of the ultrasound system, wherein the processor is configured to dynamically update the visual indicator in response to a change in the position of the imaging plane relative to the target plane.

2. The ultrasound imaging system of claim 1, wherein the probe includes at least one sensor configured to receive information from the position tracking system to dynamically determine the position of the probe.

3. The ultrasound imaging system of claim 2, wherein the position tracking system includes an electromagnetic field generator and the at least one sensor includes at least one electromagnetic sensor attached to the probe.

4. The ultrasound imaging system of claim 3, wherein the at least one electromagnetic sensor is embedded in the probe.

5. The ultrasound imaging system of claim 1, wherein the processor is further configured to: causing at least one shear wave image to be automatically generated when the difference between the first and second positions is below a threshold.

6. The ultrasound imaging system of claim 1, wherein the processor is configured to receive an indication of the reference plane in response to a user input.

7. The ultrasound imaging system of claim 6, further comprising a reference plane selector, and wherein the processor is further configured to set the reference plane to a current imaging plane of the probe in response to activation of the reference plane selector.

8. The ultrasound imaging system of claim 1, wherein the processor is configured to determine the reference plane based on a direction of fibers of the anisotropic tissue.

9. The ultrasound imaging system of claim 8, wherein the processor is configured to set the reference plane such that it is aligned with the direction of the fibers of the anisotropic tissue.

10. The ultrasound imaging system of claim 8, wherein the anisotropic tissue is selected from musculoskeletal tissue, myocardial tissue, vascular wall tissue, and thyroid tissue, and wherein the processor is configured to estimate the direction of fibers of the musculoskeletal tissue, myocardial tissue, vascular wall tissue, or thyroid tissue.

11. The ultrasound imaging system of claim 10, wherein the processor is further configured to estimate the direction of the fiber based at least in part on a 3D image of the anisotropic tissue.

12. The ultrasound imaging system of claim 1, wherein the processor is further configured to define the at least one target plane at a predetermined angle relative to the reference plane.

13. The ultrasound imaging system of claim 12, wherein the processor is configured to define a plurality of target planes at predetermined angular intervals relative to the reference plane.

14. The ultrasound imaging system of claim 1, wherein the processor is further configured to generate instructions for adjusting the position of the probe, wherein the instructions are configured to reduce a difference between the first position and the second position.

15. The ultrasound imaging system of claim 14, wherein the processor is further configured to control an actuator to automatically adjust the position of the probe based on the instructions.

16. The ultrasound imaging system of claim 14, wherein the visual indicator comprises a current plane indicator and a target plane indicator, wherein the current plane indicator is dynamically adjusted by the processor in response to changes in the position of the probe relative to the target plane.

17. The ultrasound imaging system of claim 16, wherein the current plane indicator is a visual representation of a current imaging plane of the probe.

18. The ultrasound imaging system of claim 15, wherein the target plane indicator is a visual representation of an imaging plane corresponding to the target plane.

19. The ultrasound imaging system of claim 13, wherein the visual indicator comprises a digital scale having a dynamic component configured to move along the scale in response to changes in the determined variance.

20. The ultrasound imaging system of claim 1, wherein the position tracking system is configured to determine a spatial position of the probe and a rotation of the probe relative to the reference plane.

21. A method of shear wave imaging, the method comprising:

determining a position of a probe relative to an imaging plane of anisotropic tissue based on position tracking data generated based at least in part on sensor data received from a position sensor coupled to the probe;

defining at least one target plane;

determining a difference between the position of the imaging plane and the position of the at least one target plane;

providing a visual indicator of the determined difference on a display and dynamically updating the visual indicator in response to a change in the position of the imaging plane; and is

Generating at least one shear wave image of the target plane using the probe.

22. The method of claim 21, further comprising generating instructions for adjusting a position of the imaging plane, wherein the instructions are configured to reduce the determined difference.

23. The method of claim 22, further comprising adjusting a position of the imaging plane based on the generated instructions.

24. The method of claim 23, wherein said adjusting the position of the imaging plane comprises: adjusting a position of the probe, adjusting at least one angular orientation of the probe, or a combination thereof.

25. The method of claim 22, further comprising sending the generated instructions to an actuator configured to control a position of the imaging plane.

26. The method of claim 21, further comprising defining a transformation between an orientation of the anisotropic tissue and a position of the imaging plane.

27. The method of claim 26, further comprising adjusting a position of the imaging plane to align with an orientation of the anisotropic tissue.

28. The method of claim 21, further comprising defining a reference plane in the anisotropic tissue.

29. The method of claim 28, wherein said defining at least one target plane further comprises defining an angle between said at least one target plane and said reference plane.

30. The method of claim 21, further comprising displaying the determined location of the imaging plane on at least one shear wave image generated at the determined location.

Technical Field

The present disclosure relates to ultrasound systems and methods for imaging anisotropic tissue at various angles relative to the tissue using shear wave elastography.

Background

Ultrasound imaging systems, such as cart-based ultrasound imaging systems, typically include a user interface that operates in conjunction with a probe and a display to acquire and display images from an object, such as a patient. Ultrasound imaging systems may use shear wave elastography to determine mechanical properties of tissue. Shear wave elastography may be used for screening and diagnostic purposes, for example to identify regions of abnormal stiffness in tissue, which may indicate the presence of a tumor, for example.

Different types of organizations have different attributes. Certain types of tissue (e.g., liver tissue) may be generally isotropic, while certain other types of tissue (e.g., musculoskeletal, vascular, and myocardial tissue) may be anisotropic, where a property of the tissue (e.g., stiffness) may vary based on the direction in which the property is measured. In order to characterize anisotropic tissue, measurements using the probe may need to be taken in more than one orientation relative to the tissue. However, it may be difficult for an operator to accurately control or record the orientation of the probe. Examples described herein may provide solutions to one or more challenges in the field of anisotropic tissue imaging.

Disclosure of Invention

The present disclosure relates to ultrasound systems and methods for imaging anisotropic tissue at various angles relative to the tissue using shear wave elastography. In at least one aspect, the present disclosure is directed to an ultrasound imaging system for shear wave imaging. An ultrasound imaging system may include a probe and a processor. The probe can transmit and receive ultrasound echo signals to produce a shear wave image of anisotropic tissue of the subject. The probe may be coupled to a position tracking system for tracking the position of the probe relative to the subject. A processor may be in communication with the probe and configured to receive position tracking data from the position tracking system. The processor may define at least one target plane in the anisotropic tissue at an angle relative to a reference plane of the anisotropic tissue. The processor may determine a difference between a first position of an imaging plane of the probe and a second position of the at least one target plane at a position indicated by the position tracking data. The processor may also provide a visual indicator of the discrepancy on a display of the ultrasound system and dynamically update the visual indicator in response to a change in the position of the imaging plane relative to the target plane.

In at least one aspect, the present disclosure relates to shear wave imaging. The method may include determining a position of the probe relative to an imaging plane of the anisotropic tissue based on position tracking data generated based at least in part on sensor data received from a position sensor coupled to the probe. The method may comprise defining at least one target plane and/or determining a difference between the position of the imaging plane and the position of the at least one target plane. The method may further include providing a visual indicator of the determined difference on the display and dynamically updating the visual indicator in response to a change in the position of the imaging plane. The method may include generating at least one shear wave image of the target plane using the probe.

Aspects of the disclosure, such as particular elements of the user interface described herein and/or functions performed by a processor of an ultrasound system, may be embodied in a computer-readable medium comprising processor-executable instructions. For example, processor-executable instructions for providing one or more graphical user interfaces or elements thereof may be incorporated into a software package, e.g., for running on an analysis workstation. Aspects of the present disclosure may facilitate offline image analysis as described further below, but it should be understood that the principles described herein may be equally applied to online image analysis (e.g., analysis performed during or shortly after image acquisition). According to one embodiment, a non-transitory computer-readable medium comprising processor-executable instructions for displaying an ultrasound image may include instructions for: displaying a first image frame from a first plurality of stored image files, wherein the image files include first position information corresponding to a probe position during acquisition of the first image frame; receiving a request for an orthogonal view via a user interface; comparing the first location information to location information for a second plurality of stored image files to identify one or more of the second plurality of image frames associated with location information that is closest to the first location information; and displaying a representation of each of the one or more image frames as candidate orthogonal views.

Drawings

Fig. 1 is a block diagram of a system for guided shear wave elastography of anisotropic tissue constructed in accordance with some examples of the present disclosure.

Fig. 2 is a block diagram of an ultrasound system according to some examples of the present disclosure.

3A, 3B, 3C, 3D, 3E, and 3F are illustrations of probe placement relative to anisotropic tissue, according to some examples of the present disclosure.

Fig. 4A and 4B are illustrations of coordinate systems associated with components of an example system (e.g., a probe and/or an electromagnetic field generator), according to some examples of the present disclosure.

Fig. 5A, 5B, and 5C are graphical user interface elements providing visual indicators according to some examples of the present disclosure.

Fig. 6A and 6B are example shear wave elastography images of muscle tissue including a location indicator, according to some examples of the present disclosure.

Fig. 7A and 7B are block diagrams of coordinate systems associated with a probe and anisotropic tissue, according to some examples of the present disclosure.

Fig. 8 is a flow chart depicting a method of guided shear wave elastography of anisotropic tissue, in accordance with some examples of the present disclosure.

Detailed Description

The following description of specific embodiments is merely exemplary in nature and is in no way intended to limit the invention or its application or uses. In the following detailed description of embodiments of the present systems and methods, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments in which the described systems and methods may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the presently disclosed systems and methods, and it is to be understood that other embodiments may be utilized and that structural and logical changes may be made without departing from the spirit and scope of the present system. Furthermore, for the sake of clarity, detailed descriptions of certain features will not be discussed so as not to obscure the description of the present system, as will be apparent to those skilled in the art. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present system is defined only by the claims.

The present technology is also described below with reference to block diagrams and/or flowchart illustrations of methods, apparatus (systems) and/or computer program products according to embodiments. It will be understood that blocks of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer-executable instructions. These computer-executable instructions may be provided to a processor, controller, or control unit of a general purpose computer, special purpose computer, and/or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.

Certain difficulties in reliably acquiring shear wave elastography images of anisotropic tissue at a lead angle relative to the tissue may be addressed by examples herein, for example, by providing feedback to a user (e.g., an sonographer) regarding the location and/or proximity of an imaging plane of a probe to a desired or target imaging plane location, and in some cases, by providing instructions to adjust the position of the probe and thereby the position of the corresponding imaging plane in order to position the probe to acquire a target view of the anisotropic tissue. Additionally or alternatively, examples herein may enable data to be collected at one or more target planes, increase reliability of probe placement during such data collection, and/or allow accurate recording of probe placement during measurements.

Various types of tissue may have anisotropic properties. For example, skeletal muscle exhibits anisotropic mechanical properties, i.e., tissue stiffness varies with load direction. In order to fully characterize the stiffness of a muscle, multiple stiffness measurement points at different angles between the transducer imaging plane and the muscle fiber orientation are typically required (e.g., in-plane fibers and cross-plane fibers as two extreme cases). Recent studies have shown that by taking multidirectional stiffness measurements over the entire imaging angle range, more diagnostic potential can be provided. However, it can be challenging for the sonographer to accurately steer the probe and determine the angle between the imaging plane and the muscle fiber direction. Ultrasound systems and methods for shear wave elastography according to the present disclosure may facilitate control of the shear wave imaging plane relative to fiber orientation, resulting in a more accurate assessment of muscle stiffness as a function of direction of loading, which may enable more complete, accurate, standardized quantitative stiffness measurements of organs and tissues exhibiting anisotropic behavior. Although certain examples are described herein with respect to musculoskeletal tissue, it should be understood that the principles of the present disclosure may be equally applied to any type of anisotropic tissue (e.g., vascular wall tissue, cardiac muscle, etc.).

Devices, systems, and methods according to embodiments of the present disclosure are directed to guiding the position (e.g., position and angle in space) of a shear wave elastography system relative to tissue. The imaging system may include a probe that transmits and receives ultrasound pulses to form an image of tissue in an imaging plane. A position measurement system (e.g., a position tracking system) coupled to the probe can determine the position of the imaging plane in real time relative to a position measurement system coordinate system (world coordinate system). Feedback regarding the position of the current imaging plane relative to the world coordinate system may be provided to a user of the system (e.g., via a graphical display) to guide the placement of the imaging plane and capture data at the determined location. The system may guide the user to position and reposition the imaging plane at various different locations around the tissue (e.g., by repositioning or rotating the probe). By recording images of B-mode, shear wave elastography, and/or other modalities at multiple locations, and recording the capture location of each image, the system can construct a more complete picture of anisotropic tissue properties.

Fig. 1 depicts an operating environment 100 associated with an ultrasound imaging system 112, in accordance with some embodiments of the present disclosure. The components of operating environment 100 may be used, at least in part, to implement embodiments of the present disclosure. For example, fig. 1 shows an ultrasound imaging system 112, which may include or be operatively coupled to a probe 106, an ultrasound base 114, and a processing unit 118 (which may be located within the base in some embodiments), a position tracking system 116 may include or be operatively associated with a position field generator 117 and a sensor 108 connected to the probe 106, and the user 102 ultrasonically imaging the object 104 with the probe 106. The components shown in fig. 1 are merely illustrative, and other variations, including removal of components, combination of components, rearrangement of components, and replacement of components, are contemplated.

In the operating environment 100 of fig. 1, the user 102 may utilize a probe 106, which may be part of the probe 106 and/or communicatively connected to the probe 106 of the ultrasound imaging system 112, to ultrasonically inspect anisotropic tissue 105 of the subject 104, and more particularly, to obtain ultrasonic shear wave elastography images of the anisotropic tissue 105 of the subject 104. The probe 106 may be arranged relative to the anisotropic tissue 105 such that it scans a measurement plane 110 comprising a cross section of the anisotropic tissue 105. The probe 106 may include a sensor 108, and the sensor 108 may be coupled to a position measurement system 116 to determine a position (e.g., spatial position and/or orientation) of the probe 106 to determine a position of the measurement plane 110. In one example, the sensor 108 may interact with an electromagnetic field generated by the field generator 117 of the position measurement system 116 to determine the position of the sensor 108. In other examples, the position measurement system 116 may involve optical tracking (e.g., an optical sensor records the position of an optical marker on the probe), mechanical tracking (e.g., connecting the probe to a robotic arm, a rigid stereotactic frame), and/or ultrasound image-based tracking. Other position measurement or tracking systems may be used in other embodiments. The probe 106 may be coupled to an image data acquisition component, which may be located in an imaging base 114 of the image to acquire image data on the measurement plane 110.

The ultrasound system 112 includes an ultrasound base 114 coupled to the probe 106 and a processing unit 118. The processing unit 118 may receive data regarding the position of the probe 106 from the position tracking system 116 and/or the sensors 108, and may receive measurement data from the ultrasound base 114. In some embodiments, the processing unit 118 may be a computer coupled to the ultrasound base 114. Ultrasound system 112 may include a display 120, a processor 126, a controller 128, and a memory 130 including instructions 132, which instructions 132 may include one or more steps 133a-133 e. The processor 126 and/or the controller 128 may operate based on the instructions 132 to analyze information and/or control the ultrasound system 112. The display 120 may display at least one image 124 and/or one or more visual indicators 122 based on measurement data from the ultrasound imaging system 112 and/or location data from the sensor 108 and/or the location tracking system 116.

The probe 106 may be used to image a region of interest in biological tissue, such as a region including anisotropic tissue 105. As described, anisotropic tissue is typically tissue that may have different properties, e.g., may have different stiffness depending on the orientation relative to the tissue for which the property is to be measured. In some examples, the anisotropic tissue 105 being imaged may be musculoskeletal tissue, cardiac muscle tissue, vascular wall tissue, thyroid tissue, or any combination thereof. Of course, the examples herein may be applicable to other types of organizations than the specific examples described herein. One or more properties of anisotropic tissue 105, such as stiffness, may be measured and may be used to assess and/or diagnose conditions such as injury, malnutrition, and/or myositis.

The probe 106 may be used to acquire shear wave elastography images of the anisotropic tissue 105. To this end, the probe 106 may include a transducer operable to emit "push pulses" toward the anisotropic tissue 105, generating shear waves that then propagate through the anisotropic tissue 105. Alternatively, shear waves in tissue may be generated without acoustic radiation forces, but via mechanical forces applied externally to the tissue, for example by a mechanical vibrator configured to compress the tissue. The probe 106 may be further operable to emit a tracking pulse which may be used to measure the velocity of the shear wave as it propagates. The measured velocity of the shear wave may be analyzed (e.g., by the processor 126) to determine the stiffness of the anisotropic tissue 105. The shear wave elastography data may be used to generate a shear wave elastography image.

The transducers of the probe 106 may be linear (or one-dimensional) transducers operable to receive data, typically from a single imaging plane 110. In some embodiments, the linear probe may be phased or controllable in the azimuthal direction to increase the field of view. Nonetheless, the linear transducer may need to be physically repositioned (e.g., tilted or toe-heel relative to the subject, or rotated to a different orientation relative to the subject) in order to acquire image data in imaging planes of different heights through biological tissue. When imaging a subject using a system according to examples herein, the system may receive position data specifying the position of the probe 106 (including the spatial position (3D) of the probe and the orientation of the probe) and thereby indicate an imaging plane relative to a reference frame (e.g., a coordinate system of a position measurement system). As the examination progresses, the ultrasound system 112 may direct the user 102 to change the position of the probe 106, and thus the orientation of the imaging plane 110, for example, to acquire images at different orientations relative to the tissue 105.

The probe 106 may be coupled to the position tracking system 116 to determine the position of the probe 106 relative to the position tracking system 116. The probe 106 may include at least one sensor 108 that may receive information from a position tracking system 116. The at least one sensor 108 may be attached to the probe 106 or may be integrated into the probe 106. The position tracking system may have a known spatial relationship with the object 104. In some examples, the position tracking system may register to the object 104, for example, by spatially registering the probe 106 to the object at the beginning of the ultrasound examination. As shown, the position tracking system 116 may include a position field generator 117 that generates fields that interact with the at least one sensor 108. In one embodiment, at least one sensor 108 may measure a property of the field. In another embodiment, the at least one sensor may produce a disturbance in the field that may be measured by the position field generator 117.

Measurements from the at least one sensor 108 and/or the location field generator 117 may be used to determine the location of each of the at least one sensor 108 relative to the location field generator 117. These measurements, in turn, may be used to determine the position of the probe 106 based on a known relationship between the position of each of the at least one sensor 108 and the probe 106. The measurements may be sent to the ultrasound system 112 to determine the position of the probe 106, or the calculations may be made by a processor located in the position tracking system 116 or the probe 106. Although fig. 1 depicts the position tracking system 116 as being coupled to the ultrasound system 112, in some embodiments, the position tracking system 116 is not directly connected to the ultrasound system 112 and the sensors 108 of the probe 106 provide position measurements to the ultrasound system 112. The sensors 108 may be directly coupled to the ultrasound system 112 or may be indirectly coupled via the probe 106 and/or the position tracking system 116.

As described in more detail herein, the position tracking system 116 may have a known relationship to the position of the object 104 and/or the anisotropic tissue 105, allowing the relationship between the position of the probe 106 and the position of the anisotropic tissue 105 to be determined. In one example, the location field generator 117 may be located below the object 104, for example embedded in a mattress supporting the object 104. In other examples, if it is assumed that tissue 105 is not moving relative to position measurement system 116, it may not be necessary to determine the relationship between probe 106 and anisotropic tissue 105.

The position tracking system 116 may have a coverage area surrounding the object 104 and/or the anisotropic tissue 105. The coverage area may define an area in which the position of the probe 106 may be determined. The size and shape of the coverage area may be determined by the properties of the location field generator 117. The coverage area may be defined by moving the probe 106 and marking different positions. As an example, the ultrasound system 112 may prompt the user 102 (e.g., via the display 120) to move the probe 106 to one or more set positions relative to the subject 104. These locations may be the edges of the expected coverage area. The ultrasound system 112 may record the size of the desired coverage area and may then provide feedback (e.g., via tones and/or display 120) to indicate that the probe 106 is at or near the edge of the coverage area.

In some embodiments, the location tracking system 116 may be an electromagnetic tracking system. The location field generator 117 may be an electromagnetic field generator that generates an electromagnetic field with respect to the object 104. The sensor 108 may be an electromagnetic sensor that detects a property (e.g., amplitude) of an electromagnetic field at the location of the sensor 108. The electromagnetic field generator may be a table-top electromagnetic generator and may be positioned on or near a structure (e.g., gurney, imaging table, bed) supporting the subject 104. Other forms of position tracking system 116 and sensors 108 may be used in other embodiments.

In the example of fig. 1, the ultrasound system 112 is communicatively coupled to the probe 106 and the position tracking system 116. The ultrasound system 112 may receive data from the probe 106 and the position tracking system 116 and/or may control them to alter the operation of these components. The ultrasound system 112 may be provided as a mobile unit, such as on a cart, or as a portable handheld unit, such as a tablet. The probe 106 and/or the position tracking system may be coupled to the ultrasound system 112 by a wired and/or wireless (e.g., Wi-Fi, bluetooth) connection. The ultrasound system 112 may have an ultrasound base 114 coupled to the probe 106. The ultrasound pedestal 114 may control shear wave elastography transmitted by the transducer and receive data from the probe 106. Although fig. 1 depicts the ultrasound base 114 and the processing unit 118 as separate components of the ultrasound system 112, it should be understood that the ultrasound base 114 may share components with the processing unit 118, such as the processor 126, the controller 128, and the memory 130. In some embodiments, the ultrasound base 114 and the processing unit 118 may be integrated into a single unit.

The ultrasound system 112 includes an operating unit 118, the operating unit 118 being coupled (via the ultrasound base 114) to the probe 106 and the position tracking system 116. The operation unit 118 may receive shear wave elastography data from the probe 106 (and/or the ultrasound base 114) and generate one or more shear wave elastography images 124. The processor 126 may utilize the instructions 132 (e.g., by performing step 133a) to analyze data from the probe 106, the sensor 108, and/or the position tracking system 116 to determine the position of the probe 106. The position of the probe 106 may be determined in real time. The location and/or image 124 may additionally be stored in memory 130. The processor 126 may determine one or more target locations or planes on which images are to be taken (e.g., by performing step 133b of the instructions 132) and determine a difference between the current position of the probe 106 and the target plane (e.g., by performing step 133 c). The processor 126 may operate the instructions 132 to generate one or more visual indicators 122 depicting the current position of the imaging plane 110 relative to the target plane of the anisotropic tissue 105 (e.g., by performing step 133 d). When the probe 106 and the target plane are aligned, the processor 126 may execute the instructions 132 to generate a shear wave image (e.g., step 133 e). The processing unit 118 may include a display 120, the display 120 displaying an image 124 and/or a visual indicator 122. Although certain specific steps 133a-133e of instructions 132 have been described, it should be understood that instructions 132 may include more or fewer steps, and that these steps may be repeated, rearranged and/or selectively altered.

The processor 126 may generate one or more visual indicators 122 that may guide the positioning of the probe 106 such that the position of the imaging plane 110 matches the position of the target plane. The visual indicator 122 may provide feedback to the user 102 to manually adjust the position of the imaging plane 110 by moving the probe 106. The processor 126 may directly adjust the position of the probe 106 and/or the imaging plane 110 by generating instructions to update the position of the probe 106 and/or the imaging plane 110, and the controller 128 controls the actuator based on those instructions to adjust the position of the probe 106 and/or the imaging plane 110 to match the position of the target plane. The ultrasound system 112 may generate various target planes to guide shear wave elastography of the anisotropic tissue 105 at various locations relative to the anisotropic tissue 105. The visual indicator 122 may improve the accuracy with which the probe 106 is positioned at each of the various locations.

The ultrasound system 112 may capture shear wave elastography images (data) along the imaging plane 110, and may also store and/or retrieve previously recorded data for later viewing. The image 124 and/or location data may be stored in the memory 130 and recalled as needed. The ultrasound system 112 may classify different images based on the particular location at which the data was collected and may allow the user 102 to classify or select different images based on these classifications. For example, the ultrasound system 112 may have a "find orthogonal" tool that selects an image that is orthogonal to the currently displayed image. Reports may be generated based on a selection of saved data in memory 130.

Fig. 2 illustrates a block diagram of an ultrasound imaging system 200 according to some embodiments of the present disclosure. The ultrasound imaging system 200 may be used to implement, at least in part, the ultrasound system 112 of figure 1. Fig. 2 shows an ultrasound imaging system 200 including an ultrasound probe 206, a transducer array 270, a microbeamformer 268, a transmit/receive (T/R) switch 250, a beamformer 252, a transmit controller 210, a signal processor 254, a B-mode processor 262, a scan converter 260, a multi-plane reformatter 266, a volume renderer 264, an image processor 258, a graphics processor 256, a user interface 274, an input device 272, and an output device 220. The components shown in fig. 2 are merely illustrative, and other variations, including removal of components, combination of components, rearrangement of components, and replacement of components, are contemplated.

The ultrasound imaging system 200 includes a probe 206, and in some embodiments, the probe 206 may be used to implement the probe 106 of fig. 1. The probe 206 is positioned around the subject and is used to capture data about the subject's tissue. In the ultrasound imaging system 200 of fig. 2, the ultrasound probe 206 includes a transducer array 270 for transmitting ultrasound and receiving echo information. Various transducer arrays are known in the art, for example, linear arrays, convex arrays, or phased arrays. The transducer array 270 may include, for example, a two-dimensional array of transducer elements capable of scanning in elevation and azimuth dimensions for 2D and/or 3D imaging. The transducer array 270 is coupled to a microbeamformer 268, typically located in the ultrasound probe 206, which microbeamformer 268 controls the transmission and reception of signals by the transducer elements in the array. In this example, the microbeamformer 268 is coupled, e.g., by a probe cable or wirelessly, to the transmit/receive T/R switch 250, which switches between transmit and receive. The T/R switch 250 may thus protect the beamformer 252 from high energy transmit signals. In some embodiments, the T/R switch 250 and other elements of the system may be included in the transducer probe, rather than in a separate ultrasound system base (e.g., ultrasound base 114).

The transmission of ultrasound beams from the transducer array 270 is directed by the transmission controller 228, which is coupled to the T/R switch 250 and the beamformer 252, under the control of the microbeamformer 268. The launch controller 228 receives input from user manipulation of the input device 272 of the user interface 274. The transmit controller 228 may be a component of an ultrasound system base (e.g., ultrasound base 114 of FIG. 1) or may be a general controller of an ultrasound system (e.g., controller 128 of FIG. 1). The user interface 274 may be implemented using one or more inputs (e.g., a control panel that may include soft and/or hard controls) and an output device (e.g., one or more displays), as described further below. One of the functions controlled by the transmit controller 228 is the direction in which the beam is steered. The beams may be steered vertically forward (perpendicular to the transducer array) from the transducer array, or at different angles for a wider field of view. The partially beamformed signals produced by the microbeamformer 268 are coupled to the beamformer 252 where the partially beamformed signals from individual patches of transducer elements are combined into fully beamformed signals. The transmit controller 228 may record the position of the beam relative to the probe 206. As described herein, the position of the beam and probe 206 may be used to determine the position of an imaging plane (e.g., imaging plane 110 of fig. 1).

The beamformed signals may be coupled to a signal processor 254. The signal processor 254 may process the received echo signals in various ways, such as bandpass filtering, decimation, I and Q component separation, and harmonic signal separation. The processor 254 may also perform signal enhancement such as ripple reduction, signal compounding, and noise cancellation. The processed signals may be coupled to a B mode processor 262, and the B mode processor 1128 may employ amplitude detection to image structures in the body. The signals generated by the B mode processor may be coupled to a scan converter 260 and a multiplanar reformatter 266. The scan converter 260 arranges the echo signals according to the spatial relationship in which they are received in the desired image format. For example, the scan converter 260 may arrange the echo signals into a two-dimensional sector-shaped format, or a pyramidal three-dimensional (3D) image. The multiplanar reformatter 266 can convert echoes received from points in a common plane in a volumetric region of the body into an ultrasound image of that plane, as described in U.S. patent US 6443896 (Detmer). The volume renderer 264 converts the echo signals of the 3D data set into a projected 3D image as seen from a given reference point, for example as described in US 6530885(Entrekin et al). The 2D or 3D images may be coupled from the scan converter 260, the multi-plane reformatter 266, and the volume renderer 264 to the image processor 258 for further enhancement, buffering, and temporary storage for display on the output device 220. The output device 220 may include a display device implemented using various known display technologies, such as LCD, LED, OLED, or plasma display technologies. In some embodiments, output device 220 may implement display 120 of fig. 1.

The graphics processor 256 may generate a graphical overlay for display with the ultrasound image. These graphic overlays may include standard identifying information such as the patient's name, date and time of the image, imaging parameters, and the like. The graphics processor may receive input from an input device 272, such as a typed patient name. The input device 272 may include one or more mechanical controls, such as buttons, dials, trackballs, physical keyboards, etc., which may also be referred to herein as hard controls. Alternatively or additionally, the input device 272 may include one or more soft controls, such as buttons, menus, soft keyboards, and other user interface control elements implemented, for example, using touch-sensitive technology (e.g., resistive, capacitive, or optical touch screens). To this end, the ultrasound imaging system 200 may include a user interface processor (i.e., the processor 226) that may control the operation of a user interface, such as the functionality associated with the soft controls. One or more user controls may be co-located on a control panel. For example, one or more mechanical controls may be provided on the console, and/or one or more soft controls may be co-located on a touch screen that may be attached to or integrated with the console. For example, in some embodiments, the input device 272 may be part of the processing unit 118 and/or ultrasound base 114 of fig. 1.

The ultrasound images and associated graphic overlays may be stored in memory 230, for example, for offline analysis. Additionally, the memory 230 may store processor-executable instructions, including instructions for performing functions associated with the user interface 274. In some embodiments, the user interface 274 may include a graphical user interface that may be configured to display, responsive to the processor of the system 200, graphical user interface elements for providing guidance to an sonographer to perform shear wave elastography of anisotropic tissue in ultrasound imaging in accordance with any of the examples herein. The memory 230 may be part of the ultrasound base unit or may be general purpose memory that is part of a computer system coupled to the base unit (e.g., the memory 230 may be the memory 130 of the processing unit 118 of fig. 1). The user interface 274 may also be coupled to the multiplanar reformatter 266 for selecting and controlling the display of the plurality of multiplanar reformatted (MPR) images. In some examples, the functionality of two or more processing components (e.g., beamformer 252, signal processor 254, B-mode processor 262, scan converter 260, multi-plane reformatter 266, volume renderer 264, image processor 258, graphics processor 256, processor 226, etc.) may be combined into a single processing unit, such as processor 126 of fig. 1.

The probe 206, transducer 208, microbeamformer 268 and transducer 270 may be combined into a handheld unit 276. The handheld unit 276 may be shaped to be held in a user's hand. The handheld unit 276 may have a "head" or "face" that contains the transducer 270 and is shaped to be positioned on the surface of the subject (e.g., against the skin). In some embodiments, the sensor 208 may implement at least one sensor 108 of fig. 1. Although only one sensor 208 is shown in FIG. 2, it should be understood that the sensor 208 may represent multiple sensors located around the probe 206. The sensor 208 may be integral, e.g., contained within the housing of the probe 206, may be a separate component attached to the exterior of the housing of the probe 206, or may be a combination of integral and combined. The sensor 208 may be located in a fixed position relative to the probe 206 such that by knowing the position of the sensor 208, the position of the probe 206 and the imaging plane are also known.

Fig. 3A-3F illustrate probe placement relative to anisotropic tissue according to examples of the present disclosure. The position of the probe may be defined by the position of a plane intersecting the tissue. The relationship between these planes may be determined and may be used to guide placement of the probe during one or more imaging operations. Figures 3A-3D depict anisotropic tissue 305, fibers 334, probe 306, imaging plane 310, rotated imaging planes 310' and 310 ", reference plane 311, and at least one target plane 313 and 313A-313D. Fig. 3A and 3C depict a probe placed at two different positions relative to anisotropic tissue. Figures 3B and 3D depict the imaging plane through the tissue captured by the probe at the positions of figures 3A and 3C, respectively. Fig. 3E is a top view of the tissue depicting the target plane 313. FIG. 3F is a top view similar to FIG. 3E, but with multiple imaging planes 313a-313d present. The examples shown in fig. 3A-3F are merely illustrative, and other variations, including removing components, combining components, rearranging components, and replacing components, are contemplated.

In some embodiments, the probe 306 may be an implementation of the probe 106 of fig. 1. The probe 306 may be placed against the surface of the subject (e.g., on the skin) adjacent to a region of anisotropic tissue 305. The anisotropic tissue 305 may include fibers 334, the orientation of which may determine, at least in part, the anisotropic properties of the anisotropic tissue 305. That is, when measured with the imaging plane of the probe 306 aligned with the direction of the fibers of the tissue 305, a given characteristic (e.g., stiffness) of the tissue 305 may be different than when measured with the imaging plane of the probe 306 misaligned with the direction of the fibers. The probe 306 collects data from an imaging plane 310, which may include a portion of the anisotropic tissue 305. A reference plane 311 may be defined in the tissue and an angle theta may be measured between the current position of the imaging plane 310 and the position of the reference plane 311. One or more target planes 313, 313a-d may also be defined in the anisotropic tissue 305, which may have an angle α with respect to the reference plane and an angle β with respect to the current imaging plane 310.

Fig. 3A depicts a probe 306 for shear wave elastography imaging of a region of anisotropic tissue 305. The anisotropic tissue 305 may include fibers 334. The fibers 334 may be substantially aligned. The fibers 334 may have mechanical properties measured across the fibers that are different from the same mechanical properties measured along the fibers. In some cases, the orientation of the fibers 334 may define anisotropic properties of the anisotropic tissue 305. The fibers 334 may be, for example, muscle fibers. The probe 306 can collect data from an imaging plane 310 extending through the tissue.

FIG. 3B shows a cross-sectional view of the anisotropic tissue 305 corresponding to the imaging plane 310. In this example, the imaging plane is shown as being generally aligned along the long axis of the optical fiber 334. The fibers 334 extend across the imaging plane 310. The imaging plane 310 is depicted as being generally rectangular, but other shapes of the imaging plane 310, such as curvilinear, trapezoidal, fan-shaped, and/or radial, may be used in other examples.

Fig. 3C and 3D are similar to fig. 3A-B, respectively, but the probe 306 has changed position relative to its position in fig. 3A-B, and is now collecting data from the image plane 310'. In fig. 3C and 3D, the imaging plane 310' is shown as being generally aligned (e.g., rotated about 90 ° from fig. 3A-B) on the long axis of the fiber 334. The image of fig. 3D shows the fiber 334 entering and exiting in imaging plane 310'.

Fig. 3E and 3F depict simplified top views of the probe 306 and anisotropic tissue 305. For reference, the anisotropic tissue 305 is shown in top view, not visible even though it may be occluded (e.g., under the skin). The fibers of the anisotropic tissue 305 have been omitted from these figures for clarity, but it should be understood that the anisotropic tissue 305 may still have fibers, for example, which may extend from left to right in this view. Reference plane 311 may be defined relative to anisotropic tissue 305 such that the position of imaging plane 310 may be described relative to the position of reference plane 311. In this example, reference plane 311 is defined as a plane aligned with the long axis of fiber 334 (e.g., the location of imaging plane 310 in fig. 3A-3B). Since the position of the probe 306 is rotated relative to the position of the reference plane, the position of the case image plane 310 'can be measured as the angle θ between the reference plane 311 and the image plane 310'. In the case of fig. 3E, the angle θ may be about 90 °.

In some embodiments, the reference plane 311 may be defined relative to one or more features of the anisotropic tissue 305. The reference plane 311 may be aligned with known anatomical structures (e.g., along the long axis of the bone or along the intended direction of the muscle fibers). The reference plane 311 may be aligned with a feature of the anisotropic tissue 305. The reference plane 311 may be aligned with the long axis of the fibers of the muscle tissue, for example. The system (e.g., ultrasound system 112 of fig. 1) may be used to determine the location of reference plane 311. In an example, a 3D scan of the anisotropic tissue 305 can be performed, and anatomical features can be identified by the system (e.g., using the processor 126 of fig. 1) through techniques such as machine learning. The system may then define a reference plane 311 based on the identified anatomical features. In one embodiment, the system may estimate the orientation of the fibers in the anisotropic tissue 305 based on a 3D scan and align the reference plane 311 with the estimated long axis of the fibers. Other methods of defining the reference plane 311 may be used in other examples.

In some embodiments, the reference plane 311 may be selected by a user (e.g., user 102 of fig. 1). The probe 306 may include a reference frame selector (e.g., a button). When the reference frame selector is activated, the ultrasound system (e.g., ultrasound system 112 of fig. 1) may record the current position of the imaging plane 310 as the reference plane 311. For example, in the example of fig. 3A-F, the reference frame selector may be activated when the probe 306 is in the position of fig. 3A-3B. Other methods of user selection of the reference plane 311 may be used in other embodiments.

A system (e.g., ultrasound system 112 of fig. 1) may be used to maintain images taken along image planes 310', 310 "and position information for each image defined relative to reference plane 311 (e.g., each image may be associated with an angle θ). The system may include a "recalibration reference plane" tool. The system may optionally allow selection of a new reference plane for the saved image. The new reference plane may be selected from one of the saved images and may be selected by a user of the system. The system may update the location data saved with the image (e.g., by overwriting the old location data) to reflect the location of the image relative to the new reference plane. The "recalibration reference plane" tool may be used to account for changes in the measurement position over time, for example, if the probe 306 is moved during a measurement sequence.

Fig. 3E and 3F show examples of positioning of the probe 306 relative to the target plane 313 (in fig. 3E) or relative to any of a plurality of target planes (in fig. 3F). The target planes 313, 313a-d may represent locations (or imaging planes) where it may be desirable to acquire shear wave elastography images. In this way, the indicator of the target plane 313 may indicate a position at which the imaging plane 310' of the probe 306 should be aligned in order to acquire an image on a desired imaging plane. An ultrasound system (such as ultrasound system 112 of fig. 1) may generate operator guidance (e.g., graphical user elements provide indicators for one or more target planes) to assist a user in acquiring appropriate image data. The target planes 313, 313a-d may have predetermined positions relative to the reference plane 311. In some examples, the target plane 313 may be defined at a predetermined angle relative to the reference plane 311 (e.g., by the system and/or in response to user input). The plurality of target planes 313a-d may be defined at set angular intervals from a reference plane. In one example, the plurality of target planes are defined at angular intervals of 15 ° such that a first target plane is defined at 15 ° relative to the reference plane and further target planes are defined at angular increments of 15 ° relative to the previous target plane. Other angular intervals (e.g., 5 °, 10 °, 20 °, 25 °, etc.) may be used in other examples, and a user of the system may select the angular interval and range. In some examples, the system may be preset to define the target plane at given angular intervals, and the preset may be adjustable by the user in some cases.

The position of each target plane 313, 313a-d can be described relative to the reference plane 311. In the example of fig. 3E shown, there may be an angle a between the position of reference plane 311 and the position of target plane 313, with all planes rotated about the axis. There may be an angle β between the current imaging plane 310 and the target plane 313. Similar to fig. 3A-3D, reference plane 311 is selected to be aligned with the long axis of fibers 334 of anisotropic tissue 305. The target plane 313 has been selected to be at a position of about 45 deg. (counterclockwise direction) from the reference plane 311. Thus, in this example, the angle α may be about 45 °. Other angles and positions of the target plane may be used in other examples.

The position of the imaging plane 310 may be adjusted to match the position of the target plane 313. This may take the form of decreasing the angle β until it is about 0 ° or within a threshold of 0 °. When the imaging plane 310 is aligned with the position of the target plane 313, a shear wave elastography image may be generated. When the imaging plane 310 is close to the target plane 313 (e.g., the angle β is below a selected threshold), a shear wave elastography image may be recorded. The shear wave elastography image may be recorded automatically by a system (e.g., ultrasound system 112 of fig. 1) or at the time of selection by a user. The current angle θ may also be recorded with the shear wave elastography image. As illustrated in example fig. 3E, image plane 310' is currently at an angle θ of about 90 ° relative to reference plane 311. The current position of the imaging plane 310' may match the position of the previous target plane. The imaging plane 310' may need to be rotated approximately-45 (e.g., 45 clockwise) to align with the position of the target plane 313. Although only one target plane 313 and one rotation are shown in this example, the position of the imaging plane 310 may be rotated again to match the next target plane. Other rotations and target plane positions are used in other examples.

FIG. 3F is similar to FIG. 3E, but in the embodiment of FIG. 3F the system has generated multiple target planes 313A-313d and the probe 306 is rotated so that the imaging plane 310 'has a different position than the imaging planes 310 and 310' of FIGS. 3A-3E. In FIG. 3F, similar to FIGS. 3A-3E, a reference plane 311 has been defined that is aligned along the axis of the fibers of the anisotropic tissue 305. The ultrasound imaging system may generate a plurality of target planes 313a-d and then generate instructions to position the imaging plane 310 in alignment with each of the plurality of target planes 313a-d in the sequence. The system may command the target planes 313a-d of the sequence to minimize movement of the probe 306 between each alignment of the sequence. In the example shown in FIG. 3F, an ultrasound system (e.g., ultrasound system 112 of FIG. 1) has generated four target planes 313a-d relative to reference plane 311. Each of the target planes 313a-313d is spaced apart at an angular interval of about 22.5 deg.. Thus, as measured clockwise from reference plane 311, target plane 313a is approximately 22.5, target plane 313b is approximately 45, target plane 313c is approximately 67.5, and target plane 313d is approximately 90. Although the target planes 313a-d have been shown at regular rotational intervals, it should be understood that the target planes may be generated based on a variety of criteria. In some examples, the target planes may be irregularly spaced. In some examples, other position changes besides rotation may be indicated. Similarly, although only four target planes 313a-d are shown in this example, it should be understood that the system may produce more or fewer target planes.

As shown in fig. 3F, the probe has been rotated so that the current imaging plane 310 "is aligned with the first target plane 313 a. Once the system (e.g., ultrasound system 112 of fig. 1) records an image at that location (either automatically or after prompting the user to save the image), the system may update to generate an orientation to position the probe 306 to align with the next target plane 313 b. Thus, the system may generate an instruction indicating that the probe 306 should be rotated 22.5 from the position of the current imaging plane 310 "to the position of the target plane 313 b. After an image is taken at each target plane 313a-d, the process may be repeated until the system indicates that there are no target planes remaining.

Fig. 4A-4B illustrate other aspects of the present disclosure, and more particularly, coordinate systems associated with the probe and/or other components of the system. Fig. 4A-B show probe 406, anisotropic tissue 405, fiber 434, position tracking system 416, position coordinate system 440, reference coordinate system 442, target coordinate system 444, and current coordinate system 446. In certain embodiments, the probe 406 may be an implementation of the probe 106 of FIG. 1. The examples shown in fig. 4A-4B are merely illustrative, and other variations, including removing components, combining components, rearranging components, and replacing components, are contemplated.

The coordinate system shown in figures 4A-4B may be defined and used by the ultrasound system (e.g., by the processor 126 of the ultrasound system 112) to determine the difference between the current position of the imaging plane and the position of the target plane, and to generate instructions to adjust the position of the imaging plane to the position of the target plane. A coordinate system may be used to determine positions and orientations in order to alter these positions in up to three spatial and three rotational dimensions. Fig. 4A shows probe 406 aligned along a reference plane, which may be aligned along the long axis of fiber 434 through anisotropic tissue 405. For clarity, the probe 406 is depicted as being positioned directly on the anisotropic tissue 405, but it should be understood that one or more other tissues of the subject (e.g., fat, skin) may be present between the probe 406 and the region of the anisotropic tissue 405. Similarly, the position tracking system 416 is shown positioned on the lower surface of the anisotropic tissue 405, but it should be understood that one or more materials may be present between the position tracking system 416 and the anisotropic tissue 405 (e.g., other tissue of the subject, clothing of the subject, air, bedding, etc.).

The reference plane, detector 406, position tracking system 416, and target plane may each be associated with a coordinate system. The coordinates may be 3D coordinates, such as xyz coordinates. Other coordinate systems (e.g., polar coordinates) may be used in other examples. The coordinate system may be used to define the position of its respective plane in space. For example, as shown in FIG. 4A, the reference plane is positioned along the x and y axes of a reference coordinate system 442. The positions of the reference plane, probe 406, position tracking system 416, and target plane may be tracked within their respective coordinate systems, and a transformation that may be based on the positions of the coordinate systems may be used to determine the positions of these elements relative to each other.

The location tracking coordinate system 440 (or world coordinate system) may be xyz coordinates expressed as x _ world, y _ world, and z _ world. The position tracking coordinate system 440 may be aligned with the position of the position tracking system 416, which may be aligned with a known position relative to the object (e.g., a predetermined placement in an examination room, or a placement determined intra-operatively using known registration methods (e.g., rigid point-based registration methods)). The position tracking coordinate system 440 may thus be used to describe the position relative to the position tracking system 416, and thus relative to the object.

The reference coordinate system 442 may be xyz coordinates denoted as x _ ref, y _ ref, and z _ ref. As described herein, the reference coordinate system 442 may be aligned with an anatomical feature of the anisotropic tissue 405 and/or selected by a user. Using the known relationship between the reference coordinate system 442 and the world coordinate system 440, the location of the tissue may be known relative to the location tracking system. Transformation ofWorld of thingsTrefMay be determined to transform the coordinate information from the reference coordinate system 442 to the world coordinate system 440. Once the transformation is determined, the object may need to be held in a fixed position (e.g., by resting) relative to position tracking system 416.

One or more target planes may be defined relative to the reference plane. Each target plane may have a target coordinate system 444. The target coordinate system 444 mayIs an xyz coordinate system with axes denoted as x _ target, y _ target, z _ target. Can changerefTTargetDetermined to transform the coordinate information from the target coordinate system 444 to the reference coordinate system 442. By linking the target coordinates to the reference coordinate system 442 (which in turn is linked to the position-tracking coordinate system 440), the position of each target plane relative to the position-tracking system 416 can be determined.

The current position of probe 406 may be expressed in a current coordinate system 446. The current coordinate system 446 may be linked to the position of the probe 406 and may move as the probe 406 moves. The current coordinate system 446 may be an xyz coordinate system having axes denoted as x _ current, y _ current, z _ current. As shown in fig. 4A, the current coordinate system 446 is aligned with the reference coordinate system (442). As indicated by the arrows in FIG. 4A, the reference coordinate system 442 may utilize a functionWorld of thingsTrefIs transformed to the world coordinate system 440. As shown in fig. 4B, current coordinate system 446 has been rotated approximately 90 ° to align with target coordinate system 444. The transformation between the current coordinate system 446 and the world coordinate system 440 may be by transformationWorld of thingsTAt presentTo be determined. The transformation may be determined by the location tracking system 416.

A processor of the ultrasound system (e.g., processor 126 of fig. 1) may use the transformation between coordinate systems to generate directions for positioning probe 406. The processor may generate a transformation of the current imaging plane position relative to the target plane position according to equation 1:

targetTAt present=(refTTarget)-1(World of thingsTref)-1 World of thingsTAt presentEquation 1

The processor may calculate the position of the current imaging plane to the position of the target plane in real time. As described herein, the processor may generate feedback or instructions to adjust the position of the imaging plane to reduce the difference between the position of the current imaging plane and the position of the target plane. The processor may be based on a transformationTargetTAt presentThe difference between the position of the imaging plane and the position of the target plane is calculated. As described herein, the display (e.g.,display 120 of fig. 1) may display feedback (e.g., visual indicator 122 of fig. 1) based on the calculated variance as the calculated variance changes in real-time.

The general transformation of equation 1 may be used to generate specific instructions for adjusting the position of probe 406. In the example shown in fig. 4A-4B, the target plane coordinate system 444 is rotated 90 ° with respect to the reference plane coordinate system 442 about their mutual y-axis y _ ref/y _ target. In other words, there are 90 ° angles between x _ ref and x _ target and between z _ ref and z _ target. In this example, the only change in position that the imaging plane needs to undergo is rotation. The angle β between the current probe position (determined by current probe coordinate system 446) and target coordinate system 444 may be determined by:

wherein the content of the first and second substances,Tis a transposition operator and is a function of the transposition operator,targetRAt presentIs a transformation matrixTargetTAt presentThe rotational component of (a) to (b),is the transformed column unit vector along x _ current to plane XZTargetThe plane XZTargetIs a plane defined along the axes x _ target and z _ target of the target coordinate system 444;is normalized. The angle β may be calculated in real-time and may be presented to the user in one or more visual indicators (e.g., visual indicator 122 of fig. 1). Although in this example the positions of the object plane and the current imaging plane may only change by the angle β, other angles and/or other position elements may be calculated based on the transformation. In some examples, 3 rotation angles may be calculatedDegrees and 3 position vectors to provide 3D/6DOF position information.

Fig. 5A-5C depict Graphical User Interface (GUI) elements 500, 505, and 510 including visual indicators according to some examples of the present disclosure. The visual indicators may be generated in real-time and may provide guidance to the operator on how to reposition or manipulate the probe in order to acquire one or more target image planes. In some examples, the visual indicator may include instructions for repositioning the probe such that an imaging plane of the probe is aligned with the target image plane so that an image of the object on the target image plane may be acquired. The visual indicator may also provide feedback regarding the current position of the imaging plane relative to the reference plane and the one or more target planes. FIGS. 5A-5C illustrate visual indicators 522a-C, reference indicators 548a, target indicators 550a-C, current indicators 552a-C, and organization indicators 554 a. The visual indicators in the examples herein may be used to implement any of the visual indicators of the present disclosure (e.g., visual indicator 122 of fig. 1). The components shown in fig. 5A-5C and their arrangement are merely illustrative, and other variations, including removing components, combining components, rearranging components, and replacing components, are contemplated.

5A-5C depict example visual indicators 522a-C that may be used to guide the alignment of a current imaging plane with a target plane. Fig. 5A and 5B depict two visual indicators 522a, B, respectively, when the current imaging plane is in the first and second positions. The visual indicators of fig. 5B are labeled 522a 'and 522B' to indicate the location of the change. Fig. 5C depicts a visual indicator 522C according to another example. Any of the visual indicators 522a-c may be provided on a display (e.g., as GUI elements 500, 505, 510 of a graphical user interface). Any of the visual indicators 522a-c may include a reference indicator 548 configured to depict a location of a reference plane, a target indicator 550 configured to depict a location of a target plane, a current indicator 552 configured to depict a current location of an imaging plane, a tissue indicator 554 configured to depict a property of tissue (e.g., a fiber orientation of tissue), or a combination thereof. Notably, in some examples and as further described below, the current indicator may be configured to be dynamically updated (e.g., when the operation is moving the probe) such that it provides a visual indication of the relative angle between the target image plane and the current image plane (i.e., the image plane of the probe at the current location) at any given time, or some quantitative or qualitative measure of how close the current image plane is to the target image plane.

In the example shown, visual indicator 522a includes a reference indicator 548a, a target indicator 550a, and a current indicator 552a in the form of rectangular bars, which schematically represent the footprint of an imaging array or probe (e.g., probe 106 of fig. 1). In some examples, the rectangular bars for each different indicator may be represented slightly differently. For example, the indicators associated with the reference image plane (in this case, rectangular bars) may be displayed in one color while the indicators associated with the other image planes may be represented in a different color, or in some cases, one or more may appear as an outline while others may be filled in. In the particular illustrated example, the rectangular bar of the target indicator 550a is displayed as an outline, while the rectangular bar of the current indicator 552a is displayed as a filled bar. When the operator positions the probe in alignment with the target plane (as shown in fig. 5B), the filled bar aligns with and fills the outline of the target indicator. In other examples, the indicator may be configured differently in one or two dimensions than other indicators, e.g., the target or current indicator may be longer. In other examples, the indicators may be distinguished from each other differently, such as being different colors and/or having different borders (e.g., solid line borders, dashed line borders, dotted line borders, etc.). In some examples, legends may be provided to help distinguish the indicators.

In this example, the indicator overlies the tissue indicator 554, which tissue indicator 554 may include one or more lines to represent the direction of the fibers of the anisotropic tissue (e.g., fibers 334 of fig. 3). In this example, the visual indicator 522a depicts a simulated top view (e.g., looking the axis of the probe down toward the tissue) in which the target plane is rotated by about 90 ° relative to the reference plane. Visual indicator 522a thus shows that target indicator 550a is rotated approximately 90 deg. from reference indicator 548 a. The system may update the current indicator 552a in real-time to depict the current location of the imaging plane.

The visual indicator 522b shows the current position of the probe represented by the current indicator 552b along a target indicator 550b (which is a numerical scale). The target indicator 550b may be a normalized value or score (e.g., between 0-100) that represents how close the current location of the imaging plane is to the current target plane. Thus, one end of the target indicator 550b may represent a complete alignment of the current plane with the target plane (e.g., a score of 100), while the other end of the scale (e.g., a score of 0) may represent a complete misalignment of the current plane (e.g., orthogonal to the target plane). The current indicator 552b may be provided in the form of an arrow or a slider that moves along the target indicator 550b to represent the current position of the image plane relative to the target plane.

Visual indicator 552c shows a current indicator 552c and a target indicator 550c, which are both representations of the 3D shape of the probe. The 3D rendering may be a realistic depiction of the probe or may be stylized. The current indicator 552c may be a solid representation of the current position of the probe. The target indicator 550c may be a translucent (or wire frame) model of the probe position such that when the probe is in that position, the imaging plane produced by the probe is aligned with the target plane. The target indicator 550c may be presented in a different color than the current indicator 552c, and may change color in response to changing position of the probe. An arrow may be displayed indicating the recommended movement (e.g., tilting) of the probe to match the target indicator 550 c.

The instructions and feedback of the visual indicators 522a-c may be used to adjust the position of the probe until the current indicators 552a-c match the target indicators 550 a-c. The system may provide feedback (e.g., a change in color, hue, or displayed message) to indicate that the current imaging plane is aligned with the target plane (or within an acceptable tolerance of alignment). The system may at this point prompt the user to record a shear wave elastography image and/or the system may automatically record a shear wave elastography image. The visual indicators 522a-c may then be updated to display a new target plane along with the target indicators 550a-c, or may indicate that the measurement sequence has been completed.

As an example, fig. 5A shows two visual indicators 522a and 522b, each visual indicator 522a and 522b presenting the current position of the same probe. In both cases, the current indicators 552a, 552b indicate that the probe is not currently aligned with the target probe. The visual indicator 522a shows a simplified 2D representation of the probe position (as if looking down from above), and shows that the current indicator 552a is not aligned with the target indicator 550 a. The visual indicator 522b shows the current indicator 552b along with a target indicator 550b, which target indicator 550b is a numerical axis below a maximum value (e.g., 100) that represents alignment with the target plane. Fig. 5B shows the same two visual indicators 522a 'and 522B' reflecting the new position of the probe with the imaging plane aligned with the target plane. In the case of visual indicator 522a ', current indicator 552a' is now within target indicator 550a, which is a contour indicating the position of the probe with the imaging plane aligned with the target plane. In the case of visual indicator 522b ', current indicator 552b' has been incremented along target indicator 550b (in numerical scale) until it reaches or approaches a maximum value. In further examples, non-visual cues may additionally or alternatively be provided to the guidance. For example, in addition to the visual indicator, when the system determines that the probe is positioned such that the imaging plane of the probe is sufficiently aligned with the target plane (e.g., corresponding to the position of the current indicator 552b anywhere between 90-100), the system may be configured to provide a tactile or audible indicator (e.g., beeping or other sound).

6A-6B are example displays including shear wave elastography images, according to some embodiments of the present disclosure. The image may be presented to a user on a display (e.g., on display 120 of fig. 1) and/or may be stored in a memory of the system (e.g., memory 130) for later analysis. Fig. 6A-6B depict images 624,624 'and position indicators 654, 654'. In particular embodiments, images 624,624' may be an implementation of image 124 of fig. 1. The components shown in fig. 6A-6B are illustrative only, and other variations, including removing components, combining components, rearranging components, and replacing components, are contemplated.

Fig. 6A and 6B illustrate example images 624,624' that may be presented on a display, such as display 120 of fig. 1. The images 624,624' may be presented to the user in real-time and/or may be saved (such as on memory 130 of fig. 1) for later viewing. Position information for the image plane position at each image 624,624 'may be calculated and saved with image 624,624'. Images 624,624' may include position indicators 654, 654' that indicate the position of the imaging planes shown in images 624,624' relative to a reference plane. Fig. 6A shows image 624 aligned with the reference plane, as shown by position indicator 654, while fig. 6B shows image 624 'aligned orthogonally to the reference plane, as shown by position indicator 654'. Position indicators 654, 654 'may be superimposed over the images 624, 624'. The location indicators 654, 654' may be text or may be some other representation of the location, such as a graphical display or a normalized score.

7A-7B are block diagrams of coordinate systems defined by probe placement and orientation of anisotropic tissue according to some embodiments of the present disclosure. When determining the position of the imaging plane, the orientation of the tissue may be taken into account in order to take into account changes in the direction of the tissue (e.g. changes in the orientation of the fibers of the tissue). Figures 7A-7B depict anisotropic tissue 705, probe 706, fiber 734, position tracking system 716, position (or world) coordinate system 740, reference coordinate system 742, target coordinate system 744, current coordinate system 746, and tissue coordinate system 743 aligned with the muscle fibers. In a particular embodiment, the probe 706 may be an implementation of the probe 106 of FIG. 1. The components shown in fig. 7A-7B are illustrative only, and other variations, including removing components, combining components, rearranging components, and replacing components, are contemplated.

The coordinate system of fig. 7A-7B may be substantially similar to the coordinates described in fig. 4A-4B, but fig. 7A-7B incorporate a tissue coordinate system 743 to describe the location of the fibers of the anisotropic tissue 705. The tissue coordinate system 743 may be an xyz coordinate system that includes x _ tissue, y _ tissue, and z _ tissue. In the example shown in fig. 7A-7B, the tissue is muscle tissue, and thus the tissue coordinate system 743 is a muscle coordinate system that includes x _ muscle, y _ muscle, and z _ muscle. Although muscles are specified here, it should be understood that the tissue coordinate system may be aligned to any type of anisotropic tissue. The anisotropic tissue 705 may have elements, such as fibers 734, that are not aligned along any of the world coordinate system 740, the reference coordinate system 742, and/or the target coordinate system 744. The orientation of the fibers 734 may vary along the length of the anisotropic tissue 705. Thus, the tissue coordinate system 743 aligned with the fibers 734 in this case may also vary along the length of the tissue 705. The tissue coordinate system 743 may be defined by a user. The tissue coordinate system 743 may be automatically generated by extracting the orientation of the fibers 734 from the image, for example by hough transform, segmentation and parameterization of the fibers 734 and/or backscatter acoustic properties. The target plane may be selected to align with the fiber 734. When there is a separate tissue coordinate system 743, equation 1, which defines the conversion between the target coordinates and the current coordinates, can be modified to equation 4:

targetTAt presentTargetTMuscle MuscleTref(World of thingsTref)-1 World of thingsTAt presentIn the formula 4, in which,world of thingsTAt presentIs a transformation between the current coordinate system 746 and the world coordinate system 740, andtargetTMuscleIs a transformation between the tissue coordinate system 743 and the target coordinate system 744. Similar to the instructions of equations 2 and 3, further calculations may be performed to generate specific instructions such as specific angle or position changes between elements of the target coordinate system 744 and the current coordinate system 746.

Fig. 7A and 7B show the probe 706 rotated to align with the target plane. In the example shown, the target plane is aligned with the fibers 734 of the anisotropic tissue 705. FIG. 7A shows the probe 706 aligned with the reference plane (e.g., the current frame 746 and the reference frame 742 are aligned). In this example, the probe 706 may need to undergo two rotations (with respect to y _ ref and with respect to z _ ref) to align the current imaging plane with the target plane. FIG. 7B shows the probe 706 aligned with a target coordinate system 744. The transformation calculated in equation 4 can be used to generate instructions for the desired change in the position of the probe 706 in real time and allow the target plane to be selected at different points along the tissue with different orientations of the fiber 734.

FIG. 8 is a flow chart depicting multi-angle shear wave imaging, in accordance with a specific embodiment of the present disclosure. The example method 800 illustrates steps that may be utilized by the systems and/or apparatus described herein in any order. The method may be performed by an ultrasound system, such as ultrasound system 112 of figure 1. In some embodiments, the method 800 may be implemented in the instructions 132 of the system 112 and executed by the processor 126.

In the illustrated embodiment, the method 800 begins at block 810 with "determining a position of a probe relative to an imaging plane of anisotropic tissue based on position tracking data generated based at least in part on sensor data received from a position sensor coupled to the probe".

At block 820, the method involves "defining at least one target plane".

At block 830, the method includes "determining a difference between the position of the imaging plane and the position of the at least one target plane".

At block 840, the method includes "providing a visual indicator of the determined disparity on the display, and dynamically updating the visual indicator in response to a change in the position of the imaging plane".

At block 850, the method involves "producing at least one shear wave image of the target plane with the transducer".

The method 800 may also include generating instructions to adjust the position of the imaging plane. The instructions may be provided to the user, for example, via the visual indicator 122 of fig. 1. The user may adjust the position of the probe in response to the instructions, and may receive additional feedback as the adjustment continues. The instructions may also be used to automatically adjust the position of the imaging plane, for example by changing the position of the probe and/or by changing the position of the imaging plane relative to the probe. The position of the probe may be changed, for example, by an actuator controlled by a controller (such as controller 128 of fig. 1). The position of the imaging plane relative to the probe can be changed by, for example, moving the transducer in the plane or using beam steering.

The steps of method 800 may be repeated. For example, block 820 and 850 may be repeated for multiple target planes at different locations around the tissue. The steps of method 800 may run continuously, e.g., block 810, or may be performed in response to an input, e.g., block 850, which may be triggered, for example, by a user, or may be triggered when the system detects that a particular condition is met (e.g., a measured difference between the imaging plane and the target plane falls below a threshold).

In various embodiments where components, systems and/or methods are implemented using programmable devices such as computer-based systems or programmable logic, it should be understood that the above-described systems and methods may be implemented using any of a variety of known or later developed programming languages, such as, for example, "C", "C + +", "FORTRAN", "Pascal", "VHDL", and the like. Thus, various storage media can be prepared, such as magnetic computer disks, optical disks, electronic memory, and the like, which can contain information that can direct a device, such as a computer, to implement the above-described systems and/or methods. Once the information and programs contained on the storage medium are accessible to an appropriate device, the storage medium may provide the information and programs to the device, thereby enabling the device to perform the functions of the systems and/or methods described herein. For example, if a computer disk containing appropriate materials (e.g., source files, object files, executable files, etc.) is provided to a computer, the computer can receive the information, appropriately configure itself and perform the functions of the various systems and methods described in the illustrations and flowcharts above to achieve the various functions. That is, the computer can receive portions of information from the disk pertaining to different elements of the above-described systems and/or methods, implement the individual systems and/or methods, and coordinate the functions of the individual systems and/or methods described above.

In view of this disclosure, it should be noted that the various methods and apparatus described herein may be implemented in hardware, software, and firmware. In addition, various methods and parameters are included as examples only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to effect these techniques, while remaining within the scope of the present disclosure. The functionality of one or more processors described herein may be incorporated into a fewer number or single processing units (e.g., CPUs) and may be implemented using Application Specific Integrated Circuits (ASICs) or general purpose processing circuits programmed to perform the functions described herein in response to executable instructions.

Although the present system may have been described with particular reference to an ultrasound imaging system, it is also contemplated that the present system may be extended to other medical imaging systems that obtain one or more images in a systematic manner. Thus, the present system may be used to obtain and/or record image information related to, but not limited to, kidney, testis, breast, ovary, uterus, thyroid, liver, lung, musculoskeletal, spleen, heart, artery and vascular systems, as well as other imaging applications related to ultrasound guided interventions. Additionally, the present system may also include one or more programs that may be used with conventional imaging systems so that they may provide the features and advantages of the present system. Certain other advantages and features of the disclosure may become apparent to those skilled in the art upon examination of the disclosure or may be experienced by those employing the novel systems and methods of the disclosure. Another advantage of the present systems and methods may be that conventional medical imaging systems may be easily upgraded to incorporate the features and advantages of the present systems, devices and methods.

Of course, it should be understood that any of the examples, embodiments, or processes described herein may be combined with one or more other examples, embodiments, and/or processes, or separated and/or performed in separate devices or device parts, in accordance with the present systems, devices, and methods.

Finally, the above-discussion is intended to be merely illustrative of the present system and should not be construed as limiting the appended claims to any particular embodiment or group of embodiments. Thus, while the present system has been described in detail with reference to exemplary embodiments, it should also be appreciated that numerous modifications and alternative embodiments may be devised by those having ordinary skill in the art without departing from the broader and intended spirit and scope of the present system as set forth in the claims that follow. The specification and drawings are accordingly to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.

35页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:血管内超声成像

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!