Apparatus and method for image-based control of imaging system parameters

文档序号:1175572 发布日期:2020-09-22 浏览:18次 中文

阅读说明:本技术 用于对成像系统参数进行基于图像的控制的装置和方法 (Apparatus and method for image-based control of imaging system parameters ) 是由 C·F·佩里 于 2019-03-15 设计创作,主要内容包括:一种装置包括一个或多个处理器,所述一个或多个处理器被配置成监测根据一个或多个参数操作的成像系统中的成像探头的运动。所述成像探头被配置成输出表示被成像体的图像数据。所述一个或多个处理器被配置成基于所监测到的所述成像探头的运动来更改所述成像系统的所述一个或多个参数。(An apparatus includes one or more processors configured to monitor motion of an imaging probe in an imaging system operating according to one or more parameters. The imaging probe is configured to output image data representative of an imaged volume. The one or more processors are configured to alter the one or more parameters of the imaging system based on the monitored motion of the imaging probe.)

1. An apparatus, comprising:

one or more processors configured to monitor motion of an imaging probe in an imaging system operating according to one or more parameters, the imaging probe configured to output image data representative of an imaged volume,

wherein the one or more processors are configured to alter the one or more parameters of the imaging system based on the monitored motion of the imaging probe.

2. The apparatus of claim 1, wherein the one or more processors are configured to communicatively couple to one or more motion sensors operatively coupled to the imaging probe, wherein the one or more processors are configured to identify the motion of the imaging probe based on motion data output by the one or more motion sensors.

3. The apparatus of claim 1, wherein the one or more processors are configured to monitor the motion of the imaging probe by examining the image data and identifying the motion of the imaging probe based on the image data.

4. The apparatus of claim 1, wherein the one or more processors are configured to monitor the motion of the imaging probe by determining that the imaging probe is stationary, and the one or more processors are configured to alter the one or more parameters of the imaging system in response to determining that the imaging probe remains stationary.

5. The apparatus of claim 4, wherein the one or more processors are configured to alter values of the one or more parameters with respect to time while the imaging probe remains stationary.

6. The apparatus of claim 1, wherein the one or more processors are configured to alter the one or more parameters of the imaging system such that the imaging probe acquires the image data at different sensitivities in response to changes in the motion of the imaging probe.

7. The apparatus of claim 1, wherein the one or more processors are configured to alter the one or more parameters of the imaging system such that the imaging probe one or more of starts or stops acquiring the image data in response to a change in the motion of the imaging probe.

8. The apparatus of claim 1, wherein the one or more processors are configured to instruct the imaging probe to acquire a still image of the imaged volume based on the monitored motion of the imaging probe.

9. The apparatus of claim 1, wherein the one or more processors are configured to alter the one or more parameters of the imaging system by altering one or more of a gain, a temporal gain compensation, a line density, a receive frequency, a speckle reduction filter setting, a refresh rate, or a rendering setting of the imaging system.

10. The apparatus of claim 1, wherein the one or more processors are configured to alter the one or more parameters of the imaging system based on the motion of the imaging probe and without receiving or determining any other manual input provided by the imaging system or an operator of the imaging probe.

11. The apparatus of claim 1, wherein the one or more processors are configured to calculate an image quality metric based on the monitored motion of the imaging probe, the one or more processors configured to alter the one or more parameters of the imaging system in response to the image quality metric exceeding or falling below a specified threshold.

12. A method, comprising:

obtaining image data of an imaged volume using a movable imaging probe of an imaging system that operates according to one or more parameters;

monitoring motion of the imaging probe while the imaging probe is obtaining the image data of the imaged volume; and

altering, using one or more processors, the one or more parameters of the imaging system based on the monitored motion of the imaging probe.

13. The method of claim 12, wherein the motion of the imaging probe is monitored based on data output by one or more sensors operatively coupled with the imaging probe.

14. The method of claim 12, wherein the motion of the imaging probe is monitored based on an analysis of the image data.

15. The method of claim 12, wherein monitoring the motion of the imaging probe comprises determining that the imaging probe remains stationary, wherein the one or more parameters of the imaging system are altered in response to determining that the imaging probe remains stationary.

16. The method of claim 14, wherein values of the one or more parameters are altered with respect to time while the imaging probe remains stationary.

17. A tangible and non-transitory computer-readable storage medium comprising instructions that direct one or more processors to:

monitoring motion of an imaging probe of an imaging system operating in accordance with one or more parameters, the motion of the imaging probe being monitored while the imaging probe is obtaining image data of an imaged volume, the motion of the imaging probe being monitored based on one or more of data output by one or more sensors operatively coupled with the imaging probe or based on one or more changes in the image data; and is

Altering, using one or more processors, the one or more parameters of the imaging system based on the one or more of the data output by the one or more sensors or based on the one or more changes in the image data.

18. The computer-readable storage medium of claim 17, wherein the instructions direct the one or more processors to:

monitoring the motion of the imaging probe by determining that the imaging probe remains stationary; and is

Altering the one or more parameters of the imaging system in response to determining that the imaging probe remains stationary.

19. The computer-readable storage medium of claim 17, wherein the instructions direct the one or more processors to:

monitoring the motion of the imaging probe by determining that the imaging probe is moving; and is

Altering the one or more parameters of the imaging system in response to determining that the imaging probe is moving.

20. The computer-readable storage medium of claim 17, wherein the instructions are configured to instruct the one or more processors to:

altering the one or more parameters of the imaging system by one or more of:

modifying a sensitivity of the imaging probe to acquire the image data,

the acquisition of the image data is started and,

stopping acquiring the image data, or

Altering one or more of a gain, a temporal gain compensation, a line density, a receive frequency, a speckle reduction filter setting, a refresh rate, or a rendering setting of the imaging system.

Technical Field

The subject matter disclosed herein relates generally to imaging systems.

Background

The imaging system generates image data representing the imaged volume based on various parameters of the imaging system. These parameters indicate how a volume may be imaged and may have a fixed value or may be manually altered by a user of the imaging system.

For example, some ultrasound imaging systems include software tools (e.g., applications) for automatically segmenting follicles in an ultrasound volume. These tools may have a user selectable sensitivity slider that allows a user of the ultrasound system to manually alter the sensitivity at which images are acquired. The low sensitivity selected by the user may result in the ultrasound imaging system detecting fewer follicles in the ultrasound volume. A higher sensitivity selected by the user may result in more follicles being detected, but this may also result in more false positive auto-detections of follicles by the imaging system.

The user may then need to find such sensitivity (or other imaging parameters): it provides useful image data to reveal or detect a volume (e.g., a follicle) within an image, but is not so sensitive (or has other extremes) as to falsely detect or falsely identify a volume in the image(s). For manually selected imaging parameters, this may result in the user having to repeatedly select and/or alter the value(s) of the imaging parameters.

For certain types of imaging systems, this can be a difficult operation. With respect to ultrasound imaging systems, a user typically has a handheld ultrasound imaging probe held by the user toward a region of interest while the user also simultaneously views a display showing a representation of the image data and alters one or more imaging parameters. The user may not be able to maintain the desired region of interest within the field of view of the imaging probe while also altering the values of the imaging system parameters.

Disclosure of Invention

In one embodiment, an apparatus includes one or more processors configured to monitor motion of an imaging probe in an imaging system operating according to one or more parameters. The imaging probe is configured to output image data representative of an imaged volume. The one or more processors are configured to alter the one or more parameters of the imaging system based on the monitored motion of the imaging probe.

In one embodiment, a method comprises: obtaining image data of an imaged volume using a movable imaging probe of an imaging system that operates according to one or more parameters; monitoring motion of the imaging probe while the imaging probe is obtaining the image data of the imaged volume; and altering, using one or more processors, the one or more parameters of the imaging system based on the monitored motion of the imaging probe.

In one embodiment, a tangible and non-transitory computer-readable storage medium is provided that includes instructions that direct one or more processors to monitor motion of an imaging probe of an imaging system operating according to one or more parameters. The motion of the imaging probe is monitored while the imaging probe is obtaining image data of an imaged volume. The motion of the imaging probe is monitored based on one or more of data output by one or more sensors operatively coupled with the imaging probe or based on one or more changes in the image data. The instructions also instruct the one or more processors to alter the one or more parameters of the imaging system using one or more processors based on the one or more of the data output by the one or more sensors or based on the one or more changes in the image data.

Drawings

The inventive subject matter described herein will be better understood by reading the following description of non-limiting embodiments with reference to the drawings, in which:

FIG. 1 is a schematic diagram of an ultrasound imaging system according to an embodiment;

FIG. 2 schematically illustrates one embodiment of a control device of the imaging system shown in FIG. 1;

FIG. 3 illustrates a flow diagram of one embodiment of a method for automatically modifying parameters of an imaging system; and is

Fig. 4 illustrates a flow diagram of another embodiment of a method for automatically modifying parameters of an imaging system.

Detailed Description

One or more embodiments of the inventive subject matter described herein provide apparatus and methods of an imaging system that monitor image quality and/or motion of an imaging probe and automatically alter one or more parameters of the imaging system based on the image quality and/or probe motion (or absence of probe motion). The parameters of the imaging system dictate how the image data is obtained, processed, and/or visually presented. For example, the parameters of the imaging system may be settings of the imaging system, such as sensitivity of imaging the region of interest, gain of the probe sensing information, time gain compensation of the probe sensing information, line density, receive frequency, speckle reduction filter settings, rendering settings, brightness, focus, and the like.

The monitored motion may be movement of the imaging probe, or the absence of probe motion. For example, the apparatus and methods may examine an imaging probe to determine if the probe is moving, how fast the probe is moving, the direction(s) in which the probe is moving, how long the probe has moved, if the probe is remaining stationary, how long the probe has remained stationary, and the like. Not all embodiments of the inventive subject matter described herein are limited to monitoring movement of a probe as motion of the probe. In at least one embodiment of the inventive subject matter described herein, monitoring a stationary probe is included in the movement of the monitoring probe. For example, monitoring the motion of the probe may involve determining whether the probe remains stationary and does not move relative to the body being imaged.

The probe of the imaging system may be a device that senses information about a region of interest or imaged volume. For example, the probe may be an ultrasound imaging probe that emits ultrasound pulses and detects echoes of the pulses. Alternatively, the probe may be a camera, a lens system of a camera, an infrared emitter and/or detector, a photo emitter and/or detector (e.g., for a LiDAR system, a structured light array system, etc.), an x-ray detector, and so forth.

Parameters of the imaging system may be altered by different amounts based on the type (e.g., category) of probe motion detected, the speed at which the probe is moved, how long the probe remains stationary, and how long the probe is moving. The parameter may be a modification of the imaging system or one of various settings indicating how to image a region of interest of the imaging probe and/or how to present image data. For example, the parameters (and values of the parameters that are altered or set based on monitored motion of the imaging probe) may include sensitivity of imaging the region of interest, gain of the probe sensed information, time gain compensation of the probe sensed information, line density, receive frequency, speckle reduction filter settings, rendering settings, brightness, focus, and the like.

The apparatus and method may automatically alter one or more imaging parameters based on the monitored motion of the probe. Such automatic modification of parameters may occur without operator intervention. For example, the apparatus and methods may alter the value of an imaging system parameter without receiving any other operator input related to or indicative of the alteration of the parameter. This solves the problem that the imaging system parameters cannot be changed without the operator controlling the probe also having to manually change (via touch, voice, etc.) the parameters. Rather than distracting the operator between moving or holding the probe, examining the image data, and altering the value(s) of one or more imaging system parameters, the operator may focus on moving or holding the probe stationary while the image data is being generated by the probe.

In one embodiment, the apparatus and method calculate an image quality metric based on the monitored motion of the imaging probe. This image quality metric may be a quantitative value indicative of the quality of the image data based on the motion of the probe. For example, a probe that generates image data of a region of interest with many motion artifacts may have a lower image quality metric than when the probe generates image data of the same or other regions of interest with fewer motion artifacts. The apparatus and methods may calculate a quality metric based on image data of the probe, and may also alter values of one or more imaging system parameters based on the calculated quality metric.

As one example of the inventive subject matter used in conjunction with an ultrasound imaging system, a user may use an ultrasound probe to image a target organ within a patient being imaged. The user may then hold the probe stationary, and the apparatus or method may alter the parameters of the imaging system with respect to time for a longer period of time while the probe remains stationary. For example, the apparatus or method may use a counter to increase the sensitivity of the probe in proportion to the time the user holds the probe stationary. For example, the longer the user holds the probe in place, the greater the value of the sensitivity parameter of the imaging system. User feedback may be generated and provided via a color bar or the like on the display of the imaging system to display the progress or current value of the sensitivity parameter. As another example, segmentation results associated with a particular sensitivity parameter may be displayed to a user.

In another example, the apparatus and methods may monitor probe motion to determine if and/or when movement of the probe has stopped. In response to this movement stop, the imaging system may start a counter and increase the value of the imaging system parameter. In response to the start of movement (from the probe being stationary), the value of the parameter may stop increasing, and the current value of the parameter and/or the image data may be displayed to the user.

As another example, a user may move the imaging probe and view image data representing a region of interest. Once the region of interest is found, the user can slow down the movement of the probe so that a decrease in the monitored motion is detected. In response to detecting a reduction in movement, the apparatus and methods may instruct the imaging system to begin acquiring image data, such as by beginning to record a two-dimensional movie or video. The user may later begin to move the probe to search for another target. This results in an increase in the movement of the probe and a change in the motion of the probe is detected. In response to detecting the start of probe movement, the apparatus may instruct the imaging system to stop recording the two-dimensional movie or video.

At least one technical effect of the inventive subject matter described herein provides an imaging system that allows an operator to image a region of interest in an imaged volume while modifying parameters of the imaging system indicative of how image data is obtained and/or generated without providing other intervention or action on the imaging system). This may reduce operator interface requirements of the imaging system, enabling the imaging system to alter parameters used to create the imaging system without additional input other than movement of the probe, but while imaging of the volume continues or is ongoing.

Fig. 1 is a schematic diagram of an ultrasound imaging system 100 according to an embodiment. The ultrasound imaging system 100 includes a transmit beamformer 101 and a transmitter 102 that drives elements 104 within a probe 106 to transmit pulsed ultrasound signals into a body (not shown). According to an embodiment, the probe 106 may be a two-dimensional matrix array probe. However, according to other embodiments, any other type of probe capable of acquiring four-dimensional ultrasound data may be used. The four-dimensional ultrasound data may include ultrasound data such as a plurality of three-dimensional volumes acquired over a period of time. The four-dimensional ultrasound data may include information showing how the three-dimensional volume changes over time.

The pulsed ultrasonic signals are backscattered from various structures in the body, such as blood cells or muscle tissue, to produce echoes that return to the elements 104. The element 104 converts the echoes into electrical signals or ultrasound data, and the electrical signals are received by a receiver 108. The electrical signals representing the received echoes are passed through a receive beamformer 110, which outputs ultrasound data. The probe 106 may contain electronic circuitry for performing all or a portion of transmit beamforming and/or receive beamforming. For example, all or a portion of the transmit beamformer 101, the transmitter 102, the receiver 108, and the receive beamformer 110 may be located within the probe 106. Scanning may include acquiring data through the process of transmitting and receiving ultrasound signals. The data generated by the probe 106 may include one or more data sets acquired with an ultrasound imaging system. The user interface 115 may be used to control the operation of the ultrasound imaging system 100, including controlling the entry of patient data, altering scanning or display parameters, and the like.

The ultrasound imaging system 100 also includes one or more processors 116 that control the transmit beamformer 101, the transmitter 102, the receiver 108, and the receive beamformer 110. The processor 116 is in electronic communication with the probe 106 via one or more wired and/or wireless connections. The processor 116 may control the probe 106 to acquire data. The processor 116 controls which of the elements 104 are active and the shape of the beam emitted from the probe 106. The processor 116 is also in electronic communication with a display device 118, and the processor 116 may process the data into images for display on the display device 118. According to an embodiment, the processor 116 may include one or more Central Processing Units (CPUs). According to other embodiments, the processor 116 may include one or more other electronic components capable of performing processing functions, such as one or more digital signal processors, Field Programmable Gate Arrays (FPGAs), graphics boards, and/or integrated circuits. According to other embodiments, the processor 116 may include a number of electronic components capable of performing processing functions. For example, the processor 116 may include two or more electronic components selected from a list of electronic components including: one or more central processors, one or more digital signal processors, one or more field programmable gate arrays, and/or one or more graphics boards. According to another embodiment, the processor 116 may further include a complex demodulator (not shown) that demodulates the radio frequency data and generates the raw data. In another embodiment, demodulation may be performed earlier in the processing chain.

The processor 116 is adapted to perform one or more processing operations on the data according to a plurality of alternative ultrasound modalities. When echo signals are received, the data may be processed in real-time during a scanning session, such as by processing the data without any intentional delay or while additional data is acquired during the same imaging session for the same patient. For example, embodiments may acquire images at a real-time rate of seven to twenty volumes per second. However, the real-time volume rate may depend on the length of time required to acquire each data volume for display. Thus, when acquiring relatively large amounts of data, the real-time volume rate may be slow. Some embodiments may have a real-time volume rate much faster than twenty volumes per second, while other embodiments may have a real-time volume rate much slower than seven volumes per second.

The data may be temporarily stored in a buffer (not shown) during a scanning session and processed in less than real-time in a live or offline operation. Some embodiments of the inventive subject matter may include multiple processors (not shown) for handling processing tasks handled by the processor 116 according to the exemplary embodiments described above. For example, a first processor may be used to demodulate and extract the RF signal, while a second processor may be used to further process the data prior to displaying the image. It should be understood that other embodiments may use different processor arrangements.

The ultrasound imaging system 100 may acquire data continuously at a volume rate of, for example, ten hertz to thirty hertz. Images generated from the data may be refreshed at similar frame rates. Other embodiments may acquire and display data at different rates. For example, some embodiments may acquire data at a volume rate of less than ten hertz or greater than thirty hertz, depending on the size of the volume and the intended application.

A memory 120 is included for storing the processed amount of acquired data. In one embodiment, the memory 120 has sufficient capacity to store at least a few seconds of ultrasound data volume. These data volumes are stored in the following way: to facilitate retrieval thereof according to the order or time of its acquisition. The memory 120 may include any known data storage medium, such as one or more tangible and non-transitory computer-readable storage media (e.g., one or more computer hard drives, disk drives, universal serial bus drives, etc.).

Optionally, one or more embodiments of the inventive subject matter described herein may be implemented with contrast agents. When ultrasound contrast agents including microbubbles are used, contrast imaging generates enhanced images of anatomical structures and blood flow in the body. After acquiring data while using a contrast agent, image analysis includes separating harmonic components and linear components, enhancing the harmonic components, and generating an ultrasound image by using the enhanced harmonic components. Separation of the harmonic components from the received signal is performed using a suitable filter.

In various embodiments of the present invention, the processor 116 may process the data by other or different mode dependent modules (e.g., B-mode, color doppler, M-mode, color M-mode, spectral doppler, elastography, TVI, strain rate, etc.) to form two-dimensional or three-dimensional image data. For example, one or more modules may generate B-mode, color doppler, M-mode, color M-mode, spectral doppler, elastography, TVI, strain rate, combinations thereof, and the like. The image beams and/or volumes are stored and timing information in memory indicating the time at which the data was acquired may be recorded. The modules may include, for example, a scan conversion module to perform a scan conversion operation to convert the image volume from beam space coordinates to display space coordinates. The video processor module may read the image volume from memory and display the image in real-time while performing the procedure on the patient. The video processor module may store images in an image memory, from which the images are read and displayed.

Fig. 2 schematically illustrates an embodiment of the control device 200 of the imaging system 100. Although not shown in fig. 1, the control device 200 may be included in the imaging system 100. For example, the control apparatus 200 may include the processor 116, and optionally may include one or more of the transmit beamformer 101 and/or the receive beamformer 110. Alternatively, the processor implementing the operations performed by the control device 200 may be an additional or other processor than the processor shown in fig. 1. For example, operations performed by the processor 116 in conjunction with the imaging system 100 described with respect to fig. 1 may not be performed by the same processor(s) performing the operations of the control device 200.

The control device 200 includes or is in communication with one or more motion sensors 202 operatively coupled to the probe 106. For example, the processor 116 may communicate with one or more accelerometers in and/or on the probe 106 to receive motion data from the accelerometers to monitor the motion of the probe 106. Additionally or alternatively, the movement sensor 202 may be external to or separate from the probe 106, such as a radar system, LiDAR system, camera system, or the like, that generates data indicative of movement (or lack thereof) of the probe 106. The processor 116 and the sensors 202 may be coupled by a wired connection and/or a wireless connection to communicate data indicative of movement of the probe 106 while an operator is using the probe 106 to obtain image data of a region of interest in one or more imaged volumes.

The processor 116 monitors the motion of the imaging probe 106 in the imaging system 100 to control time-varying parameters of the imaging system 100. The value of the parameter may be altered based on the monitored motion of the imaging probe 106. The processor 116 may alter the time-varying parameters of the imaging system 100 based on the motion of the imaging probe 106, thereby altering the image data of the imaged volume acquired by the imaging probe 106.

The processor 116 may examine the acceleration or other motion of the probe 106 (as indicated by data from the sensor 202) to monitor the motion of the imaging probe 106. Such motion data may indicate to the processor 116 whether the probe 106 is stationary or moving, how long the probe 106 has been stationary or moving, the speed at which the probe 106 is moving, the direction(s) in which the probe 106 is moving, the orientation of the probe 106 (while moving or stationary), and/or other information about the movement (or lack thereof) of the probe 106.

Based on the movement data provided by the sensor(s) 202, the processor 116 may alter time-varying parameters of the imaging system 100. The parameters of the imaging system 100 may be time-varying in that the values of the parameters may vary at different times or as a function of time based on the motion of the probe 106 being monitored. The values of the parameters of the imaging system 100 may continue to change at the same rate, at an increasing or accelerating rate with respect to time, and/or at a decreasing or decelerating rate with respect to time based on the motion of the probe 106. In one embodiment, the parameter of the imaging system 100 may have a first value while the operator is moving the imaging probe 106. In response to the operator holding the imaging probe 106 stationary for at least a specified period of time (e.g., five seconds or more), the processor 116 may begin to change the value of the parameter from the first value, such as by increasing the gain, increasing the number of scan lines, increasing the refresh rate, and so forth. The value of the parameter may return to the first value in response to the operator again moving the imaging probe 106. Alternatively, the parameter value may be changed to another default value.

For example, the gain at which the probe 106 (or processor 116) amplifies the received echoes of the ultrasound pulses may be set (by default, by an operator of the probe 106 and/or automatically by the processor 116) to have a first value while the operator is using the probe 106 to obtain image data of a region of interest in the body. The body may be a human patient, the anatomy of a patient, or a non-human subject, such as a machine or component under examination. The operator may move the probe 106 and examine the image data presented on the display device 118 (shown in FIG. 1). Once the field of view of the probe 106 causes the image data to reveal or acquire a volume or region of interest within the body (e.g., a region of interest within the body), the operator may stop moving the probe 106. In response to detecting a stop in movement of the probe 106, the processor 116 may automatically and without operator input or other intervention) increase the gain value of the imaging system 100. As the gain value continues to increase, the operator may continue to view the image data output by the imaging system 100 on the display device 118. The gain may be increased by a default amount and then stopped increasing, may continue to be increased until the operator again begins moving the probe 106, may be increased by a fixed discrete amount (e.g., an amount of step) and then maintained at this value. The value of the gain may remain at this value for a default amount of time (e.g., five seconds, etc.), or until the operator provides some input (e.g., audibly, via touch, and/or by moving the probe 106).

In one embodiment, the processor 116 does not alter the values of the operating parameters of the imaging system 100 unless or until the probe 106 remains stationary for at least a specified period of time. For example, the processor 116 may require that the probe 106 remain stationary for a non-transitory period of time (e.g., at least two seconds, at least five seconds, etc.) to ensure that the operator is not merely resting or wanting to briefly view the region of interest, but does not alter parameters based on the probe 106 remaining stationary for a brief period of time.

The processor 116 may alter the value of the operating parameter in response to detecting movement of the probe 106. For example, in addition to or instead of changing the parameter values in response to the probe 106 remaining stationary, the processor 116 may examine data from the sensors 202 to determine whether the probe 106 is moving. In response to determining that the probe 106 is moving, the processor 116 may alter the parameter values of the imaging system.

The processor 116 may require that the probe 106 be moved in a specified manner prior to changing the value of the operating parameter to ensure that the detected movement indicates that the operator wants to change the operating parameter. For example, the processor 116 may examine data from the sensors 202 to determine how fast the probe 106 is moving, and in response to the probe 106 moving faster than a specified non-zero speed or rate, the processor 116 alters the value of the imaging parameter. If the probe 106 is moving, but moves slower than the specified rate, the processor 116 may not automatically change the value of the parameter.

Alternatively, the processor 116 may examine the motion of the probe 106, determine that the probe 106 is moving slowly, but still alter the values of the imaging parameters. For example, an operator may attempt to hold the probe 106 stationary, but may inadvertently move the probe 106 slightly. The processor 116 may monitor the motion of the probe 106 based on data from the sensors 202 and determine that the probe 106 is moving but not moving fast enough and/or a large enough distance to indicate that the operator does not want to change the imaging parameters. Accordingly, the processor 116 may alter the value of the parameter.

The processor 116 may associate different types or categories of motion of the probe 106 with different imaging parameters and/or different changes to the imaging parameters. For example, the probe 106 may be actuated by an operator in different sequences of movements. This sequence may be associated with imaging parameters of the imaging system 100 and/or changes to imaging parameters. Different motion sequences of the probe 106 may be associated with different imaging parameters and/or different changes to imaging parameters. The processor 116 may examine data from the sensors 202 to determine a sequence of motions (e.g., moving and/or remaining stationary) of the probe 106. Different specified probe motion sequences may be stored in memory 120 (shown in fig. 1) and associated with different imaging parameters and/or different changes to imaging parameters. The processor 116 may access these specified sequences and compare the motion sequences (e.g., detected motion sequences) represented by the sensor data to the specified motion sequences. If the detected motion sequence matches or more closely matches one of the specified motion sequences (e.g., exactly matches or more closely matches relative to one or more or all of the other specified motion sequences), the processor 116 may determine which imaging parameters are to be modified and/or how to modify the parameters from the memory 120. The processor 116 may then implement this change.

The sequence of probe movements may include various movements. For example, holding the probe stationary for a specified period of time (e.g., two seconds) may be a first motion in a first sequence, and then rotating the probe 106 in a clockwise direction may be a second motion in the first sequence. In response to determining that the probe 106 has been moved according to this first sequence, the processor 116 may determine which parameter to alter (e.g., two-dimensional line density) and/or by how much to alter the parameter value (e.g., increase the parameter by 10%). As another example, holding the probe stationary for a specified period of time may be a first motion in a different second sequence, and then rotating the probe 106 in a counter-clockwise direction may be a second motion in the second sequence. In response to determining that the probe 106 has been moved according to this second sequence, the processor 116 may determine which parameter to alter (e.g., the receive frequency) and/or by how much to alter the parameter value (e.g., by 8% decrease the parameter). These sequences are provided as examples only and do not limit all embodiments of the inventive subject matter. For example, some sequences may not involve holding the probe 106 stationary and/or rotating the probe 106.

Different designated motion sequences may represent different types or categories of motion. Alternatively, the sequence may be a single movement of the probe 106. For example, holding the probe 106 stationary may be a single movement in the first specified sequence that indicates how to alter the imaging parameters and/or how to alter the values of the parameters.

In one embodiment, the parameter that is altered based on the detected motion of the probe 106 is the start and/or stop of a video recording. The operator may move (or not move) the probe 106 in a manner associated with starting recording video (such as a movie start operation) (in memory 120). The processor 116 may begin saving motion video of the image data from the probe 106 (e.g., in the memory 120) and/or rendering this video on the display device 118. In response to detecting the same or other motion of the probe 106, the processor 116 may stop recording video, or perform a cine-stop operation.

The parameter altered based on the detected motion of the probe 106 may be the acquisition of a still image or photograph. The operator may move (or not move) the probe 106 in a manner associated with capturing still photographs. The processor 116 may then save the still image from the image data from the probe 106 and/or present this video on the display device 118.

The imaging system 100 described herein may allow an operator of the probe 106 to set and/or alter imaging parameters without having to provide any other input other than moving (or not moving) the probe 106. The imaging system 100 can alter imaging parameters based on the monitored motion of the probe 106 without the operator having to actuate any input devices (other than the probe 106), without the operator having to speak for voice activated controls or instruct another operator to alter parameters, etc.

In one embodiment, the processor 116 measures image quality of image data acquired or generated using the probe 106 and determines whether to alter imaging parameters based on the image quality. The image quality may be a quantity indicative of how sharp the image data shows the region of interest in the body under examination. An image that more clearly indicates or depicts a region of interest may be associated with a higher image quality value, while an image that does not so clearly indicate or depict a region of interest may be associated with a lower image quality value. The processor 116 may determine the image quality by monitoring the motion of the probe 106. Different types and/or amounts of movement of the probe 106 may be associated with different image quality values. For example, a probe 106 that remains stationary for a period of time may have a larger image quality value, while a probe 106 that remains stationary for a shorter period of time may have a smaller image quality value, while a slow moving probe 106 may have an even smaller image quality value, while a faster moving probe 106 may have an even smaller image quality value.

The processor 116 may monitor the motion of the probe 106, calculate an image quality value of the image data from the probe 106, and optionally alter the imaging parameters based on the value of the image quality. For example, the processor 116 may alter the parameter if the image quality falls below a specified threshold associated with poor image quality. The value of this threshold may represent or indicate the amount of motion artifact in the image data. A larger threshold may indicate more motion artifacts present in the image data, while a smaller threshold may indicate less motion artifacts in the image data.

The operator may move the probe 106 around to find a region of interest in the body. When the operator is moving the probe 106 around, the processor 116 may reduce the values of sensitivity, gain, line density, etc. of the imaging system 100 so that the image data is not too sensitive or contains too much information that the operator cannot understand. In response to movement of the probe slowing or stopping, the processor 116 may increase sensitivity, gain, line density, etc., due to the improved image quality resulting from slowing or stopping of the probe 106.

During an imaging session, the operator may continue to move or not move the probe 106, and the processor 116 may repeatedly monitor the motion of the probe 106 to determine whether and/or how to alter parameters of the imaging system 100. This may assist the operator in paying more attention to the placement of the probe 106 and the image data generated than in prior systems, which may be at least partially distracting to the operator, as the imaging parameters must be both manipulated and manually modified by entering the modifications into a keyboard, touch screen, etc.

FIG. 3 illustrates a flow diagram of one embodiment of a method 300 for automatically modifying parameters of an imaging system. The method 300 may represent operations performed by the processor 116 (shown in fig. 1) during an imaging session in which an operator is manually actuating the imaging probe 106 (shown in fig. 1) to obtain image data representative of a region of interest within a body. In one embodiment, the memory 120 stores instructions that, when executed by the processor 116, direct the processor 116 to perform the operations of the method 300 (and/or other operations described herein).

Execution of the method 300 may allow the operator to continue imaging the region of interest while modifying parameters indicative of how image data is obtained and/or generated without providing other intervention or action to the imaging system 100 (shown in fig. 1). This may reduce operator interface requirements of the imaging system 100, enabling the imaging system 100 to customize or customize parameters used to create the imaging system 100 without additional input other than movement of the probe 106.

At 302, image data of a volume is obtained using an imaging probe of an imaging system. For example, the probe 106 may be used to obtain image data representing a field of view or a region of interest of a volume. At 304, sensor data indicative of motion of an imaging probe is received while the volume is being imaged. The sensor data may be provided from a motion sensor (such as an accelerometer) coupled to the probe 106. The data may indicate motion of the probe 106, such as whether the probe 106 is moving, how the probe 106 is moving, the direction(s) in which the probe 106 is moving, the speed at which the probe 106 is moving, and so forth. This data may be provided to the processor 116, as described above.

At 306, motion of the probe is determined based on the sensor data. For example, the processor 116 may receive data from an accelerometer or other sensor indicating how and/or whether the probe 106 is moved. The processor 116 may examine this data to monitor how the probe 106 is moving and/or whether the probe 106 remains stationary. For example, accelerations in different directions may indicate the position at which the probe 106 is moving and/or how fast it is moving.

At 308, it is determined whether the motion of the probe is associated with a change in imaging parameters. The processor 116 may examine the motion of the probe 106 and determine whether the motion is associated with an imaging parameter and/or an amount of change in the parameter. As described above, holding the probe 106 stationary, moving the probe 106 slower than a specified threshold, moving the probe 106 faster than the same or another threshold, and/or moving the probe 106 in a specified direction (or along a specified vector) may be associated with a parameter of the imaging system 100 and/or a change to an imaging parameter. If the motion of the probe 106 is associated with a parameter or a parameter change, the flow of the method 300 may proceed toward 312. If the motion of the probe 106 is not associated with a parameter or a parameter change, the flow of the method 300 may proceed toward 310. For example, a single movement or a combination of two or more movements may not be associated with any parameter or parameter change.

At 310, it is determined whether a sequence of motions of the probe is associated with a change in imaging parameters. While a single motion or combination of motions of the probe 106 may not be associated with imaging parameters or parameter changes made by the processor 116, a longer series of consecutive motions of the probe 106 may be associated with parameters or parameter changes. Different motion sequences may be associated with different parameters or parameter changes. If the processor 116 determines that the detected motion sequence of the probe 106 is associated with a parameter or a parameter change, the flow of the method 300 may proceed toward 312. Otherwise, the monitored motion of the probe 106 is not recognized by the processor 116 with any parameter or parameter modification. As a result, the flow of method 300 may return toward 302. This may allow the imaging session to continue using the motion of the probe 106 without changing the parameters. Optionally, the flow of method 300 may terminate.

At 312, a change in imaging parameters is determined. The processor 116 may refer to the memory 120 (or another location) shown in fig. 1 to determine which parameter and/or parameter change is associated with the identified probe motion (e.g., according to 306 and 308) and/or probe motion sequence (e.g., according to 306 and 310). Some different probe motions and/or motion sequences may be associated with different parameters and/or parameter changes in the memory 120 (or elsewhere), and the processor 116 may access such information to determine which parameter to change and/or how to change the parameter.

At 314, imaging parameter modification is implemented. The processor 116 may automatically alter the imaging parameters without operator intervention other than movement or holding of the probe. As described above, this may reduce the amount and/or source of input information required by the imaging system 100 to alter the image data obtained by the probe 106.

Fig. 4 illustrates a flow diagram of another embodiment of a method 400 for automatically modifying parameters of an imaging system. The method 400 may represent operations performed by the processor 116 (shown in fig. 1) during an imaging session in which an operator is manually actuating the imaging probe 106 (shown in fig. 1) to obtain image data representative of a region of interest within a body. In one embodiment, the memory 120 stores instructions that, when executed by the processor 116, direct the processor 116 to perform the operations of the method 300 (and/or other operations described herein).

Similar to the method 300, performance of the method 400 may allow an operator to continue imaging the region of interest while modifying parameters indicative of how image data is obtained and/or generated without providing other intervention or action to the imaging system 100 (shown in fig. 1). This may reduce operator interface requirements of the imaging system 100, enabling the imaging system 100 to customize or customize parameters used to create the imaging system 100 without additional input other than movement of the probe 106. One difference between the flow charts of the methods 300, 400 shown in fig. 3 and 4 is the reliance on the sensor 202 to detect movement of the probe 106. The method 300 includes determining motion of the probe 106 (e.g., at 306) based on output from the sensor 202, while the method 400 does not necessarily require or rely on the output of the sensor. As described below, the method 400 involves examining the image data to identify motion of the probe 106, which can then be used to alter one or more parameters of the imaging system 100.

At 402, image data of a volume is obtained using an imaging probe of an imaging system. For example, the probe 106 may be used to obtain image data representing a field of view or a region of interest of a volume. At 404, image data acquired using the imaging probe is examined. The image data may be examined to determine whether the image data indicates that the probe is moving. For example, the image data may be examined using one or more cross-correlation techniques, speckle tracking, optical flow analysis, and the like, to determine whether and how the image data changes with respect to time. In one embodiment, different frames of image data (indicative of image data acquired at different times) may be examined and compared to one another to identify differences in the image data.

At 406, motion of the probe is determined based on the image data. For example, the processor 116 may receive the image data, identify differences or changes in the image data with respect to time (e.g., at 404), and identify movement of the probe 106 based on the differences. For example, imaging the same volume while the probe 106 is moving may result in different frames of image data being different. If the probe 106 is moving, the volume or one or more portions of the volume may be located at different positions in the image frame. A change in position or other difference in the image data may reveal how the probe 106 is moving, such as in which direction the probe 106 is moving and/or how fast the probe 106 is moving. Such probe motion can be identified without using or relying on output from a motion sensor, such as an accelerometer. This analysis of the image data to identify probe motion may occur while additional image data is being acquired. For example, during a continuous imaging session in which the imaging probe 106 is acquiring image data of a volume, frames of image data may be examined to identify probe motion while additional frames are acquired.

At 408, it is determined whether the motion of the probe is associated with a change in the imaging parameters. The processor 116 may examine the motion of the probe 106 and determine whether the motion is associated with an imaging parameter and/or an amount of change in the parameter. As described above, holding the probe 106 stationary, moving the probe 106 slower than a specified threshold, moving the probe 106 faster than the same or another threshold, and/or moving the probe 106 in a specified direction (or along a specified vector) may be associated with a parameter of the imaging system 100 and/or a change to an imaging parameter. If the motion of the probe 106 is associated with a parameter or a parameter change, the flow of the method 400 may proceed to 412. If the motion of the probe 106 is not associated with a parameter or a parameter change, the flow of the method 400 may proceed toward 410. For example, a single movement or a combination of two or more movements may not be associated with any parameter or parameter change.

At 410, it is determined whether a sequence of motions of the probe is associated with a change in the imaging parameters. While a single motion or combination of motions of the probe 106 may not be associated with imaging parameters or parameter changes made by the processor 116, a longer series of consecutive motions of the probe 106 may be associated with parameters or parameter changes. Different motion sequences may be associated with different parameters or parameter changes. If the processor 116 determines that the detected motion sequence of the probe 106 is associated with a parameter or a parameter change, the flow of the method 400 may proceed toward 412. Otherwise, the monitored motion of the probe 106 is not recognized by the processor 116 with any parameter or parameter modification. As a result, the flow of method 400 may return toward 402. This may allow the imaging session to continue using the motion of the probe 106 without changing the parameters. Optionally, the flow of method 400 may terminate.

At 412, a change in imaging parameters is determined. The processor 116 may refer to the memory 120 (or another location) shown in fig. 1 to determine which parameter and/or parameter alteration is associated with the identified probe motion (e.g., according to 406 and 408) and/or probe motion sequence (e.g., according to 406 and 310). Some different probe motions and/or motion sequences may be associated with different parameters and/or parameter changes in the memory 120 (or elsewhere), and the processor 116 may access such information to determine which parameter to change and/or how to change the parameter.

At 414, imaging parameter modification is implemented. The processor 116 may automatically alter the imaging parameters without operator intervention other than movement or holding of the probe. As described above, this may reduce the amount and/or source of input information required by the imaging system 100 to alter the image data obtained by the probe 106.

In yet another embodiment, the imaging system 100 and/or methods 300, 400 may use a combination of data from the sensors 202 and analysis of the imaging data to identify motion of the probe 106. For example, another method of the inventive subject matter described herein can include a combination of the operations described in connection with 304, 306 in method 300 and in connection with 404, 406 in method 400. Such a combined or hybrid approach may involve examining sensor data indicative of motion of the probe 106 and analysis of the image data to identify motion of the probe 106. In one embodiment, if at least one of the sensor data or the image data difference indicates probe motion, this probe motion may be identified and used to alter the imaging parameters, as described above. Otherwise, the imaging parameters are not automatically altered based on probe motion. In another embodiment, if both the sensor data and the image data indicate probe motion, the imaging parameters are altered accordingly, as described above. Otherwise, the imaging parameters are not automatically altered based on probe motion.

In one embodiment, an apparatus includes one or more processors configured to monitor motion of an imaging probe in an imaging system operating according to one or more parameters. The imaging probe is configured to output image data representative of an imaged volume. The one or more processors are configured to alter the one or more parameters of the imaging system based on the monitored motion of the imaging probe.

Optionally, the one or more processors are configured to communicatively couple with one or more motion sensors operatively coupled to the imaging probe. The one or more processors are configured to identify motion of the imaging probe based on the motion data output by the one or more motion sensors.

Optionally, the one or more processors are configured to monitor the motion of the imaging probe by examining the image data and identifying the motion of the imaging probe based on the image data.

Optionally, the one or more processors are configured to monitor the motion of the imaging probe by determining that the imaging probe is stationary. The one or more processors may be configured to alter the one or more parameters of the imaging system in response to determining that the imaging probe remains stationary.

Optionally, the one or more processors are configured to alter the values of the one or more parameters with respect to time while the imaging probe remains stationary.

Optionally, the one or more processors are configured to alter the one or more parameters of the imaging system such that the imaging probe acquires image data at different sensitivities in response to changes in motion of the imaging probe.

Optionally, the one or more processors are configured to alter the one or more parameters of the imaging system such that the imaging probe one or more of starts or stops acquiring image data in response to a change in motion of the imaging probe.

Optionally, the one or more processors are configured to instruct an imaging probe to acquire a still image of an imaged volume based on the monitored motion of the imaging probe.

Optionally, the one or more processors are configured to alter the one or more parameters of the imaging system by altering one or more of a gain, a temporal gain compensation, a line density, a receive frequency, a speckle reduction filter setting, a refresh rate, or a rendering setting of the imaging system.

Optionally, the one or more processors are configured to alter the one or more parameters of the imaging system based on the motion of the imaging probe and without receiving or determining any other manual input provided by the imaging system or an operator of the imaging probe.

Optionally, the one or more processors are configured to calculate an image quality metric based on the monitored motion of the imaging probe. The one or more processors may be configured to alter the one or more parameters of the imaging system in response to the image quality metric exceeding or falling below a specified threshold.

In one embodiment, a method comprises: obtaining image data of an imaged volume using a movable imaging probe of an imaging system that operates according to one or more parameters; monitoring motion of the imaging probe while the imaging probe is obtaining the image data of the imaged volume; and altering, using one or more processors, the one or more parameters of the imaging system based on the monitored motion of the imaging probe.

Optionally, the motion of the imaging probe is monitored based on data output by one or more sensors operatively coupled with the imaging probe.

Optionally, the motion of the imaging probe is monitored based on an analysis of the image data.

Optionally, monitoring the movement of the imaging probe comprises determining that the imaging probe remains stationary. In response to determining that the imaging probe remains stationary, the one or more parameters of the imaging system may be altered.

Optionally, the values of the one or more parameters are altered with respect to time while the imaging probe remains stationary.

In one embodiment, a tangible and non-transitory computer-readable storage medium is provided that includes instructions that direct one or more processors to monitor motion of an imaging probe of an imaging system operating according to one or more parameters. The motion of the imaging probe is monitored while the imaging probe is obtaining image data of an imaged volume. The motion of the imaging probe is monitored based on one or more of data output by one or more sensors operatively coupled with the imaging probe or based on one or more changes in the image data. The instructions also instruct the one or more processors to alter the one or more parameters of the imaging system using one or more processors based on the one or more of the data output by the one or more sensors or based on the one or more changes in the image data.

Optionally, the instructions instruct the one or more processors to monitor motion of the imaging probe by determining that the imaging probe remains stationary, and alter the one or more parameters of the imaging system in response to determining that the imaging probe remains stationary.

Optionally, the instructions instruct the one or more processors to monitor motion of the imaging probe by determining that the imaging probe is moving, and alter the one or more parameters of the imaging system in response to determining that the imaging probe is moving.

Optionally, the instructions are configured to instruct the one or more processors to alter the one or more parameters of the imaging system by one or more of: modifying the sensitivity of the imaging probe for acquiring the image data; starting to acquire image data; stopping acquiring the image data; or to change one or more of the gain, temporal gain compensation, line density, receive frequency, speckle reduction filter settings, refresh rate, or rendering settings of the imaging system.

As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural elements or steps, unless such exclusion is explicitly stated. Furthermore, references to "one embodiment" are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising," "including," or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property.

It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the embodiments (and/or aspects of the embodiments) described above may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. While the dimensions and types of materials described herein are intended to define the parameters of the invention, they are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reading the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms "including" and "in which" are used as the plain-English equivalents of the respective terms "comprising" and "in which". Furthermore, in the following claims, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in a device-plus-function format and are not intended to be interpreted based on 35u.s.c. § 112(f), unless the phrase "device for … …" is explicitly used until such claim limitations follow without further structural functional recitation.

This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:超声波诊断系统及超声波诊断系统的工作方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!