Method and system for visualizing overlapping images

文档序号:862204 发布日期:2021-03-16 浏览:10次 中文

阅读说明:本技术 用于可视化重叠图像的方法和系统 (Method and system for visualizing overlapping images ) 是由 菲德瑞克·韦罗内西 奥利弗·杰拉德 于 2019-08-01 设计创作,主要内容包括:本发明公开了一种用于可视化重叠图像的医学成像工作站和方法,该工作站和方法包括访问第一图像数据集和第二图像数据集。该工作站和方法包括在显示设备上显示第一图像,其中第一图像包括第一图像数据集的至少一部分并且包括一种结构。该工作站和方法包括与第一图像同时在显示设备上显示第二图像,其中第二图像包括第二图像数据的至少一部分并且包括该结构,并且其中第二图像的至少一部分与第一图像重叠。该工作站和方法包括自动循环地改变第二图像的与第一图像重叠的至少一部分的不透明度。(A medical imaging workstation and method for visualizing overlapping images includes accessing a first image dataset and a second image dataset. The workstation and method include displaying a first image on a display device, where the first image includes at least a portion of a first image dataset and includes a structure. The workstation and method include displaying a second image on the display device concurrently with the first image, wherein the second image includes at least a portion of the second image data and includes the structure, and wherein at least a portion of the second image overlaps the first image. The workstation and method include automatically cyclically changing the opacity of at least a portion of the second image that overlaps the first image.)

1. A method for visualizing overlapping images, the method comprising:

accessing a first image dataset and a second image dataset, wherein the first image dataset and the second image dataset are acquired with one or more medical imaging systems;

displaying a first image on a display device, wherein the first image comprises at least a portion of the first image dataset and comprises a structure;

displaying a second image on the display device concurrently with the first image, wherein the second image comprises at least a portion of the second image dataset and comprises the structure,

and wherein at least a portion of the second image overlaps the first image; and

automatically cyclically changing an opacity of at least the portion of the second image that overlaps the first image.

2. The method of claim 1, wherein automatically cyclically changing the opacity of at least the portion of the second image comprises automatically cyclically changing the opacity of the entire second image.

3. The method of claim 1, wherein automatically cyclically changing the opacity of at least the portion of the second image comprises automatically cyclically changing the opacity of the portion of the second image that overlaps the first image and not automatically cyclically changing the opacity of the portion of the second image that does not overlap the first image.

4. The method of claim 1, wherein the first image data set comprises first ultrasound image data acquired from a first ultrasound imaging mode and the second image data set comprises second ultrasound image data acquired from a second ultrasound imaging mode, and wherein the second ultrasound imaging mode is different from the first ultrasound imaging mode.

5. The method of claim 4, wherein the first ultrasound imaging mode is a B-mode and the second ultrasound imaging mode is a color Doppler mode, and wherein the first ultrasound image data and the second ultrasound image data are acquired in an interleaved manner.

6. The method of claim 1, wherein the one or more medical imaging systems used to acquire the first image dataset are selected from the group consisting of: computed tomography imaging systems, ultrasound imaging systems, positron emission computed tomography imaging systems, nuclear medicine imaging systems, and x-ray imaging systems.

7. The method of claim 6, wherein the one or more medical imaging systems used to acquire the second image dataset are selected from the group consisting of: computed tomography imaging systems, ultrasound imaging systems, positron emission computed tomography imaging systems, nuclear medicine imaging systems, and x-ray imaging systems.

8. The method of claim 7, wherein the one or more medical imaging systems used to acquire the second image dataset are different from the one or more medical imaging systems used to acquire the first image dataset.

9. The method of claim 1, wherein at least one of the first image dataset and the second image dataset is acquired in real-time.

10. The method of claim 1, wherein the portion of the second image has a non-uniform opacity.

11. The method of claim 1, wherein the portion of the second image has a uniform opacity.

12. The method of claim 1, wherein the opacity of at least the portion of the second image varies automatically cyclically according to a periodically repeating function selected from the group consisting of: sine functions, step functions, and sawtooth functions.

13. The method of claim 12, wherein the function has a period between 1 and 20 seconds.

14. The method of claim 13, wherein the period is user adjustable.

15. The method of claim 1, further comprising adjusting a position of the second image relative to the first image while the automatically cyclically changing an opacity of at least the portion of the second image that overlaps the first image.

16. The method of claim 1, further comprising displaying the first image using a first color map and the second image using a second color map different from the first color map to help distinguish the first image from the second image.

17. The method of claim 1, further comprising automatically cyclically changing a color map between a first color and a second color as the automatically cyclically changing the opacity of the second medical image.

18. The method of claim 1, wherein the first image data set is a first 3D data set and the second image data set is a second 3D data set, and wherein the first image is a first slice rendered by the first 3D data set and the second image is a second slice rendered by the second 3D data set.

19. The method of claim 18, further comprising:

displaying a third image from the first image data set on the display and a fourth image from the second image data set on the display simultaneously with the first image and the second image, wherein the third image is a second slice rendered by the first image data set and the fourth image is a second slice rendered by the second image data set, wherein a portion of the fourth image at least partially overlaps the third image; and

automatically cyclically changing an opacity of at least the portion of the fourth image that overlaps the third image therebetween, wherein the automatically cyclically changing the opacity of at least the portion of the fourth image is synchronized with the automatically cyclically changing the opacity of at least the portion of the second image.

20. A medical imaging workstation comprising:

a user input device;

a display device; and

a processor in electronic communication with the user input device and the display device, wherein the processor is configured to:

accessing a first image dataset and a second image dataset, wherein the first image dataset and the second image dataset are acquired with one or more medical imaging systems;

displaying a first image on the display device, wherein the first image comprises at least a portion of the first image dataset and comprises a structure;

displaying a second image on the display device concurrently with the first image, wherein the second image comprises at least a portion of the second image dataset and comprises the structure,

and wherein at least a portion of the second image overlaps the first image; and

automatically cyclically changing an opacity of at least the portion of the second image that overlaps the first image.

21. The medical imaging workstation of claim 20 wherein the medical imaging workstation is a component of a medical imaging system.

22. The medical imaging workstation of claim 20 wherein the medical imaging workstation is a component of an ultrasound imaging system.

23. The medical imaging workstation of claim 20 wherein the processor is configured to cyclically change the opacity of at least the portion of the second image according to a periodic function.

24. The medical imaging workstation of claim 20 wherein the processor is configured to display the first image using a first color map and the second image using a second color map, wherein the second color map is different from the first color map.

25. The medical imaging workstation of claim 20 wherein the processor is configured to periodically change a color map of the second image between a first color and a second color as the opacity is periodically changed, wherein the first color is different from the second color.

Technical Field

The present disclosure generally relates to a method and a medical imaging workstation for visualizing overlapping images.

Background

The present invention relates generally to imaging of objects, and more particularly to visualizing overlapping images.

In medical imaging, it is often desirable to display two or more overlapping images. For example, when attempting to register a first image with a second image, the two images may be displayed in an overlapping manner. Likewise, a first image containing data of a first type may be displayed to overlap a second image containing data of a second type. The two overlapping images may contain information acquired with different imaging modalities or the two overlapping images may contain information acquired with different acquisition modes.

One problem with conventionally displayed overlapping images is that the overlying image at least partially obscures the underlying image. The overlying image makes it more difficult to see the information contained in the underlying image. Or, conversely, if the overlying image is made more transparent, the data contained in the underlying image is more difficult to interpret. It is difficult or impossible to display all of the information in both the overlying and underlying images using conventional techniques.

When registering a first image with a second image, the images are typically displayed in an overlapping manner. When registering two images with each other, it is desirable to align common anatomical structures or landmarks between the two images. The process of registering the two images typically requires manual input from an operator in order to register the images as closely as possible to each other. However, when the overlying image obscures part of the underlying image, it is difficult to accurately register the two images with each other. It is difficult for the user to discern the anatomical structures in the overlying and underlying images in order to accurately register them with each other.

Also, when the underlying image and the overlying image represent different types of data, it is difficult for a user to interpret all of the data in both the underlying image and the overlying image.

For at least the reasons discussed above, there is a need for improved methods and workstations for visualizing overlapping images.

Disclosure of Invention

The above-mentioned deficiencies, disadvantages and problems are addressed herein, which will be understood by reading and understanding the following specification.

In one embodiment, a method for visualizing overlapping images includes accessing a first image dataset and a second image dataset, wherein the first image dataset and the second image dataset are acquired with one or more medical imaging systems. The method includes displaying a first image on a display device, where the first image includes at least a portion of a first image dataset and includes a structure. The method comprises displaying a second image on the display device simultaneously with the first image, wherein the second image comprises at least a part of the second image data set and comprises the structure, and wherein at least a part of the second image overlaps the first image. The method includes automatically cyclically changing the opacity of at least a portion of the second image that overlaps the first image.

In one embodiment, a medical imaging workstation includes a user input device, a display device, and a processor in electronic communication with the user input device and the display device. The processor is configured to access a first image dataset and a second image dataset, wherein the first image dataset and the second image dataset are acquired with one or more medical imaging systems. The processor is configured to display a first image on the display device, wherein the first image comprises at least a portion of the first image dataset and comprises a structure. The processor is configured to display a second image on the display device concurrently with the first image, wherein the second image comprises at least a portion of the second image dataset and includes the structure, and wherein at least a portion of the second image overlaps the first image. The processor is configured to automatically cyclically change the opacity of at least a portion of the second image that overlaps the first image.

Various other features, objects, and advantages of the invention will be apparent to those skilled in the art from the accompanying drawings and detailed description thereof.

Drawings

FIG. 1 is a schematic diagram of a workstation according to one embodiment;

FIG. 2 is a schematic diagram of a medical imaging system according to one embodiment;

FIG. 3 is a flow diagram of a method according to one embodiment;

FIG. 4 is a representation of a first image, a second image, and a composite image;

FIG. 5 is a representation of a second image overlapping a portion of a first image according to one embodiment;

FIG. 6 is a representation of three composite images generated from a 3D data set according to one embodiment;

FIG. 7 is a graph of a sawtooth function according to one embodiment;

FIG. 8 is a graph of a sine function according to one embodiment;

FIG. 9 is a graph of a step function according to one embodiment;

FIG. 10 is a representation of a series of screen shots, according to one embodiment.

Detailed Description

In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments which may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description, therefore, is not to be taken in a limiting sense.

Fig. 1 is a schematic illustration of a medical imaging workstation 10 according to one embodiment. The medical imaging workstation 10 includes a display device 12, a user input device 14, and a processor 16. The display device 12 and the input device 14 are each in electronic communication with the processor 16. Display device 12 may be an LED display, an OLED display, a Liquid Crystal Display (LCD), a projection display device, a cathode ray tube monitor, or any other type of display configured to display one or more images. User input device 14 may include any type of user input control, including one or more of the following: a mouse, a trackball, a keyboard, a touchpad, a touch screen-based user interface, one or more hard buttons, a slider, a swivel, or any other type of physical control. The processor 16 may include one or more of the following elements: a microprocessor, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a graphics card, or any other type of electronic device configured to implement logical processing instructions. According to various embodiments, medical imaging workstation 10 may be a stand-alone workstation configured to receive image data from one or more of a memory, a separate medical imaging system, and/or a database, such as a PACS/RIS system.

Fig. 2 is a schematic diagram of a medical imaging system 20. The medical imaging workstation 10 is a component of a medical imaging system 20 according to one embodiment. The medical imaging system 20 further comprises an image acquisition unit 22. The medical imaging system 20 may be any type of medical imaging system, such as an x-ray imaging system, a Computed Tomography (CT) imaging system, a Positron Emission Tomography (PET) imaging system, an ultrasound imaging system, or a single photon computed tomography (SPECT) imaging system. Likewise, the image acquisition unit 22 may be an x-ray acquisition unit, a Computed Tomography (CT) acquisition unit, a Positron Emission Tomography (PET) acquisition unit, an ultrasound acquisition unit, a single photon computed tomography (SPECT) acquisition unit, or any other type of medical image acquisition unit. The image acquisition unit 22 comprises acquisition hardware for acquiring one or more image data sets and a hardware structure for supporting the acquisition hardware. The image acquisition unit may further comprise one or more processors for controlling the acquisition of image data. According to various embodiments, the processor 16 in the medical imaging workstation may also be used to control the acquisition hardware 24 in the image acquisition unit 22.

According to embodiments in which image acquisition unit 22 is an x-ray acquisition unit, acquisition hardware 24 may include an x-ray tube and an x-ray detector.

In accordance with embodiments in which the image acquisition unit 22 is a CT acquisition unit, the acquisition hardware 24 may include one or more x-ray tubes and CT detectors disposed on a gantry configured to rotate about the patient support. The CT detector is configured to detect x-rays emitted by the one or more x-ray tubes.

In accordance with an embodiment in which the image acquisition unit 22 is a PET acquisition unit, the acquisition hardware 24 may include PET detectors disposed about the patient gantry. PET detectors are sensitive to gamma rays emitted in response to positron annihilation events occurring within the patient.

According to embodiments in which the image acquisition unit 22 is a SPECT acquisition unit, the acquisition hardware 24 may include one or more gamma detectors configured to detect gamma rays emitted from the radiotracer.

In accordance with an embodiment in which the image acquisition unit 22 is an ultrasound acquisition unit, the acquisition hardware 24 may include a probe having a plurality of transducer elements, a beamformer, a transmitter, and a receiver.

Fig. 3 is a flow chart of a method 300 according to an example embodiment. The various blocks of the flow chart represent steps that may be performed in accordance with the method 300. Additional embodiments may perform steps shown in different sequences and/or additional embodiments may include additional steps not shown in fig. 3. The technical effect of method 300 is to periodically change the opacity of at least a portion of an image that overlaps another image in order to more clearly display information in both the overlying image and the underlying image. The method 300 will be described in accordance with an exemplary embodiment using the workstation 10 shown in fig. 1 and 2.

At step 302, the processor 16 accesses a first image dataset. The first image dataset may be acquired with a separate medical imaging system and the processor 16 may access the first image dataset from the memory, the first image dataset may be accessed from a separate medical imaging system, or the first image dataset may be accessed from a PACS/RIS system. Alternatively, according to embodiments in which the workstation 10 is part of a medical imaging system, such as the embodiment shown in fig. 2, the first image data may be acquired with the image acquisition unit 22. According to various embodiments, the processor 16 may control the acquisition of the first image data set.

At step 304, the processor 16 accesses a second image dataset. The second image dataset may be acquired with a separate medical imaging system and the processor 16 may access the first image dataset from memory, from a separate medical imaging system or from a PACS/RIS system. Alternatively, according to embodiments in which the workstation 10 is part of a medical imaging system, such as the embodiment shown in fig. 2, the second image data set may be acquired with the image acquisition unit 22. According to various embodiments, the processor 16 may control the acquisition of the first image data set. The first image data set and the second image data set may comprise at least one common structure. For example, one or more anatomical structures comprised in the first image data set may also be comprised in the second image data set.

According to one embodiment, the first image data set and the second image data set may be acquired with different medical imaging systems. For example, the first image dataset may be acquired with a medical imaging system selected from an x-ray imaging system, a CT imaging system, a PET imaging system, an ultrasound imaging system, or a SPECT imaging system. The second image data set may for example be acquired with a different type of medical imaging system. For example, the second image data set may be acquired with an x-ray imaging system, a CT imaging system, a PET imaging system, an ultrasound system, or a SPECT imaging system, wherein a type of medical imaging system used to acquire the second image data set is different from a type of medical imaging system used to acquire the first image data set.

According to some non-limiting examples, the x-ray image acquired by the x-ray imaging system, the CT image acquired by the CT imaging system, and the MR image acquired by the MR imaging system provide an image of a structure represented within the body. The PET image acquired by the PET imaging system and the SPECT image acquired by the SPECT imaging system are functional images that provide physiological information about the patient's body. Ultrasound images acquired by an ultrasound imaging system may be used to provide information about structural features or physiological information (such as blood flow, strain, or tissue stiffness) when imaging a patient, as well as to provide other types of information. Both X-ray imaging systems and ultrasound imaging systems can be used to provide real-time images of a patient during surgery or examination. Various different types of imaging systems are sometimes referred to as modalities.

According to other embodiments, the first image data set and the second image data set may be acquired with the same imaging modality. The first image dataset may be acquired using the same or a different imaging modality. For example, the first image data set may be an ultrasound imaging data set acquired in a first imaging mode and the second image data set may be an ultrasound imaging data set acquired in a second imaging mode. Examples of ultrasound imaging modalities include: b-mode, M-mode, color doppler, strain, and elastography. According to an exemplary embodiment, the first image data set may be B-mode ultrasound imaging data and the second image data set may be color doppler ultrasound imaging data.

At step 306, processor 16 displays a first image based on at least a portion of the first image dataset on display device 12. Also, at step 308, processor 16 displays a second image based on at least a portion of the second image dataset on display device 12. On the display device, the second image and the first image are at least partially overlapping. Steps 306 and 308 may be performed simultaneously.

Fig. 4 is a schematic diagram of how processor 16 simultaneously displays both the first image and the second image on display device 12. Fig. 4 shows a first image 402 generated on the basis of a first image data set and a second image 404 generated on the basis of a second image data set. FIG. 4 also shows a composite image 406 resulting from the simultaneous display of both the first image 402 and the second image 404 on the display device 12. According to the embodiment shown in fig. 4, the second image 404 is completely overlapped with the first image 402 in the composite image 406. According to other implementations, in the composite image 406, the second image 404 only partially overlaps the first image 402. For these embodiments, a portion of the second image 404 overlaps the first image. According to some embodiments in which the position of the second image 404 may be adjusted relative to the first image 402, the size and shape of the portion of the second image 404 that overlaps the first image 402 may change as the position of the second image 404 is adjusted relative to the first image 402.

In the embodiment shown in fig. 4, the region 408 represents a portion of the second image 404 that overlaps the first image 402. As described above, according to an exemplary embodiment, the region 408 represents the entirety of the second image 404. For purposes of this embodiment, the second image 404 will be referred to as the overlying image, and the first image 402 will be referred to as the underlying image.

Fig. 5 shows a representation of an embodiment in which only a portion of a second image 404 overlaps a first image 402 according to an embodiment. In fig. 5, a portion 408 of the second image 404 that overlaps the first image 402 is shown with cross-hatching.

Fig. 6 shows a representation of a screen shot in which a first image dataset is a 3D dataset and a second image dataset is also a 3D dataset, according to an embodiment. Various embodiments may simultaneously display multiple images representing various slices or planes. Fig. 6 includes overlapping images from three separate planes or slices within the first image data set and the second image data set. The images in fig. 6 represent different non-parallel planes, but according to other embodiments, the images may represent two or more planes from within the 3D data set that are parallel to each other. For example, fig. 6 includes a first image 402 and a second image 404 and a first overlap region 408. According to one embodiment, the first image 402 may be a CT image of a short axis view of the mitral valve and the second image 404 may be an ultrasound image of a short axis view of the mitral valve. Fig. 6 includes third and fourth images 422 and 424 and a second overlap area 428. According to one embodiment, the third image 422 may be a CT image of a mitral valve-commissure view and the first image 424 may be an ultrasound image of a mitral valve-commissure view. Fig. 6 includes fifth and sixth images 432, 434 and a third overlapping area 438. According to one embodiment, the fifth image 432 may be a CT image of an anteroposterior long axis view and the sixth image 434 may be an ultrasound image of an anteroposterior long axis view. The first image 402, the third image 422 and the fifth image 432 may all be generated by reconstructing a planar view or slice from a 3D CT data set. The second image 404, the fourth image 424, and the sixth image 434 may all be generated by reconstructing a planar view from the 3D ultrasound data set. It should be understood that according to other embodiments, images may be generated by reconstructing planar views or slices from different types of 3D data sets. Additionally, the view shown in FIG. 6 is according to an exemplary embodiment.

At 310, processor 16 determines whether the opacity of second image 404 should be adjusted. According to one embodiment, if the workstation is in a mode that allows for automatic changes in opacity, the method 300 proceeds to step 312. According to other embodiments, the user is able to selectively switch between a mode that automatically changes the opacity of second image 404 and a mode that does not automatically change the opacity of second image 404. If it is not desired to adjust the opacity of second image 404 (i.e., workstation 10 is in a mode that does not automatically change the opacity of second image 404), method 300 proceeds to end at 316.

If workstation 10 is in a mode that automatically changes the opacity of the second image, method 300 proceeds to step 312. At step 312, processor 16 automatically adjusts the opacity of second image 404, and second image 404 is displayed on display device 12 at an opacity that is different from the opacity of second image 404 at step 308. The second image 404 may have a uniform or non-uniform opacity. For embodiments in which the second image has a non-uniform opacity, the opacity of the image may still be reduced by a fixed amount or percentage. At step 314, processor 16 determines whether it is desired to adjust the opacity of the second image. If it is not desired to adjust the opacity of second image 404, the method proceeds to end at step 316. However, if workstation 10 is still in a mode that automatically changes the opacity of second image 404, the method returns to step 312, where the opacity of second image 404 is adjusted. Method 300 may iteratively loop through steps 312 and 314 as long as workstation 10 remains in a mode in which it is desired to automatically change the opacity of second image 404. According to some embodiments, the method may iteratively loop through steps 312 and 314 as the operator is adjusting the position of the second image 404 relative to the first image 402. Additional description of some exemplary ways in which the opacity of second image 404 may be automatically changed will be described below.

Fig. 7, 8, and 9 are graphs illustrating the manner in which processor 16 may cyclically change the opacity of the second image, according to various embodiments.

Referring to FIG. 7, a curve 700 represents opacity values along a y-axis 702 and time along an x-axis 704. T is0、T1、T2、T3、T4、T5、T6、T7And T8Representing time along the x-axis 704. According to one embodiment, time T0、T1、T2、T3、T4、T5、T6、T7And T8May represent evenly spaced time intervals. Curve 700 represents a periodically repeating sawtooth function. From time T0To time T4Representing a period tau or cycle. According to various implementations, it may be desirable to use a function with a period between 1 and 20 seconds, but other implementations may use a function with a period shorter than 1 second or longer than 20 seconds. According to other embodiments, the period may be user adjustable. Curve 700 shows two complete cycles of the function represented in curve 700.

According to the embodiment shown in fig. 7, processor 16 automatically cycles through the opacity of second image 404 according to a sawtooth function represented by curve 700. For example, at time T0Where the opacity of the second image 404 is O3(ii) a At time T1Where the opacity of the second image is O2(ii) a At time T2Where the opacity of the second image 404 is O1(ii) a At time T3Where the opacity of the second image 404 is O2(ii) a And at time T4Where the opacity of the second image 404 is O3. From time T, as shown in curve 7000To time T2Processor 16 reduces the opacity of second image 404 in a linear manner and from time T2To time T4Processor 16 increases the opacity in a linear manner.

Referring to FIG. 8, a curve 800 represents opacity values along a y-axis 702 and time along an x-axis 704. T is0、T1、T2、T3、T4、T5、T6、T7And T8Representing time along the x-axis 704. According to one embodiment, time T0、T1、T2、T3、T4、T5、T6、T7And T8May represent evenly spaced time intervals. Curve 800 represents a sinusoidal function. From time T0To time T4Representing a cycle or loop. According to various implementations, it may be desirable to use a function with a period between 1 and 20 seconds, but other implementations may use a function with a period shorter than 1 second or longer than 20 seconds. According to other embodiments, the period may be user adjustable. Curve 800 shows two complete cycles of the sinusoidal function represented in curve 700.

According to the embodiment shown in fig. 8, processor 16 automatically cycles through the opacity of second image 404 according to a sinusoidal function represented by curve 800. For example, at time T0Where the opacity of the second image 404 is O3(ii) a At time T1Where the opacity of the second image 404 is O2(ii) a At time T2Where the opacity of the second image 404 is O1(ii) a At time T3Where the opacity of the second image 404 is O2(ii) a And at time T4Where the opacity of the second image 404 is O3. From time T in the manner shown by curve 8000To time T2Processor 16 reduces the opacity of the second image and from time T2To time T4The processor 16 increases the opacity.

Referring to FIG. 9, a curve 900 represents opacity values along the y-axis 702 and time along the x-axis 704. T is0、T1、T2、T3、T4、T5、T6、T7And T8Representing time along the x-axis 704. According to one embodiment, time T0、T1、T2、T3、T4、T5、T6、T7And T8May represent evenly spaced time intervals. Curve 900 represents a step function. From time T0To time T4Representing a cycle or loop. According to various implementations, it may be desirable to use a function with a period between 1 and 20 seconds, but other implementations may use a function with a period shorter than 1 second or longer than 20 seconds. According to other embodiments, the period may be user adjustable. Curve 800 shows two complete cycles of the step function represented in curve 700.

According to the embodiment shown in fig. 8, processor 16 automatically cycles through the opacity of second image 404 according to a step function represented by curve 800. For example, at time T0Where the opacity of the second image is O3At time T1Where the opacity of the second image 404 is O2(ii) a At time T2Where the opacity of the second image 404 is O1(ii) a At time T3Where the opacity of the second image 404 is O2(ii) a And at time T4Where the opacity of the second image 404 is O3. From time T in a stepwise manner as shown by curve 8000To time T2Processor 16 reduces the opacity of second image 404 and from time T2To time T4The processor 16 increases the opacity.

Curves 700, 800, and 900 are merely three exemplary embodiments of a periodically repeating function that processor 16 may use to automatically change the opacity of second image 404. In other embodiments, processor 16 may automatically adjust the opacity of second image 404 in a loop according to other functions. Processor 16 may automatically loop through at a maximum value (such as opacity O)3) And minimum opacity (such as opacity O)1) The opacity of the second image 404 is changed as shown in fig. 7, 8 and 9. Additionally, according to other implementations, the period of the periodic variation function used to control the opacity of the second image 404 may be adjusted. The period of the periodic variation function may be adjusted manually or the period may be adjusted automatically by the processor. For example, the period of the function may be at any timeThe time becomes longer or the period of the function may become shorter over time. For example, when performing registration between two or more images, it may be advantageous to vary the period of the function since the clinician's requirements may be different when coarsely adjusting the position of the second image 404 relative to the first image 402 and when finely adjusting the position of the second image 404 relative to the first image 402.

Curves 700, 800, and 900 all share a common period and have the same opacity value at times bounded on the x-axis of the graph. However, in each of the illustrated embodiments, the manner in which processor 16 controls the transitions between the times defined on the graph is different.

FIG. 10 illustrates a time T corresponding to that shown in FIGS. 7, 8, and 9, according to one embodiment0、T1、T2And T34 exemplary screen shots. It should be understood that according to other embodiments, the processor 16 may display the intermediate image. In other words, processor 16 may display at time T0And time T1One or more images in between. The processor 16 may display at time T1And time T2One or more images in between. The processor 16 may display at time T2And time T3One or more images in between. And processor 16 may display at time T3And time T4One or more images in between.

FIG. 10 includes a graph corresponding to time T0Image 150, corresponding to time T1Image 152, corresponding to time T2And corresponding to time T1543Of the image 156. Each image in the graphs 150, 152, 154, and 156 represents a composite image that includes a first image 402 and a second image 404. According to the embodiment shown in fig. 10, the second image 404 completely overlaps the first image 402.

In image 150, second image 404 is in opacity O3Displaying; in image 152, second image 404 is in opacity O2Displaying; in image 154, second image 404 is in opacity O1Displaying; and in image 156, the second image 404 again in opacity O2And (6) displaying. As described above, the opacity of second image 404 shown in FIG. 10 may be automatically changed by processor 16 according to one of the graphically represented functions of FIGS. 7, 8, or 9. Fig. 10 shows the change in opacity of the second image 404 over a complete cycle. It should be appreciated that processor 16 may continue to automatically adjust the opacity of the second image according to the same function for a period of time longer than one cycle as shown in fig. 10.

According to many embodiments, the first image 402 and the second image 404 may represent image data acquired from different imaging modalities or they may represent image data acquired in different imaging modalities using the same imaging modality. According to most embodiments, the first image 402 and the second image 404 contain different types of data. To assist the clinician in distinguishing the first image 402 from the second image 404 in the composite image 406, the processor 16 may display the first image data set as a first image using a first color map and the second image data set as a second image using a second color map different from the first color map.

The processor 16 may also control the opacity (and thus the transparency) of the second image 404 to enable the clinician to view both the first image 402 and the second image 404 in the region of the second image 404 that overlaps the first image 402.

For example, image 150 shows when the second image is at opacity O3The composite image at time T0 at the maximum opacity level of the representation. The maximum opacity level (i.e., the minimum transparency level) makes it very easy for the clinician to discern information in the second image 404 at the expense of some information in the first image 402 that is blurred as an underlying image. Image 152 shows at opacity level O2A second image 404 representing a maximum opacity level O3And a minimum opacity level O1To an opacity level in between. In the image 152, an intermediate opacity level O2So that the clinician can see some information in the first, lower image 402 and some information in the second, overlying image 404. According to oneIn an exemplary embodiment, image 154 shows opacity level O1A second image 404 representing a minimum opacity level (i.e., a maximum transparency level). The low opacity level enables the clinician to clearly see the structures and/or details represented in the first lower image 402 in the region overlapping the second image 404.

By automatically cycling the opacity of the second image 404, the technique enables the clinician to alternate between clearly viewing all details/structures in the second overlying image 404 and clearly viewing all details/structures in the first underlying image 402. According to other embodiments, processor 16 may control the display in such a way that the color map for first image 402 or second image 404 is periodically changed while processor 16 periodically changes the opacity of second image 404.

This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:基于图像分析的骨髓读取支助装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!