Imaging system and method

文档序号:1175545 发布日期:2020-09-22 浏览:10次 中文

阅读说明:本技术 成像系统和方法 (Imaging system and method ) 是由 吕靖原 刘琦 张仲奇 徐健 于 2020-06-22 设计创作,主要内容包括:本发明提供一种成像系统和方法。所述成像方法可包括获取与受试目标的感兴趣区域(ROI)相关联的成像数据。所述成像数据可对应于所述ROI的多个时间序列图像。所述成像方法还可以包括基于所述成像数据确定包括空间基和一个或多个时间基的数据集。所述空间基可包括所述成像数据的空间信息。所述一个或多个时间基可包括所述成像数据的时序信息。所述成像方法还可以包括在存储介质中存储所述空间基以及所述一个或多个时间基。(The invention provides an imaging system and method. The imaging method may include acquiring imaging data associated with a region of interest (ROI) of a subject object. The imaging data may correspond to a plurality of time series images of the ROI. The imaging method may further include determining a dataset including a spatial basis and one or more temporal bases based on the imaging data. The spatial basis may include spatial information of the imaging data. The one or more time bases may include timing information of the imaging data. The imaging method may further include storing the spatial basis and the one or more temporal bases in a storage medium.)

1. An imaging system, comprising:

at least one storage device storing a set of instructions; and

at least one processor in communication with the at least one storage device, wherein the set of instructions, when executed, are operable to direct the system to perform steps comprising:

acquiring imaging data associated with a region of interest of a subject object, wherein the imaging data corresponds to a plurality of time series images of the region of interest;

determining a data set from the imaging data, wherein the data set comprises a spatial basis comprising spatial information of the imaging data and one or more temporal bases comprising timing information of the imaging data; and

storing the spatial basis and the one or more temporal bases in a storage medium.

2. The imaging system of claim 1, wherein the spatial basis and the one or more temporal bases relate to a low rank model, wherein the low rank model represents a correlation between the plurality of time series images.

3. The imaging system of claim 2,

the spatial basis comprises a spatial basis matrix, and the one or more temporal bases comprise a single temporal basis matrix;

a combination of the spatial basis matrix and the temporal basis matrix represents a low rank matrix, wherein the low rank matrix corresponds to a set of the plurality of time series images; and

elements in the spatial basis matrix and the temporal basis matrix are less than elements in the low rank matrix.

4. The imaging system of claim 2,

the dataset further comprises a core tensor;

the spatial basis comprises a spatial basis matrix and the one or more temporal bases comprise more than two temporal basis matrices;

the combination of the spatial basis matrix and the two or more temporal element matrices and the core tensor represents a low-rank multidimensional tensor, wherein the low-rank multidimensional tensor corresponds to the set of the plurality of time series images; and

the elements in the core tensor, the spatial basis matrix, and the two or more temporal element matrices are less than the elements in the low-rank multidimensional tensor.

5. The imaging system of claim 4, wherein the low rank multi-dimensional tensor comprises:

a spatial dimension corresponding to the spatial basis matrix, an

Two or more time dimensions, wherein each time dimension corresponds to a time element matrix of the two or more time element matrices, respectively.

6. The imaging system of claim 4 or 5, wherein the at least one processor, when executing the set of instructions, is further configured to instruct the system to perform steps comprising:

storing the core tensor in the storage medium.

7. The imaging system of claim 1, wherein the at least one processor, when executing the set of instructions, is further configured to instruct the system to perform steps comprising:

reconstructing at least a portion of the images of the plurality of time series images based on the data set; and

storing the resulting reconstructed image in the storage medium.

8. The imaging system of claim 1, wherein the reconstructed image comprises at least one set of images of the plurality of time series images, wherein the plurality of time series images represent a change in a value of a parameter of a plurality of parameters over time, the plurality of parameters being used to acquire the imaging data.

9. The imaging system of claim 8, wherein the plurality of parameters includes at least one of one or more imaging sequence parameters, cardiac motion parameters, and respiratory motion parameters.

10. The imaging system of claim 1, wherein the storage medium comprises the at least one storage device or image archiving and communication system; and/or the plurality of time series images comprises magnetic resonance images, computed tomography images, ultrasound images, or multi-modality images.

11. An imaging system, comprising:

at least one storage device storing a set of instructions; and

at least one processor in communication with the at least one storage device, wherein the set of instructions, when executed, are operable to direct the system to perform steps comprising:

acquiring a data set from a storage medium, wherein the data set comprises a spatial basis and one or more temporal bases, the spatial basis and the one or more temporal bases correspond to a plurality of time series images of a region of interest of a subject object, the spatial basis comprises spatial information of the plurality of time series images, and the one or more temporal bases comprise timing information of the plurality of time series images;

receiving instructions for reconstructing one or more target images of the plurality of time series images;

reconstructing the one or more target images based on the data set and the instructions, wherein each target image of the one or more target images is reconstructed by:

determining a subset of time bases for each of the one or more time bases based on the instructions; and

reconstructing the target image based on the dataset and the one or more time-based subsets; and

displaying the one or more target images.

12. The imaging system of claim 11, wherein the determining a subset of time bases for each of the one or more time bases based on the instructions comprises:

based on the instruction, obtaining a value of at least one parameter of a plurality of parameters corresponding to the target image, wherein the plurality of parameters are used for obtaining imaging data of the region of interest;

determining temporal information corresponding to the one or more time-based target images based on a value of at least one of the plurality of parameters; and

a subset of time bases for each of one or more time bases corresponding to the target image is determined based on the time information.

13. The imaging system of claim 11,

the spatial basis comprises a spatial basis matrix, and the one or more temporal bases comprise a single temporal basis matrix;

a combination of the spatial basis matrix and the temporal basis matrix represents a low rank matrix, wherein the low rank matrix corresponds to a set of the plurality of time series images;

elements in the spatial basis matrix and the temporal basis matrix are less than elements in the low rank matrix;

reconstructing the target image based on the dataset and the one or more time-based subsets comprises:

reconstructing the target image by determining a product of the spatial basis matrix and a temporal basis subset of the single temporal basis matrix.

14. The imaging system of claim 11,

the dataset further comprises a core tensor;

the spatial basis comprises a spatial basis matrix and the one or more temporal bases comprise more than two temporal basis matrices;

the combination of the spatial basis matrix and the two or more temporal element matrices and the core tensor represents a low-rank multidimensional tensor, wherein the low-rank multidimensional tensor corresponds to the set of the plurality of time series images; and

the elements in the core tensor, the spatial basis matrix, and the two or more temporal element matrices are less than the elements in the low-rank multidimensional tensor;

reconstructing the target image based on the dataset and the one or more time-based subsets comprises:

reconstructing the target image by determining a product of the spatial basis matrix, two or more temporal basis subsets of the two or more temporal basis matrices, and the core tensor.

15. An imaging method implemented on a device having one or more processors and one or more memory devices, the method comprising:

acquiring imaging data associated with a region of interest of a subject object, wherein the imaging data corresponds to a plurality of time series images of the region of interest;

determining a data set from the imaging data, wherein the data set comprises a spatial basis comprising spatial information of the imaging data and one or more temporal bases comprising timing information of the imaging data; and

storing the spatial basis and the one or more temporal bases in a storage medium.

Technical Field

The present invention relates to the field of imaging technology, and more particularly to a system and method for storing and displaying images.

Background

Imaging, such as dynamic medical imaging, may involve a large number of images. For example, MRI (magnetic resonance Imaging) implements real-time dynamic Imaging, multi-contrast Imaging and parametric Imaging, which refers to continuously acquiring a plurality of MRI images within a period of time, reflecting the movement of a subject over time and/or the change of contrast over time. However, the MRI real-time dynamic imaging, the multi-contrast imaging and the parametric imaging may involve a large number of images. For example, a cardiac 3D free breathing T1 quantitative Dynamic Contrast Enhancement (DCE) application corresponding to 20 cardiac cycles, 88 saturation times, 12 slices, 75 heartbeats may involve 1584000 images. Another example, a four-dimensional (4D) flow application corresponding to a three-dimensional (3D) space dimension, a heart cycle dimension, and a flow encoding direction dimension would involve 10000 images. Storing, transmitting, and/or displaying such a large number of images in the form of digital imaging and communications in medicine (DICOM, for short, all known as digital imaging and communications in medicine) consumes a large amount of resources, increasing the pressure on medical systems, such as scanners and image archiving and communication systems (PACS, for short, all known as graphic arts and communications systems). Accordingly, it is desirable to provide a system and/or method for storing and/or displaying medical images that addresses the above-mentioned problems.

Disclosure of Invention

Additional features of the invention will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by practice or verification of the invention by way of example. The features of the present invention can be implemented by the practice or use of the methods, instrumentalities and combinations set forth in the detailed examples discussed below.

According to a first aspect of the present invention, an imaging system may include one or more storage devices storing a set of instructions; and one or more processors in communication with the one or more storage devices. The set of instructions, when executed by the one or more processors, may instruct the one or more processors to perform one or more of the following operations. The one or more processors can acquire imaging data associated with a region of interest of a subject, wherein the imaging data corresponds to a plurality of time series images of the region of interest. The one or more processors may determine a dataset from the imaging data, where the dataset may include a spatial basis, which may include spatial information for the imaging data, and one or more temporal bases, which may include timing information for the imaging data. The one or more processors may store the spatial basis and the one or more temporal bases in a storage medium.

In some embodiments, the spatial basis and the one or more temporal bases may relate to a low rank model, wherein the low rank model represents a correlation between the plurality of time series images.

In some embodiments, the spatial basis may comprise a spatial basis matrix and the one or more temporal bases may comprise a single temporal basis matrix. The combination of the spatial basis matrix and the temporal basis matrix may represent a low rank matrix, wherein the low rank matrix corresponds to a set of the plurality of time series images. Elements in the spatial basis matrix and the temporal basis matrix may be less than elements in the low rank matrix.

In some embodiments, the data set may also include a core tensor. The spatial basis may comprise a spatial basis matrix, and the one or more temporal bases may comprise more than two temporal basis matrices; the combination of the spatial basis matrix and the two or more temporal element matrices and the core tensor may represent a low rank multidimensional tensor, wherein the low rank multidimensional tensor corresponds to a set of the plurality of time series images. The elements of the core tensor, the spatial basis matrix, and the two or more temporal element matrices may be less than the elements of the low-rank multidimensional tensor.

In some embodiments, the low-rank multidimensional tensor can include a spatial dimension corresponding to the spatial basis matrix and two or more temporal dimensions, wherein each temporal dimension corresponds to a respective one of the two or more temporal element matrices.

In some embodiments, the one or more processors may store the core tensor in the storage medium.

In some embodiments, the one or more processors may reconstruct at least a portion of the images of the plurality of time series images based on the data set. The one or more processors may store the resulting reconstructed image in the storage medium.

In some embodiments, the one or more processors may send the reconstructed image to a user device to present the reconstructed image.

In some embodiments, the reconstructed image may include at least one set of images of the plurality of time series images, wherein the plurality of time series images represent a change in a value of one parameter of a plurality of parameters over time, the plurality of parameters being used to acquire the imaging data.

In some embodiments, the plurality of parameters may include at least one of one or more of imaging sequence parameters, cardiac motion parameters, and respiratory motion parameters.

In some embodiments, the storage medium may include the at least one storage device or a picture archiving and communication system.

In some embodiments, the plurality of time series images includes Magnetic Resonance (MR) images, Computed Tomography (CT) images, ultrasound images, or multi-modality images.

According to a second aspect of the invention, the imaging method may comprise one or more of the following steps. One or more processors may acquire imaging data associated with a region of interest of a subject, wherein the imaging data corresponds to a plurality of time series images of the region of interest. The one or more processors may determine a dataset from the imaging data, where the dataset may include a spatial basis, which may include spatial information for the imaging data, and one or more temporal bases, which may include timing information for the imaging data. The one or more processors may store the spatial basis and the one or more temporal bases in a storage medium.

According to a third aspect of the present invention, an imaging system may include a first input/output (I/O) module for acquiring imaging data associated with a region of interest of a subject, wherein the imaging data corresponds to a plurality of time series images of the region of interest. The imaging system may further include a matrix determination module to determine a dataset from the imaging data, wherein the dataset includes a spatial basis including spatial information of the imaging data and one or more temporal bases including timing information of the imaging data. The first input/output module is further configured to store the spatial basis and the one or more temporal bases in a storage medium.

According to a fourth aspect of the invention, a non-transitory computer-readable storage medium may include at least one set of instructions. The at least one set of instructions may be executable by one or more processors of a computer server. The one or more processors can acquire imaging data associated with a region of interest of a subject, wherein the imaging data corresponds to a plurality of time series images of the region of interest. The one or more processors may determine a dataset from the imaging data, where the dataset may include a spatial basis, which may include spatial information for the imaging data, and one or more temporal bases, which may include timing information for the imaging data. The one or more processors may store the spatial basis and the one or more temporal bases in the storage medium.

According to a fifth aspect of the present invention, an imaging system may include one or more storage devices storing a set of instructions; and one or more processors in communication with the one or more storage devices. The set of instructions, when executed by the one or more processors, may instruct the one or more processors to perform one or more of the following operations. The one or more processors may retrieve a data set from a storage medium, wherein the data set includes a spatial basis and one or more temporal bases, the spatial basis and the one or more temporal bases corresponding to a plurality of time series images of a region of interest of a subject object, the spatial basis including spatial information of the plurality of time series images, the one or more temporal bases including timing information of the plurality of time series images. The one or more processors may receive instructions, wherein the instructions are to reconstruct one or more target images of the plurality of time series images. The one or more processors may reconstruct the one or more target images based on the data sets and the instructions. For each of the one or more target images, the one or more processors may determine, based on the instructions, a subset of time bases for each of the one or more time bases. The one or more processors may reconstruct the target image based on the data set and the one or more time-based subsets. The one or more processors may display the one or more target images.

In some embodiments, to determine the subset of time bases for each of the one or more time bases based on the instructions, the one or more processors may obtain, based on the instructions, a value for at least one parameter of a plurality of parameters corresponding to the target image, where the plurality of parameters are used to obtain imaging data for the region of interest. The one or more processors may determine temporal information corresponding to the target images of the one or more time bases based on a value of at least one of the plurality of parameters. The one or more processors may determine a subset of time bases for each of one or more time bases corresponding to the target image based on the temporal information.

In some embodiments, the plurality of parameters includes at least one of one or more of imaging sequence parameters, cardiac motion parameters, and respiratory motion parameters.

In some embodiments, the spatial basis and the one or more temporal bases may relate to a low rank model, wherein the low rank model represents a correlation between the plurality of time series images.

In some embodiments, the spatial basis may comprise a spatial basis matrix and the one or more temporal bases may comprise a single temporal basis matrix. The combination of the spatial basis matrix and the temporal basis matrix may represent a low rank matrix, wherein the low rank matrix corresponds to a set of the plurality of time series images. Elements in the spatial basis matrix and the temporal basis matrix may be less than elements in the low rank matrix.

In some embodiments, to reconstruct the target image based on the data set and the one or more time-based subsets, the one or more processors may reconstruct the target image by determining a product of the spatial-based matrix and a time-based subset of the single time-based matrix.

In some embodiments, the data set further includes a core tensor. The spatial basis may comprise a spatial basis matrix and the one or more temporal bases may comprise more than two temporal basis matrices. The combination of the spatial basis matrix and the two or more temporal element matrices and the core tensor may represent a low rank multidimensional tensor, wherein the low rank multidimensional tensor corresponds to a set of the plurality of time series images. The elements of the core tensor, the spatial basis matrix, and the two or more temporal element matrices may be less than the elements of the low-rank multidimensional tensor.

In some embodiments, the low-rank multidimensional tensor can include a spatial dimension corresponding to the spatial basis matrix and two or more temporal dimensions, wherein each temporal dimension corresponds to a respective one of the two or more temporal element matrices.

In some embodiments, to reconstruct the target image based on the data set and the one or more time-based subsets, the one or more processors may reconstruct the target image by determining a product of the spatial-based matrix, two or more time-based subsets of the two or more time-based matrices, and the core tensor.

In some embodiments, the one or more processors may store at least one of the one or more target images in the storage medium.

In some embodiments, the storage medium may comprise the storage device or a picture archiving and communication system.

In some embodiments, the plurality of time series images may include Magnetic Resonance (MR) images, Computed Tomography (CT) images, ultrasound images, or multi-modality images.

According to a sixth aspect of the present invention, the imaging method may include one or more of the following steps. The one or more processors may retrieve a data set from a storage medium, wherein the data set includes a spatial basis and one or more temporal bases, the spatial basis and the one or more temporal bases corresponding to a plurality of time series images of a region of interest of a subject object, the spatial basis may include spatial information of the plurality of time series images, and the one or more temporal bases may include timing information of the plurality of time series images. The one or more processors may receive instructions, wherein the instructions are to reconstruct one or more target images of the plurality of time series images. The one or more processors may reconstruct the one or more target images based on the data sets and the instructions. For each of the one or more target images, the one or more processors may determine, based on the instructions, a subset of time bases for each of the one or more time bases. The one or more processors may reconstruct the target image based on the data set and the one or more time-based subsets. The one or more processors may display the one or more target images.

According to a seventh aspect of the present invention, an imaging system may comprise a second input/output (I/O) module for acquiring a data set from a storage medium, wherein the data set comprises a spatial basis and one or more temporal bases, the spatial basis and the one or more temporal bases corresponding to a plurality of time series images of a region of interest of a subject object, the spatial basis may comprise spatial information of the plurality of time series images, and the one or more temporal bases may comprise timing information of the plurality of time series images. The second input/output (I/O) module is further to receive instructions, wherein the instructions are to reconstruct one or more target images of the plurality of time series images. The imaging system further includes a second reconstruction module to reconstruct the one or more target images based on the data set and the instructions. For each of the one or more target images, a time base subset for each of the one or more time bases may be determined based on the instructions, and the target image may be reconstructed based on the data set and the one or more time base subsets. The second input/output (I/O) module is further to display the one or more target images.

According to an eighth aspect of the invention, a non-transitory computer readable medium may comprise at least one set of instructions. The at least one set of instructions may be executable by one or more processors of a computer server. The one or more processors may retrieve a data set from a storage medium, wherein the data set may include a spatial basis and one or more temporal bases, the spatial basis and the one or more temporal bases corresponding to a plurality of time series images of a region of interest of a subject object, the spatial basis may include spatial information of the plurality of time series images, and the one or more temporal bases may include timing information of the plurality of time series images. The one or more processors may receive instructions, wherein the instructions are to reconstruct one or more target images of the plurality of time series images. The one or more processors may reconstruct the one or more target images based on the data sets and the instructions. For each of the one or more target images, the one or more processors may determine, based on the instructions, a subset of time bases for each of the one or more time bases. The one or more processors may reconstruct the target image based on the data set and the one or more time-based subsets. The one or more processors may display the one or more target images.

Drawings

The invention is further described by means of a number of exemplary embodiments, which are described in detail with reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent like structures throughout the several views of the drawings, and in which:

FIG. 1 is a schematic diagram of an exemplary medical system of some embodiments of the present invention;

FIG. 2 is a schematic view of an exemplary MRI scanner in accordance with some embodiments of the present invention;

FIG. 3 is a schematic diagram of exemplary hardware and/or software components of a computing device of some embodiments of the invention;

FIG. 4 is a schematic diagram of exemplary hardware and/or software components of a mobile device in accordance with some embodiments of the present invention;

FIG. 5 is a block diagram of an exemplary processing device in accordance with some embodiments of the invention;

FIG. 6 is a flow chart of an exemplary process for MRI reconstruction of some embodiments of the present invention;

FIG. 7A is a schematic diagram of the Tucker factorization of low rank three dimensional tensors of some embodiments of the present invention;

FIG. 7B is a schematic illustration of a portion of a plurality of time series images in accordance with some embodiments of the invention;

FIG. 8 is a block diagram of an exemplary user device in accordance with some embodiments of the invention;

fig. 9 is a flow chart of an exemplary process for MRI reconstruction in accordance with some embodiments of the present invention.

Detailed Description

In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In general, well-known methods, procedures, systems, components, and/or circuits have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present invention. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention. Thus, the present invention is not limited to the illustrated embodiments, but covers the broadest scope consistent with the claims.

The terminology used in the description of the invention herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the invention. As used herein, the terms "a," "an," and "the" can refer to the singular and the plural unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

It should be understood that the terms "system," "device," "unit," "module," and/or "block" used herein are a method for distinguishing different components, elements, components, parts, or combinations thereof of different levels in ascending order. However, if other terms of another expression can achieve the same purpose, these terms may be substituted therefor.

Generally, as used herein, the terms "module," "unit," or "block" refer to a logical component contained in hardware or firmware, or to a collection of software instructions. The modules, units or blocks described herein may be implemented in software and/or hardware and may be stored in any type of non-transitory computer-readable medium or other storage device. In some embodiments, software modules/units/blocks may be compiled and linked into an executable program. It should be appreciated that software modules may be invoked from other modules/units/blocks or themselves, and/or may be invoked in response to detected events or interrupts. The software modules/units/blocks for execution on a computing device (e.g., processor 310 as shown in fig. 3) may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, diskette, or any other tangible medium, or as a digital download (and initially stored in a compressed or installable format requiring installation, decompression, or decryption prior to execution). These software codes may be stored, in part or in whole, on a storage device of the computing device for execution by the computing device. The software instructions may be embedded in firmware, such as an EPROM (erasable programmable read-only memory). It is further understood that the hardware modules/units/blocks may comprise connected logic components, such as gates and flip-flops, and/or may comprise programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functions described in this invention may be implemented as software modules/units/blocks, but may also be represented in hardware or firmware. Generally, the modules/units/blocks described herein refer to logical modules/units/blocks, which may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks, although they may be physically organized or stored differently.

It will be understood that when an element, device, module or block is referred to as being "on," "connected to" or "coupled to" another element, device, module or block, it can be directly on, connected, coupled or communicated to the other element, device, module or block, or intervening elements, devices, modules or blocks may be present, unless the context clearly dictates otherwise. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed features.

These and other features of the present invention, and the manner of operation and function of the related elements of structure, as well as the combination of parts and manufacturing costs, will become more apparent upon consideration of the following description of the invention taken in conjunction with the accompanying drawings. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. It will be understood that the figures are not drawn to scale.

The present invention provides medical systems and assemblies for medical imaging. In some embodiments, the medical system may comprise an imaging system. The imaging system may comprise a single modality imaging system or a multi-modality imaging system. For example, the single modality system may include a Computed Tomography (CT) system, a Magnetic Resonance Imaging (MRI) system, a Positron Emission Tomography (PET) system, an ultrasound system, and the like. For example, the MRI system may include a superconducting magnetic resonance imaging system, a non-superconducting magnetic resonance imaging system, or the like. As another example, the multi-modality imaging system can include a computed tomography magnetic resonance imaging (MRI-CT) system, a positron emission tomography magnetic resonance imaging (PET-MRI) system, a single photon emission computed tomography magnetic resonance imaging (SPECT-MRI) system, and the like. For example, the multi-modality imaging system may include a magnetic resonance positron emission tomography (MR-PET) system, a digital subtraction angiography magnetic resonance imaging (DSA-MRI) system, a computed tomography positron emission tomography (CT-PET) system, or the like. In some embodiments, the medical system may include a therapy system. The treatment system may include a treatment planning system (TPS, also known as a treatment plan system), an image-guided radiation treatment (IGRT, also known as an image-guide radio therapy), and so on. The Image Guided Radiation Therapy (IGRT) may include a treatment device and an imaging device. The treatment device may comprise a linear accelerator, a cyclotron, a synchrotron or the like for performing radiation therapy on the subject. The therapeutic device may include an accelerator for a variety of particles, including photons, electrons, protons, or heavy ions. The imaging devices may include MRI scanners, CT scanners (e.g., Cone Beam Computed Tomography (CBCT), Digital Radiology (DR), Electronic Portal Imaging Devices (EPID), and so on.

One aspect of the invention relates to a system and method for storing and displaying medical images. Taking the system and method for storing and/or displaying MRI images as an example, Magnetic Resonance (MR) data corresponding to a plurality of time series images of a region of interest (ROI) of a subject object may be acquired. The information represented in the plurality of time-series images may have a temporal correlation in addition to a spatial correlation. According to some embodiments, such spatial and/or temporal correlations may be described by a low rank model. In an image reconstruction process based on the MR data, a spatial basis matrix comprising spatial information of the MR data and one or more temporal basis matrices comprising temporal information of the MR data may be determined based on the MR data and the low rank model. Due to the low-rank nature of the timing information in the form of the one or more time base matrices and/or the low-rank nature of the spatial information in the form of the spatial base matrices, the MR data represented in the form of the spatial base matrices and the one or more time base matrices may have a smaller data volume or file size than the plurality of time series images. Thus, the spatial basis matrix and the one or more temporal element matrices may be stored, for example, in the Picture Archiving and Communication System (PACS), rather than storing all of the plurality of time series images, which may reduce the storage space occupied.

If a user wishes to view multiple time series images in a user device, the spatial basis matrix and the one or more time element matrices may be transmitted from the PACS to the user device instead of transmitting the multiple time series images themselves, which may reduce the amount of data transmitted, which in turn may reduce the pressure on transmission bandwidth, reduce transmission time, and/or reduce the chance of transmission errors. With the spatial basis matrix and the one or more temporal element matrices, a user device with general processing capabilities may be employed to achieve fast reconstruction of any one of the plurality of time series images.

It should be understood that the disclosed methods and systems are described with reference to MRI images for purposes of illustration and are not intended to limit the scope of the present invention. The disclosed methods and systems may be applied to other single modality or multi-modality imaging, including, for example, CT imaging, ultrasound imaging, MRI-CT, and the like. In some embodiments, the method and/or system of medical image storage and display in the present invention may be applied to a scenario of storage and/or display of medical images of low rank.

Fig. 1 is a schematic diagram of an exemplary medical system 100 of some embodiments of the invention. The medical system 100 may include a scanner 110, a network 120, a user device 130, a processing device 140, and a storage device 150. The components of the medical system 100 may be connected in one or more ways. For example only, the scanner 110 may be connected to the processing device 140 via the network 120. As another example, the scanner 110 may be directed to the processing device 140 (as indicated by the dashed double-headed arrow connecting the scanner 110 and the processing device 140). In another embodiment, the storage device 150 may be directly connected to the processing device 140 or connected to the processing device 140 through the network 120. In another embodiment, the terminal devices (e.g., 131, 132, 133, etc.) may be connected directly to the processing device 140 (as indicated by the dashed double-headed arrow connecting the user device 130 and the processing device 140) or to the processing device 140 via the network 120.

The scanner 110 may scan a subject located within its examination region and generate a plurality of imaging data relating to the subject. In the present invention, "subject" and "object" may be used interchangeably. By way of example only, the subject target may include a scan target, a man-made object, and the like. In another embodiment, the subject target may include a specific portion, organ and/or tissue of the scanned target. For example, the subject target may include the head, brain, neck, body, shoulders, arms, chest, heart, stomach, blood vessels, soft tissues, knees, feet, or other parts, etc., or any combination thereof.

In some embodiments, the scanner 110 may include an MRI scanner, a CT scanner, a PET scanner, an ultrasound scanner, or a multi-modality device, among others. For example, the multi-modality device may include an MRI-CT device, a PET-MRI device, or a PET-CT device, among others. In the present invention, the X-axis, Y-axis, and Z-axis shown in FIG. 1 may form an orthogonal coordinate system. The X and Z axes shown in fig. 1 may be horizontal and the Y axis may be vertical. As shown in fig. 1, the positive direction along the X-axis may be from the right side to the left side of the scanner 110 (as viewed from the direction facing the scanner 110); the positive direction along the Y-axis (as shown in fig. 1) may be from the lower to the upper portion of the scanner 110; the positive direction along the Z-axis (as shown in fig. 1) may refer to the direction in which the subject is removed from the scanning tunnel (or called conduit) of the scanner 110. Additional details of the scanner 110 are provided in the present disclosure, for example, with reference to FIG. 2 and its description.

The network 120 may include any suitable network capable of facilitating the exchange of information and/or data for the medical system 100. In some embodiments, one or more components of the medical device 110 (e.g., the scanner 110, the user device 130, the processing device 140, or the storage device 150) may communicate information and/or data with one or more other components of the medical system 100 via the network 120. For example, the processing device 140 may acquire imaging data (e.g., Magnetic Resonance (MR) data) from the scanner 110 via the network 120. In another embodiment, the user device 130 may retrieve a spatial basis matrix and one or more temporal basis matrices corresponding to the imaging data from the storage device 150 and/or the processing device 140. In some embodiments, the network 120 may be any type of wired or wireless network, or combination thereof. The network 120 may include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), etc.), a wired network (e.g., ethernet), a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a virtual private network ("VPN"), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. By way of example only, the network 120 may include a cable television network, a cable network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a wireless local,Metropolitan Area Network (MAN), Public Switched Telephone Network (PSTN), BluetoothTMNetwork and ZigBeeTMA network, a Near Field Communication (NFC) network, etc., or any combination thereof. In some embodiments, the network 120 may include one or more network access points. For example, the network 120 may include wired and/or wireless network access points, such as base stations and/or internet switching points, through which one or more components of the medical system 100 may connect to the network 120 to exchange data and/or information.

The user device 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, a desktop computer (not shown), a workstation (not shown), etc., or any combination thereof. In some embodiments, the mobile device 131 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, and the like, or any combination thereof. In some embodiments, the smart home device may include a smart lighting device, a control device of a smart electrical device, a smart monitoring device, a smart television, a smart camera, an intercom, or the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, a smart footwear, smart glasses, a smart helmet, a smart watch, a smart garment, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof. In some embodiments, the virtual reality device and/or augmented reality device may include a virtual reality helmet, virtual reality glasses, a virtual reality patch, an augmented reality helmet, augmented reality glasses, an augmented reality patch, and the like, or any combination thereof. For example, the virtual reality device and/or augmented reality device may include googleTMGlasses, Oculus Rift head-mounted display, holographic lens, Gear VR head-mounted display, and the like. In some embodiments, the user device 130 may remotely operate the scanner 110 and or the processing device 140. In some embodiments, the user device 130 may operate the scanner 110 and/or the processing device via a wireless connection140. In some embodiments, the user device 130 may receive information and/or instructions input by a user and send the received information and/or instructions to the scanner 110 or the processing device 140 via the network 120. For example, a user of the medical system 100 (e.g., a doctor, technician, or engineer, etc.) may set up a scanning protocol via the user device 130. The user device 130 may transmit the scan protocol to the processing device 140 to instruct the processing device 140 to control the scanner 110 (e.g., an MRI scanner) to operate according to the scan protocol. In some embodiments, the user device 130 may receive data and/or information from the processing device 140 and/or the storage device 150. For example, the user device 130 may obtain a spatial basis matrix and one or more temporal basis matrices from the processing device 140 and/or the storage device 150. In another embodiment, the user device 130 may retrieve one or more images from the processing device 140 and/or the storage device 150.

The processing device 140 may process data and/or information obtained from the scanner 110, the user device 130, and/or the storage device 150. For example, the processing device 140 may acquire imaging data (e.g., MR data) from the scanner 110 and determine a spatial basis matrix and one or more temporal basis matrices based on the imaging data. In another embodiment, the processing device 140 may receive one or more instructions from the user device 130 and control the scanner 110 to operate according to the one or more instructions. In some embodiments, the processing device 140 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data stored in the scanner 110, the user device 130, and/or the storage device 150, or information and/or data acquired by the scanner 110, the user device 130, and/or the storage device 150 via the network 120. As another example, the processing device 140 may be directly connected to the scanner 110 (as indicated by the dashed double-headed arrow connecting the processing device 140 and the scanner 110 in fig. 1), the user device 130 (as indicated by the dashed double-headed arrow connecting the processing device 140 and the user device 130 in fig. 1), and/or the storage device 150 to access stored or retrieved information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof. In some embodiments, the processing device 140 may be implemented on a computing device 300 having one or more of the components shown in FIG. 3 of the present invention.

The storage device 150 may store data and/or instructions. In some embodiments, the storage device 150 may include a database 151, a Picture Archiving and Communication System (PACS)152, a file system 153, or the like, or any combination thereof. In some embodiments, the storage device 150 may store data acquired from the scanner 110, user device 130, and/or the processing device 140. For example, the storage device 150 may store imaging data (e.g., MR data) acquired by the scanner 110. In another embodiment, the storage device 150 may store medical images (e.g., MRI images) generated by the processing device 140 and/or the user device 130. In another embodiment, the storage device 150 may store a spatial basis matrix and one or more temporal basis matrices. In another embodiment, the storage device 150 may store an electronic medical record of the scan target. In another embodiment, the storage device 150 may store preset scan parameters (e.g., preset scan protocols) of the medical system 100. In some embodiments, the storage device 150 may store data and/or instructions that may be executed by the processing device 140 or for performing the exemplary methods described in this disclosure. For example, the storage device 150 may store instructions executable by the processing device 140 for determining a spatial basis matrix and one or more temporal basis matrices based on the imaging data. In another embodiment, the storage device 150 may store instructions executable by the processing device 140 and/or the user device 130 for generating one or more images based on the spatial basis matrix and the one or more temporal basis matrices. In some embodiments, the storage device 130 may include a mass storage device, a removable storage device, volatile read-write memory, read-only memory (ROM), the like, or any combination thereof. For example, mass storage devices may include magnetic disks, optical disks, solid state drives, and the like. For example, the removable storage device may include a flash drive, a floppy disk, an optical disk, a memory card, a compact disk, a magnetic tape, and so forth. The volatile read and write memory may include, for example, Random Access Memory (RAM). For example, RAM may include Dynamic RAM (DRAM), Double Data Rate Synchronous Dynamic RAM (DDRSDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), zero capacitance RAM (Z-RAM), and the like. For example, ROM can include Masked ROM (MROM), Programmable ROM (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), digital versatile disk ROM, and the like. In some embodiments, the storage device 150 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.

In some embodiments, the storage device 150 may be connected to the network 120 to communicate with one or more components of the medical system 100 (e.g., the scanner 110, the processing device 140, the user device 130, etc.). One or more components of the medical system 100 may access data or instructions stored in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be directly connected to or in communication with one or more components of the medical system 100 (e.g., the scanner 110, the processing device 140, the user device 130, etc.). In some embodiments, the storage device 150 may be part of the processing device 140.

In some embodiments, the medical system 100 may also include one or more power supplies (not shown in fig. 1) connected to one or more components of the medical system 100 (e.g., the scanner 110, the processing device 140, the user device 130, the storage device 150, etc.).

For the sake of brevity, the description of the method and/or system for storage and display of medical images may be exemplified by MRI. For example, the following description of the scanning instrument 110 may refer to an MRI scanner as an example. In another embodiment, the following description of the method and/or system for storage and display of medical images may refer to MR images as an example. It should be noted that the methods and/or systems for storage and display of MR images described below are merely some examples or implementations and are not intended to limit the scope of the present invention. It will be apparent to those skilled in the art that the method and/or system for storing and displaying MR images of the present invention can be applied to other similar single-modality or multi-modality imaging including, for example, CT imaging, ultrasound imaging, MRI-CT, etc. In some embodiments, the method and/or system of storage and display of medical images in the present invention may be applied to a scenario of storage and/or display of medical images of low rank.

Figure 2 is a schematic diagram of an exemplary MRI scanner according to some embodiments of the present invention. The main magnet 201 can generate a first magnetic field (or referred to as a main magnetic field) that can be applied to an object (also referred to as a scan target) exposed to the magnetic field. The main magnet 201 may comprise a resistive electromagnet or a superconducting electromagnet, both of which require a power supply (not shown) to support operation. Optionally, the main magnet 201 may comprise a permanent magnet. The main magnet 201 may include a through hole for receiving the scan target. The main magnet 201 can also control the homogeneity of the generated main magnetic field. Compensation coils may also be provided in the main magnet 201. The compensation coils may be placed in the gaps of the main magnet 201 to compensate for inhomogeneities of the magnetic field of the main magnet 201. The compensation coil may be powered by a compensation power supply.

Gradient coils 202 may be located within the main magnet 201. The gradient coil 202 may generate a second magnetic field (otherwise known as a gradient field, including gradient fields Gx, Gy, and Gz). The second magnetic field may be superimposed on the main magnetic field generated by the main magnet 201, while the main magnetic field is distorted such that the magnetic direction of the protons of the subject may vary with their position in the gradient field, thereby encoding spatial information into MR signals generated by the imaging region of the subject. The gradient coils 202 may include X-axis coils (e.g., for generating a gradient field Gx corresponding to the X-direction), Y-axis coils (e.g., for generating a gradient field Gy corresponding to the Y-direction), and/or Z-axis coils (e.g., for generating a gradient field Gz corresponding to the Z-direction) (not shown in fig. 2). In some embodiments, the Z-axis coil may be based on a circular (Maxwell) coil design, while the X-axis coil and the Y-axis coil may be based on a saddle (Golay) coil design. The three sets of coils can generate three different magnetic fields for position encoding. The gradient coils 202 may allow spatial encoding of MR signals for image reconstruction. The gradient coil 202 may be connected to one or more of an X-axis gradient amplifier 204, a Y-axis gradient amplifier 205, or a Z-axis gradient amplifier 206. One or more of the three amplifiers may be connected to a waveform generator 216. The waveform generator 216 may generate gradient waveforms that are applied to the X-axis gradient amplifier 204, the Y-axis gradient amplifier 205, or the Z-axis gradient amplifier 206. The amplifier may amplify the waveform. The amplified waveform may be applied to one of the coils in the gradient coil 202 to generate magnetic fields in the X, Y, and Z axes, respectively. The gradient coil 202 may be designed for a closed bore MRI scanner or an open bore MRI scanner. In some cases, all three sets of the gradient coils 202 may be energized, thereby generating three gradient fields. In some embodiments of the invention, the X-axis coil and the Y-axis coil may be energized to generate gradient fields in the X-direction and the Y-direction. As used herein, the X-axis, Y-axis, Z-axis, X-direction, Y-direction, and Z-direction depicted in fig. 2 are the same or similar to those depicted in fig. 1.

In some embodiments, a Radio Frequency (RF) coil 203 may be located inside the main magnet 201 and used for RF transmission, RF reception, or both. The radio frequency coil 203 may be connected to radio frequency electronics 209, and the radio frequency electronics 209 may function as one or more Integrated Circuits (ICs) that function as a waveform transmitter and/or a waveform receiver. The radio frequency electronics 209 may be connected to a Radio Frequency Power Amplifier (RFPA)207 and an analog-to-digital converter (ADC) 208.

When the radio frequency coil 203 is used for radio frequency transmission, it may generate radio frequency signals providing a third magnetic field for generating MR signals related to the imaging region of the subject. The third magnetic field may be perpendicular to the main magnetic field. The waveform generator 216 may generate radio frequency pulses. The RF pulses may be amplified by the RF pa 207, processed by the RF electronics 209, the RF electronics 209 responding and generating RF signals based on the strong current generated by the amplified RF pulses by the RF coil 203.

When the radio frequency coil 203 is used as radio frequency receive, it may be responsible for detecting MR signals (e.g., echoes). After excitation, the MR signals generated by the subject target can be sensed by the radio frequency coil 203. A receive amplifier may then receive the sensed MR signal from the radio frequency coil 203, amplify the sensed MR signal, and provide the amplified MR signal to the analog-to-digital converter 208. The analog-to-digital converter 208 may convert the MR signals from analog signals to digital signals. Finally, the digital MR signals may be sent to the processing device 140 for sampling.

In some embodiments, the gradient coils 202 and the radio frequency coils 203 may be positioned circumferentially with respect to the subject. Those skilled in the art will appreciate that the main magnet 201, the gradient coils 202, and the radio frequency coil 203 may be positioned in a variety of arrangements around the subject.

In some embodiments, the rf pa 207 may amplify a radio frequency pulse (e.g., power of the radio frequency pulse, voltage of the radio frequency pulse) to generate an amplified radio frequency pulse to drive the radio frequency coil 203. The RFPA 207 may comprise a transistor-based RFPA, a vacuum tube-based RFPA, or the like, or any combination thereof. The transistor-based RFPA may include one or more transistors. The vacuum tube-based RFPA may include a triode, tetrode, klystron, etc., or any combination thereof. In some embodiments, the RFPA 207 may comprise a linear RFPA or a non-linear RFPA. In some embodiments, the RFPA 207 may include one or more RFPAs.

In some embodiments, the scanner 110 may also include a subject target positioning system (not shown). The subject target positioning system may include a subject target support and a transport apparatus. The subject may be placed on the subject support and positioned by the transport device within the bore of the main magnet 201.

An MRI system (e.g., the medical system 100 disclosed in the present invention) may generally be used to acquire internal images of a particular region of interest (ROI) of a scanned object. An MRI system includes a main magnet (e.g., the main magnet 201) for providing a strong and uniform main magnetic field to align the individual magnetic moments of hydrogen atoms within the body of the scanned subject. In this process, the hydrogen atoms oscillate around their poles at their specific larmor frequency. If an additional magnetic field is applied to the tissue, tuned to the larmor frequency, the hydrogen atoms absorb additional energy and the net aligning moment of the hydrogen atoms rotates. The additional magnetic field may be provided by a radio frequency excitation signal (e.g., a radio frequency signal generated by the radio frequency coil 203). When the additional magnetic field is removed, the magnetic moment of the hydrogen atoms rotates back to a position aligned with the main magnetic field, thereby emitting an MR signal. The MR signals are received and processed to form an MR image. T1 relaxation can be the process by which the net magnetization increases/recovers to an initial maximum parallel to the main magnetic field. T1 may be the time constant for longitudinal magnetization regrowth (e.g., along the main magnetic field direction). The T2 relaxation may be a process of magnetization transverse component decay or dephasing. T2 may be the time constant of transverse magnetization decay/dephasing.

The radio frequency excitation signal non-selectively excites all hydrogen atoms in the sample if the main magnetic field is uniformly distributed throughout the body of the scan target. Thus, for imaging a particular portion of the scanning target body, magnetic field gradients Gx, Gy, and Gz in the x, y, and z directions (e.g., generated by the gradient coils 202) having particular times, frequencies, and phases may be superimposed on the uniform magnetic field such that the radio frequency excitation signals excite hydrogen atoms in a desired slice of the scanning target body and particular phase and frequency information is encoded in the MR signals according to the location of the hydrogen atoms in the "image slice".

Typically, the portion of the scan target to be imaged is scanned over a series of measurement cycles, wherein the radio frequency excitation signals and the magnetic field gradients Gx, Gy and Gz vary according to the MRI imaging protocol being used. The protocol may be designed for one or more tissues to be imaged, diseases, and/or clinical protocols. The protocol may include a specific number of pulse sequences in different planes and/or with different parameters. The pulse sequence may include a spin echo sequence, a gradient echo sequence, a diffusion sequence, an inversion recovery sequence, a saturation recovery sequence, and the like, or any combination thereof. For example, the spin echo sequence may include a Fast Spin Echo (FSE) pulse sequence, a Turbine Spin Echo (TSE) pulse sequence, a Rapid Acquisition Relaxation Enhancement (RARE) pulse sequence, a half fourier acquisition single shot turbine spin echo (HASTE) pulse sequence, a Turbine Gradient Spin Echo (TGSE) pulse sequence, and the like, or any combination thereof. In another embodiment, the gradient echo sequence may include a balanced steady state free precession (bSSFP) pulse sequence, an attenuated gradient echo (GRE) pulse sequence, an Echo Planar Imaging (EPI) pulse sequence, a Steady State Free Precession (SSFP) pulse sequence, or the like, or any combination thereof. The protocol may also include information about image contrast and/or ratio, region of interest (ROI), slice thickness, type of imaging (e.g., T1 weighted imaging, T2 weighted imaging, proton density weighted imaging, etc.), T1, T2, echo type (spin echo, Fast Spin Echo (FSE), fast recovery FSE, single shot FSE, gradient echo, fast imaging with steady state precession, etc.), flip angle value, acquisition Time (TA), echo Time (TE), repetition Time (TR), echo chain length (ETL), number of phases, Number of Excitations (NEX), inversion time, bandwidth (e.g., radio frequency receive bandwidth, radio frequency transmit bandwidth, etc.), or any combination thereof.

For each MRI scan, the resulting MR signals (also referred to as MR data) can be digitized and processed to reconstruct images using the MRI imaging protocol used.

Fig. 3 is a schematic diagram of exemplary hardware and/or software components of a computing device for implementing processing device 140 according to some embodiments of the present invention. As shown in FIG. 3, computing device 300 may include a processor 310, memory 320, input/output (I/O)330, and communication ports 340.

The processor 310 may execute computer instructions (e.g., program code) and perform the functions of the processing device 140 in accordance with the techniques described herein. The computer instructions may include routines, programs, objects, components, signals, data structures, procedures, modules, and functions that perform the particular functions described. For example, the processor 310 may determine a spatial basis matrix and one or more temporal basis matrices corresponding to the MR data. In some embodiments, the processor 310 may include a microcontroller, a microprocessor, a Reduced Instruction Set Computer (RISC), Application Specific Integrated Circuits (ASICs), application specific instruction set processors (ASIPs), Central Processing Units (CPUs), Graphics Processing Units (GPUs), Physical Processing Units (PPUs), microcontroller units, Digital Signal Processors (DSPs), Field Programmable Gate Arrays (FPGAs), Advanced RISC Machines (ARMs), Programmable Logic Devices (PLDs), any circuit or processor capable of performing one or more functions, or the like, or any combination thereof.

For illustration only, only one processor is depicted in the computing device 300. It should be noted, however, that the computing device 300 of the present invention may also include multiple processors. Thus, the steps of a method performed by one processor may also be combined or performed separately by a plurality of processors, as described in the present invention. For example, if in the present invention, the processors of the computing device 300 perform operations A and B, it is understood that operations A and B may be performed by two or more different processors of the computing device 300 together, or may be performed separately (e.g., a first processor performing operation A and a second processor performing operation B, or the first processor and the second processor performing operations A and B together).

For example only, the processor 310 may receive instructions to follow an MRI scan protocol for imaging/scanning the subject. For example, the processor 310 may instruct a subject target positioning system of the scanner 110 to move the subject target to an appropriate position within the bore of the main magnet 201. In another embodiment, the processor 310 can also provide specific control signals to control the main magnet 201 to generate a main magnetic field having a specific strength.

The processor 310 may receive control signals to set the shape, amplitude and/or timing of the gradient waveforms and/or the shape, amplitude and/or timing of the RF waveforms and send the set parameters to the waveform generator 216 to instruct the waveform generator 216 to generate specific gradient waveform sequences and pulse sequences to be applied to the gradient coil 202 and the RF coil 203, respectively, via the amplifier 204 and 207.

The processor 310 may also sample data (e.g., echoes) from the RF coil 203 based on one or more sampling parameters including, for example, timing information (e.g., length of data acquisition), type of K-space data acquisition (e.g., undersampling, oversampling, etc.), sampling trajectory (e.g., cartesian trajectory, non-cartesian trajectory such as helical trajectory, radial trajectory, etc.), or the like, or combinations thereof. In some embodiments, the timing information may be input by a user (e.g., an operator) or autonomously determined by the medical system 100 based on one or more other parameters of the imaging procedure (e.g., clinical needs). The timing information may correspond to the type of gradient waveforms and RF waveforms, respectively, sent to the gradient coil 202 and the RF coil 203 so that the MR signals may be correctly sampled. The processor 310 may also generate an MR image by reconstructing the sampled data.

The memory 320 may store data/information obtained from the scanner 110, the user device 130, the storage 150, or any other component of the medical system 100. In some embodiments, the memory 320 may include mass storage devices, removable storage devices, volatile read-write memory, read-only memory (ROM), etc., or any combination thereof. For example, the mass storage device may include a magnetic disk, an optical disk, a solid state drive, and the like. The removable storage device may include a flash drive, a floppy disk, an optical disk, a memory card, a compact disk, a magnetic tape, etc. The volatile read-write memory may include Random Access Memory (RAM). The RAM may include Dynamic RAM (DRAM), Double Data Rate Synchronous Dynamic RAM (DDRSDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), zero capacitance RAM (Z-RAM), and the like. The ROM may include Masked ROM (MROM), Programmable ROM (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), digital versatile disk ROM, and the like. In some embodiments, the memory 320 may store one or more programs and/or instructions to perform the exemplary methods described in this disclosure. For example, the memory 320 may store a program of the processing device 140 for determining a spatial basis matrix and one or more temporal basis matrices based on MR data. In some embodiments, the memory 320 may store the reconstructed MRI image and/or the spatial basis matrix and the one or more temporal basis matrices.

The input/output (I/O)330 may input and/or output signals, data, or information. In some embodiments, the input/output (I/O)330 may allow a user to interact with the processing device 140. In some embodiments, the input/output (I/O)330 may include an input device and an output device. For example, the input device may include a keyboard, mouse, touch screen, microphone, trackball, etc., or any combination thereof. For example, the output devices may include a display device, speakers, printer, projector, etc., or any combination thereof. For example, the display device may include a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) based display, a flat panel display, a curved screen, a television device, a Cathode Ray Tube (CRT), etc., or any combination thereof.

The communication port 340 may be connected to a network, such as the network 150, to facilitate data communication. The communication port 340 may establish a connection between the processing device 140 and the scanner 110, the user device 130, or the storage device 150. The connection may be a wired connection, a wireless connection, or a combination of both to enable data transmission and reception. The wired connection may include an electrical cable, an optical cable, a telephone line, etc., or any combination thereof. The wireless connection may include Bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile networks (e.g., 3G, 4G, 5G, etc.), the like, or any combination thereof. In some embodiments, the communication port 340 may comprise a standardized communication port, such as RS232, RS485, and the like. In some embodiments, the communication port 340 may be a specially designed communication port, for example, the communication port 340 may be designed according to digital imaging and communications in medicine (DICOM) protocol.

Fig. 4 is a schematic diagram of the hardware and/or software components of an exemplary mobile device 300 for implementing user device 130 in accordance with some embodiments of the present invention. As shown in FIG. 4, the mobile device 400 may include a communication platform 410, a display 420, a Graphics Processing Unit (GPU)430, a Central Processing Unit (CPU)440, I/O450, memory 460, and storage 490. In some embodiments, the mobile device 400 may also include, but is not limited to, any other suitable components of a system bus or controller (not shown). In some embodiments, a mobile operating system 470 (e.g., iOS, Android, Windows Phone, etc.) and one or more application programs 480 may be loaded from the memory 490 into the memory 460 for execution by the CPU 440. The application programs 480 may include a browser or any other suitable mobile application for receiving and presenting information from the processing device 140 related to image processing or other information. User interaction of information flows may be enabled through I/O450 and provided to the processing device 140 and/or other components of the medical system 100 via the network 120. By way of example only, a user of the medical system 100 (e.g., a doctor, technician, engineer, operator, etc.) may enter data relating to a subject being imaged/scanned via the I/O450, or a subject to be imaged/scanned via the I/O450. The data relating to the subject may include identification information (e.g., name, age, gender, medical history, contact information, physical results, etc.) and/or test information including attributes necessary to perform an MRI scan. The user may also input parameters required to operate the scanner 110, such as image contrast and/or ratio, region of interest (ROI), slice thickness, type of imaging (e.g., T1 weighted imaging, T2 weighted imaging, proton density weighted imaging, etc.), T1, T2, echo type (spin echo, Fast Spin Echo (FSE), fast recovery FSE, single shot FSE, gradient echo, fast imaging with steady state precession, etc.), flip angle value, acquisition Time (TA), echo Time (TE), repetition Time (TR), inversion Time (TI), saturation Time (TS), echo chain length (ETL), number of phases, Number of Excitations (NEX), bandwidth (e.g., radio frequency receiver bandwidth, radio frequency transmitter bandwidth, etc.), scan type, sampling type, point in time when MR data is acquired (e.g., cardiac phase, respiratory phase, etc.), etc., or any combination thereof. The I/O may also display MR images generated based on the sampled data.

In some embodiments, the I/O450 may include input devices and output devices. For example, the input device may include a keyboard, mouse, touch screen, microphone, trackball, etc., or any combination thereof. For example, the output devices may include a display device, speakers, printer, projector, etc., or any combination thereof. For example, the display device may include a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) based display, a flat panel display, a curved screen, a television device, a Cathode Ray Tube (CRT), etc., or any combination thereof.

To implement the various modules, units and their functions described in this disclosure, a computer hardware platform may be used as the hardware platform for one or more of the elements described herein. The hardware elements, operating systems and programming languages of such computers are conventional in nature and well known to those skilled in the art, accommodating blood pressure monitoring as described herein. A computer with user interface elements may be used to implement a Personal Computer (PC) or other type of workstation or terminal device, and, when suitably programmed, the computer may also act as a server. The structure, programming and general operation of such computer devices are well known to those skilled in the art and, accordingly, the drawings should be understood to be shallow.

Fig. 5 is a block diagram of an exemplary processing device in accordance with some embodiments of the invention. The processing device 140 may include a first input/output (I/O) module 510, a matrix determination module 520, and a first reconstruction module 530.

The first input/output module 510 may acquire Magnetic Resonance (MR) data associated with a region of interest (ROI) of a subject object. The MR data may be acquired by scanning the ROI with an MRI scanner (e.g., the scanner 110 of the medical system 100 in fig. 1) based on a plurality of parameters. The MR data may correspond to a plurality of time series images of the ROI. In some embodiments, the ROI may be one or more regions or volumes of the subject object. In some embodiments, the plurality of time series images may represent one or more kinetic parameters of the ROI, such as a T1 recovery parameter, a T2 attenuation parameter, a cardiac motion parameter, a respiratory motion parameter, a contrast agent kinetic parameter, or the like, or any combination thereof.

In some embodiments, different values of the plurality of parameters may be set by the user device 130. In some embodiments, the user device 130 may send instructions to the processing device 140 that include different values of the plurality of parameters to instruct the processing device 140 to control the scanner 110 to scan the ROI based on the different values of the plurality of parameters. In some embodiments, the user device 130 may send instructions regarding one or more related imaging protocols (e.g., protocols for brain imaging, protocols for cardiac imaging, protocols for lung imaging, etc.). The processing device 140 may set or search the plurality of parameters and/or the different values of the plurality of parameters from a storage device (e.g., the storage device 150 and/or the memory 320 of the processing device 140) according to instructions. In some embodiments, the processing device 140 may obtain an imaging plan (e.g., a command provided by a user or a command searched from a storage device), upon which the processing device 140 may obtain (e.g., from a storage device) a plurality of parameters corresponding to the imaging plan and/or different values for the plurality of parameters. The different values of the plurality of parameters may image one or more kinetic parameters of the ROI.

In some embodiments, the plurality of parameters may include cardiac phase, respiratory phase, imaging sequence parameters (e.g., inversion Time (TI), saturation Time (TS), etc.), and the like, or any combination thereof. In some embodiments, the cardiac phase may be indicative of a state of the heart of the subject at a certain point in time of a cardiac cycle (e.g., one cardiac cycle). For example, the cardiac phase may include an end diastole phase, an end systole phase, and the like. If the ROI comprises at least a part of the heart, different cardiac phases (e.g. different points in time corresponding to different cardiac phases) may be set in order to be able to image the dynamics of the cardiac motion of the ROI.

In some embodiments, the respiratory phase may be indicative of a state of a lung of the subject at a point in time in a respiratory cycle (e.g., one breath). For example, the respiratory phase may include an end-expiratory phase, an end-inspiratory phase, and the like. If the ROI comprises at least a portion of the lung, different respiratory phases (e.g., different points in time corresponding to different respiratory phases) may be set so as to enable imaging of the dynamics of the respiratory motion of the ROI.

In an inversion recovery sequence, a 180 ° rf pulse may be applied to rotate the magnetization to the negative plane prior to the excitation rf pulse. The interval between the 180 ° rf pulse and the excitation rf pulse may be referred to as the inversion time. In a saturation recovery sequence, a 90 ° rf pulse may be applied to rotate the magnetization to the transverse plane prior to the excitation rf pulse. The interval between the 90 ° rf pulse and the excitation rf pulse may be referred to as the saturation time. In some embodiments, different inversion times or saturation times may be set to dynamically restore the T1 of the ROI.

In some embodiments, the first input/output module 510 may acquire the MR data from the scanner 110 or the storage device 150.

The matrix determination module 520 may determine a data set based on the MR data, wherein the data set includes a spatial basis and one or more temporal bases. The spatial basis may comprise spatial information of the MR data. The one or more time bases may include timing information of the MR data. In some embodiments, the spatial basis or the temporal basis may include a function, a model, a vector, a matrix, a tensor, or the like, or any combination thereof. In some embodiments, the matrix determination module 520 may determine the spatial basis and one or more temporal bases based on the MR data and the low rank model. In some embodiments, the low rank model may represent a plurality of time series images. In some embodiments, the low rank model may represent that the correlation between the plurality of time series images is low rank. As used herein, a low rank means that the rank (e.g., a two-dimensional (2D) matrix or multidimensional tensor) of the model is less than the number (or count) of any one-dimensional element of the model (e.g., the number of columns and rows of the matrix). Further, a low rank represents that the rank of a model is much smaller than the number (or count) of any one-dimensional element of the model. For example, the rank of the model is less than 50%, 40%, 30%, 20%, 10%, etc. of the minimum number (or count) of elements per dimension in the model. In some embodiments, the low rank model may take the form of a spatial basis and one or more temporal bases corresponding to the acquired MR data. Due to the low rank feature, the spatial basis and the one or more temporal bases may have a smaller file size or data size than the plurality of time-series images. In the present invention, the space base and the one or more time bases may be stored instead of all of the plurality of time-series images, so that the occupied storage space may be reduced.

In some embodiments, the low rank model may comprise a low rank matrix. The spatial basis may comprise a spatial basis matrix. The one or more time bases may include a time base matrix.

In some embodiments, the low rank model may comprise a low rank tensor. The data set may also include a core tensor. The spatial basis may comprise a spatial basis matrix. The one or more time bases may include more than two time base matrices. The low rank tensor can be a multi-dimensional tensor of spatial dimensions, which can include pixel (or voxel) locations in the plurality of time series images and two or more time dimensions, where each time dimension corresponds to the ROI imaged according to a set of parameter values of the plurality of parameters.

The first reconstruction module 530 can reconstruct at least a portion of the images of the plurality of time series images based on the data set. In some embodiments, the first reconstruction module 530 may reconstruct at least a portion of the images of the plurality of time series images based on the spatial basis and the one or more temporal bases. In some embodiments, the first reconstruction module 530 may reconstruct at least a portion of the images of the plurality of time series images based on the spatial basis matrix and the one or more temporal basis matrices. In some embodiments, the reconstructed image may include at least one set of images of the plurality of time series of images representing a change in parameter values of the plurality of parameters over time (e.g., a dynamics of the ROI). For example, as shown in column a1 of fig. 7B, the reconstructed images may include a first set of images 702 representing respiratory motion of the ROI, a second set of images 703 representing cardiac motion of the ROI, and a third set of images 704 representing T1 recovery of the ROI. In another embodiment, the first reconstruction module 530 may generate a fourth set of images representing a T1 restoration of the ROI and a fifth set of images representing cardiac motion of the ROI. The first reconstruction module 530 may determine T1 values corresponding to each pixel or voxel of the ROI based on the fourth set of images. The first reconstruction module 530 may convert the fifth set of images to false color images (e.g., the image 705 shown in column a2 in fig. 7B) based on the T1 values corresponding to each pixel or voxel of the ROI. In some embodiments, the first reconstruction module 530 may assign a color value for each T1 value. For example only, the T1 values for the pixels or voxels of the fifth set of images may be in the range of 0-3 seconds. The first reconstruction module 530 may determine a color bar 701 (e.g., as shown in column a2 in fig. 7B), where the color bar 701 includes color values for T1 values. The first reconstruction module 530 may generate the false color image by converting a grayscale value of each pixel or voxel in the fifth set of images into a corresponding color value. The false color image may represent the T1 value and the cardiac motion of the ROI. In another embodiment, the reconstructed image may include one or more other images (e.g., one or more other images shown in column a3 of fig. 7B) of the plurality of time series images of interest to the user, such as one or more end diastole or end systole images.

In some embodiments, a reconstruction instruction may be set in a scan protocol by, for example, the user device 130, where the reconstruction instruction is used to indicate which image of the plurality of time-series images needs to be reconstructed. For example, the reconstruction instructions may include a value of at least one of the plurality of parameters corresponding to each portion of the plurality of time series images. For example, the ROI of the subject object may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. If it is desired to reconstruct a set of images of the plurality of time series images, wherein the set of images represents a T1 recovery of the ROI, the reconstruction instructions may include one of 20 cardiac phases, one of 5 respiratory phases, and 344 inversion times.

In some embodiments, the reconstruction instructions may be acquired according to default settings of the medical system 100. In some embodiments, the reconstruction instructions may be set manually by a user, or automatically by the user device 130 based on, for example, an imaging protocol or plan (e.g., instructions) of the ROI, and/or a clinical need of a user and/or the subject. For example, in a T1 image, the reconstruction instructions may instruct reconstructing a set of images, wherein the set of images represents a T1 restoration of the ROI.

In some embodiments, the first reconstruction module 530 may reconstruct the partial images of the plurality of time series images based on steps 930 and 940 of the flow 900 in fig. 9.

In some embodiments, the first input/output module 510 may send the reconstructed image to the user device 130 and instruct the user device 130 to display the reconstructed image in an interface of the user device 130. For example, as shown in fig. 7B, the reconstructed images may include a first set of images 702 representing respiratory motion of the ROI, a second set of images 703 representing cardiac motion of the ROI, a third set of images 704 representing T1 recovery of the ROI, a set of false color images 705 representing T1 values and cardiac motion of the ROI, and one or more other images of the plurality of time series of images of interest to the user. The user device 130 may display the first group of images 702, the second group of images 703 and the third group of images 704 in column a 1. The user device 130 may display the set of false color images 705 in column a 2. The user device 130 may display one or more other images of interest to the user in column a 3. In some embodiments, the user device 130 may simultaneously display at least one of the column a1, the column a2, and the column A3 in the interface of the user device 130.

The first input/output module 510 may store the reconstructed image and at least a portion of the data set in a storage device. In some embodiments, the first input/output module 510 may store the spatial basis and the one or more temporal bases in the storage device. In some embodiments, the first input/output module 510 may store the spatial basis matrix and the temporal basis matrix in the storage device. In some embodiments, the first input/output module 510 may store the spatial basis matrix, the two or more temporal element matrices, and the core tensor in the storage device. In some embodiments, the first input/output module 510 may store a combination (e.g., product) of the core tensor and the spatial basis matrix and at least one of the two or more temporal basis matrices in the memory device. Further, the first input/output module 510 may store a combination (e.g., product) of the core tensor and the spatial basis matrix in the memory device. In some embodiments, the storage device may include the storage device 150 and/or the memory 320 of the processing device 140. In some embodiments, a user may employ the user device 130 to access the storage device to acquire the reconstructed image and the stored data set.

The modules in the processing device 140 may be connected or communicate with each other through a wired connection or a wireless connection. The wired connection may include an electrical cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may include a Local Area Network (LAN), a Wide Area Network (WAN), bluetooth, ZigBee, Near Field Communication (NFC), etc., or any combination thereof. Two or more modules may be combined into a single module, and any one module may be divided into two or more units. For example, the first input/output module 510 may be divided into two units. One of the two units may be used to acquire MR data from the scanner 110 and the other of the two units may be used to send a spatial basis matrix and one or more temporal basis matrices to the storage device 150 and/or to send the reconstructed images to the user device 130 and display.

It should be noted that the above description is only intended to illustrate the present invention and is not intended to limit the scope of the present invention. Many variations and modifications are possible to those skilled in the art in light of the teachings of this invention. However, such changes and modifications do not depart from the scope of the present invention. For example, the processing device 140 may also include a storage module (not shown in fig. 5). The storage module may be used to store data generated by any process performed by any component of the processing device 140. In another embodiment, each component of the processing device 140 may include a storage device. Alternatively, the components of the processing device 140 may share a common memory device. In another embodiment, the first reconstruction module 530 may be omitted.

FIG. 6 is a flow chart of an exemplary process for MRI reconstruction of some embodiments of the present invention. In some embodiments, the process 600 may be implemented in the medical system 100 shown in FIG. 1. For example, the flow 600 may be stored in a storage medium (e.g., the storage device 150 or the memory 320 of the processing device 140) in the form of instructions and may be invoked and/or executed by the processing device 140 (e.g., the processor 310 of the processing device 140, or one or more modules in the processing device 140 shown in fig. 5). The steps of the illustrated flow 600 presented below are intended to be illustrative. In some embodiments, the flow 600 may be accomplished with one or more additional steps not described and/or without employing one or more of the steps discussed. Additionally, the order of the steps of flow 600 shown in FIG. 6 and described below is not limiting.

In step 610, the processing device 140 (e.g., the first input/output module 510) may acquire Magnetic Resonance (MR) data associated with a region of interest (ROI) of a subject object. The MR data may be acquired by scanning the ROI with an MRI scanner (e.g., the scanner 110 of the medical system 100 in fig. 1) based on a plurality of parameters. The MR data may correspond to a plurality of time series images of the ROI. In some embodiments, the ROI may be one or more regions or volumes of the subject object. In some embodiments, the plurality of time series images may represent one or more kinetic parameters of the ROI, such as a T1 recovery parameter, a T2 attenuation parameter, a cardiac motion parameter, a respiratory motion parameter, a contrast agent kinetic parameter, or the like, or any combination thereof.

In some embodiments, different values of the plurality of parameters may be set by the user device 130. In some embodiments, the user device 130 may send instructions to the processing device 140 that include different values of the plurality of parameters to instruct the processing device 140 to control the scanner 110 to scan the ROI based on the different values of the plurality of parameters. In some embodiments, the user device 130 may send instructions regarding one or more related imaging protocols (e.g., protocols for brain imaging, protocols for cardiac imaging, protocols for lung imaging, etc.). The processing device 140 may set or search the plurality of parameters and/or the different values of the plurality of parameters from a storage device (e.g., the storage device 150 and/or the memory 320 of the processing device 140) according to instructions. In some embodiments, the processing device 140 may obtain an imaging plan (e.g., a command provided by a user or a command searched from a storage device), upon which the processing device 140 may obtain (e.g., from a storage device) a plurality of parameters corresponding to the imaging plan and/or different values for the plurality of parameters. The different values of the plurality of parameters may image one or more kinetic parameters of the ROI.

In some embodiments, the plurality of parameters may include cardiac phase, respiratory phase, imaging sequence parameters (e.g., inversion Time (TI), saturation Time (TS), etc.), and the like, or any combination thereof. In some embodiments, the cardiac phase may be indicative of a state of the heart of the subject at a certain point in time of a cardiac cycle (e.g., one cardiac cycle). For example, the cardiac phase may include an end diastole phase, an end systole phase, and the like. If the ROI comprises at least a part of the heart, different cardiac phases (e.g. different points in time corresponding to different cardiac phases) may be set in order to be able to image the dynamics of the cardiac motion of the ROI.

In some embodiments, the respiratory phase may be indicative of a state of a lung of the subject at a point in time in a respiratory cycle (e.g., one breath). For example, the respiratory phase may include an end-expiratory phase, an end-inspiratory phase, and the like. If the ROI comprises at least a portion of the lung, different respiratory phases (e.g., different points in time corresponding to different respiratory phases) may be set so as to enable imaging of the dynamics of the respiratory motion of the ROI.

In an inversion recovery sequence, a 180 ° rf pulse may be applied to rotate the magnetization to the negative plane prior to the excitation rf pulse. The interval between the 180 ° rf pulse and the excitation rf pulse may be referred to as the inversion time. In a saturation recovery sequence, a 90 ° rf pulse may be applied to rotate the magnetization to the transverse plane prior to the excitation rf pulse. The interval between the 90 ° rf pulse and the excitation rf pulse may be referred to as the saturation time. In some embodiments, different inversion times or saturation times may be set to dynamically restore the T1 of the ROI.

In some embodiments, the processing device 140 may acquire the MR data from the scanner 110 or the storage device 150.

In step 620, the processing device 140 (e.g., the matrix determination module 520) may determine a data set based on the MR data, wherein the data set includes a spatial basis and one or more temporal bases. The spatial basis may comprise spatial information of the MR data. The one or more time bases may include timing information of the MR data. In some embodiments, the spatial basis or the temporal basis may include a function, a model, a vector, a matrix, a tensor, or the like, or any combination thereof.

In some embodiments, the information represented by the plurality of time series images has a temporal correlation in addition to the spatial correlation. Taking a cardiac dynamic magnetic resonance image as an example, the values of neighboring pixels (or voxels) may vary similarly over time due to the approximately periodic motion of the heart. This similarity can be translated into a low rank characteristic of a model representing multiple time series images for image reconstruction.

In some embodiments, the processing device 140 may determine the spatial basis and the one or more temporal bases based on the MR data and the low rank model. In some embodiments, the low rank model may represent a plurality of time series images. In some embodiments, the low rank model may represent that the correlation between the plurality of time series images is low rank. As used herein, a low rank means that the rank (e.g., a two-dimensional (2D) matrix or multidimensional tensor) of the model is less than the number (or count) of any one-dimensional element of the model (e.g., the number of columns and rows of the matrix). Further, a low rank represents that the rank of a model is much smaller than the number (or count) of any one-dimensional element of the model. For example, the rank of the model is less than 50%, 40%, 30%, 20%, 10%, etc. of the minimum number (or count) of elements per dimension in the model. In some embodiments, the low rank model may take the form of a spatial basis and one or more temporal bases corresponding to the acquired MR data. Due to the low rank feature, the spatial basis and the one or more temporal bases may have a smaller file size or data size than the plurality of time-series images. In the present invention, the space base and the one or more time bases may be stored instead of all of the plurality of time-series images, so that the occupied storage space may be reduced.

In some embodiments, the low rank model may comprise a low rank matrix. The spatial basis may comprise a spatial basis matrix. The one or more time bases may include a time base matrix. For example, f (γ, t) may represent a spatio-temporal signal of the MR data, γ may represent a position in the ROI (e.g., two-dimensional (2D) coordinates (x, y) or three-dimensional (3D) coordinates (x, y, z)) (the position of pixels or voxels in the plurality of time-series images), and t may represent a point in time. The low rank matrix F may be represented by the following formula (1):

wherein m represents the number (or count) of pixels (or voxels) of each image in the plurality of time-series images; n represents the number (or count) of the plurality of time-series images; each column in the low-rank matrix F represents one of the plurality of time-series images; each row in the low rank matrix F represents a variation over time of the signal intensity (or gray value) of a pixel (or voxel) in a plurality of time series images corresponding to the same position in the region of interest.

In each of the plurality of time-series images, the difference between the values of the pixels (or voxels) may be relatively small. The values of neighboring pixels (or voxels) in the plurality of time series images may be similar in change over time. Therefore, there may be a relatively strong correlation between the row vectors of the matrix F and a relatively strong correlation between the column vectors of the matrix F, which makes the matrix F have a low rank characteristic, e.g., rank r < min (m, n) of F.

In some embodiments, f (γ, t) may represent MR images with different contrasts. Such as images acquired at different TEs, different TRs, and different flip angles. There may be strong correlation between the row vectors of the matrix F and strong correlation between the column vectors of the matrix F, which makes the matrix F have low rank characteristics, such as rank r < min (m, n) of F.

In some embodiments, the low rank model F may be decomposed into a spatial basis matrix Us and a temporal basis matrix Vt based on singular value decomposition (SVD for short, and single value decomposition for all english):

F=UsVtT(2)

in some embodiments, the spatial basis matrix Us may include signals of MR data corresponding to pixels (or voxels) in the plurality of time-series images and index positions of the pixels (or voxels) in the plurality of time-series images. The time base matrix Vt may include a plurality of time points, each time point representing an imaging time of one time-series image of the plurality of time-series images, and the plurality of time points may correspond to a time taken for an entire scanning process of the ROI.

In some embodiments, the low rank matrix F may include m × n elements, the spatial basis matrix Us may include m × r elements, and the temporal basis matrix Vt may include r × n elements, where r refers to a rank of the low rank matrix F. In the present invention, the spatial basis matrix Us and the temporal basis matrix Vt (e.g., r (m + n) elements) may be stored instead of all of the plurality of time-series images (e.g., m × n elements). Due to the low rank nature of the matrix F, r may be smaller or even much smaller than m and n, thus reducing the amount of data and/or file size for storage and/or transmission.

In some embodiments, the processing device 140 may determine a spatial basis matrix U's comprising m x r ' elements and a temporal basis matrix V't comprising r ' x n elements, where r ' is less than the rank of the low rank matrix F. The combination of the spatial basis matrix U's and the temporal basis matrix V't may approximate the low rank matrix F. Presenting the acquired MR data in the form of the spatial basis matrix U's and the temporal basis matrix V't (e.g., storing r ' (m + n) elements) may further reduce the amount of data and/or file size for storage and/or transmission and reduce noise of the plurality of time series images.

In some embodiments, the processing apparatus 140 may determine the spatial basis matrix and the temporal basis matrix based on Singular Value Decomposition (SVD) of an acquired or reconstructed image sequence.

In some embodiments, the low rank model may comprise a low rank tensor. The data set may also include a core tensor. The spatial basis may comprise a spatial basis matrix. The one or more time bases may include more than two time base matrices. The low rank tensor can be a multi-dimensional tensor of spatial dimensions, which can include pixel (or voxel) locations in the plurality of time series images and two or more time dimensions, where each time dimension corresponds to the ROI imaged according to a set of parameter values of the plurality of parameters.

In each of the plurality of time-series images, the difference between the values of the pixels (or voxels) may be relatively small. The values of neighboring pixels (or voxels) in the plurality of time series images may be similar in change over time. Thus, the tensor may have the low rank characteristic.

In some embodiments, the low rank tensor can be decomposed into a core tensor, a spatial basis matrix, and more than two temporal basis matrices based on a Tucker decomposition.

In some embodiments, the spatial basis matrix may correspond to a spatial dimension of the low rank tensor. The spatial basis matrix may include signals of MR data corresponding to pixels (or voxels) in the plurality of time-series images and index positions of the pixels (or voxels) in the plurality of time-series images. Each of the two or more time base matrices may correspond to one of two or more time dimensions of the low rank tensor.

For example, the ROI of the subject object may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. The low rank tensor may be a four dimensional tensor comprising a spatial dimension and three temporal dimensions corresponding to cardiac motion, respiratory motion, and T1 recovery, respectively. A core tensor, a spatial basis matrix corresponding to the spatial dimension, and three temporal basis matrices corresponding to three temporal dimensions may be determined. A first time basis matrix of the three time basis matrices may correspond to a time dimension of cardiac motion of the low rank tensor and index 20 cardiac phases (e.g., 20 time points corresponding to 20 cardiac phases). A second time base matrix of the three time base matrices may correspond to a time dimension of respiratory motion of the low rank tensor and index 5 respiratory phases (e.g., 5 time points corresponding to 5 respiratory phases). A third time base matrix of the three time base matrices may correspond to the time dimension of the T1 recovery of the low rank tensor and index 344 inversion times.

By way of example only, fig. 7A is a schematic diagram of the Tucker factorization of low rank three dimensional tensors of some embodiments of the present invention. The low rank tensor a may include one spatial dimension and two temporal dimensions. As shown in fig. 7A, the low rank tensor a may be decomposed into a core tensor G, a spatial basis matrix Ux corresponding to the spatial dimension, and two temporal basis matrices Ut corresponding to two temporal dimensions according to a Tucker decomposition1And Ut2

The low rank tensor A may include J × K × l elements, and the spatial basis matrix Ux may include J × r1Elements of the time basis matrix Ut1May include K × r2Individual elements, the time basis matrix Ut2 may include L × r3The core tensor G may include r1×r2×r3An element of wherein r1Representing the rank, r, of the spatial basis matrix Ux2Representing the spatial basis matrix Ut1Rank of (d), r3Representing the spatial basis matrix Ut2Is determined. In the present invention, the core tensor G, the spatial basis matrix Ux and the two temporal basis matrices Ut may be stored1And Ut2(e.g., r)1r2r3+Jr1+Kr2+Lr3One element) instead of all of the plurality of time series images (e.g., J × K × l elements)1,r2,r3May be less than or even much less than J, K and L, which may reduce the amount of data and/or file size for storage and/or transmission.

In some embodiments, the processing device 140 may determine to include J × r'1Spatial basis matrix U 'x of elements, comprising K × r'2Time basis matrix U't of individual elements1Comprising l × r'3Time basis matrix U't of individual elements2And comprises r'1×r′2×r′2A core tensor G 'of elements, where r'1Rank, r ' less than the spatial base matrix U ' x '2Less than the time basis matrix U' t1Of rank, r'2Less than the time basis matrix U' t2Is determined. The space basis matrix U 'x and the time basis matrix U' t1The time base matrix U' t2And the core tensor G' may be approximated in combination by the low rank tensor a. Using the space basis matrix U 'x and the time basis matrix U' t1The time base matrix U' t2And the form of the core tensor G '(e.g., r'1r′2r′3+Jr′1+Kr′2+Lr′3Multiple elements) presents the acquired MR data may further reduce the amount of data and/or file size for storage and/or transmission and reduce noise of the multiple time series images.

In some embodiments, the processing device 140 may determine the core tensor, the spatial basis matrix, and the two or more temporal basis matrices based on a Tucker decomposition of an acquired or reconstructed image. In step 630, the processing device 140 (e.g., the first reconstruction module 530) may reconstruct at least a portion of the images of the plurality of time series images based on the data set. In some embodiments, the processing device 140 may reconstruct at least a portion of the images of the plurality of time series images based on the spatial basis and the one or more temporal bases. In some embodiments, the processing device 140 may reconstruct at least a portion of the images of the plurality of time series images based on the spatial basis matrix and the one or more temporal basis matrices. In some embodiments, the reconstructed image may include at least one set of images of the plurality of time series of images representing a change in parameter values of the plurality of parameters over time (e.g., a dynamics of the ROI). For example, as shown in column a1 of fig. 7B, the reconstructed images may include a first set of images 702 representing respiratory motion of the ROI, a second set of images 703 representing cardiac motion of the ROI, and a third set of images 704 representing T1 recovery of the ROI. In another embodiment, the processing device 140 may generate a fourth set of images representing a T1 recovery of the ROI and a fifth set of images representing cardiac motion of the ROI. The processing device 140 may determine a T1 value corresponding to each pixel or voxel of the ROI based on the fourth set of images. The processing device 140 may convert the fifth set of images to a false color image (e.g., the image 705 shown in column a2 in fig. 7B) based on the T1 values corresponding to each pixel or voxel of the ROI. In some embodiments, the processing device 140 may assign a color value for each T1 value. For example only, the T1 values for the pixels or voxels of the fifth set of images may be in the range of 0-3 seconds. The processing device 140 may determine a color bar 701 (e.g., as shown in column a2 in fig. 7B), where the color bar 701 includes color values for T1 values. The processing device 140 may generate the false color image by converting a grayscale value of each pixel or voxel in the fifth set of images to a corresponding color value. The false color image may represent the T1 value and the cardiac motion of the ROI. In another embodiment, the reconstructed image may include one or more other images (e.g., one or more other images shown in column a3 of fig. 7B) of the plurality of time series images of interest to the user, such as one or more end diastole or end systole images. Such as one or more end diastole or end systole images.

In some embodiments, a reconstruction instruction may be set in a scan protocol by, for example, the user device 130, where the reconstruction instruction is used to indicate which image of the plurality of time-series images needs to be reconstructed. For example, the reconstruction instructions may include a value of at least one of the plurality of parameters corresponding to each portion of the plurality of time series images. For example, the ROI of the subject object may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. If it is desired to reconstruct a set of images of the plurality of time series images, wherein the set of images represents a T1 recovery of the ROI, the reconstruction instructions may include one of 20 cardiac phases, one of 5 respiratory phases, and 344 inversion times.

In some embodiments, the reconstruction instructions may be acquired according to default settings of the medical system 100. In some embodiments, the reconstruction instructions may be set manually by a user, or automatically by the user device 130 based on, for example, an imaging protocol or plan (e.g., instructions) of the ROI, and/or a clinical need of a user and/or the subject. For example, in a T1 image, the reconstruction instructions may instruct reconstructing a set of images, wherein the set of images represents a T1 restoration of the ROI.

In some embodiments, the processing device 140 may reconstruct the partial images of the plurality of time series images based on steps 930 and 940 of the flow 900 in fig. 9.

In some embodiments, the processing device 140 may send the reconstructed image to the user device 130 and instruct the user device 130 to display the reconstructed image in an interface of the user device 130. For example, as shown in fig. 7B, the reconstructed images may include a first set of images 702 representing respiratory motion of the ROI, a second set of images 703 representing cardiac motion of the ROI, a third set of images 704 representing T1 recovery of the ROI, a set of false color images 705 representing T1 values and cardiac motion of the ROI, and one or more other images of the plurality of time series of images of interest to the user. The user device 130 may display the first group of images 702, the second group of images 703 and the third group of images 704 in column a 1. The user device 130 may display the set of false color images 705 in column a 2. The user device 130 may display one or more other images of interest to the user in column a 3. In some embodiments, the user device 130 may simultaneously display at least one of the column a1, the column a2, and the column A3 in the interface of the user device 130.

In step 640, the processing device 140 (e.g., the first input/output module 510) may store the reconstructed image and at least a portion of the data set in a storage device. In some embodiments, the processing device 140 may store the spatial basis and the one or more temporal bases in the storage device. In some embodiments, the processing device 140 may store the spatial basis matrix and the temporal basis matrix in the storage device. In some embodiments, the processing device 140 may store the spatial basis matrix, the two or more temporal element matrices, and the core tensor in a storage device. In some embodiments, the processing device 140 may store a combination (e.g., product) of the core tensor and the spatial basis matrix and at least one of the two or more temporal basis matrices in the storage device. Further, the processing device 140 may store a combination (e.g., product) of the core tensor and the spatial basis matrix in the storage device. In some embodiments, the storage device may include the storage device 150 and/or the memory 320 of the processing device 140. In some embodiments, a user may employ the user device 130 to access the storage device to acquire the reconstructed image and the stored data set.

It should be noted that the above description is only intended to illustrate the present invention and is not intended to limit the scope of the present invention. Many variations and modifications are possible to those skilled in the art in light of the teachings of this invention. However, such changes and modifications do not depart from the scope of the present invention. For example, step 630 may be performed before, after, or simultaneously with the step of storing the spatial basis matrix and the one or more temporal basis matrices in the storage device. In another embodiment, step 630 may be omitted.

Fig. 8 is a block diagram of an exemplary user device of some embodiments of the invention. The processing device 140 may include a second input/output (I/O) module 810 and a second re-modeling block 820.

The second input/output module 810 can retrieve a data set including a spatial basis and one or more temporal bases from a storage device. The spatial basis and the one or more temporal bases may correspond to a plurality of time series images of a region of interest (ROI) of the subject object. The spatial basis may include spatial information of the plurality of time-series images, and the one or more temporal bases may include timing information of the plurality of time-series images. In some embodiments, the spatial basis or temporal basis may include functions, models, vectors, matrices, tensors, and the like, or any combination thereof.

In some embodiments, the second input/output module 810 may obtain a spatial basis matrix and a temporal basis matrix, the combination of which represents a low rank matrix corresponding to the plurality of time series images. In some embodiments, the data set may also include a core tensor. The second input/output module 810 can obtain a core tensor, a spatial basis matrix, and two or more temporal basis matrices, which in combination represent a low rank tensor corresponding to the plurality of time series images. In some embodiments, the second input/output module 810 can obtain a combination (e.g., product) of the core tensor and at least one of the spatial basis matrix and the two or more temporal basis matrices. In some embodiments, the second input/output module 810 may obtain the two or more temporal basis matrices and a combination (e.g., product) of the core tensor and the spatial basis matrix.

The second input/output module 810 can receive instructions for reconstructing one or more target images of the plurality of time series images.

In some embodiments, the instructions may include a value for at least one of the plurality of parameters corresponding to each of the one or more target images. For example, the ROI of the subject object may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. A user may select one of the 20 cardiac phases, one of the 5 respiratory phases, and one of the 344 inversion times in an interface of the user device 130 to generate instructions for reconstructing a target image of the plurality of time-series images.

The second reconstruction block 820 may reconstruct the one or more target images based on the data set and the instructions. In some embodiments, the second re-modeling block 820 may determine a time base subset for each of the one or more time bases for each of the one or more target images based on the instructions. The second reconstruction modeling block 820 may reconstruct the target image based on the data set and the one or more time-based subsets.

In some embodiments, the second re-modeling block 820 may determine temporal information corresponding to the target image of the one or more temporal basis matrices based on a value of at least one of a plurality of parameters in the instructions. The second re-modeling block 820 may determine a time basis submatrix for each of the one or more time basis matrices based on the time information.

In some embodiments, the second re-modeling block 820 may determine a spatial basis matrix Us and a temporal basis matrix Vt based on the MR data and low rank matrices corresponding to the plurality of time series images. The time base matrix Vt may include a plurality of time points, each time point representing an imaging time of one time-series image of the plurality of time-series images. The multiple time points may represent time taken for an entire scanning process of the ROI. The second re-modeling block 820 may determine a point in time n corresponding to the target image based on the instruction. For example, the ROI of the subject object may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. The user may generate instructions for reconstructing the target image by specifying which cardiac phase of the 20 cardiac phases, which respiratory phase of the 5 respiratory phases, and/or which inversion time of the 344 inversion times to use for image reconstruction. The second reconstruction block 820 may determine a time point n corresponding to the target image based on the cardiac phase, the respiratory phase, and the inversion time selected in the instruction. In some embodiments, the second re-modeling block 820 may determine a time base submatrix Vt (n) of the time base matrix Vt based on the time point n.

In some embodiments, the processing device 140 may determine a kernel based on the MR data and a low rank N +1 dimensional tensor corresponding to the plurality of time series imagesThe heart tensor G, the spatial basis matrix Ux and the N time basis matrices Ut1、Ut2、…、Utn. Each of the N time base matrices may correspond to one of N time dimensions of the low rank N +1 dimensional tensor and index a different value of one of the plurality of parameters. For each of the N time base matrices, the second reconstruction module 820 may determine a point in time along a respective time dimension based on the value of the respective parameter in the instruction. For example, the ROI of the subject target may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. The user may generate instructions for reconstructing the target image by specifying which cardiac phase of the 20 cardiac phases, which respiratory phase of the 5 respiratory phases, and/or which inversion time of the 344 inversion times to use for image reconstruction. The second re-modeling block 820 may determine a point in time n along a time dimension of cardiac motion based on the cardiac phase selected in the instruction1. The second re-modeling block 820 may determine a point in time n along a time dimension of respiratory motion based on the respiratory phase selected in the instruction2. The second re-modeling block 820 may determine a point in time n of the time dimension of T1 recovery based on the inversion time selected in the instruction3. In some embodiments, the second re-modeling block 820 may be based on the time point n1The time point n2…, time nNDetermining the two or more time basis matrices Ut1、Ut2、…、UtnTime basis submatrix Ut of each time basis matrix1(n1)、Ut2(n2)、…、Utn(nN)。

In some embodiments, the second re-modeling block 820 may reconstruct the target image based on the spatial basis matrix and the one or more temporal basis submatrices.

In some embodiments, the spatial basis matrix may include signals at different time points for different pixel (or voxel) locations of the plurality of time series images. The operation of generating the target image based on the time basis submatrix of each of the one or more time basis matrices and the space basis matrix may be regarded as extracting the target image from the space basis matrix using the time basis submatrix of each of the one or more time basis matrices.

In some embodiments, the second reconstruction module 820 may generate the target image by determining a product of the spatial basis matrix Us and a temporal basis submatrix Vt (n) of the temporal basis matrix Vt.

In some embodiments, the second reconstruction module 820 may reconstruct the image by determining the core tensor G, the spatial basis matrix Ux, and the N temporal basis matrices Ut1、Ut2、…、UtnTime basis submatrix Ut of each time basis matrix in1(n1)、Ut2(n2)、…、Utn(nN) The product of (a) to generate a target image.

The second input/output module 810 may display the one or more target images.

The second input/output module 810 may store at least one of the one or more target images in the storage device.

The modules in the user device 130 may be connected or communicate with each other through a wired connection or a wireless connection. The wired connection may include an electrical cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may include a Local Area Network (LAN), a Wide Area Network (WAN), bluetooth, ZigBee, Near Field Communication (NFC), etc., or any combination thereof. Two or more modules may be combined into a single module, and any one module may be divided into two or more units. For example, the second input/output module 810 may be divided into two units. One of the two units may be used to retrieve the spatial basis matrix and the one or more temporal basis matrices from the storage medium, and the other of the two units may be used to transmit one or more reconstructed images to the storage medium.

It should be noted that the above description is only intended to illustrate the present invention and is not intended to limit the scope of the present invention. Many variations and modifications are possible to those skilled in the art in light of the teachings of this invention. However, such changes and modifications do not depart from the scope of the present invention. For example, the user device 130 may further include a storage module (not shown in fig. 8). The storage module may be used to store data generated by any process performed by any component of the user device 130. In another embodiment, each component of the user device 130 may include a storage device. Alternatively, the components of the user device 130 may share a common storage device.

FIG. 9 is a flow chart of an exemplary process for MRI reconstruction of some embodiments of the present invention. In some embodiments, the process 900 may be implemented in the medical system 100 shown in FIG. 1. For example, the process 900 may be stored in a storage medium (e.g., the storage device 150, the memory 490 of the user device 130, or the memory 460 of the user device 130) in the form of instructions, and may be executed by the user device 130 (e.g., the CPU 440 of the user device 130, the GPU 430 of the user device 130, or more modules in the user device 130 shown in fig. 8). The steps of the illustrated flow 900 presented below are intended to be illustrative. In some embodiments, the process 900 may be accomplished with one or more additional steps not described and/or without one or more of the steps discussed. Additionally, the order of the steps of flow 900 as shown in FIG. 9 and described below is not limiting.

In some embodiments, the user device 130 may perform the process 900 when a user wishes to view one or more of the plurality of time-series images (also referred to as one or more target images) on the user device 130.

In step 910, the user device 130 (e.g., the second input/output module 810) may retrieve a data set from a storage device, wherein the data set includes a spatial basis and one or more temporal bases. The spatial basis and the one or more temporal bases correspond to a plurality of time series images of a region of interest (ROI) of the object under test. The spatial basis may include spatial information of the plurality of time series images. The one or more time bases may include timing information of the plurality of time series images. In some embodiments, the spatial basis or temporal basis may include a function, a model, a vector, a matrix, a tensor, or the like, or any combination thereof.

In some embodiments, the user equipment 130 may obtain a spatial basis matrix and a temporal basis matrix, the combination of which represents a low rank matrix corresponding to the plurality of time series images. In some embodiments, the data set may also include a core tensor. The user device 130 may obtain a core tensor, a spatial basis matrix, and two or more temporal basis matrices, which in combination represent a low rank tensor corresponding to the plurality of time series images. In some embodiments, the user device 130 may obtain a combination (e.g., product) of the core tensor and the spatial basis matrix and at least one of the two or more temporal basis matrices. In some embodiments, the user device 130 may obtain the two or more temporal basis matrices, as well as a combination (e.g., product) of the core tensor and the spatial basis matrix.

In step 920, the user device 130 (e.g., the second input/output module 810) may receive instructions for reconstructing one or more target images of the plurality of time series images.

In some embodiments, the instructions may include a value for at least one of a plurality of parameters corresponding to each of the one or more target images. For example, the ROI of the subject object may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases and 344 inversion times. A user may select one of the 20 cardiac phases, one of the 5 respiratory phases, and one of the 344 inversion times in an interface of the user device 130 to generate instructions for reconstructing a target image of the plurality of time-series images.

In some embodiments, the user device 130 (e.g., the second reconstruction module 820) may reconstruct the one or more target images based on the data sets and the instructions. For example, the user device 130 may reconstruct the one or more target images by performing steps 930 and 940 in the process 900.

In step 930, the user device 130 (e.g., the second reconstruction module 820) may determine a temporal basis subset for each of the one or more temporal bases for each of the one or more target images based on the instructions.

In some embodiments, the user device 130 may determine temporal information corresponding to the target image of the one or more time-base matrices based on a value of at least one of a plurality of parameters in the instructions. The user equipment 130 may determine a time basis sub-matrix for each of the one or more time basis matrices based on the time information.

In some embodiments, the processing device 140 may determine a spatial basis matrix Us and a temporal basis matrix Vt based on the MR data and low rank matrices corresponding to the plurality of time series images. The time base matrix Vt may include a plurality of time points, each time point representing an imaging time of one time-series image of the plurality of time-series images. The multiple time points may represent time taken for an entire scanning process of the ROI. The user device 130 may determine a point in time n corresponding to the target image based on the instruction. For example, the ROI of the subject object may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. The user may generate instructions for reconstructing the target image by specifying which cardiac phase of the 20 cardiac phases, which respiratory phase of the 5 respiratory phases, and/or which inversion time of the 344 inversion times to use for image reconstruction. The user device 130 may determine a point in time n corresponding to the target image based on the cardiac phase, the respiratory phase, and the inversion time selected in the instructions. In some embodiments, the user equipment 130 may determine a time base submatrix Vt (n) of the time base matrix Vt based on the time point n.

In some embodiments, the processing device 140 may determine a core tensor G, a spatial basis matrix Ux and N temporal basis matrices Ut based on the MR data and a low rank N + 1-dimensional tensor corresponding to the plurality of time series images1、Ut2、…、Utn. Each of the N time base matrices may correspond to one of N time dimensions of the low rank N +1 dimensional tensor and index a different value of one of the plurality of parameters. For each of the N time basis matrices, the user device 130 may determine a point in time along a respective time dimension based on the value of the respective parameter in the instruction. For example, the ROI of the subject target may be scanned based on a plurality of parameters including 20 cardiac phases, 5 respiratory phases, and 344 inversion times. The user may generate instructions for reconstructing the target image by specifying which cardiac phase of the 20 cardiac phases, which respiratory phase of the 5 respiratory phases, and/or which inversion time of the 344 inversion times to use for image reconstruction. The user device 130 may determine a point in time n along the time dimension of the cardiac motion based on the cardiac phase selected in the instruction1. The user device 130 may determine a point in time n along a time dimension of respiratory motion based on the respiratory phase selected in the instruction2. The user device 130 may determine a point in time n of the time dimension of the T1 recovery based on the reversal time selected in the instruction3. In some embodiments, the user equipment 130 may be based on the time point n1The time point n2…, time nNDetermining the two or more time basis matrices Ut1、Ut2、…、UtnTime basis submatrix Ut of each time basis matrix1(n1)、Ut2(n2)、…、Utn(nN)。

In step 940, the user device 130 (e.g., the second reconstruction module 820) may reconstruct the target image based on the data set and the one or more time-based subsets. In some embodiments, the user device 130 may reconstruct the target image based on the spatial basis and the one or more subsets of temporal bases.

In some embodiments, the spatial basis matrix may include signals at different time points for different pixel (or voxel) locations of the plurality of time series images. The operation of generating the target image based on the time basis submatrix of each of the one or more time basis matrices and the space basis matrix may be regarded as extracting the target image from the space basis matrix using the time basis submatrix of each of the one or more time basis matrices.

In some embodiments, the user equipment 130 may generate the target image by determining a product of the spatial basis matrix Us and a temporal basis submatrix Vt (n) of the temporal basis matrix Vt.

I=Us×Vt(n) (3)

In the above formula, I represents the target image.

In some embodiments, the user equipment 130 may determine the core tensor G, the spatial basis matrix Ux and the N temporal basis matrices Ut by determining the core tensor G, the spatial basis matrix Ux and the N temporal basis matrices Ut1、Ut2、…、UtnTime basis submatrix Ut of each time basis matrix in1(n1)、Ut2(n2)、…、Utn(nN) To generate the target image:

I=G×1Ux×2Ut1(n13Ut2(n24…×N+1UtN(nN) (4)

in the above formula, ×iThe (i ═ 1, 2, 3, …, N +1) operator indicates the i-th mode product.

In step 950, the user device 130 (e.g., the second input/output module 810) may display the one or more target images.

In step 960, the user device 130 (e.g., the second input/output module 810) may store at least one of the one or more target images in the storage device.

If a user wishes to view multiple time series images in the user device 130, it is the spatial basis matrix and the one or more time basis matrices that are sent from the storage device (e.g., the PACS) to the user device 130 instead of the multiple time series images, which may reduce transmission pressure. With the spatial basis matrix and the one or more temporal basis matrices, a device with ordinary processing capabilities (e.g., the user device 130) may be employed to enable fast reconstruction and display of any one of the plurality of time series images.

In some embodiments, the user device 130 may retrieve from the storage device partial images of the plurality of time series images that have been reconstructed (e.g., in step 630 of flow 600 of fig. 6) and the spatial basis matrix and the one or more temporal basis matrices. When an instruction to view one of the partial images of the plurality of time-series images that have been reconstructed is received, the user device 130 may directly display the reconstructed image.

It should be noted that the above description is only intended to illustrate the present invention and is not intended to limit the scope of the present invention. Many variations and modifications are possible to those skilled in the art in light of the teachings of this invention. However, such changes and modifications do not depart from the scope of the present invention. For example, step 960 may be omitted.

In some embodiments, a user may input an instruction through an operation interface of the user device 130, the instruction being used to reconstruct one or more target images of the plurality of time-series images. In some embodiments, a user may select and/or input a value of at least one of the plurality of parameters via the operator interface to generate instructions for reconstructing the one or more target images. For example, different values of the plurality of parameters for scanning the ROI of the subject target to generate the MR data may be provided in an interface as a selected option. In another embodiment, if a user enters an illegal value of a parameter (e.g., a value not belonging to the parameter values of the plurality of parameters used to scan the ROI of the subject object to generate the MR data), an error message is prompted in the interface.

For example only, the user may select a value of the inversion time, a value of the cardiac phase, and a value of the respiratory phase. Based on the selected value, the user device 130 may perform step 930 and 950 to quickly reconstruct an image and display the reconstructed image in the operator interface.

In some embodiments, first, among the parameter values for scanning the ROI of the subject object to generate the plurality of parameters of the MR data, a user may select at least a plurality of values of a first parameter of the plurality of parameters and select values of the remaining parameters of the plurality of parameters through the operation interface to view the dynamics of the dimension of the ROI corresponding to the first parameter. The user device 130 may then perform step 930 and 950 to rapidly reconstruct and display the respective images in time sequence such that the dynamics of the dimensions of the ROI corresponding to the first parameter may be presented in the continuously displayed images.

In some embodiments, the user device 130 may simultaneously display more than two interfaces similar to the above-described operation interface.

Having thus described the basic concepts, it will be apparent to those skilled in the art from this detailed description that follows, the foregoing detailed disclosure is intended to be presented by way of example only, and not by way of limitation. Various alterations, improvements, and modifications may occur and may be practiced by those skilled in the art, though not expressly stated herein. Such alterations, modifications, and variations are intended to be suggested by this application and are intended to be within the spirit and scope of the exemplary embodiments of this application.

Furthermore, certain terminology is used to describe embodiments of the application. For example, the terms "one embodiment" and/or "some embodiments" mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment disclosed herein. Therefore, it is emphasized and should be appreciated that two or more references to "one embodiment" or "some embodiments" in various portions of this application are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the application.

Moreover, those skilled in the art will appreciate that various aspects of the invention may be illustrated and described in numerous patentable contexts, including any new and useful process, machine, manufacture, or composition of matter, or any new and useful modifications thereof. Thus, aspects of the present invention may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of software and hardware, which may all generally be referred to herein as a "unit," module "or" system. Furthermore, various aspects of the present invention may take the form of a computer program product that is presented by one or more computer-readable media including computer-readable program code.

The computer readable medium may include a propagated data signal with the computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, radio frequency, etc., or any suitable combination of the foregoing.

Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, SmallTalk, Eiffel, JADE, Emerald, C + +, C #, VB.net, Python, and the like, conventional programming languages, such as C, Visual Basic, Fortran 2103, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages, such as Python, Ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer, partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).

Moreover, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations therefore, is not intended to limit the claimed processes and methods to any order except as may be specified in the claims. While the foregoing disclosure discusses, by way of various examples, what are presently considered to be various useful embodiments of the present application, it is to be understood that such details are merely for purposes of explanation and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover modifications and equivalent arrangements included within the spirit and scope of the disclosed embodiments. For example, while the implementation of the various components described above may be embodied in a hardware device, it may also be implemented as a software-only solution, e.g., installation on an existing server or mobile device.

Similarly, it should be appreciated that in the foregoing description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the various inventive embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, embodiments of the application present less than all features of a single foregoing disclosed embodiment.

40页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种用于磁共振检测的患者姿势调节装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!