Abstract: A method and a system for imaging a target volume of a subject are presented. A plurality of gated functional images and a plurality of gated structural images corresponding to the target volume are received. An evolving mean functional image and an evolving mean cine structural image are computed by iteratively averaging a plurality of pixels corresponding to the gated functional and gated cine structural images, respectively. Further, transforms that align one or more of the gated functional images to the evolving mean functional image and one or more of the gated cine structural images to the evolving mean structural image are iteratively computed based on at least one matching metric until convergence of the evolving mean functional and evolving mean structural image. Subsequently, one or more motion-corrected functional images are generated based on the converged mean functional and mean structural image.
CLIAMS:1.A method for imaging a target volume of a subject, the method comprising:
receiving a plurality of gated functional images and a plurality of gated structural images corresponding to the target volume;
computing an evolving mean functional image and an evolving mean structural image by iteratively averaging a plurality of pixels corresponding to the plurality of gated functional images and the plurality of gated structural images, respectively;
iteratively computing transforms that align one or more of the plurality of gated functional images to the evolving mean functional image and one or more of the plurality of gated structural images to the evolving mean structural image based on at least one matching metric until convergence of the evolving mean functional image and the evolving mean structural image; and
generating one or more motion-corrected functional images corresponding to the target volume based on the converged mean functional image and the mean structural image.
2.The method of claim 1, further comprising selecting the at least one matching metric such that the at least one matching metric is configured to reduce a determined distance between the transforms computed in each iteration for one or more of the plurality of gated functional images and one or more of the plurality of gated structural images.
3.The method of claim 1, further comprising selecting the at least one matching metric such that the at least one matching metric is configured to reduce a determined distance between the evolving mean structural image and a selected diagnostic structural image.
4. The method of claim 1, further comprising selecting the at least one matching metric such that the at least one matching metric is configured to reduce a determined distance between the transforms computed in each iteration for the one or more of the gated functional images and a selected diagnostic structural image.
5.The method of claim 1, further comprising assigning determined weights to the at least one matching metric.
6.The method of claim 5, wherein assigning the determined weights comprises:
assigning a first range of weights to the at least one matching metric for registering one or more spatial locations in the target volume that exhibit less than a determined phase mismatch between the iteratively computed transforms corresponding to the plurality of the gated structural images and the plurality of functional images; and
assigning a second range of weights to the at least one matching metric for registering one or more spatial locations in the target volume that exhibit more than the determined phase mismatch between the iteratively computed transforms corresponding to the plurality of the gated structural images and the plurality of functional images;
wherein the first range of weights is higher than the second range of weights.
7. The method of claim 5, wherein assigning the determined weights comprises:
assigning a first range of weights to the at least one matching metric for registering one or more rigid spatial locations in the target volume; and
assigning a second range of weights to the at least one matching metric for registering one or more spatial locations in the target volume that exhibit non-rigid motion;
wherein the first range of weights is higher than the second range of weights.
8. The method of claim 1, further comprising:
determining a plurality of motion vector fields, wherein the plurality of motion vector fields are configured to map the plurality of the gated structural images and the plurality of functional images to the converged mean structural image and the mean functional image, respectively; and
correcting for motion artifacts in the plurality of gated functional images using the plurality of motion vector fields.
9. The method of claim 8, wherein determining the plurality of motion vector fields comprises determining one or more of a local motion model and a global motion model indicative of motion of the subject during acquisition of the plurality of gated functional images, the plurality of gated structural images corresponding to the target volume, or a combination thereof.
10.The method of claim 1, wherein iteratively computing the transforms comprises optimizing a determined objective function, wherein the determined objective function is representative of a joint group-wise non-rigid registration of the plurality of gated functional images to the evolving mean functional image and one or more of the plurality of gated structural images to the evolving mean structural image based on the at least one matching metric.
11.The method of claim 1, further comprising determining convergence of one or more of the evolving mean functional image and the evolving mean structural image based upon determined distances between the iteratively computed transforms corresponding to the plurality of the gated structural images and the plurality of functional images in a particular iteration.
12.The method of claim 1, further comprising:
determining one or more clinical parameters based on the one or more motion-corrected functional images; and
generating an alert if the one or more clinical parameters fall outside designated thresholds.
13.The method of claim 1, further comprising guiding a subsequent imaging scan of the target volume based on the one or more motion-corrected functional images.
14.The method of claim 1, wherein the gated functional images correspond to gated positron emission tomography images and the gated structural images correspond to gated cine computed tomography images.
15. A multi-modality imaging system, comprising:
a functional imaging subsystem configured to generate a plurality of gated functional images corresponding to a target volume in a subject;
a structural imaging subsystem configured to generate a plurality of gated structural images corresponding to the target volume in the subject;
a processing subsystem in operative association with one or more of the functional imaging subsystem and the structural imaging subsystem and configured to:
compute an evolving mean functional image and an evolving mean structural image by iteratively averaging a plurality of pixels corresponding to the plurality of gated functional images and the plurality of gated structural images, respectively;
iteratively compute transforms that align one or more of the plurality of gated functional images to the evolving mean functional image and one or more of the plurality of gated structural images to the evolving mean structural image based on at least one matching metric until convergence of the evolving mean functional image and the evolving mean structural image; and
generate one or more motion-corrected functional images corresponding to the target volume based on the converged mean functional image and the mean structural image.
16. The imaging system of claim 15, wherein the functional imaging subsystem is a positron emission tomography-computed tomography system or a single photon emission computed tomography system, and wherein the structural imaging subsystem is a computed tomography system, a magnetic resonance imaging system, an X-ray imaging system, or combinations thereof.
17. The imaging system of claim 15, further comprising an alerting subsystem communicatively coupled to the processing subsystem, wherein the alerting subsystem is configured to generate an alert if one or more clinical parameters determined based on the one or more motion-corrected functional images fall outside designated thresholds. ,TagSPECI:BACKGROUND
Embodiments of the present specification relate generally to diagnostic imaging, and more particularly to a system and a method for robust motion correction of functional images.
Non-invasive imaging techniques are widely used in security screening, quality control, and medical diagnostic systems. Particularly, in medical imaging, non-invasive medical diagnostic imaging techniques such as multi-energy imaging allow for unobtrusive, convenient, and fast imaging of underlying tissues and organs. Additionally, certain non-invasive imaging techniques may also allow for visualization of functional behavior such as chemical or metabolic activities of organs and tissues within a patient. By way of example, a positron emission tomography (PET) system may be used to generate PET images that represent a distribution of positron-emitting nuclides within the patient’s body. The distribution of the positron-emitting nuclides, in turn, may be correlated with various structural and/or functional parameters that are indicative of a pathological condition of the patient. Use of PET images, thus, aids in diagnosis, and identification of a suitable radiation therapy, and/or performing suitable radiation therapy planning for the patient.
Generally, PET images are acquired over a time interval of several minutes. During this time interval, the image data acquisition may be affected by patient motion such as motion due to respiration, cardiac motion, and/or other gross patient movement. Such patient motion may cause motion artifacts and/or other discrepancies in acquired PET image data, which in turn, may lead to reconstruction of erroneous PET images. The erroneous PET images are unsuitable for use in identifying diagnostic parameters such as planning tumor volume and/or for prescribing suitable treatment for the patient.
Certain conventional PET imaging systems attempt to alleviate shortcomings of conventional PET imaging by employing gating methods that entail acquiring PET image data during different breathing phases or gates. Gated acquisition of image data allows for mitigation of the motion artifacts in the acquired PET image data that are caused due to patient motion. Although use of the gating methods may reduce motion artifacts in the gated PET image data corresponding to individual gates, each gate in isolation may suffer from a low signal-to-noise ratio due to reduced photon counts that may be recorded within a corresponding acquisition time interval. Furthermore, the PET images reconstructed from PET image data corresponding to different gates may not be in alignment. Such misalignment of the gated PET image data may impede accurate localization and quantification of features of interest such as tumors and corresponding volumes using resulting PET images.
Accordingly, certain conventional PET imaging systems employ elastic or non-rigid registration (NRR) methods for mitigating the motion artifacts in PET images. Specifically, the conventional PET imaging systems may reconstruct PET images from the acquired PET image data, non-rigidly register the gated PET images, and average the registered images to mitigate errors caused by the motion artifacts. Generally, the NRR methods may be posed as an estimation of optimal transforms that map the gated PET images to a ‘reference image.’ Further, conventional NRR methods, for example, may employ patch-based metrics, regularization, physics-based tissue elasticity modeling, viscosity modeling, and/or diffeomorphic registration approaches for motion correction of the gated PET images. However, such conventional NRR methods are often unstable due to presence of large motion, intrinsic noise, and/or inability to preserve small structures.
Certain PET imaging systems attempt to address the shortcomings of conventional NRR methods via use of reference-based registration methods. Specifically, in the reference-based registration methods, one of the PET gate images is chosen as the reference-gated PET image for pair-wise NRR of a plurality of gated PET images. Use of a reference-gated PET image, however, may cause the resulting PET images to be biased towards the selected reference-gated PET image. Additionally, quality of the resulting PET images may be limited by a suitability of the selected reference-gated PET image. For example, if the reference-gated PET image includes anatomy exhibiting extreme motion, other images may have to undergo large deformation to be registered to the reference-gated PET image, thus resulting in low registration quality and/or need for complicated computations.
Alternatively, certain conventional PET systems may use a group-wise NRR method for motion correction of PET images. Unlike the pair-wise NRR, the group-wise NRR is a ‘reference free’ method that is not biased by the choice of the reference image. Specifically, the group-wise NRR method jointly and iteratively estimates the ‘reference image’ and corresponding transforms of the PET images to the iteratively estimated “reference image.” However, currently available group-wise NRR methods are limited to generation of anatomical atlases that involves large populations in which no correlation of motion is expected. Accordingly, the currently available group-wise NRR methods merely address challenges relating to multiple modes of variation in the population.
However, gated PET data often includes as few as four to six gates. The currently available group-wise NRR methods prove unstable when processing such gated PET data because multiple solution pairs may be determined even locally for the jointly estimated reference image and the corresponding transforms. Accordingly, PET images that have been motion-corrected using conventional reference-based registration or group-wise NRR methods may be unsuitable for investigating and/or accurately characterizing minute features within a target volume in the patient.
BRIEF DESCRIPTION
In accordance with certain aspects of the present specification, a method for imaging a target volume of a subject is disclosed. The method entails receiving a plurality of gated functional images and a plurality of gated structural images corresponding to the target volume. The method further includes computing an evolving mean functional image and an evolving mean structural image by iteratively averaging a plurality of pixels corresponding to the plurality of gated functional images and the plurality of gated structural images, respectively. Moreover, the method includes iteratively computing transforms that align one or more of the plurality of gated functional images to the evolving mean functional image and one or more of the plurality of gated structural images to the evolving mean structural image based on at least one matching metric until convergence of the evolving mean functional image and the evolving mean structural image. Additionally, the method includes generating one or more motion-corrected functional images corresponding to the target volume based on the converged mean functional image and the mean structural image.
In accordance with certain aspects of the present specification, a multi-modality imaging system is presented. The system includes a functional imaging subsystem configured to generate a plurality of gated functional images corresponding to a target volume in a subject. Further, the system includes a structural imaging subsystem configured to generate a plurality of gated structural images corresponding to the target volume in the subject. Additionally, the system includes a processing subsystem in operative association with one or more of the functional imaging subsystem and the structural imaging subsystem. The processing subsystem is configured to compute an evolving mean functional image and an evolving mean structural image by iteratively averaging a plurality of pixels corresponding to the plurality of gated functional images and the plurality of gated structural images, respectively. Moreover, the processing subsystem is configured to iteratively compute transforms that align one or more of the plurality of gated functional images to the evolving mean functional image and one or more of the plurality of gated structural images to the evolving mean structural image based on at least one matching metric until convergence of the evolving mean functional image and the evolving mean structural image. Additionally, the processing subsystem is configured to generate one or more motion-corrected functional images corresponding to the target volume based on the converged mean functional image and the mean structural image
DRAWINGS
These and other features and aspects of embodiments of the present specification will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIG. 1 is a schematic representation of an exemplary imaging system for use in motion correction of acquired image data, in accordance with aspects of the present specification;
FIG. 2 is a schematic representation of another exemplary imaging system for use in motion correction of acquired image data, in accordance with aspects of the present specification;
FIG. 3 is a flow chart illustrating an exemplary method for motion correction of acquired image data, in accordance with aspects of the present specification;
FIG. 4 is a graphical representation of group-wise joint non-rigid registration of computed tomography (CT) and PET images using the method of FIG. 3, in accordance with aspects of the present specification; and
FIG. 5 is a diagrammatical representation of exemplary PET and CT images generated using the method of FIG. 3, in accordance with aspects of the present specification.
DETAILED DESCRIPTION
The following description presents exemplary systems and methods for robust motion correction of functional image data corresponding to a subject. Particularly, embodiments illustrated hereinafter present exemplary imaging systems and methods that provide an enhanced group-wise non-rigid registration (GW-NRR) framework for jointly registering gated PET images with gated images generated using another imaging modality. In accordance with certain aspects of the present specification, the gated images may be obtained from a radiographic imaging modality that allows for accurate delineation of anatomical features in a target volume of the subject. The joint GW-NRR of PET images with other radiographic images, in turn, reduces motion artifacts, thus aiding in generation of high quality PET images that do not suffer from reference bias.
Although exemplary embodiments of the present systems and methods are described in the context of a hybrid PET-CT imaging system, it will be appreciated that use of the present systems and methods in various other imaging applications and systems is also contemplated. Some of these systems, for example, may include a PET-magnetic resonance imaging (MRI) system, a PET-X-ray system, a single positron emission computed tomography system (SPECT)-CT system, a SPECT-MRI system, and/or a SPECT-X-ray system. An exemplary environment that is suitable for practising various implementations of the present system and methods is discussed in the following sections with reference to FIGs. 1-2.
FIG. 1 illustrates an exemplary imaging system 100 for correcting motion artifacts in image data acquired from a target volume in a subject such as a patient or a non-biological object. In one embodiment, the imaging system 100 for example, may include a hybrid PET-CT imaging system, a PET-MRI system, a SPECT-CT system, and/or a SPECT-MRI system that may be configured to acquire image data for use in generating desired images of the patient. However, for clarity, the present embodiment of the system 100 will be described with reference to a hybrid PET-structural imaging system that includes a PET imaging system 101 and a generic radiographic imaging system 102.
Generally, during PET imaging, a positron-emitting radionuclide may be introduced into the patient’s body via a biologically active molecule. The radionuclide may undergo positron emission decay and emit a positron that travels in a tissue for a short distance. The emitted positron may subsequently interact with an electron. Typically, a positron-electron interaction results in annihilation, thus converting the entire mass of the positron-electron pair into two 511 kilo-electron volt (keV) photons emitted in opposite directions along a line of response (LOR). In certain embodiments, the system 100 may be configured to detect and correlate the emitted photons to functional information corresponding to the patient.
In one embodiment, the system 100 may include the PET imaging system 101 that may be configured to detect a coincidence event if both the emitted photons arrive and are detected during the same temporal window or gate. Additionally, the PET imaging system 101 may be configured to use the detected coincidence information for generating two-dimensional (2D) or three-dimensional (3D) gated PET images corresponding to the patient.
Moreover, in certain embodiments, the PET imaging system 101 may include a gantry 103, which may be configured to support a detector ring assembly 104 that is positioned about a central axis or a patient bore 106 in the PET imaging system 101. Further, the PET imaging system 101 may include a table 108 positioned in front of the gantry 103 and aligned in line with the patient bore 106. Moreover, in one embodiment, the gated PET imaging system 101 may include a table controller (not shown) that may be configured to semi-automatically and/or automatically move the table 108 into the patient bore 106 in response to one or more suitable commands. These commands, for example, may be received from a user via an operator workstation 110 that may be communicatively coupled to the PET imaging system 101 via one or more communication links 112.
Furthermore, in certain embodiments, the PET imaging system 101 may also include a gantry controller 114 that may be configured to operate the gantry 103 in response to the commands received via the operator workstation 110 and/or based on stored instructions. Specifically, the gantry controller 114 may be configured to suitably position the gantry 103 relative to the patient bore 106 to operate the PET imaging system 101 in different modes, such as 2D or 3D data acquisition modes, for efficient PET image reconstruction.
Additionally, in one embodiment, the PET imaging system 101 may include a data acquisition subsystem (DAS) 116 that may be configured for acquiring and processing image data determined for the detected radiation events. Particularly, in certain embodiments, the DAS 116 may further include a detection unit 118 and a processing subsystem 120. The detection unit 118 may be configured to detect image data corresponding to individual radiation events, whereas the processing subsystem 120 may be configured to identify coincidence events from the radiation events based on corresponding timestamps. Accordingly, the processing subsystem 120, for example, may include one or more application-specific processors, graphical processing subsystems, digital signal processors, microcomputers, microcontrollers, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), and/or other suitable processing devices.
In certain embodiments, the processing subsystem 120 may be configured to store image data associated with the identified coincidence events, for example, in a chronological list in a data repository 122. The data repository, for example, may include a hard disk drive, a floppy disk drive, a compact disk-read/write (CD-R/W) drive, a Digital Versatile Disc (DVD) drive, a flash drive, and/or a solid-state storage device. Further, in one embodiment, the processing subsystem 120 may be configured to use the chronological list of coincidence data retrieved from the data repository 122 to reconstruct suitable PET images for display and/or diagnosis.
However, during conventional PET imaging, voluntary and/or involuntary patient motion such as movement of limbs, respiration, and/or cardiac motion may result in significant motion artifacts in the acquired PET image data. Accordingly, conventional PET imaging systems employ reference-based registration methods that entail registration of acquired PET images to a manually selected reference PET image followed by averaging of the registered PET images to generate a final motion-corrected PET image. Conventional registration methods, however, provide inadequate motion correction for PET images in presence of large patient motion such as motion of liver and/or gross movements of the patient.
The shortcomings of these conventional registration methods may be circumvented via use of the exemplary system 100. Particularly, the system 100 may be configured to apply an enhanced image registration method to the acquired PET image data for correcting and/or compensating for motion artifacts caused by such large patient motion. By way of example, in one embodiment, the processing subsystem 120 may be configured to allow for efficient registration of the acquired PET image data using a priori information. In one embodiment, the a priori information may correspond to image data information determined from an additional radiographic imaging system 102 that may be communicatively coupled to the PET imaging system 101 via the one or more communication links 113. The additional radiographic imaging system 102, for example, may include an imaging system such as a CT imaging system or an MRI system that may allow for an enhanced delineation of one or more anatomical features corresponding to the patient. Certain exemplary methods for motion correction and/or enhanced registration of the acquired PET image data based on the a priori information will be described in greater detail with reference to FIGs. 2-4.
FIG. 2 illustrates another embodiment of an exemplary imaging system 200 configured to provide enhanced motion correction and registration of imaging data. For clarity, the system 200 will be described with reference to a hybrid PET-CT imaging system. Accordingly, in one embodiment, the system 200 includes a PET imaging subsystem 201 and a CT imaging subsystem 202 for use in enhanced imaging of a target volume of a subject such as a patient. Although FIG. 2 depicts the system 200 as a hybrid PET-CT imaging system, in certain embodiments, the system 200 may include independent CT and PET imaging systems that are communicatively coupled to each other. Alternatively, the system 200 may correspond to other suitable multi-modality systems such as a PET-MRI system, a SPECT-CT system, or a SPECT-MRI system.
Further, in one embodiment, the system 200 may include a detector ring assembly 203 disposed about a patient bore (not shown in FIG. 2). Specifically, the system 200 may include multiple detector rings that may be spaced along a central axis of the PET imaging subsystem 201 to form the detector ring assembly 203. The detector rings, in turn, may include a plurality of detector modules 204, for example, including an array of individual bismuth germanate (BGO) detector crystals. Generally, these detector modules 204 may be used to detect gamma radiation emitted from the patient and may produce photons in response to the detected gamma radiation.
Accordingly, in one embodiment, the plurality of detector modules 204 may be positioned proximate to a plurality of photomultiplier tubes (not shown) in the PET imaging subsystem 201. In certain embodiments, the photomultiplier tubes (PMTs) may be configured to produce analog signals when a scintillation event occurs at one of the detector modules 204. Specifically, the PMTs may be configured to produce analog signals when a gamma ray emitted from the patient is received by one of the detector modules 204. Further, the PET imaging subsystem 201 may also include a set of acquisition circuits 206 that may be configured to receive the analog signals and generate corresponding digital signals. In one embodiment, the digital signals may be indicative of a location and energy associated with a detected radiation event. The digital signals, thus, may be correlated with functional information, which may be used in PET image reconstruction.
Moreover, in certain embodiments, the PET imaging subsystem 201 may further include a DAS 208 that may be configured to periodically sample the digital signals produced by the acquisition circuits 206. The DAS 208, in turn, may include a processing subsystem 210, which may be configured to control communication between different components of the PET imaging subsystem 201 via coupling means 212. In one embodiment, the coupling means 212, for example, may include electrical circuitry, electronic circuitry, a backplane bus, a wired communication network, and/or a wireless communication network. Additionally, the DAS 208 may also include one or more event locator units 214 that may be configured to assemble information corresponding to each valid radiation event into an event data packet. The event data packet, for example, may include a set of digital numbers that may accurately indicate a time of occurrence of the radiation event and a position of the detector crystals that detected the radiation event.
In certain embodiments, the event locator units 214 may be further configured to communicate the assembled event data packets to a coincidence detector 216 for determining coincidence events. Particularly, the coincidence detector 216 may be configured to identify coincidence event pairs if time and location markers in two event data packets are within designated thresholds. By way of example, in one embodiment, the coincidence detector 216 may be configured to identify a coincidence event pair if time markers corresponding to two event data packets are within 12 nanoseconds of each other and if the corresponding locations lie on a straight line passing through a field of view (FOV) across a patient bore.
Additionally, in certain embodiments, the PET imaging subsystem 201 may be configured to store the determined coincidence event pairs in a storage subsystem 218. Although, FIG. 2 depicts the storage subsystem 218 as part of the imaging subsystem 201, in another embodiment, the storage subsystem 218 may be an independent device that may be operatively coupled to the PET imaging subsystem 201. The storage subsystem 218, for example, may include a sorter 220 that may be configured to sort the coincidence events. In one embodiment, for example, the sorter 220 may be configured to sort the coincidence events in a 3D projection plane format using a look-up table. Particularly, the sorter 220 may be configured to determine an order of the detected coincidence event data using one or more parameters such as a radius or projection angles for efficient storage.
Further, in certain embodiments, the processing subsystem 210 may be configured to process the coincidence event data to determine corresponding time-of-flight (TOF) information. The TOF information may allow the PET imaging subsystem 201 to estimate a point of origin of the electron-positron annihilation with greater accuracy, thus improving event localization. The event localization information, in turn, may be used to enhance localization of one or more features of interest in reconstructed PET images.
Particularly, in one embodiment, the PET imaging subsystem 201 may include an image reconstruction unit 224 that may be configured to use the improved event localization data to generate high-resolution gated PET images corresponding to the target volume in the patient. In certain embodiments, the image reconstruction unit 224 may be an independent device that is communicatively coupled to the PET imaging subsystem 201. However, in certain other embodiments, the image reconstruction unit 224 may be an integral part of the processing subsystem 210. Alternatively, the image reconstruction unit 224 may be absent and the processing subsystem 210 may be configured to perform one or more functions of the image reconstruction unit 224 such as reconstruction of gated PET images.
Moreover, according to certain aspects of the present specification, the image reconstruction unit 224 may be configured to mitigate the motion artifacts in the reconstructed PET images. Particularly, the image reconstruction unit 224 may be configured to correct for the motion artifacts in the PET images that result from respiration, cardiac motion, and/or other gross patient motion via use of a priori information determined from the CT imaging subsystem 202. Accordingly, in one embodiment, the PET imaging subsystem 201 may be configured to acquire gated PET imaging data corresponding to the target volume in phase with acquisition of the gated cine CT image data via a gating device 228 such as an electrocardiogram machine.
Particularly, in certain embodiments, the gating device 228 may be configured to provide a signal to the system 200 to schedule gated PET image data acquisition in phase with the gated cine CT image data acquisition at a plurality of gating time intervals. The gating time intervals may be pre-programmed or may be manually selected by a user, for example, using an input device 230 available on an associated operator workstation 232. Subsequently, the gated PET image data and the gated cine CT image data may be used to respectively generate gated PET images and gated cine CT images.
Generally, owing to a difference in underlying physics corresponding to the CT and PET imaging, the gated CT, cine CT, and PET images may provide enhanced visualization of different features in the target volume. For example, lesions may be more pronounced in the gated PET images, whereas the vertebra may be more pronounced in the gated cine CT images. Accordingly, instead of relying on a single imaging modality, the image reconstruction unit 224 may be configured to perform a joint group-wise NRR of the gated PET images and the gated cine and/or diagnostic CT images to identify motion corresponding to the different features in the target volume. As previously noted, the joint NRR may preclude use of a specific gated cine CT image or a gated PET image as reference images. Alternatively, the joint NRR may employ iteratively evolving mean PET and evolving mean CT images for use in jointly registering the gated PET images and the gated cine CT images based on a selected matching metric. According to aspects of the present specification, the image reconstruction unit 224 may be configured to use the matching metric to determine suitable transforms that align the gated cine CT images and the gated PET images to the iteratively evolving mean PET and evolving mean CT images.
Accordingly, in a presently contemplated embodiment, the image reconstruction unit 224 may be configured to initialize the evolving mean cine CT and PET images, for example, by averaging pixel intensities in a plurality of the gated cine CT images and gated PET images, respectively. Further, the image reconstruction unit 224 may be configured to transform each of the gated cine CT and gated PET images to the corresponding initialized mean images based on the matching metric. The image reconstruction unit 224 may also be configured to update the mean cine CT and PET images by averaging the pixel intensities in corresponding transformed gated cine CT and PET images, respectively. In accordance with aspects of the present specification, the image reconstruction unit 224 may be configured to iteratively update the mean cine CT and PET images and iteratively transform gated cine CT and gated PET images to the updated mean cine CT and PET images based on the matching metric until convergence of the evolving mean cine CT and/or PET images.
Further, in one embodiment, the matching metric may be selected to reduce a distance (for example, a weighted Euclidean distance) between transforms computed for each of the gated cine CT images and corresponding gated PET images. Typically, cine CT image data corresponds to image data acquired using a low radiation scan. Accordingly, while the cine CT image data may be used for PET attenuation correction, the cine CT image data may not be suitable for diagnostic purposes. Hence, in certain embodiments, it may be desirable to fuse the gated PET data with diagnostic CT data for identifying comprehensive structural information that may be used for diagnosing the patient, where the diagnostic CT data may be acquired using a helical CT scan. In such embodiments, the matching metric may be selected such that a determined distance between the evolving mean cine CT image and a selected diagnostic CT image is iteratively reduced. Specifically, the matching metric may be used to continually reduce determined distances between different transforms that are computed for each of the gated cine CT and/or PET images and corresponding evolving mean cine CT and/or PET images until convergence of the evolving mean cine CT and/or PET images. In one embodiment, the evolving mean cine CT and/or PET images are determined to have converged when the determined distances are within clinically or user prescribed thresholds.
According to aspects of the present specification, convergence of the evolving mean cine CT and/or PET images is indicative of the joint NRR of the corresponding gated cine CT, PET, and/or diagnostic CT images. The joint NRR, as described herein, mitigates the misalignment of the gated PET images due to presence of the motion artifacts and aids in generating motion-corrected PET images of a desired quality and/or clinical specification. Particularly, registering the gated PET images via use of the a priori information determined from cine CT and/or diagnostic CT images provides extensive structural information, thereby allowing for a more efficient feature-based registration and/or motion correction in resulting PET images. Additionally, the joint NRR may also allow for registration of the gated PET images to gated diagnostic CT images, which may be acquired in a different phase, without using any complicated and/or computationally expensive multi-modality transformation metric.
Further, in one embodiment, the image reconstruction unit 224 may be configured to transmit the motion-corrected PET images to an output device 234. The output device 234, for example, may be operatively coupled to the operator workstation 232. Furthermore, the output device 234 may include devices such as a display device, an audio device, and/or a video device that allow a medical practitioner access to the motion-corrected PET images. Additionally, in certain embodiments, the image reconstruction unit 224 may be configured to transmit the motion-corrected PET images to the processing subsystem 210 for further analysis. In one embodiment, the processing subsystem 210 may be configured to determine one or more clinical parameters of interest such as lesion volumes from the motion-corrected PET images. Additionally, the processing subsystem 210 may be configured to transmit the determined clinical parameters and/or the motion-corrected PET images to the output device 234 in near real-time for aiding in clinical diagnosis and/or prescribing a suitable treatment for the patient.
Communication of the motion-corrected PET images and the clinical parameters in near real-time may allow the medical practitioner to diagnose and/or prescribe suitable treatment for the patient in a timely manner. Additionally, in certain embodiments, the processing subsystem 210 may be configured to generate an audio and/or visual alert via an alerting subsystem 236 if one or more of the clinical parameters fall outside corresponding clinical or user-prescribed thresholds. For example, the processing subsystem 210 may be configured to activate blinking lights, issue audio instructions, and/or transmit an email or short messaging service alert to a caregiver for ensuring provision of medical attention to the patient, for example, within 60 seconds. In certain other embodiments, the motion-corrected PET images may be used as feedback for guiding a subsequent imaging scan of the patient.
It may be noted that the specific arrangements depicted in FIGs. 1-2 are exemplary. In certain embodiments, the systems 100 and 200 may be configured or customized for additional functionality, different imaging applications, and/or different scanning protocols. By way of example, in one embodiment, the systems 100 and/or 200 may include a picture archiving and communications system (PACS). The PACS (not shown) may be further coupled to a radiology department information system, a hospital information system and/or to an internal or an external communications network to provide remote access to the motion-corrected PET images and/or the clinical parameters. Specifically, in certain embodiments, the communications network may allow operators at different locations to supply commands and parameters and/or gain access to the high-resolution PET images and/or the determined clinical parameters.
Embodiments of the present PET imaging subsystem 201, thus, employ the a priori information to greatly reduce motion artifacts in PET images. Particularly, use of cine CT and/or helical CT-based a priori information may provide more extensive structural information that may be used to more efficiently detect and correct for large motion in the target volume without use of complicated and/or computationally expensive multi-modal registration and/or motion correction methods. Certain exemplary methods for enhanced motion correction in PET images using the a priori information will be described in greater detail with reference to FIGs. 3-4.
FIG. 3 illustrates a flow chart 300 depicting an exemplary method for enhanced motion correction. In the present specification, embodiments of the exemplary method may be described in a general context of computer executable instructions on a computing system or a processor. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract image data types.
Additionally, embodiments of the exemplary method may also be practised in a distributed computing environment where optimization functions are performed by remote processing devices that are linked through a wired and/or wireless communication network. In the distributed computing environment, the computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
Further, in FIG. 3, the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, or combinations thereof. The various operations are depicted in the blocks to illustrate the functions that may be performed, for example, during the steps of generating an evolving mean functional image, iteratively computing transforms corresponding to the functional images, and/or generating one or more motion-corrected functional images in the exemplary method. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations.
The order in which the exemplary method is described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary method disclosed herein, or an equivalent alternative method. Additionally, certain blocks may be deleted from the exemplary method or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein. For discussion purposes, the exemplary method will be described with reference to the elements of FIGs. 1-2.
Embodiments of the present method allow for enhanced motion correction in gated functional images via a reference-free, group-wise, and joint NRR of gated functional images and gated structural images. For clarity, the present method is described with reference to enhanced motion correction in PET images using a priori information determined from a reference-free, group-wise, and joint NRR of gated cine CT images and gated PET images. However, certain embodiments of the present method may also be used for enhanced motion correction in PET images using a priori information determined from joint NRR of gated PET images with images acquired from other imaging systems such as an MRI or an ultrasound imaging system. Particularly, use of the reference-free joint NRR precludes use of a selected reference CT or PET image, thereby preventing reference image-based bias in subsequently registered images.
In one embodiment, the method begins at step 302, where gated cine CT and PET image data corresponding to a target volume in a subject may be received. The target volume, for example, may correspond to the heart, lungs, or a liver region in the subject such as a patient. In certain embodiments, the gated cine CT and PET image data may be acquired by a hybrid PET-CT imaging system such as the system 200 of FIG. 2 or independent CT and PET systems. In a presently contemplated embodiment, the acquired CT image data may correspond to gated cine CT image data acquired using a low dose cine CT scan and/or diagnostic CT image data acquired by a helical CT scan. Moreover, in certain embodiments, the gated cine CT and PET image data may be acquired in phase such that gated cine CT and PET image data acquired during similar periods of patient motion or similar time intervals are binned together. In one embodiment, the similar periods of patient motion may correspond to six different breathing phases between end inspiration and end expiration. Accordingly, in one example, the gated cine CT and PET image data may be binned into six different gates representative of the six different breathing phases.
Further, at step 304, a plurality of gated cine CT and PET images corresponding to the target volume may be reconstructed using the gated cine CT and PET image data, respectively. Particularly, in one embodiment, the gated cine CT and PET images may be reconstructed, for example, using iterative image reconstruction or filtered backprojection. In addition to the gated cine CT and PET images, one or more diagnostic CT images may also be reconstructed from image data acquired using a helical CT scan.
However, as previously noted, the gated PET, cine CT, and diagnostic CT images may include motion artifacts due to patient motion during image data acquisition. Particularly, gross motion of the patient, movement of thoracic region due to heartbeat, and/or movement of lungs due to breathing during the data acquisition may cause motion artifacts in the gated CT and PET images. In accordance with aspects of the present specification, the plurality of gated cine CT and PET images may be processed using an enhanced registration method that employs a priori information to allow for correction of the motion artifacts caused by patient motion. In certain embodiments, the a priori information may correspond to information determined from the gated cine CT image data and/or the diagnostic CT image data acquired by an associated CT imaging system such as the CT imaging subsystem 202 of FIG. 2.
Generally, CT image data exhibits better signal-to-noise-ratio (SNR) and standardized tissue intensity values, thereby providing anatomic details corresponding to the target volume with greater clarity. Therefore, in one embodiment, the gated cine CT image data that is acquired in phase with the gated PET image data may be employed for improved feature-based joint NRR of gated cine CT and PET images even in presence of large patient motion. Particularly, the joint NRR may entail iteratively computing evolving mean cine CT and PET images until convergence for jointly registering the corresponding gated cine CT and gated PET images based on a selected matching metric.
Accordingly, at step 306, an evolving mean cine CT image and an evolving mean PET image may be computed from the plurality of gated cine CT images and gated PET images, respectively. By way of example, in one embodiment, the mean cine CT image and mean PET image may be computed by averaging pixel intensities in the plurality of gated cine CT and gated PET images, respectively. As used herein, the phrase “averaging the pixel intensities” may correspond to computing a corresponding mean, a mode, median, and/or a weighted mean of the pixel intensities corresponding to the plurality of gated cine CT and/or gated PET images. The weighted mean, for example, may be determined by assigning lower weights to outlying pixel intensities.
Subsequently, at step 308, one or more of the gated cine CT images and the gated PET images may be iteratively transformed to the mean CT image and the mean PET image, respectively, based on at least one matching metric. In one embodiment, the gated cine CT images and the gated PET images may be transformed to the evolving mean cine CT image and the evolving mean PET image, respectively, via use of one or more transformation representations such as B-splines or dense deformations. Additionally, in accordance with aspects of the present specification, at least one suitable matching metric may be selected to constrain computation of transforms that are iteratively computed for one or more of the gated cine CT and gated PET images. By way of example, the selected matching metric may include a mean square error metric, a mutual information metric, a feature-based metric, and/or a correlation metric.
Furthermore, in one embodiment, the matching metric may be selected to reduce a weighted distance (for example, Euclidean distance) between the transforms that are computed in each iteration for one or more of the gated cine CT images and corresponding gated PET images. In certain embodiments, it may be desirable to employ diagnostic CT data for more comprehensive clinical analyses. In such embodiments, a matching metric may be selected to reduce a distance between the evolving CT mean image and the diagnostic CT image.
In scenarios where cine CT image data may be unavailable and it may be desirable to register the gated PET images directly to the diagnostic CT image, a multi-modal registration metric such as a mutual information (MI) metric may be selected as the matching metric. The MI metric may provide a measure of a similarity between a determined set of features in the diagnostic CT image and a gated PET image. Accordingly, in one embodiment, the MI metric may be maximized for aiding in joint NRR of the diagnostic CT image and gated PET images. Similarly, the MI metric may also be used in joint NRR of MRI images and gated PET images.
Additionally, in certain embodiments, the selected matching metric such as the Euclidean distance metric may be assigned suitable weights to constrain the joint NRR at selected regions in the gated PET images and the gated cine CT images. Particularly, in one embodiment, suitable weights may be automatically and/or manually assigned to the matching metric based on where the transforms are expected to exhibit a greater match. By way of example, a first range of weights may be assigned to the matching metric for transforming rigid spatial locations in the target volume that are conventionally known to exhibit minimal phase mismatch between the gated cine CT and gated PET images. In a similar fashion, a second range of weights that are smaller than the first range of weights may be assigned to the matching metric, for example, while transforming spatial locations that are known to exhibit non-rigid motion such as large respiratory and/or cardiac movements. Thus, use of suitable matching metrics may allow for efficient control over desired transformation of different regions in the gated cine CT and gated PET images to the mean CT and mean PET images, respectively. Certain examples of the matching metrics that may be used in the transformations computed for the gated cine CT and gated PET images will be described with reference to FIG. 4.
Further, at step 310, an iterative check may be performed to determine if the mean CT and the mean PET images have converged. In one embodiment, the mean CT image may be considered to have converged when a difference between the mean CT image and one or more of the plurality of gated cine CT images is equal to or less than a prescribed threshold. Similarly, the mean PET image may be considered to have converged when a difference between the mean PET image and one or more of the plurality of gated PET images is equal to or less than a prescribed threshold.
Moreover, in a presently contemplated embodiment, the difference between the mean CT and mean PET images and corresponding gated cine CT and gated PET images, respectively may be determined based on the matching metric used in step 308. The matching metric, for example, may provide an indication of difference in intensities between the mean CT image and one or more of the gated cine CT images, and the mean PET image and one or more of the gated PET images, respectively. Accordingly, in one embodiment, the matching metric may be compared with the prescribed threshold. In one embodiment, the prescribed threshold may correspond to a user-selected or clinically prescribed value. In one example, value of the prescribed threshold may be representative of a determined noise variance value. Thus, if the matching metric corresponds to a value that is equal or is lesser than the prescribed noise variance value, the mean CT image and/or the gated CT image may be considered as having converged. In certain other embodiments, the mean CT image and/or the gated CT image may be considered as having converged if a desired number of iterations have been completed.
However, at step 310, if it is determined that the mean CT image and/or the mean PET image have not converged, the mean CT image and the mean PET image may be updated, as depicted by step 312. Particularly, in one embodiment, the mean CT image may be updated by averaging pixel intensities of the one or more transformed cine CT images generated at step 308. Similarly, the mean PET image may be updated by averaging pixel intensities of the one or more transformed PET images generated at step 308. Subsequently, the control may pass to step 308. Particularly, according to certain aspects of the present specification, the steps 308, 310, and 312 may be iteratively repeated until convergence of the iteratively updated mean CT image, the mean PET image, the gated cine CT images, and/or the gated PET images.
Generally, convergence of the mean CT image and/or the mean PET image may entail spatial realignments and/or motion of pixels involved in iterative transformations of the gated cine CT and gated PET images to the mean CT and mean PET images, respectively. Accordingly, at step 314, a plurality of motion vector fields may be determined based on the converged mean CT image and the converged mean PET image. Specifically, in certain embodiments, the spatial realignments and/or motion of pixels during the iterative transformations that are computed at step 308 may be used to compute certain motion vector fields. In one embodiment, the motion vector fields may be indicative of motion of a patient during a gated CT and/or a gated PET image data acquisition. Alternatively, the motion vector fields may define a type and/or magnitude of a transformation of pixels in the gated cine CT and/or gated PET images that is required for aligning the gated cine CT and/or gated PET images to the evolving mean CT and mean PET images, respectively.
Further, in one embodiment, the motion vector fields may be used to generate global or local motion models that are representative of corrections required in the gated cine CT and/or gated PET images. Particularly, the local and/or global motion models may be generated, for example, by processing cine CT image data, PET image data, and/or the motion vector fields using principal or independent component analysis. The local and/or global motion models, thus determined, may then be used as a priori information to correct for patient motion artifacts introduced during image data acquisition. According to aspects of the present specification, correction of the motion artifacts may be representative of joint registration of the gated cine CT images and the gated PET images to the converged mean CT image and the converged mean PET image, respectively.
Subsequently, at step 316, one or more motion-corrected PET images corresponding to the target volume may be obtained based on the converged mean CT and/or mean PET images. In certain embodiments, the motion-corrected PET images may be further processed to give a high signal to noise ratio image by averaging or computing a median PET image. Additionally, any outlying data measurements may be identified and/or excluded from the averaged and/or median image for example, using a suitable cost function. In one embodiment, the resulting motion-corrected PET images may include fused information that may be acquired using PET, cine CT, and/or diagnostic CT scans.
Accordingly, at step 318, the motion-corrected PET images may be displayed on a display device. Particularly, the motion-corrected PET images may be displayed on the display device to aid in accurate detection, quantification, and/or localization of lesion regions that may not be easily detected in conventional PET images due to presence of motion artifacts. Use of the motion-corrected PET images, thus, aids in accurately characterizing structural and/or functional features of the target volume, which in turn, may be used to provide an informed clinical diagnosis of a health condition of the patient.
Further, FIG. 4 illustrates a diagrammatic representation 400 of an exemplary group-wise joint NRR of cine CT, diagnostic CT, and/or PET images performed using the method of FIG. 3. Particularly, in FIG. 4, reference numeral 402 is representative of a group-wise NRR of gated cine CT images, I_CT^k , where k=1,2..N. Additionally, k is representative of a number of gates. Further, reference numeral 404 is representative of a group-wise NRR of gated PET images, I_PET^k, where k=1,2..N. Additionally, k is representative of a number of desired gates.
As previously noted with reference to step 306 of FIG. 3, an evolving group-wise mean CT image µ_CT and an evolving mean PET image µ_PET may be generated by averaging the gated cine CT images I_CT^k and gated PET images I_PET^k respectively. Specifically, the mean CT image µ_CT and the mean PET image µ_PET may be generated, for example, by computing an average or a median of the pixel intensities corresponding to the gated cine CT images I_CT^k and the gated cine PET images I_PET^k, respectively.
Furthermore, as described with reference to steps 308, 310, and 312 of FIG. 3, one or more of the gated cine CT images I_CT^k and gated PET images I_PET^k may be transformed to the mean CT image µ_CT and mean PET image µ_PET respectively, until convergence of the mean CT image µ_CT and/or mean PET image µ_PET. In certain embodiments, where diagnostic CT image data may provide useful insights into a physiological and/or functional condition of a patient, the mean CT image µ_CT may be aligned with a selected diagnostic CT image I_hCT. Such an alignment may allow for an indirect transformation of the gated PET images I_PET^k and the mean PET image µ_PET to the diagnostic CT image I_hCT without use of any multi-modality metric. Thus, embodiments of the present method, such as the method described with reference to FIG. 3, may be used for efficient registration of phase-matched gated cine CT and gated PET images, gated cine CT, gated PET, and diagnostic CT images, and/or a diagnostic CT image and gated PET images.
Particularly, in a presently contemplated embodiment, desired transformation of the gated cine CT images I_CT^k and/or gated PET images I_PET^k may be achieved via use of a suitable objective function that employs one or more matching metrics. By way of example, given phase-matched gated cine CT images I_CT^k and gated PET images I_PET^k, corresponding transforms may be determined using an objective function indicated by equation (1).
E[y_PET,y_CT,y_hCT,µ_PET,µ_CT ]=?_O¦?_k¦(I_PET^k (x+y_PET^k (x))-µ_PET (x)) ^2 dx
+?_O¦?_k¦(I_CT^k (x+y_CT^k (x))-µ_CT (x)) ^2 dx
+ ?_1 ?_O¦?w(x) ?_k¦|y_CT^k-y_PET^k |^2 ? dx
+S_PET [y_PET]+S_CT [y_CT] (1)
where y_PET and y_CT correspond to deformation fields corresponding to a mean PET image µ_PET and a mean CT images µ_CT, O corresponds to an image domain, w corresponds to a selected weight function, S_PET corresponds to a transform penalty term that seeks smooth deformation field y_PET, S_CT corresponds to an anatomy and/or tissue based motion prior determined from gated cine CT image data for use on a deformation field y_CT, and where equation (1) is subject to zero-deformation constraints ?_k¦?y_PET^k=0? and/or ?_k¦?y_CT^k=0?.
Moreover, in equation (1), the first two terms (?_O¦?_k¦(I_PET^k (x+y_PET^k (x))-µ_PET (x)) ^2 dx) and (?_O¦?_k¦(I_CT^k (x+y_CT^k (x))-µ_CT (x)) ^2 dx) are representative of group-wise NRR energies corresponding to the gated PET images I_PET^k and/or gated cine CT images I_CT^k. In one embodiment, the first two terms of equation (1) may be combined with zero-deformation constraints ?_k¦?y_PET^k=0? and ?_k¦?y_CT^k=0? to impart stability to the transforms computed for the gated PET images I_PET^k and/or gated cine CT images I_CT^k.
Further, the third term (?_1 ?_O¦?w(x) ?_k¦|y_CT^k-y_PET^k |^2 ? dx ) is representative of a transform matching term that is responsible for bringing the PET and CT transforms close to each other at desired spatial locations as indicated by the weight function w. In one embodiment, the transform matching term may correspond to a weighted matching metric that brings iteratively computed PET and CT transforms close to each other, for example, at rigid and/or high weight spatial locations in the target volume.
In one embodiment, the weight function w may be initiated using a value that may be selected based on determined information corresponding to known regions in the gated cine Ct images and the gated PET images. As previously noted, in certain embodiments, the weight function w may be manually and/or automatically set for regions in the target volume where the PET and CT transforms are expected to match. By way of example, higher weights may be assigned to the matching metric for transforming rigid spatial locations in the target volume that are conventionally known to exhibit minimal phase mismatch between the gated cine CT images I_CT^k and the gated PET images I_PET^k. Similarly, smaller weights may be assigned to the matching metric, for example, while transforming spatial locations that are known to exhibit large respiratory movements, or respiratory movement that are greater than a user and/or clinically defined threshold. Moreover, as iterations progress, the weight function w may be updated to eliminate candidate outlier regions that have a high transform match cost. The weighted matching metric, thus, may be used for simultaneous NRR and automatic convergence of the gated cine CT images I_CT^k and gated PET images I_PET^k to the mean CT image µ_CT and mean PET image µ_PET, thereby aiding in efficient mitigation of large patient motion during image reconstruction.
However, in scenarios where it may be desirable to register the gated PET images to a selected diagnostic CT image for determining comprehensive structural information, a CT matching metric may be used in addition to the weighted matching metric in equation (1). Accordingly, equation (1) may be redefined, for example, as equation (2).
E[y_PET,y_CT,y_hCT,µ_PET,µ_CT ]=?_O¦?_k¦(I_PET^k (x+y_PET^k (x))-µ_PET (x)) ^2 dx
+?_O¦?_k¦(I_CT^k (x+y_CT^k (x))-µ_CT (x)) ^2 dx
+ ?_1 ?_O¦?w(x) ?_k¦|y_CT^k-y_PET^k |^2 ? dx
+ ?_2 ?_O¦?(µ_CT (x+y_hCT (x))-I_hCT (x))^2 dx ?
+S_PET [y_PET]+S_CT [y_CT]+S_hCT [y_hCT] (2)
where y_hCT corresponds to a deformation field used to align the mean CT mean image µ_CT to the diagnostic CT image I_hCT, ?_1 and ?_2 are parameters that balance corresponding penalty expressions, and S_hCT corresponds to another anatomy and/or tissue based motion prior determined from diagnostic CT image data.
In particular, equation (2) includes a CT matching term (?_2 ?_O¦?(µ_CT (x+y_hCT (x))-I_hCT (x))^2 dx ?) that may be used for non-rigidly registering the diagnostic CT image I_hCT to the mean CT image µ_CT. In certain embodiments, the CT matching term may be used to constrain possible transforms that are computed based on the mean CT image µ_CT to allow for accurate registration of the diagnostic CT image I_hCT to the mean CT image µ_CT. Particularly, the deformation field y_hCT in the CT matching term may be used to de-couple recovery of information corresponding to patient motion between selected respiratory and/or cardiac gates and a selected diagnostic CT image data acquisition. Thus, even if the diagnostic CT image I_hCT is acquired at an extreme breathing phase compared to the selected gates, the deformation field y_hCT may aid in recovering the information corresponding to the patient motion without affecting PET gate registration. To that end, in one embodiment, the deformation fields y_PET and y_CT may be chosen to correspond to high resolution and dense deformations, whereas the deformation field y_hCT may be chosen as a coarse deformation and may be represented using a sparse grid of B-spline basis functions.
Additionally, equation (2) may also include certain other transform priors that may be used with the deformation fields y_PET, y_CT, and y_hCT for improving registration of gated PET images to gated cine CT images I_CT^k and/or the diagnostic CT image I_hCT. In certain embodiments, for example, anatomy and/or tissue based motion priors S_CT and S_hCT may be used when computing the CT and PET transforms. Particularly, the motion priors S_CT and S_hCT may allow for delineation and/or labelling of multiple organs in the target volume based on corresponding pixel intensities. Accordingly, in certain embodiments, equation (2) may be optimized to determine appropriate transforms for registering each of the gated cine CT images I_CT^k and the gated PET images I_PET^k to the diagnostic CT image I_hCT.
Particularly, in one embodiment, the objective function defined in equation (2) may be optimized by computing first order optimality conditions, for example, using Euler Lagrange equations. Further, descent equations corresponding to the Euler Lagrange equations may be iteratively solved using Finite Differences. Optimization of the objective function allows for an iterative update of the evolving PET mean image µ_PET and mean CT image µ_CT, while also updating deformation fields y_PET, y_CT, and y_hCT that map individual gates to the mean PET and CT images µ_PET and µ_CT, respectively.
In certain embodiments, where the gated cine CT images I_CT^k are unavailable, the objective function defined in equation (2) may be modified for directly registering the gated PET images I_PET^k to the selected diagnostic CT image I_hCT. Specifically, in one embodiment, the objective function defined in equation (2) may be modified by simply dropping the transform prior S_hCT, and modifying the CT matching term to (H(I_hCT (.+y_hCT),µ_PET)), where H corresponds to a multi-modal transformation metric such as a mutual information metric. Similarly, when using MRI data for PET attenuation correction instead of the gated cine CT images I_CT^k, a suitable mutual information metric may be used as the matching metric.
Thus, according to aspects of the present specification, the joint NRR of the gated PET images I_PET^k, cine CT I_CT^k images, and/or diagnostic CT images I_hCT constrained by use of one or more suitable matching metrics allows for efficient correction of motion artifacts in resulting PET images. Particularly, in certain embodiments, use of the present method may aid in fusing the gated PET image data with diagnostic CT image data for reconstructing high SNR PET images. In one embodiment, the high SNR PET images provide significant clinical indicators that are typically not available via conventional PET images. Certain examples of the resulting PET images that are reconstructed using an embodiment of the present method are illustrated in FIG. 5.
FIG. 5 depicts a diagrammatical representation 500 of exemplary CT and PET images generated using the method of FIG. 3. Particularly, in FIG. 5, reference numeral 502 corresponds to a gated PET image generated using originally acquired PET image data. Further, reference numeral 504 corresponds to a selected diagnostic CT image acquired using a helical CT scan. According to certain aspects of the present specification, the gated PET image 502 may be processed via a reference-free joint NRR method, such as the method described with reference to FIGs. 3-4, to generate a corresponding motion-corrected PET image.
Particularly, in one implementation, the diagnostic CT image 504 may be registered to a mean CT image by averaging a plurality of gated cine CT images (not shown). Registration of images acquired using the same imaging modality may allow for rapid computation of one or more motion vector fields. These motion vector fields may be used to constrain a simultaneous NRR of gated cine CT images to the mean cine CT image and a gated PET image 502 to a mean PET image 506 via use of one or more suitable matching metrics until convergence of the mean cine CT and/or PET images.
In certain embodiments, such a joint NRR allows fusion of the mean PET image 506 with the selected diagnostic CT image 504 to generate a final image 508. The final image 508, thus generated, may efficiently combine rich anatomical information derived from the diagnostic CT image 504 and rich functional information determined from the gated PET image 502, while including significantly reduced motion artifacts. Joint NRR of PET images constrained using a priori information determined from an associated imaging modality such as a CT system, an MRI system and/or an X-ray system, thus, may allow for generation of final PET images with superior motion correction and resolution.
Embodiments of the present systems and methods, thus, allow for enhanced PET registration using CT-based a priori information. Particularly, the embodiments described herein allow for enhanced visualization of a target volume using both gated CT and PET images for compensating for motion artifacts in resulting PET images. For example, the present systems and methods may allow for enhanced visualization of increased metabolism in lesions that may be identified and/or localized using a helical CT scan. Specifically, use of combined CT and PET image data mitigates distortion of the acquired PET image data due to patient motion, thereby aiding in accurately distinguishing between an increase in metabolism due to active tumor growth and/or due to post-radiotherapy necrosis in the patient.
Additionally, unlike conventional multi-modal registrations that may be impeded by large patient motion, lesser number of gates, and/or phase mismatch, embodiments of the present specification preclude use of a complicated and/or computationally expensive multi-modal transformation metric, thereby allowing for simpler processing. Moreover, joint registration of CT and PET images may allow for fusion of multi-modality imaging information that provides richer structural and functional information of even small definitive structures such as tumors in a single image, while also allowing for a reduction in imaging time and/or patient discomfort.
Although the present embodiments have been described with reference to use of CT-based a priori information, in certain implementations, MRI-based a priori information may be used for constraining PET registration to generate a motion-corrected and enhanced final PET image. The final PET image that includes the MRI-derived information may be used to identify a region of hypometabolism caused by epilepsy using PET image data, while also localizing damage to anterior and medial areas of a right temporal gyrus determined using gated MRI data.
Similarly, in certain embodiments, the methods described with reference to FIGs. 3 and 4 may be used to generate enhanced functional images to allow for accurate cancer staging, biopsy planning, pre-surgical assessment, planning, and/or post-surgical assessment of radiotherapy treatment. Use of multi-modality anatomical imaging information for constraining registration of functional images, thus, may allow for enhanced diagnosis and/or treatment prescription, thereby improving patient outcomes.
It may be noted that the foregoing examples, demonstrations, and process steps that may be performed by certain components of the present systems, for example by the DAS 116, the processing subsystem 120, the DAS 208, the processing subsystem 210, and/or image reconstruction unit 224 of FIGs. 1-2 may be implemented by suitable code on a processor-based system. To that end, the processor-based system, for example, may include a general-purpose or a special-purpose computer. It may also be noted that different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently.
Additionally, the functions may be implemented in a variety of programming languages, including but not limited to Ruby, Hypertext Preprocessor (PHP), Perl, Delphi, Python, C, C++, or Java. Such code may be stored or adapted for storage on one or more tangible, machine-readable media, such as on data repository chips, local or remote hard disks, optical disks (that is, CDs or DVDs), solid-state drives, or other media, which may be accessed by the processor-based system to execute the stored code.
Although specific features of embodiments of the present specification may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments, for example, to construct additional assemblies and methods for use in enhanced diagnostic imaging.
While only certain features of the present disclosure have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 266254-1_STAMP_PAPER+FORM26.pdf | 2015-06-24 |
| 1 | 3021-CHE-2015-IntimationOfGrant04-04-2024.pdf | 2024-04-04 |
| 2 | 266254-1_IN_FORM2_Application_and_Drawings.pdf | 2015-06-24 |
| 2 | 3021-CHE-2015-PatentCertificate04-04-2024.pdf | 2024-04-04 |
| 3 | 3021-CHE-2015-AMENDED DOCUMENTS [19-01-2024(online)].pdf | 2024-01-19 |
| 3 | 266254-1_FORM3.pdf | 2015-06-24 |
| 4 | 3021-CHE-2015-FORM 13 [19-01-2024(online)].pdf | 2024-01-19 |
| 4 | 266254-1 IN Abstract Drawing.jpg | 2015-06-24 |
| 5 | 3021-CHE-2015-Power of Attorney-191015.pdf | 2016-03-22 |
| 5 | 3021-CHE-2015-FORM 3 [19-01-2024(online)].pdf | 2024-01-19 |
| 6 | 3021-CHE-2015-FORM-26 [19-01-2024(online)].pdf | 2024-01-19 |
| 6 | 3021-CHE-2015-Form 1-191015.pdf | 2016-03-22 |
| 7 | 3021-CHE-2015-PETITION UNDER RULE 137 [19-01-2024(online)].pdf | 2024-01-19 |
| 7 | 3021-CHE-2015-Correspondence-191015.pdf | 2016-03-22 |
| 8 | 3021-CHE-2015-RELEVANT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 8 | 3021-CHE-2015-POA [19-01-2024(online)].pdf | 2024-01-19 |
| 9 | 3021-CHE-2015-FORM 13 [10-10-2019(online)].pdf | 2019-10-10 |
| 9 | 3021-CHE-2015-Written submissions and relevant documents [19-01-2024(online)].pdf | 2024-01-19 |
| 10 | 3021-CHE-2015-Correspondence to notify the Controller [02-01-2024(online)].pdf | 2024-01-02 |
| 10 | 3021-CHE-2015-FER.pdf | 2020-01-31 |
| 11 | 3021-CHE-2015-FORM-26 [29-12-2023(online)].pdf | 2023-12-29 |
| 11 | 3021-CHE-2015-OTHERS [22-05-2020(online)].pdf | 2020-05-22 |
| 12 | 3021-CHE-2015-FER_SER_REPLY [22-05-2020(online)].pdf | 2020-05-22 |
| 12 | 3021-CHE-2015-US(14)-HearingNotice-(HearingDate-04-01-2024).pdf | 2023-12-11 |
| 13 | 3021-CHE-2015-ABSTRACT [22-05-2020(online)].pdf | 2020-05-22 |
| 13 | 3021-CHE-2015-DRAWING [22-05-2020(online)].pdf | 2020-05-22 |
| 14 | 3021-CHE-2015-CLAIMS [22-05-2020(online)].pdf | 2020-05-22 |
| 14 | 3021-CHE-2015-CORRESPONDENCE [22-05-2020(online)].pdf | 2020-05-22 |
| 15 | 3021-CHE-2015-COMPLETE SPECIFICATION [22-05-2020(online)].pdf | 2020-05-22 |
| 16 | 3021-CHE-2015-CLAIMS [22-05-2020(online)].pdf | 2020-05-22 |
| 16 | 3021-CHE-2015-CORRESPONDENCE [22-05-2020(online)].pdf | 2020-05-22 |
| 17 | 3021-CHE-2015-DRAWING [22-05-2020(online)].pdf | 2020-05-22 |
| 17 | 3021-CHE-2015-ABSTRACT [22-05-2020(online)].pdf | 2020-05-22 |
| 18 | 3021-CHE-2015-US(14)-HearingNotice-(HearingDate-04-01-2024).pdf | 2023-12-11 |
| 18 | 3021-CHE-2015-FER_SER_REPLY [22-05-2020(online)].pdf | 2020-05-22 |
| 19 | 3021-CHE-2015-FORM-26 [29-12-2023(online)].pdf | 2023-12-29 |
| 19 | 3021-CHE-2015-OTHERS [22-05-2020(online)].pdf | 2020-05-22 |
| 20 | 3021-CHE-2015-Correspondence to notify the Controller [02-01-2024(online)].pdf | 2024-01-02 |
| 20 | 3021-CHE-2015-FER.pdf | 2020-01-31 |
| 21 | 3021-CHE-2015-FORM 13 [10-10-2019(online)].pdf | 2019-10-10 |
| 21 | 3021-CHE-2015-Written submissions and relevant documents [19-01-2024(online)].pdf | 2024-01-19 |
| 22 | 3021-CHE-2015-POA [19-01-2024(online)].pdf | 2024-01-19 |
| 22 | 3021-CHE-2015-RELEVANT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 23 | 3021-CHE-2015-Correspondence-191015.pdf | 2016-03-22 |
| 23 | 3021-CHE-2015-PETITION UNDER RULE 137 [19-01-2024(online)].pdf | 2024-01-19 |
| 24 | 3021-CHE-2015-Form 1-191015.pdf | 2016-03-22 |
| 24 | 3021-CHE-2015-FORM-26 [19-01-2024(online)].pdf | 2024-01-19 |
| 25 | 3021-CHE-2015-Power of Attorney-191015.pdf | 2016-03-22 |
| 25 | 3021-CHE-2015-FORM 3 [19-01-2024(online)].pdf | 2024-01-19 |
| 26 | 3021-CHE-2015-FORM 13 [19-01-2024(online)].pdf | 2024-01-19 |
| 26 | 266254-1 IN Abstract Drawing.jpg | 2015-06-24 |
| 27 | 3021-CHE-2015-AMENDED DOCUMENTS [19-01-2024(online)].pdf | 2024-01-19 |
| 27 | 266254-1_FORM3.pdf | 2015-06-24 |
| 28 | 3021-CHE-2015-PatentCertificate04-04-2024.pdf | 2024-04-04 |
| 28 | 266254-1_IN_FORM2_Application_and_Drawings.pdf | 2015-06-24 |
| 29 | 3021-CHE-2015-IntimationOfGrant04-04-2024.pdf | 2024-04-04 |
| 29 | 266254-1_STAMP_PAPER+FORM26.pdf | 2015-06-24 |
| 1 | upload_search_3021che2015_30-01-2020.pdf |