Abstract: A method for imaging a target volume in a subject is presented. The method includes computing an evolving mean image by iteratively averaging a plurality of pixels corresponding to a plurality of gated images corresponding to the target volume. Further, the method includes iteratively computing one or more transforms for aligning one or more of the plurality of gated images to the evolving mean image based on at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty until convergence of the evolving mean image and the iteratively computed transforms. In addition, the method includes generating one or more motion-corrected images corresponding to the target volume based on the converged transforms. The method also includes displaying the one or more motion-corrected images, the converged mean image, or a combination thereof, on a display device.
DESC:BACKGROUND
Embodiments of the present specification relate generally to diagnostic imaging, and more particularly to a system and a method for robust motion correction of image data.
Non-invasive imaging techniques are widely used in security screening, quality control, and medical diagnostic systems. Particularly, in medical imaging, non-invasive medical diagnostic imaging techniques such as multi-energy imaging allow for unobtrusive, convenient, and fast imaging of underlying tissues and organs. Additionally, certain non-invasive imaging techniques may also allow for visualization of functional behavior such as chemical or metabolic activities of organs and tissues within a patient. By way of example, a positron emission tomography (PET) system may be used to generate PET images that represent a distribution of positron-emitting nuclides within the patient’s body. The distribution of the positron-emitting nuclides, in turn, may be correlated with various structural and/or functional parameters that are indicative of a pathological condition of the patient.
[0003] Generally, PET images are acquired over a time interval of several minutes. During this time interval, the image data acquisition may be affected by patient motion such as motion due to respiration, cardiac motion, and/or other gross patient movement. Such patient motion may cause motion artifacts and/or other discrepancies in acquired PET image data, in turn, leading to reconstruction of erroneous PET images that may be unsuitable for clinical use.
[0004] Hence, some conventional PET imaging systems employ gating techniques that entail acquisition of PET image data during different breathing phases or gates. Gated acquisition of PET image data allows for mitigation of the motion artifacts in the acquired PET image data that are caused due to patient motion. Although use of the gating methods may reduce motion artifacts in the gated PET image data corresponding to individual gates, each gate in isolation may suffer from a low signal-to-noise ratio due to reduced photon counts recorded within a corresponding acquisition time interval. Furthermore, the PET images reconstructed from PET image data corresponding to different gates may not be in alignment. Such misalignment of the gated PET image data may impede accurate localization and quantification of features of interest such as tumors and corresponding volumes that may be determined using resulting PET images.
[0005] Certain PET imaging systems attempt to correct the misalignment of image data, and in turn, corresponding gated PET images acquired at different gates through use of reference-based registration methods. In the reference-based registration methods, a gated PET image may be chosen as a reference image. Further, one or more of the gated PET images may be aligned to this reference PET image. However, use of such reference PET image may cause the resulting PET images to be biased towards the reference PET image. Additionally, quality of the resulting PET images may be limited by a suitability of the selected reference PET image. For example, when the reference PET image includes anatomy exhibiting extreme motion, other gated PET images may have to undergo large deformation to be registered to the reference PET image. Use of the reference PET image includes anatomy exhibiting extreme motion, thus, may result in low registration quality and/or need for complicated computations.
[0006] Accordingly, certain conventional PET imaging systems alternatively employ elastic or non-rigid registration (NRR) methods for mitigating the motion artifacts in PET images. Specifically, the conventional PET imaging systems may non-rigidly register the gated PET images that are reconstructed from the acquired PET image data and average the registered images to mitigate errors caused by the motion artifacts. Generally, the conventional NRR methods may be represented as functions for estimating optimal transforms that map the gated PET images to an evolving ‘reference image’. Particularly, conventional NRR methods, for example, may employ patch-based metrics, regularization, physics-based tissue elasticity modeling, viscosity modeling, and/or diffeomorphic registration approaches for mapping the gated PET images to the evolving reference image. However, such conventional NRR methods are often unstable due to presence of large motion, intrinsic noise, and/or inability to preserve small structures in the resulting PET images.
[0007] Accordingly, certain other NRR methods employ imaging data acquired by another imaging system such as a magnetic resonance imaging (MRI) system for mitigating motion artifacts in PET images. Typically, such multi-modality registration methods rely on an accurate phase match between PET and MRI data. However, such an accurate phase match may not be achieved for standard PET and/or MRI data acquisitions where breathing pattern variability is unavoidable. Additionally, PET and MRI data corresponding to different structures may also experience mismatch owing to a difference in the underlying physics principles used by the PET
and MRI systems for acquiring corresponding imaging data. Mitigating such mismatch between the PET and MRI data may entail complicated and/or computationally expensive implementations, thereby resulting in longer imaging scans. Accordingly, PET images that have been motion-corrected using conventional reference-based registration methods, conventional NRR methods, and/or multi-modal registration methods may be unsuitable for investigating and/or accurately characterizing minute features within a target volume in the patient.
BRIEF DESCRIPTION
[0008] In accordance with aspects of the present specification, a method for imaging a target volume in a subject is presented. The method includes computing an evolving mean image by iteratively averaging a plurality of pixels corresponding to a plurality of gated images corresponding to the target volume. Further, the method includes iteratively computing one or more transforms for aligning one or more of the plurality of gated images to the evolving mean image based on at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty until convergence of the evolving mean image and the iteratively computed transforms. In addition, the method includes generating one or more motion-corrected images corresponding to the target volume based on the converged transforms. The method also includes displaying the one or more motion-corrected images, the converged mean image, or a combination thereof, on a display device.
[0009] In accordance with aspects of the present specification, a method for imaging a target volume in a subject is presented. The method includes computing an evolving mean positron emission tomography image by iteratively averaging a plurality of pixels corresponding to a plurality of gated positron emission tomography images corresponding to the target volume. Moreover, the method includes iteratively computing transforms for aligning one or more of the plurality of gated positron emission tomography images to the evolving mean positron emission tomography image based on at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty until convergence of the evolving mean positron emission tomography image and the iteratively computed transforms. Furthermore, the method includes generating one or more motion-corrected positron emission tomography images corresponding to the target volume based on the converged transforms. The method also
includes displaying the one or more motion-corrected positron emission tomography images, the converged mean positron emission tomography image, or a combination thereof.
[0010] In accordance with aspects of the present specification, an imaging system is presented. The system includes a data acquisition subsystem configured to acquire image data corresponding to a target volume in a subject. In addition, the system includes a processing unit operatively coupled to the data acquisition subsystem and configured to generate a plurality of gated images corresponding to the target volume, compute an evolving mean image by iteratively averaging a plurality of pixels corresponding to the plurality of gated images, iteratively compute transforms for aligning one or more of the plurality of gated images to the evolving mean image based on at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty until convergence of the evolving mean image and the iteratively computed transforms, and generate one or more motion-corrected images corresponding to the target volume based on the converged transforms.
DRAWINGS
[0011] These and other features and aspects of embodiments of the present specification will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0012] FIG. 1 is a schematic representation of an exemplary imaging system for use in motion correction of acquired image data, in accordance with aspects of the present specification;
[0013] FIG. 2 is a schematic representation of another exemplary imaging system for use in motion correction of acquired image data, in accordance with aspects of the present specification;
[0014] FIG. 3 is a flow chart illustrating an exemplary method for motion correction of acquired image data, in accordance with aspects of the present specification; and
[0015] FIG. 4A is a diagrammatical representation of a set of gated PET images generated using a conventional PET image reconstruction method;
[0016] FIG. 4B is a diagrammatical representation of a mean PET image generated based on the PET images depicted in FIG. 4A;
[0017] FIG. 4C is a diagrammatical representation of a set of gated PET images generated using a conventional NRR method;
[0018] FIG. 4D is a diagrammatical representation of a mean PET image generated based on the PET images depicted in FIG. 4C;
[0019] FIG. 4E is a diagrammatical representation of an exemplary set of gated PET images generated using the method of FIG. 3, in accordance with aspects of the present specification; and
[0020] FIG. 4F is a diagrammatical representation of an exemplary mean PET image generated based on the PET images depicted in FIG. 4E, in accordance with aspects of the present specification.
DETAILED DESCRIPTION
[0021] The following description presents exemplary systems and methods for robust motion correction of image data corresponding to a subject. Particularly, embodiments illustrated hereinafter disclose exemplary imaging systems and methods that provide an enhanced group-wise non-rigid registration (NRR) framework for efficiently registering gated images using non-local spatio-temporal priors. As used herein, the term “group-wise NRR” may be used to refer to a registration method where an evolving ‘ideal’ reference image and transforms for mapping the gated images to the evolving ideal reference image are jointly estimated. According to aspects of the present specification, the joint NRR aids in efficient mitigation of motion artifacts in resulting images. The systems and methods for robust motion correction of image data aid in efficient and enhanced mitigation of motion artifacts in resulting images.
[0022] Generally, acquired image data that includes respiratory and/or cardiac motion exhibits strong correlation of motion across multiple frames and/or across distant or non-local regions. By way of example, when imaging a subject, an impact of respiratory motion may be
detected as low as the pelvic region of the subject. Similarly, an effect of a large motion exhibited by a small structure, such as a lesion, may be detected and/or measured more easily at a non-local neighbor such as in the lung and/or at cardiac interphase regions.
[0023] According to aspects of the present specification, such correlation of motion between the small structure and the non-local regions may be efficiently modeled using non-local spatio-temporal priors or non-local displacement and/or velocity coherence penalties. As used herein, the term non-local regions may be used to refer to one or more regions in a target volume of a subject that exhibit similarity (coherence) in displacement and/or velocity of pixels within a selected region in the target volume and one or more pixels in other regions in the target volume that are spatially separated from the selected region. Further, in the present specification, the terms spatio-temporal priors, displacement, and/or velocity coherence penalties may be interchangeably used to refer to determined displacement and/or motion information corresponding to pixels in the non-local regions corresponding to the selected region. In accordance with aspects of the present specification, the determined displacement and/or motion information corresponding to pixels in the non-local regions may be used to constrain registration of pixels in the selected region of the target volume.
[0024] For clarity, exemplary embodiments of the present systems and methods are described in the context of a PET imaging system. However, it will be appreciated that use of the present systems and methods in various other imaging applications and systems is also contemplated. Some of these systems, for example, may include a computed tomography (CT) system, a single positron emission computed tomography system (SPECT) system, an X-ray imaging system, a magnetic resonance imaging (MRI) system, an optical imaging system, and/or an ultrasound imaging system. An exemplary environment that is suitable for practising various implementations of the present system and methods is discussed in the following sections with reference to FIGs. 1-2.
[0025] FIG. 1 illustrates an exemplary imaging system 100 configured to generate motion-corrected images corresponding to a target volume in a subject 101 such as a patient or a non-biological object. In one embodiment, the imaging system 100 for example, may include a PET system, a SPECT system, a CT imaging system, an MRI system, a hybrid imaging system,
and/or a multi-modality imaging system. Particularly, the system 100 may be configured to acquire image data for use in generating desired images of the patient 101.
[0026] Accordingly, in one embodiment, the patient 101 may be suitably positioned, for example, on a table 102 to allow the system 100 to image the target volume of the patient 101. Moreover, in certain embodiments, the system 100 may include a control subsystem 104 that may be configured to semi-automatically and/or automatically move the table 102 and/or the target volume into a field of view of the system 100. Additionally, the control subsystem 104 may also be configured to operate the system 100 in different modes, such as two-dimensional (2D), three-dimensional (3D), and/or four-dimensional (4D) data acquisition modes, in response to one or more suitable commands. These commands, for example, may be pre-programmed and/or may be received from a user via an operator workstation 106 that may be communicatively coupled to the system 100 via one or more communications links 108.
[0027] Furthermore, in one embodiment, the system 100 may include a data acquisition subsystem (DAS) 110 that may be configured to acquire image data corresponding to the target volume of the patient 101 under control of the control subsystem 104. Particularly, in certain embodiments, the DAS 110 may be configured to acquire gated image data corresponding to the target volume for mitigating motion artifacts caused by patient motion in resulting images.
[0028] Accordingly, in one embodiment, the system 100 may include a gating device 112 that may be communicatively coupled to the DAS 110. For example, in one embodiment, the gating device 112 may include an electrocardiogram (ECG) machine. Further, in certain embodiments, the gating device 112 may be configured to provide a suitable signal to the DAS 110 to schedule image data acquisition at a plurality of gating time intervals. In one embodiment, the gating device 112 may be communicatively coupled to the DAS 110 via the communications links 108 to communicate the signals corresponding to desired gating time intervals. In some embodiments, these gating time intervals may be pre-programmed and/or manually selected by a user. Particularly, in one embodiment, the user may manually select the gating time intervals using one or more input devices 114 that may be communicatively coupled to the operator workstation 106. The input devices 114, for example, may include devices such as a keyboard, a
control panel, a mouse, a microphone, and/or a touchscreen display that may be configured to receive the user input.
[0029] Moreover, in certain embodiments, the system 100 may include a processing subsystem 116 that may be configured to process acquired image data based on the received user input. Accordingly, the processing subsystem 116, for example, may include one or more application-specific processors, graphical processing units, digital signal processors, microcomputers, microcontrollers, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), and/or other suitable processing devices. Alternatively, the processing subsystem 116 may be configured to store the acquired image data and/or the user input in a data repository 118 for later use. In one embodiment, the data repository 118, for example, may include a hard disk drive, a floppy disk drive, a compact disk-read/write (CD-R/W) drive, a Digital Versatile Disc (DVD) drive, a flash drive, and/or a solid-state storage device.
[0030] In certain embodiments, the processing subsystem 116 may be configured to retrieve the image data and/or the user input from the data repository 118 for reconstructing gated images of the target volume. Additionally, in one embodiment, the processing subsystem 116 may be configured to pre-process the acquired image data for reconstruction and/or motion-correction based on specific physical principles employed by the system 100 for imaging the target volume. For example, if the system 100 corresponds to a PET system, the processing subsystem 116 may be configured to perform a histogram-based normalization prior to the motion-correction. However, if the system 100 corresponds to a CT imaging system, the processing subsystem 116 may be configured to perform a Hounsfield units-based dynamic range normalization for reconstructing desired images of the target volume.
[0031] Further, when imaging the target volume, voluntary and involuntary patient motion such as movement of limbs, respiration, and/or cardiac motion may result in significant motion artifacts in the gated image data. The gated images reconstructed using the erroneous image data, thus, may be unsuitable for diagnosing certain medical conditions and/or for accurately characterizing small structures of interest such as lesions in the target volume.
[0032] Accordingly, the system 100 may be configured to perform non-rigid registration (NRR) of the acquired and/or pre-processed image data using non-local displacement and/or velocity coherence penalties to generate motion-corrected images. Particularly, in accordance with aspects of the present specification, the processing subsystem 116 may be configured to employ displacement and/or velocity coherence information corresponding to pixels in one or more non-local regions for predicting displacement and/or velocity of a selected pixel in a selected region of the target volume. Specifically, the non-local displacement and/or velocity coherence information may be used to constrain registration of the selected pixel to provide efficient motion-correction. Use of the non-local displacement and/or velocity coherence information, thus, may provide greater robustness to noise, efficient correction of image artifacts caused by large motion, and/or may aid in preserving small structures in the resulting motion-corrected images.
[0033] Also, in certain embodiments, the processing subsystem 116 may be configured to store the motion-corrected images along with the acquired and/or pre-processed image data in the data repository 118. Alternatively, the processing subsystem 116 may be configured to display the motion-corrected images and/or any clinical information derived therefrom on one or more output devices 120. The output device 120, for example, may include a display device that may be communicatively coupled to the operator workstation 106. In one embodiment, the motion-corrected images and/or any clinical information may be displayed on the output device 120 in real-time, or in near real-time (for example, within 2-3 seconds) to aid a medical practitioner in providing a more informed diagnosis of the patient 101. Certain exemplary methods for motion correction and/or enhanced registration of the acquired image data via use of the non-local displacement and/or velocity coherence penalties will be described in greater detail with reference to FIGs. 2-4.
[0034] FIG. 2 illustrates an exemplary imaging system 200 configured to provide enhanced motion-correction and registration of imaging data corresponding to a subject such as a patient. For discussion purposes, the imaging system 200 is described herein with reference to a PET imaging system 201.
[0035] Generally, during PET imaging, a positron-emitting radionuclide may be introduced into the patient’s body via a biologically active molecule. The radionuclide may undergo positron emission decay and emit a positron that travels in a tissue for a short distance. The emitted positron may subsequently interact with an electron. Typically, a positron-electron interaction results in annihilation, thus converting entire mass of the positron-electron pair into two 511 kilo-electron volt (keV) photons emitted in opposite directions along a line of response (LOR).
[0036] In certain embodiments, the PET imaging system 201 may be configured to detect and correlate the emitted photons to functional information corresponding to the patient. Particularly, in one embodiment, the PET imaging system 201 may be configured to detect a coincidence event if both the emitted photons arrive and are detected during the same temporal window or gate. Additionally, the PET imaging system 201 may be configured to use the detected coincidence information for generating 2D or 3D gated PET images corresponding to the patient.
[0037] Accordingly, in one embodiment, the system 200 may include a detector ring assembly 202 disposed about a patient bore (not shown). Specifically, the system 200 may include multiple detector rings that may be spaced along a central axis of the PET imaging system 201 to form the detector ring assembly 202. The detector rings, in turn, may include a plurality of detector modules 204. In one example, the detector module 206 may include a 6x6 array of individual bismuth germanate (BGO) detector crystals. Generally, the detector modules 204 may be used to detect gamma radiation emitted from the patient and may produce photons in response to the detected gamma radiation.
[0038] Furthermore, in one embodiment, the array of detector modules 204 may be positioned proximate to a plurality of photomultiplier tubes (not shown) in the PET imaging system 201. In certain embodiments, the photomultiplier tubes (PMTs) may be configured to produce analog signals when a scintillation event occurs at one of the detector modules 204. Specifically, the PMTs may be configured to produce analog signals when a gamma ray emitted from the patient is received by one of the detector modules 204. Also, the PET imaging system 201 may also include a set of acquisition circuits 206 that may be configured to receive the analog signals and
generate corresponding digital signals. In one embodiment, the digital signals may be indicative of a location and energy associated with a detected radiation event. The digital signals, thus, may be correlated with functional information, which may be used in PET image reconstruction.
[0039] Moreover, in certain embodiments, the PET imaging system 201 may also include a DAS 208 that may be configured to periodically sample the digital signals produced by the acquisition circuits 206. The DAS 208, in turn, may include a processing unit 210, which may be configured to control communications between different components of the PET imaging system 201. Particularly, in one embodiment, the processing unit 210 may be configured to communicate with different components of the PET imaging system 201 via a coupling unit 212. In one embodiment, the coupling unit 212, for example, may include electrical circuitry, electronic circuitry, a backplane bus, a wired communications network, and/or a wireless communications network. Additionally, the DAS 208 may also include one or more event locator circuits 214 that may be configured to assemble information corresponding to each valid radiation event into an event data packet. The event data packet, for example, may include a set of digital numbers that may accurately indicate a time of the radiation event and a position of the detector crystals that detected the radiation event.
[0040] In addition, in certain embodiments, the event locator circuits 214 may be configured to communicate the assembled event data packets to a coincidence detector 216 for determining coincidence events. Particularly, the coincidence detector 216 may be configured to identify coincidence event pairs if time and location markers in two event data packets are within pre-programmed and/or selected thresholds. By way of example, in one embodiment, the coincidence detector 216 may be configured to identify a coincidence event pair if time markers in two event data packets are within 12 nanoseconds of each other and if the corresponding locations lie on a straight line passing through a field of view (FOV) across a patient bore.
[0041] Further, in certain embodiments, the PET imaging system 201 may be configured to store the determined coincidence event pairs in a storage subsystem 218 that may be operatively coupled to the PET imaging system 201. In the example of FIG. 2, the storage subsystem 218 is depicted as an integral part of the PET imaging system 201. However, in certain embodiments,
the storage subsystem 218 may be external to the PET imaging system 201. In this example, the storage subsystem 218 may be operatively coupled to the PET imaging system 201.
[0042] The storage subsystem 218, for example, may include a sorter 220 that may be configured to sort the coincidence events. In one embodiment, for example, the sorter 220 may be configured to sort the coincidence events in a 3D projection plane format using a look-up table. Particularly, the sorter 220 may be configured to determine an order of the detected coincidence event data using one or more parameters such as radius or projection angles for efficient storage.
[0043] Additionally, in certain embodiments, the processing unit 210 may be configured to process the coincidence event data to determine corresponding time-of-flight (TOF) information. The TOF information may allow the PET imaging system 201 to estimate a point of origin of the electron-positron annihilation with greater accuracy, thus improving event localization. The event localization information, in turn, may be used to accurately locate one or more features of interest in reconstructed PET images.
[0044] Moreover, in one embodiment, the PET imaging system 201 may include an image reconstruction unit 224 that may be configured to use the improved event localization data to generate high-resolution PET images corresponding to the target volume in the patient. In the example of FIG. 2, the image reconstruction unit 224 is depicted as an integral part of the processing unit 210. However, in certain embodiments, the image reconstruction unit 224 may be an independent device that is communicatively coupled to the PET imaging system 201. In yet another embodiment, the image reconstruction unit 224 may be absent and the processing unit 210 may be configured to perform one or more functions of the image reconstruction unit 224 such as reconstruction of the PET images.
[0045] Particularly, according to certain aspects of the present specification, the image reconstruction unit 224 may be configured to mitigate the motion artifacts in gated PET images. Accordingly, in one embodiment, the PET imaging system 201 may be configured to acquire gated PET imaging data corresponding to the target volume via use of a gating device 228 such as an electrocardiogram machine. In certain embodiments, the gating device 228 may be configured to provide a signal to the system 200 to schedule gated PET image data acquisition at
a plurality of desired gating time intervals. The gating time intervals may be pre-programmed or may be manually selected by a user, for example, using an input device 230 available on an associated operator workstation 232. Subsequently, the image reconstruction unit 224 may be configured to generate gated PET images based on the acquired gated PET image data.
[0046] Additionally, in one embodiment, the image reconstruction unit 224 may be configured to correct or compensate for the motion artifacts in gated PET images, for example, that result from respiration, cardiac motion, and/or other gross patient motion. Particularly, the image reconstruction unit 224 may be configured to correct or compensate for the motion artifacts in gated PET images via use of non-local displacement and/or velocity coherence penalties. As previously noted, the displacement and/or velocity coherence penalties may be employed to constrain displacement and/or velocity of one or more selected pixels in the gated PET images during registration. Particularly, the displacement and/or velocity of one or more selected pixels in the gated PET images may be constrained based on relative displacement and/or motion of one or more pixels corresponding to at least one non-local region where the displacement and/or motion may be more easily measured .
[0047] Accordingly, in one embodiment, the image reconstruction unit 224 may be configured to perform a group-wise joint NRR of the gated PET images to identify motion corresponding to different features in the target volume based on motion of the non-local region. It may be noted that the group-wise NRR may preclude use of a specific gated cine CT image or a gated PET image as reference images. Additionally, in accordance with aspects of the present specification, the joint NRR may employ an iteratively evolving mean PET image for use in registering the gated PET images based on the non-local displacement and/or velocity coherence penalties.
[0048] In accordance with aspects of the present specification, the image reconstruction unit 224 may be configured to initialize the evolving mean PET image. For example, the image reconstruction unit 224 may be configured to initialize the evolving mean PET image by averaging pixel intensities in a plurality of the gated PET images. Further, the image reconstruction unit 224 may be configured to transform each of the gated PET images to the
initialized mean PET image, where the computed transforms may be constrained using the displacement and/or velocity coherence penalties to mitigate motion artifacts.
[0049] Subsequently, the image reconstruction unit 224 may be configured to update the mean PET image by averaging the pixel intensities in corresponding transformed gated PET images. In accordance with aspects of the present specification, the image reconstruction unit 224 may be configured to iteratively update the mean PET image. Additionally, the image reconstruction unit 224 may be configured to iteratively compute transforms for aligning gated PET images to the updated mean PET image based on the displacement and/or velocity coherence penalties until a corresponding convergence.
[0050] Particularly, in one embodiment, the image reconstruction unit 224 may be configured to use a displacement coherence penalty to constrain a displacement of a selected pixel in each iteratively computed transform to be within a designated value or range of displacement values. Similarly, the image reconstruction unit 224 may be configured to use a velocity coherence penalty to constrain a velocity of a selected pixel in each iteratively computed transform to be within a designated value or range of velocity values. By way of example, the designated value or range of values for each of the penalties may be selected based on motion coherence between the selected pixel and one or more other pixels corresponding to non-local regions.
[0051] In one embodiment, known anatomical information may be used to determine motion coherence between pixels corresponding to different regions in the target volume. Specifically, the anatomical information may be used to identify the non-local regions where an effect of displacement and/or motion of the selected pixel may be measured more easily. By way of example, when imaging the cardiothoracic region of a patient, anatomical location and/or tissue elasticity may be used to estimate an extent and type of influence of cardiac motion on motion of neighboring organs. The anatomical location and tissue elasticity information, in turn, may be determined based on phase matched cine CT data, which typically accompanies the PET image data. The non-local displacement and/or velocity coherence, thus determined from the anatomical information, penalizes deformation fields that are determined during each iteration of the group-wise NRR of the gated PET images until convergence of the evolving mean PET image and/or the iteratively computed transforms.
[0052] Additionally, in one embodiment, the evolving mean PET image and/or the transforms are determined to have converged when a distance between one or more of the gated PET images and the mean PET image is within a clinically or user prescribed threshold. According to aspects of the present specification, the converged mean PET image may correspond to a high signal-to-noise-ratio (SNR) PET image that may be used in clinical diagnosis, especially when evaluating regions that include small anatomical structures. Moreover, convergence of the evolving mean PET image and/or the iteratively computed transforms mitigates misalignment of the gated PET images due to presence of the motion artifacts and aids in generating motion-corrected PET images of a desired quality and/or clinical specification. Particularly, registering the gated PET images, when constrained by the displacement and/or velocity coherence penalties, aids in compensating for large motion. Accordingly, the present non-local NRR method allows for a more efficient feature-based registration and/or preservation of small structures such as lesions in resulting PET images.
[0053] Further, in one embodiment, the image reconstruction unit 224 may be configured to transmit the converged mean PET images and/or the motion-corrected PET images to an output device 234. The output device 234, for example, may include a display device, an audio device, and/or a video device that allow a medical practitioner access to the converged mean PET image and/or the motion-corrected PET images. Alternatively, in certain embodiments, the image reconstruction unit 224 may be configured to transmit the converged mean PET image and/or the motion-corrected PET images to the processing unit 210 for further analysis.
[0054] Particularly, in one embodiment, the processing unit 210 may be configured to perform further analysis to determine one or more clinical parameters of interest such as lesion volumes and/or standard uptake values (SUV) from the converged mean PET image and/or the motion-corrected PET images. Additionally, the processing unit 210 may be configured to transmit the determined clinical parameters, the converged mean PET image, and/or the motion-corrected PET images to the output device 234 in real-time and/or near real-time.
[0055] Communication of the converged mean PET image, the motion-corrected PET images, and/or the clinical parameters in at least near real-time may allow the medical practitioner to diagnose and/or prescribe suitable treatment for the patient in a timely manner. Additionally, in
certain embodiments, the processing unit 210 may be configured to generate an audio and/or visual alert via an alerting subsystem 236 if one or more of the clinical parameters are determined to fall outside corresponding clinical or user-prescribed thresholds. For example, the processing unit 210 may be configured to activate blinking lights, issue audio instructions, and/or transmit an email or short messaging service alert to a caregiver for ensuring timely provision (for example, within 60 seconds) of medical attention to the patient. In certain other embodiments, the motion-corrected PET images may be used, for example, as feedback for guiding radiation therapy and/or a subsequent imaging scan of the patient.
[0056] It may be noted that the specific arrangements depicted in FIGs. 1-2 are exemplary. In certain embodiments, the systems 100 and 200 may be configured or customized for additional functionality, different imaging applications, and/or different scanning protocols. By way of example, in one embodiment, the systems 100 and/or 200 may include a picture archiving and communications system (PACS). The PACS (not shown) may be further coupled to a radiology department information system, a hospital information system and/or to an internal or an external communications network such as the coupling unit 212 to provide remote access to the converged mean PET image, the motion-corrected PET images and/or the clinical parameters. Specifically, in certain embodiments, the communications network may allow operators at different locations to supply commands and parameters and/or gain access to the motion-corrected and high SNR PET images and/or the determined clinical parameters. Certain exemplary methods for enhanced motion correction in PET images using the displacement and/or velocity coherence penalties will be described in greater detail with reference to FIG. 3.
[0057] FIG. 3 illustrates a flow chart 300 depicting an exemplary method for enhanced motion correction in images corresponding to a subject. In the present specification, embodiments of the exemplary method may be described in a general context of computer executable instructions on a computing system or a processor. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract image data types.
[0058] Additionally, embodiments of the exemplary method may also be practised in a distributed computing environment where optimization functions are performed by remote processing devices that are linked through a wired and/or wireless communication network. In the distributed computing environment, the computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0059] Further, in FIG. 3, the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, or combinations thereof. The various operations are depicted in the blocks to illustrate the functions that may be performed, for example, during the steps of generating an evolving mean PET image, iteratively computing transforms corresponding to the PET images, and/or generating one or more motion-corrected PET images in the exemplary method. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations.
[0060] The order in which the exemplary method is described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary method disclosed herein, or an equivalent alternative method. Additionally, certain blocks may be deleted from the exemplary method or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein. For discussion purposes, the exemplary method will be described with reference to the elements of FIGs. 1-2.
[0061] Embodiments of the present method allow for enhanced motion correction in images corresponding to a target volume in a subject. For clarity, the present method is described with reference to enhanced motion correction in PET images via a group-wise NRR of gated PET images based on non-local displacement and/or velocity coherence penalties. However, certain embodiments of the present method may also be used for enhanced motion correction in images reconstructed using other imaging systems such as an MRI system, a CT imaging system, and/or an ultrasound imaging system. Particularly, use of the group-wise NRR constrained by non-local displacement and/or velocity coherence penalties minimizes noise and large motion artifacts, while preserving small structures in the registered PET images.
[0062] In one embodiment, the method begins at step 302, where gated PET image data corresponding to a target volume in a subject is received. The target volume, for example, may correspond to the heart, lungs, or a liver region in the subject such as a patient. In certain embodiments, the gated PET image data may be acquired by a PET imaging system such as the system 200 of FIG. 2. Specifically, in certain embodiments, the gated PET image data may be acquired, for example, during one or more selected phases of breathing between end inspiration and end expiration. By way of example, in one embodiment, the gated PET image data may be binned into six different phases or gates.
[0063] Further, at step 304, a plurality of gated PET images corresponding to the target volume may be reconstructed using the gated PET image data. Particularly, in one embodiment, the gated PET images may be reconstructed, for example, using iterative image reconstruction or filtered backprojection techniques. As previously noted, the gated PET images may include motion artifacts due to patient motion during image data acquisition. Particularly, gross motion of the patient, movement of thoracic region due to heartbeat, and/or movement of lungs due to breathing during the data acquisition may cause motion artifacts in the gated PET images.
[0064] In accordance with aspects of the present specification, the plurality of gated PET images may be processed using an enhanced group-wise NRR method to allow for correction of the motion artifacts caused by patient motion. Accordingly, at step 306, an evolving mean PET image may be computed from the plurality of gated PET images. In one embodiment, the evolving mean PET image may be computed by averaging pixel intensities in the plurality of gated PET images. As used herein, the phrase “averaging the pixel intensities” may correspond to computing a corresponding mean, a mode, and/or an arithmetic mean of the pixel intensities associated with the plurality of gated PET images.
[0065] Subsequently, at step 308, one or more transforms may be iteratively computed to align the gated PET images to the mean PET image based on at least one non-local displacement coherence penalty and/or at least one non-local velocity coherence penalty. Particularly, in one embodiment, the transforms may be computed for aligning the gated PET images to the evolving mean PET image via use of one or more NRR techniques. In one embodiment, the NRR
techniques may include a steepest descent minimization of the Mean Squared Error (MSE) between the evolving mean PET image and each of the gated PET images.
[0066] However, use of conventional NRR methods may not provide optimal motion-correction in resulting PET images. This is because, in conventional group-wise NRR methods, the mean PET image may converge at different stages. More specifically, the convergence of the mean PET image may vary due to differences in selection of different time steps or initialization of the transforms when iteratively computing the transforms for one or more of the gated PET images. Such variation is more pronounced for gated PET image data acquired via use of fewer than 4-6 gates. Thus conventional group-wise NRR methods are often unstable and/or provide inconsistent performance in generating high SNR PET images that are suitable for clinical diagnosis.
[0067] According to aspects of the present specification, these shortcomings of the conventional group-wise NRR methods may be circumvented by use of the non-local displacement and/or velocity coherence penalties for constraining the NRR of the gated PET images. Particularly, suitable non-local displacement and/or velocity coherence penalties may be selected to constrain the transforms that may be iteratively computed for one or more of the gated PET images.
[0068] Accordingly, in one embodiment, the non-local displacement and/or velocity coherence penalties may be determined based on displacement and/or velocity coherence between a selected pixel and one or more pixels in non-local regions. The displacement and/or velocity coherence, in turn, may be determined based on determined clinical and/or anatomical information. In one embodiment, the determined clinical and/or anatomical information may aid in identifying how different types of motion such as respiration, heartbeat, motion of internal organs, external vibrations, and/or gross patient motion affects displacement and/or velocity of pixels corresponding to different non-local regions of the patient. The displacement and/or velocity information, thus identified, may be used to predict displacement and/or motion at a selected location, and in turn, constrain the transforms that are computed in each iteration for aligning the location in one or more of the gated PET images to the corresponding location in the evolving mean PET image. In particular, use of non-local displacement and/or velocity
information may aid in accurately predicting large motion of even small structures that conventional NRR methods often fail to detect and/or determine from the gated PET images.
[0069] Furthermore, in one embodiment, weighted non-local displacement and/or velocity coherence penalties may be used to provide suitable control over the deformation fields determined for aligning the gated PET images to the evolving mean PET image in each iteration. Accordingly, in one embodiment, the NRR framework may be represented, for example, using equation (1).
??????[??,??]=S?|????(·+????)-??|2????O+???????(?? ,??)|????(?? )-????(??)|2????OO????=1???? +?????????(?? ,??)|????(?? )-????(??)|2???????? OO (1)
[0070] In equation (1), where Enl corresponds to an objective function representative of the NRR framework, ?? corresponds to the evolving mean PET image and u corresponds to deformation fields that may be represented by {????} ??=1?? and/or =[??1,??2,…,????]. Also, ????(·+????) is representative of the function ????(?? +????(?? )) for every ?? . Also, N corresponds to a selected number of PET gates represented by {????} ??=1??, O corresponds to domain of the pixels, and ?? corresponds to running index across gates. Moreover, ?? and ???? correspond to selected weights and ??(?? ,??) and ????(?? ,??) correspond to weight functions representative of the weighted non-local displacement and/or velocity coherence penalties that may be used to identify one or more non-local pixels y that may be correlated with a selected pixel x. Further, ??? and ??? correspond to scalar values that balance the weight functions and ???? corresponds to velocity that may be also represented by [????+1-????].
[0071] In one embodiment, the group-wise NRR may be achieved by minimizing the objective function Enl using a time descent or a gradient descent technique. Specifically, in one embodiment, the objective function Enl may be minimized by iteratively updating the evolving mean PET image ?? and the deformation fields ?? using the time descent or gradient descent technique. Further, the objective function Enl may be discretized using semi-implicit finite differences in a multi-resolution framework to allow for efficient NRR of the gated PET images. Moreover, the objective function Enl may be represented as Euler Lagrange equations, which
include linear operators that can be advantageously used in standard implicit schemes that are employed in finite difference solvers. In addition, the objective function E
nl may be constrained via use of the weighted non-local displacement and/or velocity coherence penalties that allow for a desired regularization or constraining of the deformation fields u, thereby providing a more stable NRR framework.
[0072] Particularly, in equation (1), the first regularization term ?????????(?? ,??)|????(?? )-OO????(??)|2???????? ?? may correspond to the weighted non-local displacement coherence penalty that operates on the deformation fields u and defines correlations in the deformation fields u between non-local regions. Further, the second regularization term represented by ???????????(?? ,??)|????(?? )-????(??)|2???????? OO?? may correspond to the weighted non-local velocity coherence penalty, where the weighted non-local velocity coherence penalty defines the non-local spatial correlations of velocity ????. Specifically, the weighted non-local velocity coherence penalty may define a temporal trend of motion across non-local pixels, where the temporal trend of motion may be used to mitigate noise, large motion, and/or low contrast artifacts during the NRR of the gated PET images.
[0073] Additionally, in certain embodiments, the weighted non-local velocity coherence penalty may be selectively adjusted to be less restrictive on the deformation fields u than the weighted non-local displacement coherence penalty. Particularly, in one embodiment, the weights corresponding to non-local displacement and/or velocity coherence penalties may be adjusted via use of the scalar values ??? and ??? to balance the corresponding weight functions ??(?? ,??) and ????(?? ,??) in a desired manner. By way of example, the deformation fields u may be adjusted via use of the scalar values ??? and ??? to allow for greater regularization of displacement of the pixels as compared to velocity of the pixels to minimize noisy artifacts in the resulting PET images.
[0074] Accordingly, in one embodiment, a first range of weights may be assigned to the non-local displacement coherence penalty to constrain displacement of one or more selected pixels in a selected image to lie within a designated range of displacement values. Accordingly, at least one non-local region corresponding to the selected pixels in the selected image may be identified
based on known anatomical information. Subsequently, a value of the non-local displacement coherence penalty that reduces a difference in displacement of the one or more selected pixels and one or more other pixels corresponding to the non-local region may be determined.
[0075] Further, a second range of weights may be assigned to the non-local velocity coherence penalty to constrain velocity of the selected pixels between consecutive image frames to lie within a designated range of velocity values. In one embodiment, the second range of weights may have values that are lesser than the first range of weights. Selectively weighting the non-local displacement and/or velocity coherence penalties may allow for efficient control over desired transformation of different regions in the gated PET images to the mean PET image. Accordingly, use of the weighted non-local displacement and/or velocity coherence penalties may aid in mitigating noise and/or artifacts caused by large motion, while preserving features of small structures in the resulting PET images.
[0076] Additionally, in a presently contemplated embodiment, use of the weight functions ??(?? ,??) and ????(?? ,??) may be optimized to allow for faster computation by use of Gaussian weights. In one example, the weight functions may be represented, for example, using Gaussian functions as defined in equations (2) and (3).
??(?? ,??)=????(|?? -??|) (2)
????(?? ,??)=??????(|?? -??|) (3)
where ???? and ?????? correspond to the Gaussian functions.
[0077] Moreover, when using Gaussian weights, minimization of the objective function Enl using a steepest descent technique entails simple convolutions that are fast to compute. Additionally, the steepest descent technique may employ linear operators that may be posed in simple implicit schemes for fast convergence, in turn, allowing for faster clinical diagnoses.
[0078] It may be noted that the embodiments described herein disclose use of the weighted non-local displacement and/or velocity coherence penalties. However, in certain embodiments, the weighted non-local displacement and/or velocity coherence penalties may be replaced with non-local total variation (NL-TV) terms and intensity dependent weights. Use of NL-TV terms
may allow for more efficient computation of iterative transforms for aligning the gated PET images to the evolving mean PET image, albeit with a longer computation time. By way of example, ??(?? ,??)=??????(|????(?? )-????(??)|)????(|?? -??|) may be chosen for a Gaussian ??????, which in turn leads to regions with similar intensity moving coherently, thereby enhancing the capture of motion discontinuities.
[0079] Further, at step 310, an iterative check may be performed to determine if the mean PET image and or the transforms have converged. In certain embodiments, the mean PET image may be identified to have converged when a similarity between the mean PET image and one or more of the plurality of gated PET images is equal to or greater than a prescribed threshold. Accordingly, in one embodiment, a matching metric that may provide an indication of similarity in intensities between the mean PET image and one or more of the gated PET images may be employed. Subsequently, the matching metric may be compared with the prescribed threshold. In one embodiment, the prescribed threshold may correspond to a user-selected or clinically prescribed value or percentage. Thus, if the matching metric corresponds to a value that is equal or is greater than a prescribed threshold value, the mean PET image and transforms may be considered to have converged. Alternatively, if the mean image or the transforms do not change by a value that is equal or is greater than another prescribed threshold value, the mean PET image and transforms may be considered to have converged. Control may be passed to step 314.
[0080] However, at step 310, if it is determined that the mean PET image and/or the transforms have not converged, the mean PET image may be updated, as depicted by step 312. Particularly, in one embodiment, the mean PET image may be updated by averaging pixel intensities of the one or more transformed PET images generated at step 308. Subsequently, control may pass to step 308. According to aspects of the present specification, steps 308, 310, and 312 may be iteratively repeated until convergence of the iteratively updated PET image and/or iteratively computed transforms.
[0081] Further, at step 314, one or more motion-corrected PET images corresponding to the target volume may be reconstructed based on the converged transforms. Generally, convergence of the mean PET image entails spatial realignments and/or motion of pixels involved in iterative
transformations of the gated PET images to the mean PET image that is representative of a high SNR PET image.
[0082] Alternatively, in certain embodiments, the motion-corrected PET images may be further averaged or processed to generate a median PET image. Additionally, any outlying data measurements corresponding to the averaged and/or median PET image may be identified and/or excluded, for example, using a suitable cost function. Subsequently, the one or more desired PET images having high SNR may be generated from the averaged and/or median image. In one embodiment, such high SNR PET images may include minimal motion artifacts, preserve size and SUV of small structures such as tumors, and/or capture motion of low contrast features. Accordingly, the desired PET images, for example, may find use in accurate detection, quantification, and/or localization of lesion regions that may not be easily detected in conventional PET images due to presence of motion artifacts and/or low contrast.
[0083] Further, at step 316, the motion corrected PET images and/or the converged mean PET image may be displayed. Particularly, in one embodiment, the motion corrected PET images and/or the converged mean PET image may be displayed on a display device such as the output device 234 of FIG. 4 in near rea-time. Additionally, in certain embodiments, clinical information, such as SUV and lesion volumes, that may be derived from the motion-corrected PET images and/or converged mean PET image may be displayed to allow a medical practitioner to ascertain a medical condition of the patient with greater accuracy when compared to using conventional PET images. An exemplary performance of the present method may be described via a comparison of a set of PET images generated using a conventional PET imaging technique, a conventional NRR technique, and an embodiment of the present method that will be discussed in detail with reference to FIGs. 4A-4F.
[0084] FIG. 4A depicts a diagrammatical representation 400 of exemplary PET images of a cardio-thoracic region of a patient that are generated following conventional PET image reconstruction. Particularly, FIG. 4A illustrates gated PET images 402, 404, 406, and 408 that are generated using image data that is acquired at four different gates or gating intervals, Gate 1, Gate 2, Gate 3, and Gate, by a PET-CT imaging system. Specifically, the image data may be used to form gated PET sinogram data that is phase matched with cine CT image data employed
for a corresponding PET attenuation correction. In certain embodiments, the image data is quantitatively corrected for random coincidences, dead-time, scatter, normalization and cine CT based attenuation correction. Subsequently, the corrected PET data is reconstructed using a conventional PET image reconstruction technique to generate initial PET images of a desired size.
[0085] As evident from the depictions of FIG. 4A, use of the conventional PET image reconstruction technique results in blurred PET images 402, 404, 406, and 408 that may not be suitable for use in certain clinical diagnoses, without accurate motion correction. By way of example, as depicted in the gated PET image 408, the right lung lesion exhibits a large motion artifact relative to a corresponding size of a lesion. Additionally, the PET images 402, 404, 406, and 408 exhibit significant contrast variations. Moreover, clinically relevant features in a cardiac region fail to be consistently visible in all the gated PET images 402, 404, 406, and 408 that are generated using the conventional PET image reconstruction technique.
[0086] Further, FIG. 4B depicts a diagrammatical representation 410 of an exemplary mean PET image that is generated by averaging pixels corresponding to the gated PET images 402, 404, 406, and 408 of FIG. 4A. In the example illustrated in FIG. 4B, the mean PET image 410 is generated using the four gated PET images 402, 404, 406, and 408 of FIG. 4A. However, in certain other embodiments, a greater or lesser number of gated PET images may be used to generate the mean PET image 410. However, as evident from the depiction of FIG. 4B, even the mean PET image 410 exhibits significant blurring, and thus, may be unsuitable for clinical use.
[0087] Also, FIG. 4C depicts a diagrammatical representation 412 of exemplary PET images of a cardio-thoracic region of a patient. In one embodiment, registered PET images are generated by processing gated PET images using a conventional pair-wise NRR method (PW-NRR). Particularly, FIG. 4C illustrates gated PET images 414, 416, 418, and 420 that are generated by processing gated PET images, such as the PET images 402, 404, 406, and 408 of FIG. 4A, using the conventional PW-NRR method.
[0088] As depicted in FIG. 4C, even the gated PET images 414, 416, 418, and 420 that are generated using the conventional PW-NRR method appear to include registration artifacts 422. Specifically, the gated PET image 414 depicts a split in a lesion along with streaking artifacts
422 at the lung interface. Typically, these streaking artifacts 422 are caused by large motion of the patient. Additionally, an effect of bias to reference is clearly depicted in the gated PET image 416, where distinct features in a cardiac region of the patient appear to have collapsed. Further, the gated PET image 420 also depicts large-scale artifacts that may be typically caused by significant contrast variations.
[0089] Moreover, FIG. 4D depicts a diagrammatical representation 424 of an exemplary mean PET image that is generated by averaging pixels corresponding to the gated PET images 414, 416, 418, and 420 of FIG. 4C. As evident from the depiction of FIG. 4D, even the mean PET image 424 appears to have lost all definitive features that appear in the original gated PET images 402, 404, 406, and 408 of FIG. 4C.
[0090] FIG. 4E depicts a diagrammatical representation 426 of exemplary PET images of a cardio-thoracic region of a patient that are generated by processing gated PET images using an embodiment of the group-wise non-local (GW-NL) NRR method described with reference to FIG. 3. Particularly, FIG. 4E illustrates gated PET images 428, 430, 432, and 434 that are generated by processing gated PET images, such as the PET images 402, 404, 406, and 408 of FIG. 4A, using an embodiment of the GW-NL NRR method. As previously noted, the GW-NL NRR method is constrained via use of weighted non-local displacement and/or velocity coherence penalties for enhanced registration of the gated PET images.
[0091] As is evident from the depictions of FIG. 4E, the gated PET images 428, 430, 432, and 434 generated using the GW-NL NRR method exhibit significantly less blurring and depict even small structures. Particularly, the gated PET images 428, 430, 432, and 434 depict small structures such as a lesion with greater clarity as compared to PET images 414, 416, 418, and 420 of FIG. 4C that are generated using the conventional PW-NRR method.
[0092] Additionally, FIG. 4F depicts a diagrammatical representation 436 of an exemplary mean PET image that is generated by averaging pixels corresponding to the gated PET images 428, 430, 432, and 434 of FIG. 4E. As depicted in FIG. 4F, the mean PET image 436 also exhibits significantly less blurring, while also visualizing small structures. Use of the mean PET image 436, thus, may provide useful clinical indicators that may not be available in conventional PET images for providing an accurate diagnosis.
[0093] Embodiments of the present systems and methods, thus, allow for enhanced PET registration using non-local displacement and/or velocity coherence penalties. Use of non-local displacement and/or velocity coherence penalties allows for efficient prediction of displacement and/or velocity exhibited by large motion of a target region based on a corresponding displacement and/or velocity of one or more non-local regions that allow easier detection and/or measurement of effects of the large motion. The velocity and/or displacement, thus determined using the non-local displacement and/or motion coherence penalties, may aid in accurately predicting patient motion, and in turn, mitigating the image artifacts caused by the detected motion. Moreover, even small structures in the target volume such as lesions or tumors may be accurately reproduced in the resulting motion-corrected PET images via use of non-local displacement and/or velocity coherence information. Accordingly, the resulting motion-corrected PET images may aid a medical practitioner in accurately distinguishing between different medical conditions, thereby improving patient outcomes.
[0094] It may be noted that the foregoing examples, demonstrations, and process steps that may be performed by certain components of the present systems, for example by the DAS 110, the processing subsystem 116, the DAS 208, specifically the processing unit 210 in the DAS 208, and/or image reconstruction unit 224 of FIGs. 1-2 may be implemented by suitable code on a processor-based system. To that end, the processor-based system, for example, may include a general-purpose or a special-purpose computer. It may also be noted that different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently.
[0095] Although specific features of embodiments of the present specification may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments, for example, to construct additional assemblies and methods for use in enhanced diagnostic imaging.
[0096] While only certain features of the present disclosure have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is,
therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention. ,CLAIMS:1. A method for imaging a target volume in a subject, comprising:
computing an evolving mean image by iteratively averaging a plurality of pixels corresponding to a plurality of gated images corresponding to the target volume;
iteratively computing one or more transforms for aligning one or more of the plurality of gated images to the evolving mean image based on at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty until convergence of the evolving mean image and the iteratively computed transforms;
generating one or more motion-corrected images corresponding to the target volume based on the converged transforms; and
displaying the one or more motion-corrected images, the converged mean image, or a combination thereof, on a display device.
2. The method of claim 1, further comprising receiving the plurality of gated images, wherein receiving the plurality of gated images comprises receiving a plurality of positron emission tomography images, a plurality of computed tomography images, a plurality of single photon emission computed tomography images, a plurality of magnetic resonance images, a plurality of X-ray images, a plurality of optical images, a plurality of ultrasound images, or combinations thereof.
3. The method of claim 1, wherein iteratively computing transforms comprises:
selecting one or more pixels in an image selected from the plurality of gated images;
identifying at least one non-local region corresponding to the one or more selected pixels in the selected image; and
determining the at least one non-local displacement coherence penalty that reduces a difference in displacement of the one or more selected pixels and one or more other pixels corresponding to the at least one non-local region.
4. The method of claim 1, wherein iteratively computing transforms comprises:
selecting one or more pixels in an image selected from the plurality of gated images;
identifying at least one non-local region corresponding to the one or more selected pixels in the selected image; and
determining the at least one non-local velocity coherence penalty that reduces a difference in velocity of the one or more selected pixels and one or more other pixels corresponding to the at least one non-local region.
5. The method of claim 1, further comprising selectively assigning weights to one or more of the at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty.
6. The method of claim 5, wherein selectively assigning weights comprises:
selecting one or more pixels in an image selected from the plurality of gated images;
assigning a first range of weights to the at least one non-local displacement coherence penalty for minimizing displacement of the one or more selected pixels over the plurality of gated images; and
assigning a second range of weights to the at least one non-local velocity coherence penalty for minimizing a velocity of the one or more selected pixels between two consecutive gated images in the plurality of gated images.
7. The method of claim 5, wherein selectively assigning weights comprises:
assigning a first range of Gaussian weights to the at least one non-local displacement coherence penalty; and
assigning a second range of Gaussian weights to the at least one non-local velocity coherence penalty, wherein the first range of Gaussian weights is higher than the second range of Gaussian weights.
8. The method of claim 5, further comprising:
selecting one or more non-local total variation terms as the at least one non-local displacement coherence penalty and the at least one non-local velocity coherence penalty; and
selectively assigning intensity dependent weights to the one or more selected non-local total variation terms.
9. The method of claim 1, wherein iteratively computing transforms comprises optimizing a determined objective function using a gradient descent algorithm, a steepest descent algorithm, or a combination thereof.
10. The method of claim 1, further comprising:
iteratively computing a similarity between one or more of the plurality of the gated images and the evolving mean image; and
identifying that the evolving mean image has converged upon determining that the computed similarity is greater than a prescribed threshold.
11. The method of claim 1, further comprising:
determining one or more clinical parameters based on the converged mean image; and
generating an alert if the one or more clinical parameters fall outside corresponding prescribed thresholds.
12. The method of claim 1, further comprising guiding a subsequent imaging scan of the target volume based on the one or more motion-corrected images.
13. A method for imaging a target volume in a subject, comprising:
computing an evolving mean positron emission tomography image by iteratively averaging a plurality of pixels corresponding to a plurality of gated positron emission tomography images corresponding to the target volume;
iteratively computing transforms for aligning one or more of the plurality of gated positron emission tomography images to the evolving mean positron emission tomography image based on at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty until convergence of the evolving mean positron emission tomography image and the iteratively computed transforms;
generating one or more motion-corrected positron emission tomography images corresponding to the target volume based on the converged transforms; and
displaying the one or more motion-corrected positron emission tomography images, the converged mean positron emission tomography image, or a combination thereof.
14. An imaging system, comprising:
a data acquisition subsystem configured to acquire image data corresponding to a target volume in a subject;
a processing unit operatively coupled to the data acquisition subsystem and configured to:
generate a plurality of gated images corresponding to the target volume;
compute an evolving mean image by iteratively averaging a plurality of pixels corresponding to the plurality of gated images;
iteratively compute transforms for aligning one or more of the plurality of gated images to the evolving mean image based on at least one non-local displacement coherence penalty and at least one non-local velocity coherence penalty until convergence of the evolving mean image and the iteratively computed transforms; and
generate one or more motion-corrected images corresponding to the target volume based on the converged transforms.
15. The imaging system of claim 14, wherein the imaging system is a positron emission tomography imaging system, a computed tomography imaging system, a single photon emission computed tomography system, a magnetic resonance imaging system, an X-ray imaging system, an optical imaging system, an ultrasound imaging system, or combinations thereof.
16. The imaging system of claim 14, further comprising an alerting subsystem communicatively coupled to the processing unit and configured to generate an alert if one or more clinical parameters determined based on the one or more motion-corrected images fall outside prescribed thresholds.
| # | Name | Date |
|---|---|---|
| 1 | 1115-CHE-2015-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 266283-1_IN_Application_and_Drawings.pdf ONLINE | 2015-03-09 |
| 2 | 1115-CHE-2015-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 2 | 266283-1 IN POA Form 26.pdf ONLINE | 2015-03-09 |
| 3 | 266283-1_IN_Application_and_Drawings.pdf | 2015-03-13 |
| 3 | 1115-CHE-2015-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 4 | 266283-1 IN POA Form 26.pdf | 2015-03-13 |
| 4 | 1115-CHE-2015-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 5 | 1115-CHE-2015-Power of Attorney-110915.pdf | 2015-11-23 |
| 5 | 1115-CHE-2015-IntimationOfGrant04-05-2022.pdf | 2022-05-04 |
| 6 | 1115-CHE-2015-PatentCertificate04-05-2022.pdf | 2022-05-04 |
| 6 | 1115-CHE-2015-Form 1-110915.pdf | 2015-11-23 |
| 7 | 1115-CHE-2015-Correspondence-110915.pdf | 2015-11-23 |
| 7 | 1115-CHE-2015-ABSTRACT [17-04-2020(online)].pdf | 2020-04-17 |
| 8 | 1115-CHE-2015-CLAIMS [17-04-2020(online)].pdf | 2020-04-17 |
| 8 | OTHERS [04-03-2016(online)].pdf | 2016-03-04 |
| 9 | 1115-CHE-2015-COMPLETE SPECIFICATION [17-04-2020(online)].pdf | 2020-04-17 |
| 10 | 1115-CHE-2015-CORRESPONDENCE [17-04-2020(online)].pdf | 2020-04-17 |
| 10 | Description(Complete) [04-03-2016(online)].pdf | 2016-03-04 |
| 11 | 1115-CHE-2015-DRAWING [17-04-2020(online)].pdf | 2020-04-17 |
| 11 | Assignment [04-03-2016(online)].pdf | 2016-03-04 |
| 12 | 1115-CHE-2015-FER_SER_REPLY [17-04-2020(online)].pdf | 2020-04-17 |
| 12 | 1115-CHE-2015-RELEVANT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 13 | 1115-CHE-2015-FORM 13 [10-10-2019(online)].pdf | 2019-10-10 |
| 13 | 1115-CHE-2015-OTHERS [17-04-2020(online)].pdf | 2020-04-17 |
| 14 | 1115-CHE-2015-FER.pdf | 2019-10-29 |
| 15 | 1115-CHE-2015-FORM 13 [10-10-2019(online)].pdf | 2019-10-10 |
| 15 | 1115-CHE-2015-OTHERS [17-04-2020(online)].pdf | 2020-04-17 |
| 16 | 1115-CHE-2015-FER_SER_REPLY [17-04-2020(online)].pdf | 2020-04-17 |
| 16 | 1115-CHE-2015-RELEVANT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 17 | Assignment [04-03-2016(online)].pdf | 2016-03-04 |
| 17 | 1115-CHE-2015-DRAWING [17-04-2020(online)].pdf | 2020-04-17 |
| 18 | 1115-CHE-2015-CORRESPONDENCE [17-04-2020(online)].pdf | 2020-04-17 |
| 18 | Description(Complete) [04-03-2016(online)].pdf | 2016-03-04 |
| 19 | 1115-CHE-2015-COMPLETE SPECIFICATION [17-04-2020(online)].pdf | 2020-04-17 |
| 19 | Drawing [04-03-2016(online)].pdf | 2016-03-04 |
| 20 | 1115-CHE-2015-CLAIMS [17-04-2020(online)].pdf | 2020-04-17 |
| 20 | OTHERS [04-03-2016(online)].pdf | 2016-03-04 |
| 21 | 1115-CHE-2015-ABSTRACT [17-04-2020(online)].pdf | 2020-04-17 |
| 21 | 1115-CHE-2015-Correspondence-110915.pdf | 2015-11-23 |
| 22 | 1115-CHE-2015-Form 1-110915.pdf | 2015-11-23 |
| 22 | 1115-CHE-2015-PatentCertificate04-05-2022.pdf | 2022-05-04 |
| 23 | 1115-CHE-2015-IntimationOfGrant04-05-2022.pdf | 2022-05-04 |
| 23 | 1115-CHE-2015-Power of Attorney-110915.pdf | 2015-11-23 |
| 24 | 1115-CHE-2015-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 24 | 266283-1 IN POA Form 26.pdf | 2015-03-13 |
| 25 | 266283-1_IN_Application_and_Drawings.pdf | 2015-03-13 |
| 25 | 1115-CHE-2015-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 26 | 266283-1 IN POA Form 26.pdf ONLINE | 2015-03-09 |
| 26 | 1115-CHE-2015-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 27 | 266283-1_IN_Application_and_Drawings.pdf ONLINE | 2015-03-09 |
| 27 | 1115-CHE-2015-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 2019-07-1012-26-37_10-07-2019.pdf |