Abstract: Methods and systems for ultrasound imaging are disclosed. One or more candidate structures corresponding to a feature of interest are identified in each image frame in a plurality of image frames corresponding to a subject. Each candidate structure in each image frame is evaluated. At least one candidate structure is retained based on the evaluation. Correspondence between the retained candidate structure and the feature of interest is determined. The retained candidate structure is identified as the feature of interest based on the correspondence. A subset of image frames including the feature of interest is identified from the plurality of image frames. A correlation indicator representative of a correlation between a scan plane corresponding to the image frame and a desired scan plane is computed for each image frame in the subset using a physics-based forward model. An optimal image frame is selected from the subset based on the correlation indicator. FIG. 2
METHOD AND SYSTEM FOR AUTOMATICALLY IDENTIFYING A FEATURE OF INTEREST IN A DESIRED SCAN PLANE
BACKGROUND
[0001] Embodiments of the present disclosure relate generally to diagnostic imaging, and more particularly to a method and system for identifying an optimal image frame including a feature of interest in a desired scan plane.
[0002] Medical diagnostic ultrasound is an imaging modality that employs ultrasound waves to probe the acoustic properties of biological tissues and produces corresponding images. Particularly, diagnostic ultrasound systems are used to provide an accurate visualization of muscles, tendons, and other internal organs to assess their size, structure and any pathological lesions using near real time tomographic images. Further, diagnostic ultrasound is used for determining movement, for example, corresponding to blood flow within the body. Additionally, ultrasound systems also find use in therapeutics where an ultrasound probe is used to guide interventional procedures such as biopsies or to track an interventional device.
[0003] Ultrasound systems also find extensive use in prenatal imaging. For example, ultrasound images may be used for assessing gestational age (GA) and weight of a fetus. Further, two-dimensional (2D) and/or three-dimensional (3D) ultrasound images may allow for measurement of specific features of fetal anatomy such as the head, abdomen, or femur. Measurement of the specific features, in turn, may be used in determination of the GA, assessment of growth patterns, and/or identification of anomalies in the fetus. By way of example, measurement of a length of the femur in the second and third trimesters of pregnancy provides a significant indication of fetal growth. In common clinical practice, the length of the femur is measured by moving an ultrasound transducer over the abdomen until the femur is visible in a desired scan plane. Specifically, a sonographer may strive to select a scan plane in which the femur is approximately normal to an ultrasound beam by repeatedly repositioning a transducer probe over a target region. Further, the length of the femur may be measured by selecting corresponding endpoints on a display. Subsequently, the GA corresponding to the measured length may be determined from standard Obstetric (OB) Tables. Use of accurate femur length measurements prevents dating errors, in turn, allowing for accurate diagnosis and prescription.
[0004] Acquiring an optimal image frame for accurate femur length measurement, however, is a challenging procedure. Generally, an accurate measurement of the femur length entails measurement of a proximal femur positioned at about 90 degrees to the ultrasound beam in an acquired image frame. Further, clinical guidelines stipulate the inclination of the femur in the image frame to be within 30 degrees of the face of an ultrasound transducer. In such an acquired frame, the femur typically appears as a uniformly bright and elongated object with sharp ends in the field of view.
[0005] Acquisition of an image frame that satisfies the stipulated guidelines, however, may be complicated due to imaging artifacts caused by near field haze resulting from subcutaneous fat layers, unpredictable patient movement, and/or ubiquitous speckle noise. Further, operator and/or system variability may also limit reproducibility of the fetal length measurements. For example, sub-optimal ultrasound image settings such as gain compensation and dynamic range may reduce an ability to visualize internal structures of the human body. Similarly, even small changes in positioning of the ultrasound transducer may lead to significant changes in visualized image frame, thus leading to incorrect measurements.
[0006] accurate ultrasound measurements, thus, typically entail meticulous and lengthy acquisitions by experienced sonographers. For example, certain conventional ultrasound imaging methods have been known to employ training algorithms and/or semi-automated methods that use image-derived characteristics for use in diagnosis and treatment. These conventional methods rely on the sonographer's selection of the optimal image frame from a plurality of image frames. In a conventional clinical workflow, the sonographer may continue to search for a better image frame even after identifying an acceptable image frame in the hope of determining measurements that are more accurate. However, upon failing to find a better image frame, the sonographer may have to scroll back to an originally acceptable frame manually, thus prolonging imaging time and hindering reproducibility. Ultrasound imaging using conventional methods and/or by a novice sonographer, therefore, may not allow for measurements suited for real-time diagnosis and treatment.
[0007] Moreover, performance of these conventional methods depends upon on the skill and experience of the sonographer, thus limiting the availability of quality imaging services, for example, to large hospitals and urban areas. Scarcity of skilled and/or experienced sonographers in remote or rural regions, thus, may cause these regions to be poorly or under-served.
BRIEF DESCRIPTION
[0008] In accordance with exemplary aspects of the present disclosure, a method for ultrasound imaging is presented. The method includes identifying one or more candidate structures corresponding to a feature of interest in each image frame in a plurality of image frames corresponding to a subject. Further, the method includes evaluating each candidate structure in the one or more candidate structures in each image frame. Additionally, the method includes retaining at least one candidate structure from the one or more candidate structures based on the evaluation. Moreover, the method includes determining a correspondence between the at least one retained candidate structure and the feature of interest. The method further includes identifying the at least one retained candidate structure as the feature of interest based on the correspondence. Additionally, the method includes identifying a subset of image frames from the plurality of image frames, where the subset of image frames comprises the identified feature of interest. Furthermore, the method includes computing a correlation indicator for each image frame in the subset of image frames using a physics-based forward model, where the correlation indicator is representative of a correlation between a scan plane corresponding to the image frame and a desired scan plane. The method also includes selecting an optimal image frame from the subset of image frames based on the correlation indicator.
[0009] In accordance with exemplary aspects of the present disclosure, another method for ultrasound imaging is disclosed. The method includes generating a plurality of binary images corresponding to a plurality of image frames. Further, the method includes identifying one or more candidate structures corresponding to a feature of interest in each binary image in the plurality of binary images using connected component analysis. The method also includes selecting a candidate structure from the one or more candidate structures in the binary image based on one or more characteristics corresponding to the one or more candidate structures in the binary image. Furthermore, the method includes determining end points of the selected candidate structure. Additionally, the method includes generating a first mask using the determined end points. The method further includes applying the first mask to the binary image to generate a second mask corresponding to the selected candidate structure. Moreover, the method includes applying the second mask to an image frame that corresponds to the binary image for identifying the feature of interest in the image frame.
[0010] In accordance with exemplary aspects of the present disclosure, yet another method for ultrasound imaging is presented. The method includes receiving a plurality of image frames including a feature of interest. Additionally, the method includes receiving one or more characteristics corresponding to the feature of interest in each of the plurality of image frames, where the one or more characteristics comprise end points, length, depth, and/or intensity profile. The method also includes determining a reference template corresponding to the feature of interest using the one or more characteristics and a physics-based forward model. Moreover, the method includes computing a reference intensity profile of the feature of interest using the reference template. The method also includes determining corresponding correlations between the reference intensity profile and the intensity profile of the feature of interest in each of the plurality of image frames. Furthermore, the method includes identifying an optimal image frame based on a correlation indicator indicative of the determined correlation corresponding to each of the plurality of image frames.
[0011] In accordance with exemplary aspects of the present disclosure, an imaging system is disclosed. The system includes an acquisition subsystem configured to obtain a plurality of image frames corresponding to a target region of a subject. The system further includes a processing unit in operative association with the acquisition subsystem. The processing unit is configured to identify one or more candidate structures corresponding to a feature of interest in each image frame in a plurality of image frames corresponding to a subject. The processing unit is further configured to evaluate each candidate structure in the one or more candidate structures in each image frame. The processing unit is also configured to retain at least one candidate structure from the candidate structures based on the evaluation. Particularly, the processing unit is configured to retain at least one candidate structure from the candidate structures based on the evaluation. Additionally, the processing unit is configured to determine a correspondence between the at least one retained candidate structure and the feature of interest. Further, the processing unit is configured to identify the at least one retained candidate structure as the feature of interest if the determined correspondence is greater than a determined threshold. The processing unit is also configured to identify a subset of image frames from the plurality of image frames, wherein the subset of image frames comprises the identified feature of interest. Moreover, the processing unit is configured to compute a correlation indicator for each image frame in the subset of image frames using a physics-based forward model, where the correlation indicator is representative of a correlation between a scan plane corresponding to the image frame and a desired scan plane. Furthermore, the processing unit is configured to select an optimal image frame from the subset of image frames based on the correlation indicator.
DRAWINGS
[0012] These and other features and aspects of embodiments of the present technique will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0013] FIG. 1 is a schematic representation of an exemplary ultrasound imaging system, in accordance with aspects of the present disclosure;
[0014] FIG. 2 is a flow chart illustrating an exemplary method for ultrasound imaging, in accordance with aspects of the present disclosure;
[0015] FIG. 3 is a flow chart illustrating an exemplary method for detecting presence of a feature of interest in ultrasound images, in accordance with aspects of the present disclosure;
[0016] FIG. 4 is a graphical representation of exemplary receiver operating characteristics (ROC) curves corresponding to different image frames generated using the method of FIG. 3, in accordance with aspects of the present disclosure;
[0017] FIG. 5 is another graphical representation of exemplary ROC curves corresponding to different image frames generated using the method of FIG. 3, in accordance with aspects of the present disclosure;
[0018] FIG. 6 is a flow chart illustrating an exemplary method for identifying an optimal image frame including a feature of interest in a desired scan plane, in accordance with aspects of the present disclosure;
[0019] FIG. 7 is a schematic representation of geometry of a femur modeled as a finite edge, in accordance with aspects of the present disclosure; and
[0020] FIG. 8 is top view of the finite edge depicted in FIG. 7, in accordance with aspects of the present disclosure.
DETAILED DESCRIPTION
[0021] The following description presents systems and methods for identifying an optimal image frame including a feature of interest in a desired scan plane. Particularly, certain embodiments illustrated herein describe the systems and the methods that detect a desired feature of interest in a plurality of image frames and identify an optimal image frame that includes the feature of interest in the desired scan plane. As used herein, the term "desired scan plane" may correspond to a cross-sectional slice of a target anatomy that satisfies clinical, user-defined, and/or application-specific guidelines to provide accurate and reproducible measurements of the feature of interest.
[0022] Further, embodiments of the present systems and methods allow for communication of the optimal image frame and accurate measurements of the feature of interest to a user for use in further diagnosis. The feature of interest, for example, may include anatomical features such as the femur or the humerus bone of a fetus, or an interventional device such as a catheter or needle within the body of a patient.
[0023] specifically, embodiments described herein allow for identification of a feature of interest that corresponds to a long and linear structure having high acoustic impedance using edge diffraction formalism. As used herein, the term "long and linear structure" is used to refer to a feature of interest having a length that is at least ten times the length of an incident acoustic wave used to image the feature of interest. Certain embodiments described herein may also allow communication of a correlation indicator indicative of a probability of each of the image frames generated in real-time to provide measurements of the feature of interest that satisfy the clinical, user-defined and/or application-specific guidelines.
[0024] Although the following description includes embodiments relating to medical diagnostic ultrasound imaging, these embodiments may also be implemented in other medical imaging systems. These systems, for example, may include optical imaging systems, and/or systems that monitor targeted drug and gene delivery. In certain embodiments, the present systems and methods may also be used during non-medical imaging, for example, during nondestructive testing of elastic materials that may be suitable for ultrasound imaging and/or security screening. An exemplary environment that is suitable for practising various implementations of the present system is described in the following sections with reference to FIG. 1.
[0025] FIG. 1 illustrates an exemplary ultrasound system 100 for providing robust and reproducible imaging performance. To that end, the system 100 may be configured as a console system or a cart-based system. Alternatively, the system 100 may be configured as a portable system, such as a hand-held, laptop-based, and/or a Smartphone-based system. Particularly, implementing the system 100 as a portable system may allow for pervasiveness of ultrasound imaging in rural regions, where skilled and experienced sonographers are typically in short supply.
[0026] In accordance with aspects of the present disclosure, the system 100 may be configured to automatically identify an optimal image frame from a plurality of image frames for determining accurate measurements corresponding to a feature of interest. Particularly, the system 100 may be configured to determine the measurements corresponding to the feature of interest using edge diffraction formalism. These measurements may then be used to assess a pathological condition of a subject.
[0027] For clarity, the present disclosure is described with reference to identifying an optimal image frame for accurate measurement of the length of the femur of a fetus. However, certain embodiments may allow for automatic identification of optimal image frames for measuring other long and linear
Features of interest such as the tibia or the humerus bone in a fetus. Embodiments of the present disclosure may also be employed for real-time detection, segmentation, and/or tracking of long and linear non-biological structures such as manufactured parts, catheters, or needles used during interventional procedures.
[0028] Conventional ultrasound systems employ previously determined information for identifying the features of interest, for example, using supervised learning and/or semi-automated segmentation, which may entail prolonged processing. However, embodiments of the present disclosure allow for fast and accurate measurements of the feature of interest using a forward model based on knowledge of anatomy and physics-based principles. The forward model allows for automatic detection of the feature of interest and identification of the optimal image frame that includes the feature of interest in a desired scan plane. As used herein, the term "optimal image frame" is used to refer to an image frame that is most suitable for determining measurements that are in accordance with one or more determined guidelines.
[0029] In one embodiment, the image frames may be acquired by imaging a target region of interest (ROI) of the subject. To that end, in certain embodiments, the system 100 may include transmit circuitry 102 that may be configured to generate a pulsed waveform to drive an array 104 of transducer elements 106 housed within a transducer probe 108. Particularly, the pulsed waveform drives the array 104 of transducer elements 106 to emit ultrasonic pulses into a body or volume of interest of the subject. At least a portion of the ultrasonic pulses generated by the transducer elements 106 back-scatter from the target ROI to produce echoes that return to the transducer array 104 and are received by a receive circuitry 110 for further processing.
[0030] In one embodiment, the receive circuitry 110 may be operatively coupled to a beam former 112 that may be configured to process the received echoes and output corresponding radio frequency (RF) signals. Although, FIG. 1 illustrates the transducer array 104, the transmit circuitry 102, the receive circuitry 110, and the beam former 112 as distinct elements, in certain embodiments, one or more of these elements may be implemented together as an independent acquisition subsystem in the system 100. The acquisition subsystem may be configured to acquire image data corresponding to the subject, such as a patient, for further processing.
[0031] A processing unit 114 may be configured to receive and process the acquired image data, for example, the RF signals according to a plurality of selectable ultrasound imaging modes in near real-time and/or offline mode. To that end, the processing unit 114 may be operatively coupled to the beam former 112, the transducer probe 108, and/or the receive circuitry 110. In one example, the processing unit 114 may include devices such as one or more general-purpose or application-specific processors, digital signal processors, microcomputers, microcontrollers, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGA), or other suitable devices in communication with other components of the system 100.
[0032] In certain embodiments, the processing unit 114 may be configured to provide control and timing signals for configuring one or more imaging parameters for imaging the target ROI. By way of example, the imaging parameters may include a sequence of delivery of different pulses, frequency of the pulses, a time delay between two different pulses, intensity of the pulses, and/or other such imaging parameters. Particularly, in one embodiment, the processing unit 114 may be operatively coupled to a power source 116, for example through a communications link 117, to drive one or more components of the system 100. Accordingly, the power source 116 may include, for example, a fixed current outlet and/or a battery to provide drive voltage to the components of the system 100 for imaging the target ROI.
[0033] Moreover, in one embodiment, the processing unit 114 may be configured to store the delivery sequence, frequency, time delay, and/or beam intensity, for example, in a memory device 118 for use in imaging the target ROI.
The memory device 118 may include storage devices such as a random access memory, a read only memory, a disc drive, solid-state memory device, and/or a flash memory. In certain embodiments, the processing unit 114 may be configured to use the stored information for configuring the transducer elements 106 to direct one or more groups of pulse sequences toward the target ROI, for example, the fetus.
[0034] Further, the processing unit 114 may be configured to track the displacements in the target ROI caused in response to the incident pulses to determine corresponding characteristics. These characteristics, for example, may include size of the head, abdomen, or the femur that allow determination of GA, assessment of growth patterns, and identification of anomalies in the fetus. The displacements and characteristics, thus determined, may be stored in the memory device 118. Additionally, the displacements and/or the determined characteristics may be communicated to a user, such as a sonographer, for further assessment.
[0035] In certain embodiments, the processing unit 114 may also be coupled to one or more user input-output devices 120 for receiving commands and inputs from the user. The input-output devices 120, for example, may include devices such as a keyboard, a touch screen, a microphone, a mouse, a control panel, a display device 122, a foot switch, a hand switch, and/or a button. In one embodiment, the display device 122 may include a graphical user interface (GUI) for providing the user with configurable options for imaging desired regions of the subject. By way of example, the configurable options may include a selectable image frame, a selectable ROI, a desired scan plane, a delay profile, a designated pulse sequence, a desired pulse repetition frequency, and/or other suitable system settings to image the desired ROI. Additionally, the configurable options may further include a choice of diagnostic information to be communicated to the user. The diagnostic information, for example, may include a magnitude of strain and/or stiffness in the target ROI estimated from the received signals.
[0036] Accordingly, in one embodiment, the processing unit 114 may be configured to process the RF signal data to prepare image frames and to generate the requested diagnostic information based on user input. Particularly, the processing unit 114 may be configured to process the RF signal data to generate 2D, 3D, and/or four-dimensional (4D) datasets corresponding to different imaging modes. By way of example, the processing unit 114 may generate B-mode, color Doppler, power Doppler, M-mode, anatomical M-mode, strain, strain rate, and/or spectral Doppler image frames based on specific scanning and/or user-defined requirements.
[0037] In certain embodiments, the processing unit 114 may be configured to generate the image frames in real-time while scanning the target ROI and receiving corresponding echo signals. As used herein, the term "real-time" may be used to refer to an imaging rate from about 10 to about 30 image frames per second (fps) with a delay of less than 1 second. Also, in one embodiment, the processing unit 114 may be configured to customize the delay in reconstructing and rendering the image frames based on specific system and/or imaging requirements. Further, the processing unit 114 may be configured to process the RF signal data such that a resulting image is rendered, for example, at the rate of 10 fps on the associated display device 122 that is communicatively coupled to the processing unit 114.
[0038] In one embodiment, the display device 122 may be a local device. Alternatively, the display device 122 may be remotely located to allow a remotely located medical practitioner to track diagnostic information corresponding to the subject. In certain embodiments, the processing unit 114 may be configured to update the image frames on the display device 122 in an offline and/or delayed update mode. Particularly, the image frames may be updated in the offline mode based on the echoes received over a determined period of time. Alternatively, the processing unit 114 may be configured to dynamically update the image frames and sequentially display the updated image frames on the display device 122 as and when additional frames of ultrasound data are acquired.
[0039] With continued reference, to FIG. 1, in certain embodiments, the system 100 may further include a video processor 124 that may be configured to digitize the received echoes and output a resulting digital video stream on the display device 122. In one embodiment, the video processor 124 may be configured to store the image frames corresponding to the target ROI for later review and analysis. Alternatively, the video processor 124 may be configured to communicate the image frames to a remote location for allowing the remotely located medical practitioner to diagnose a patient condition and/or to prescribe treatment.
[0040] Furthermore, in certain embodiments, the video processor 124 may be configured to display the video stream along with patient-specific diagnostic and/or therapeutic information in real-time while the patient is being imaged. Particularly, in one embodiment, the video processor 124 may be configured to automatically identify an image frame that depicts the feature of interest in a desired scan plane. The desired scan plane may correspond to an optimal scan plane that, in accordance with prescribed guidelines, is most suitable for measuring one or more desired characteristics of the feature of interest. For example, when imaging the femur, the desired scan plane may correspond to a scan plane where the femur is positioned at about 90 degrees to the ultrasound beam. Alternatively, a desired scan plane for tracking progress of an interventional device such as a catheter or a needle through the body of the patient may correspond to in-plane or out-of-plane guidance procedures.
[0041] In another embodiment, the video processor 124 may be configured to determine a correlation indicator representative of a probability of each of the image frames to provide accurate measurements of the feature of interest in real time. The video processor 124 may be configured to determine the correlation indicator by extracting the feature of interest from the image frame and computing a correlation between an intensity profile of the feature of interest and a template having substantially the same length as the feature of interest. Particularly, in accordance with aspects of the present disclosure, the video processor 124 may be configured to generate the template using high frequency edge diffraction formalism. When using edge diffraction formalism, the femur is assumed to be representative of an edge with a length substantially larger than the wavelength of ultrasound. Moreover, in one embodiment, the video processor 124 may also be configured to identify the scan plane having the highest correlation with the template as the optimal scan plane.
[0042] Additionally, the video processor 124 may be configured to communicate the correlation indicator representative of the determined correlation between an image frame and the template to a user in real-time. In one embodiment, the video processor 124 may be configured to communicate the correlation indicator visually using the display device 122. For example, the correlation indicator may be represented using a color bar, a pie chart, and/or a number. Alternatively, the video processor 124 may be configured to communicate the correlation indicator using an audio and/or a video feedback. In one example, the audio feedback may include one or more beeps or speech in a language of choice.
[0043] In certain embodiments, the video processor 124 may be configured provide the audio and/or video feedback to allow for a semi-automated user-based selection of an optimal image frame that includes the feature of interest in the desired scan plane. Alternatively, the system 100 may be configured to reinitiate scanning of the target region, automatically or based on user input, if the correlation indicator corresponding to none of the image frames is greater than a clinically acceptable threshold. For example, in one embodiment, the video processor 124 may be configured to 'auto-freeze' the image frame having the highest correlation with the template for measuring one or more characteristics of the feature of interest. Additionally, the video processor 124 may be configured to trigger automated measurements once the image frame having the optimal scan plane is identified.
[0044] Such real-time identification of the optimal scan plane and/or the correlation indicator precludes switching back and forth between different image frames in a cine loop for identifying the optimal scan plane. Further, using edge diffraction formalism for determining the correlation indicator for each image frame allows for real-time detection of the feature of interest and a corresponding optimal scan plane. Embodiments of the present disclosure, thus, allow for a reduction in imaging time, while providing enhanced performance as compared to conventional training and segmentation based methods. An exemplary method for automatically identifying a feature of interest in a desired scan plane and computing a correlation indicator representative of suitability of the image frame to allow for accurate measurements is described in greater detail with reference to FIG. 2.
[0045] FIG. 2 illustrates a flow chart 200 depicting an exemplary method for ultrasound imaging. In the present disclosure, embodiments of the exemplary method may be described in a general context of computer executable instructions on a computing system or a processor. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types.
[0046] Additionally, embodiments of the exemplary method may also be practised in a distributed computing environment where optimization functions are performed by remote processing devices that are linked through a wired and/or wireless communication network. In the distributed computing environment, the computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0047] Further, in FIG. 2, the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, or combinations thereof. The various operations are depicted in the blocks to illustrate the functions that are performed, for example, during generating a binary image, identifying the feature of interest, and/or correlation determination phases of the exemplary method. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations.
[0048] The order in which the exemplary method is described is not intended to be construed as a limitation and any number of the described blocks may be combined in any order to implement the exemplary method disclosed herein, or an equivalent alternative method. Additionally, certain blocks may be deleted from the exemplary method or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein. For discussion purposes, the exemplary method will be described with reference to the elements of FIG. 1.
[0049] Embodiments of the present disclosure allow for automatic detection of a feature of interest and identification of the optimal image frame that includes the feature of interest in a desired scan plane. As previously noted, the desired scan plane may correspond to a cross-sectional slice of a target ROI that satisfies clinical, user-defined, and/or application-specific guidelines to provide accurate and reproducible measurements of the feature of interest.
[0050] Particularly, certain embodiments of the present method provide for automatic identification of an optimal image frame that allows for efficient measurement of a feature of interest that corresponds to a long, bright, and linear structure such as a catheter, a needle, a pipe, the femur, humerus, tibia, or other long bones in the body. For clarity, the present method is described with reference to detection and identification of the femur bone in image frames corresponding to a fetus. However, it may be appreciated that other long, bright, and linear structures may similarly be identified using embodiments of the present method.
[0051] Further, in accordance with aspects of the present disclosure, embodiments of the present method employ a physics-based forward model for identifying the optimal image frame that includes the feature of interest in the desired scan plane. Particularly, the physics-based forward model may provide a representation of the behavior of ultrasound signals when imaging a long and linear structure of high acoustic impedance positioned within the desired scan plane and/or outside the desired scan plane. For example, the physics-based forward model may employ high frequency edge diffraction formalism that assumes the long and linear feature of interest to be representative of a low absorption edge with a length much larger than the wavelength of ultrasound.
[0052] Accordingly, the embodiment of the present method depicted in FIG. 2 is described with reference to the use of edge diffraction formalism for identifying a long and linear structure in the image frames. However, in certain embodiments, other suitable physics-based models may be similarly used to model the behavior of a feature of interest having any other shape, for example a spherical shape, for identifying the feature of interest disposed in a desired scan plane.
[0053] The method begins at step 202, where a plurality of image frames is received. In one embodiment, the image frames may be received from an acquisition subsystem, such as the ultrasound system 100 of FIG. 1, configured to acquire imaging data from a target ROI of a subject, for example, a fetus. Alternatively, the image frames may be received from a storage repository such as the memory device 118 of FIG. 1. The received image frames, for example, may include 2D, 3D, and/or 4D image data. Additionally, in certain embodiments, the image frames may correspond to cine loops that include a series of 2D images acquired over a determined period of time.
[0054] It may be desirable to determine presence of a feature of interest in the plurality of image frames. Accordingly, at step 204, one or more candidate structures corresponding to the feature of interest may be identified in each image frame in the plurality of image frames. In one embodiment, a processing subsystem such as the processing unit 114 of FIG. 1 may be configured to identify the candidate structures corresponding to the feature of interest such as the femur.
[0055] To that end, the processing unit 114 may apply a sequence of processing steps to each image frame in the plurality of image frames. In one example, the sequence may entail applying a sector mask to each image frame in the plurality of image frames to retain only a desired field of view. Further, a Frangi vesselness filter followed by k-means clustering may be applied to binarize the image frame. Subsequently, a connected component analysis may be performed on the image frame to identify the candidate structures corresponding to the femur. In certain embodiments, the candidate structures may be skeletonized. Moreover, branches of the skeletonized structures may be pruned to determine end points that may possibly correspond the end points of the femur.
[0056] once identified, it may be determined whether the candidate indeed corresponds to the feature of interest, thus establishing the presence of the feature of interest in the image frame. To that end, at step 206, each candidate structure in the one or more candidate structures may be evaluated. In one embodiment, the evaluation may entail ranking each candidate structure based on one or more corresponding characteristics. The characteristics may include, for example, aspect ratio, phase symmetry, intensity profile, size, location, and/or vesselness of the feature of interest. At least one candidate structure may be retained in the image frame based on the evaluation, as depicted by step 208. In one embodiment, for example, strength of a desired combination of the characteristics of the candidate structures may be ascertained to retain one or more best-ranked candidate structures as the possible femur.
[0057] Furthermore, at step 210, a correspondence between the retained candidate structure and the feature of interest may be determined. Generally, presence of other long and linear candidate structures such as skin or muscle tissues in an image frame may confound the identification of the femur. Accordingly, in certain embodiments, femur characteristics such as uniformity of brightness may be used to compute a metric for the retained candidate. Particularly, in one embodiment, the metric for each of the candidate structures in the image frame may be computed as the ratio of difference between the area of the largest connected component and the area of the other components in the foreground of the candidate structure to the total area of the candidate structure.
[0058] In an alternative embodiment, sharpness of the end points of the retained candidate structures may be used as a criterion to establish the correspondence with the femur. This criterion follows from the clinical observation that the ends of a properly imaged femur are sharp. Accordingly, in one example, one or more masks may be used to generate neighborhood regions around the end points for quantifying their sharpness as a computed metric. In one embodiment, the sharpness metric may be defined as a percentage overlap between an area of the candidate structure within the neighborhood region and the foreground of the image frame within the neighborhood region.
[0059] In embodiments that employ uniformity of brightness and/or sharpness of the end points as evaluation criteria, the computed metric may be indicative of the correspondence between the retained candidate structures and the femur. In one example, a higher value of the computed metric may be representative of a greater probability of the candidate structure being the femur.
[0060] Accordingly, at step 212, the retained candidate structure may be identified as the feature of interest based on the determined correspondence. In certain embodiments, the retained candidate structure whose correspondence with the feature of interest is greater than a determined threshold may be positively identified as the feature of interest. In one example, the threshold may be predefined during system setup, and/or may be determined based on user input, and/or historical information. Specifically, in one embodiment, the retained candidate structure having a suitably high value of the computed metric may be deemed to correspond to the femur, thus establishing presence of the femur in the image frame.
[0061] Further, at step 214, one or more segmentation masks may be generated to isolate the candidate structure identified as the feature of interest from the other structures in the image frame. To that end, the segmentation masks may be applied to the image frame corresponding to the binary image to generate a processed image frame that may be used to identify the feature of interest in the originally acquired grayscale image frame.
[0062] Additionally, at step 216, a subset of image frames may be identified from the plurality of image frames, where the subset of image frames includes the feature of interest identified at step 212. Steps 204, 206, 208, 210, 212, 214, and 216 for identifying the candidate structures, the feature of interest, and the subset of the image frames that include the feature of interest will be described in greater detail with reference to Figs. 3-5.
[0063] Further, at step 218, a correlation indicator may be computed for each image frame in the subset of image frames using a physics-based forward model. Specifically, the correlation indicator may be representative of a correlation between a scan plane corresponding to the image frame and a desired scan plane as determined using the physics-based forward model. Moreover, the physics-based forward model may define one or more reference characteristics exhibited by the feature of interest being imaged using ultrasound waves when positioned within a desired scan plane, for example, stipulated by clinical guidelines.
[0064] Accordingly, in one embodiment, a comparison between the reference characteristics and characteristics derived from the feature of interest in the image frame may aid in determining whether the feature of interest lies within the clinically acceptable scan plane. The correlation indicator may be representative of a result of such a comparison.
[0065] Further, at step 220, an optimal image frame may be selected from the subset of image frames based on the correlation indicator. In one embodiment, the optimal image frame may correspond to the image frame in which the characteristics of the feature of interest have the highest correlation with the reference characteristics. Moreover, as previously noted, the optimal image frame may be selected automatically, for example, based on the highest value of the correlation indicator. Alternatively, the correlation indicator may be communicated to the user to allow the user to select the optimal frame. In one embodiment, the correlation indicator may be communicated visually, for example, using a color bar, a pie chart, and/or a number. Alternatively, the correlation indicator may be communicated using an audio and/or video feedback, for example, as beeps, speech, and/or printed message.
[0066] In certain embodiments, the processing unit 114 may be configured to 'auto-freeze' the image frame having the highest correlation. The optimal image frame may then be used for determining medically relevant measurements of the feature of interest. For example, in one embodiment, the processing unit 114 may be configured to trigger automated measurements of the feature of interest once the image frame having the feature of interest in the desired scan plane is identified.
[0067] The physics-based forward model, thus, allows for robust and reproducible measurements of the feature of interest in real-time without resorting to use of time-consuming manual selection and/or training algorithms that are used conventionally. An exemplary method for computing the correlation indicator using the physics-based forward model for use in identifying an optimal image frame in real-time will be described in greater detail with reference to FIG. 6. Also, an exemplary method for identifying the subset of the image frames that include the feature of interest and may be evaluated using the physics-based model will be described in greater detail with reference to FIG. 3.
[0068] FIG. 3 illustrates a flow chart 300 depicting an exemplary method for detecting presence of a feature of interest in ultrasound images. At step 302, a plurality of image frames are received. As previously noted, the received image frames may include 2D, 3D, 4D image data, or a cine loop that includes a series of 2D images acquired over a determined period of time. Moreover, the image frames may be received in real-time while imaging a patient or in an offline mode. Generally, these image frames may correspond to greyscale images.
[0069] Further, at step 304, a plurality of binary images corresponding to the plurality of image frames may be generated. In one embodiment, generating the binary images may include smoothing each image frame in the plurality of image frames. The smoothened image frame may then be processed by applying a feature enhancement filter to generate a filtered image frame. For example, in an embodiment where the femur is being imaged, a Frangi vesselness filter may be employed to enhance representation of long, bright, and tubular structures such as the femur in the filtered image frame.
[0070] Subsequently, thresholding may be applied to the filtered image frame to generate the binary image. It may be noted that the binary image may include one or more structures having a desired acoustic impedance in the foreground. Particularly, in one embodiment, k-means clustering may be applied to the filtered image frame to cluster pixels in the filtered image frame, for example, into five clusters. Further, pixels corresponding to, for example, a third cluster may be selected as corresponding to foreground regions, while all other pixels may be discarded. Subsequently, one or more small and unconnected components may be removed from the identified foreground regions using a minimum size and/or geometry-based morphological operation.
[0071] Moreover, at step 306, one or more candidate structures corresponding to a feature of interest may be identified in each binary image in the plurality of binary images using connected component analysis. In the connected component analysis, for example, an 8-neighborhood connected component labeling may be applied to the binary image to find one or more connected components representative of the candidate structures, for example, corresponding to the femur. Typically, the femur is assumed to be a bright and sharp-edged structure due to high acoustic impedance of bone relative to surrounding soft tissue. Accordingly, the connected component labeling is used to identify such bright and long candidate structures.
[0072] Additionally, in certain embodiments, one or more characteristics corresponding to the candidate structures may also be determined. The characteristics may be selected, for example, based on anatomy, tissue characterization in ultrasound, and/or scan geometry characteristics. Further, the characteristics, for example, may include aspect ratio, phase symmetry, intensity, size, location, and/or vesselness. In one embodiment, these characteristics may be used to assess correlation of each candidate structure with a known profile of the femur.
[0073] Further, at step 308, a candidate structure may be selected from the one or more candidate structures in the binary image based on one or more characteristics corresponding to the candidate structures. To that end, in one embodiment, skeletonized representations corresponding to the one or more candidate structures may be generated. Moreover, one or more branches of the skeletonized components may be removed. Additionally, one or more characteristics corresponding to each of the skeletonized component may be determined. As previously noted with reference to the candidate structures, the characteristics corresponding to the skeletonized components may include aspect ratio, phase symmetry, intensity, size, location, and/or vesselness of the candidate structures corresponding to the skeleton.
[0074] In certain embodiments, the determined characteristics may be used to compute a metric corresponding to each of the skeletonized components. In one example, computing the metric may entail determining a normalized sum of aspect ratio, phase symmetry, intensity, size, location, and/or vesselness of the candidate structures corresponding to each of the skeletons. Subsequently, a skeletonized component may be selected as a probable representative of the feature of interest based on the computed metric. In one embodiment, the candidate structure corresponding to the skeletonized component having the highest value of the computed metric may be selected.
[0075] Further, at step 310, end points of the selected candidate structure in the binary image may be determined. In one embodiment, the end points of the selected candidate structure may correspond to the end points of the corresponding skeletonized component. A corresponding location of the selected candidate structure in the original greyscale image frame may then be determined based on the determined end points in the binary image frame.
[0076] Accordingly, at step 312, a first mask may be generated using the determined end points in the binary image. In one embodiment, the first mask may correspond to a parallelogram shaped mask. Specifically, the parallelogram shaped mask may be generated such that a first side of the parallelogram shaped mask may include a line connecting the end points of the selected candidate structure. Further, a height of the parallelogram shaped mask may be about one tenth of the first side.
[0077] The first mask, thus generated, may be applied to the binary image to generate a second mask corresponding to the selected candidate structure, as depicted by step 314. Particularly, in one embodiment, application of the first mask may allow for retention of the selected candidate structure in the binary image. The retained portion may then be used to generate the second mask.
[0078] At step 316, the second mask may be applied to an image frame that corresponds to the binary image for identifying the feature of interest in the image frame. In certain embodiments, the second mask may be multiplied with the originally acquired greyscale image frame corresponding to the binary image frame to identify the feature of interest in the greyscale image frame.
[0079] In one example, the embodiment of the method illustrated in FIG. 3 may allow for identification of the selected candidate structure as the femur in a subset of image frames in a cine loop. However, presence of skin and muscle tissues in a target ROI may confound accurate detection of the femur. Accordingly, in certain embodiments, the selected candidate structure may undergo further processing for preventing false-positive identification of a candidate structure as the femur.
[0080] As previously noted, the femur is assumed to be a bright and sharp-edged structure due to high acoustic impedance of bone relative to surrounding soft tissue. Accordingly, in certain embodiments, uniformity of the selected candidate structure may be determined to ascertain if the selected candidate structure is indeed the femur. Alternatively, it may be determined if the selected candidate structure includes sharp ends.
[0081] In one embodiment, an assessment of uniformity of the selected candidate structure may be performed by applying, for example, a two-level Otsu thresholding on the resulting greyscale image frame generated at step 316. In one example of Otsu thresholding, intensity of all pixels belonging to the brightest class may be set to unity, whereas all other pixels may be discarded. The thresholding may allow identification of structures having high acoustic impedance. Further, a connected component analysis may be performed in the foreground regions including the candidate structures to identify the largest connected component. A uniformity indicator (UI) may be computed for each candidate structure based on a size of the largest connected component in the binary image and a reference size of the feature of interest, for example, the femur. The uniformity indicator, for example, may be defined using equation (1).size (Largest connected component) - size(Other connected components) size (Feature of interest)(1)
[0082] In one example, a higher value of the UI may be representative of a greater probability of the candidate structure being the femur. In certain embodiments, a suitable threshold may be used to deem the candidate structures having a UI above the threshold as being the femur. Further, one or more characteristics corresponding to the identified femur may be determined. These characteristics include, for example, end points, length, depth, and/or intensity profile. By way of example, the intensity profile of the identified femur may be determined by calculating a mean of pixel intensities along each ultrasound beam assuming rectangular coordinates.
[0083] In certain embodiments, the computed UI may be communicated to the user in real-time to indicate suitability of the image frame. Furthermore, the UI computed for a plurality of image frames, such as image frames in a cine loop, may be used to determine a threshold for detecting the femur in subsequently acquired image frames. To that end, in one embodiment, receiver operating characteristics (ROC) curves corresponding to a cine loop may be generated based on the corresponding UI scores.
[0084] FIG. 4 depicts a graphical representation 400 of exemplary ROC curves for use in determining the uniformity indicator metric, where each of the curves represents a different cine loop. In FIG. 4, the ROC curve 402 is representative of a merged cine loop consisting of a plurality of cine loops. Also, the ROC curve 402 is representative of a change in the sensitivity relative to the specificity of the method of detection of the feature of interest, such as the method described with reference to FIG. 3, at different thresholds. As depicted in FIG. 4, use of the method of FIG. 3 results in a desirable separation between the image frames that include the femur component and those that do not include the femur component. The graphical representation 400, thus, allows for determining a suitable threshold for detecting presence of the femur in subsequently received and previously unseen image frames for a selected sensitivity and specificity.
[0085] With returning reference to FIG. 3, in certain embodiments, instead of the uniformity assessment, sharpness of the ends of the selected candidate structure may be determined to ascertain if the selected candidate structure is indeed the femur. To that end, in one example, a 30X30 window around the end points determined at step 310 may be selected. Further, a two-level Otsu thresholding may be applied within the window to identify foreground regions. Particularly, foreground regions that overlap with the first mask generated at step 312 may be retained. Additionally, overlap scores may be computed, for example, using Dice's coefficient at both end points. The overlap scores from the two end points may then be summed to obtain an overall overlap score. In one example, a higher overlap score may be representative of a greater probability of the candidate structure being the femur. Particularly, in one embodiment a suitable threshold may be used to deem the candidate structures having an overlap score above the threshold as being the femur.
[0086] Furthermore, the overall overlap scores computed for a plurality of image frames, such as image frames in a cine loop, may be used to determine a threshold for detecting the femur in subsequently acquired image frames. To that end, ROC curves corresponding to each image frame may be generated based on the corresponding overall overlap scores.
[0087] FIG. 5 depicts a graphical representation 500 of exemplary ROC curves for use in determining the sharpness metric corresponding to end points of a candidate structure corresponding to a feature of interest. In FIG. 5, the ROC curve 502 is representative of a merged cine loop consisting of a plurality of cine loops. Also, the ROC curve 502 is representative of a change in the sensitivity relative to the specificity of the method of detection of the feature of interest, such as the method described with reference to FIG. 3, at different thresholds. As depicted in FIG. 5, use of the method of FIG. 3 results in a desirable separation between the images frames that include the femur component and those that do not include the femur component. The graphical representation 500, thus, allows for determining a suitable threshold for detecting presence of the femur in subsequently received and previously unseen image frames for a selected sensitivity and specificity.
[0088] Figs. 3-5, thus, describe methods for detecting presence of a feature of interest such as the femur in a plurality of image frames, and in turn, identifying a subset of image frames including the femur from the plurality of image frames. One or more characteristics of the detected femur may then be measured, for example, to estimate the GA, assess growth patterns, and/or identify of anomalies in the fetus. However, measuring the characteristics of the femur from any image frame may not be optimal. Generally, an accurate measurement of the femur length entails measurement of a proximal femur positioned at about 90 degrees to the ultrasound beam in an acquired image frame. Further, clinical guidelines stipulate the inclination of the femur in the image frame to be within 30 degrees of the face of an ultrasound transducer.
[0089] Although Figs. 3-5 identify the subset of image frames that include the femur, identifying an optimal image frame from this subset of image frames that includes the femur positioned in a desired scan plane as stipulated by clinical guidelines may entail further processing.
[0090] FIG. 6 illustrates a flowchart 600 depicting an exemplary method for identifying an optimal image frame that includes the feature of interest in a desired scan plane for ultrasound imaging from a plurality of input image frames that includes the feature of interest. As previously noted, the optimal image frame may be an image frame that includes the feature of interest in the desired scan plane that is most suitable for determining measurements in accordance with one or more determined guidelines. For discussion purposes, the present embodiment is discussed with reference to an exemplary method for identifying an optimal image frame that includes the femur in a desired scan plane. However, it may be noted that optimal image frames for other features of interest may similarly be identified using the aspects of the present disclosure described herein.
[0091] In certain image frames, the femur may extend outside the desired scan plane and/or may not be parallel to the transducer face. Moreover, diffraction of the ultrasound waves may cause tapering of the end points of the femur. Such tapering off of the end points is observed to be different based on whether the femur is in the desired scan plane or outside the desired scan plane. Particularly, the femur appears smaller when disposed outside the desired scan plane, thus resulting in measurement errors. Selection of any image frame for femur measurements, thus, may not be optimal.
[0092] Embodiments of the present method, therefore, allow for selection of the optimal image frame by modeling the behavior of the ultrasound waves when incident on the femur using a physics-based forward model. The physics-based forward model, for example, based on edge diffraction formalism may allow for determining a correlation between an "ideal" femur and the femur detected in each image frame, which in turn, may aid in selecting the optimal image frame that allows for accurate ultrasound-based measurements.
[0093] Accordingly, at step 602, a plurality of image frames that include a feature of interest may be received. The image frames, for example, may include 2D, 3D, 4D image data, or a cine loop that includes a series of 2D images acquired over a determined period of time. The image frames are assumed to include the feature of interest. Accordingly, in one embodiment, the subset of image frames identified at step 210 of FIG. 2 may be received. Moreover, the image frames may be received in real-time while imaging a patient or in an offline mode.
[0094] Additionally, at step 604, one or more characteristics corresponding to the feature of interest in each of the plurality of image frames may be received. The received characteristics may include, for example, end points, length, depth, and/or intensity profile. In one embodiment, the received characteristics may correspond to the region of the image frame that includes the feature of interest, such as, the feature of interest identified in step 316 of FIG. 3.
[0095] Further, at step 606, a template corresponding to the feature of interest may be determined using the one or more characteristics and the physics-based forward model. Particularly, the physics-based forward model may provide a representation of the behavior of ultrasound signals when imaging the feature of interest. For example, when an acoustic wave is incident on long and sharp edge like objects with high acoustic impedance, such as the femur, there is diffraction. Accordingly, the physics-based forward model may characterize each point illuminated by the acoustic wave as a source in a plurality of directions.
[0096] Specifically, the physics-based model may employ high frequency edge diffraction formalism that assumes the feature of interest, for example the femur, to be representative of a finite low absorption edge with a length much larger than the wavelength of ultrasound. Particularly, in accordance with aspects of the present disclosure, the femur may be represented as an edge in a wedge of an angle of about 2*pi and corresponding characteristics may be determined.
[0097] FIG. 7 illustrates a schematic representation 700 of geometry of the femur modeled as a finite edge. In the graphical representation 700, positions of a source S and a receiver R are indicated in cylindrical coordinates. Further, FIG. 8 illustrates a top view 800 of the edge depicted in FIG. 7 for allowing better characterization of the geometry of the edge. In Figs. 7-8, sound paths via wedge end points 8 and z are indicated by a solid line. Also, the least-time sound path via the apex point zapex (see FIG. 8) is depicted with a dashed line, whereas other sound paths are illustrated with dotted lines.
[0098] It may be noted that derivation of an impulse response for a finite edge is based on the argument that a local reaction at the edge to an impulsive incident wave is instantaneous. The impulse response for a finite edge, for example, may be defined using equations (2) and (3).
Where c is the speed of sound, v = /ΘW is the wedge index, m is the "source-to-edge point" distance, and / is the "edge point-to-receiver" distance. Further, (m + I) correspond to the sound path length or any point z along the edge, whereas P corresponds to the local geometry at the point of incidence on the edge.
[0099] In one embodiment, the integration range is between the two end points of the finite edge. Particularly, the line integral shows an integration of impulse contributions, which are delayed by the sound path length m + / for any point z along the edge. Spherical spreading from the source to the edge point, and from the edge point to the receiver, is indicated by a factor 1/ml. The distances m and / are not involved in the p-terms and accordingly, the sum of the P-terms may be indicative of a directivity function.
[0100] In an exemplary implementation, equations (2) and (3) may be representative of the physics-based forward model, which may be used to model a transducer face as a set of linearly placed discrete source and receiver points. Further, the femur may be assumed to be a set of diffraction or secondary point sources positioned along a linear edge parallel to or at an angle to the source and receiver line. In such a configuration, time of arrival of diffracted signals at the receiver from the source may be based on a travel path from the source to the receiver via the edge and the corresponding geometry.
[0101] With returning reference to step 606 of FIG. 6, the physics-based forward model may represent an 'ideal' femur as a set of diffraction point sources along a linear edge parallel to the transducer face. The physics-based forward model may use the edge diffraction formalism to determine the template corresponding to the ideal femur, for example, using equations (2) and (3) and the one of more received characteristics such as length and the depth corresponding to the selected candidate structure.
[0102] Further, at step 608, a reference intensity profile of the feature of interest may be computed using the determined template. Specifically, the physics-based forward model may be used to compute the reference intensity profile based on relative locations of source, receiver, edge, and beam when imaging an ideal femur having a length that is substantially similar to the length of the candidate structure.
[0103] Corresponding correlations between the reference intensity profile and an intensity profile of the feature of interest in each of the plurality of image frames may be determined, as depicted by step 610. In one embodiment, the correlations may be determined, for example, using a linear correlation function between x and y, where x represents the reference intensity profile at each of the n points and y represents the intensity profile of the feature of interest at the same n points. According to certain aspects of the present disclosure, higher the correlation, greater is the suitability of the image frame for use in determining measurements of the feature of interest that satisfy the clinical, user-defined and/or application-specific guidelines.
[0104] At step 612, an optimal image frame may be selected based on a correlation indicator indicative of the determined correlation corresponding to each of the plurality of image frames. In one embodiment, the image frame having the highest correlation indicator may be identified as the optimal image frame. As previously noted the optimal image frame may be selected automatically or based on user input. Accordingly, in certain embodiments, the correlation indicator may be communicated to the user. Specifically, the correlation indicator may be communicated to the user visually, for example, using a color bar, a pie chart, and/or a number. Alternatively, the correlation indicator may be communicated using an audio and/or video feedback, for example, beeps, speech, and/or printed message to allow for selection of the optimal image frame.
[0105] Alternatively, the system may automatically identify and display the optimal image frame to the user. Additionally, automated measurements of the feature of interest may be triggered once the image frame having the feature of interest in the desired scan plane is identified. These measurements, in turn, may be used, for example, in the determination of the GA, assessment of growth patterns, identification of anomalies in a fetus, or for tracking of interventional devices.
[0106] Embodiments of the present disclosure, thus, provide systems and methods that allow accurate detection, segmentation, scoring of the scan plane and measurement of the feature of interest automatically and in real-time. Such automated and real-time detection of the desired scan plane and/or communication of the correlation indicator allow the user to avoid switching back and forth between different image frames in a cine loop for identifying the optimal image frame. Further, use of the physics-based forward model for determining the correlation indicator for each image frame allows for real-time identification of the desired scan plane for measuring the feature of interest.
[0107] Moreover, embodiments of the present disclosure allow for a reduction in imaging time, while providing enhanced performance as compared to conventional training and segmentation based methods. Particularly, the automated detection and scan plane identification allows for robust and reproducible measurements of the feature of interest irrespective of the skill and/or experience level of the user. The embodiments of the present methods and systems, thus, may also aid in making quality ultrasound imaging services available over large geographical areas including rural regions that are traditionally under-served owing to lack of trained sonographers.
[0108] It may be noted that the foregoing examples, demonstrations, and process steps that may be performed by certain components of the present systems, for example by the processing unit 114 and/or the video processor 124 of FIG. 1, may be implemented by suitable code on a processor-based system. To that end, the processor-based system, for example, may include a general-purpose or a special-purpose computer. It may also be noted that different implementations of the present disclosure may perform some or all of the steps described herein in different orders or substantially concurrently.
[0109] Additionally, the functions may be implemented in a variety of programming languages, including but not limited to Ruby, Hypertext Preprocessor (PHP), Perl, Delphi, Python, C, C++, or Java. Such code may be stored or adapted for storage on one or more tangible, machine-readable media, such as on data repository chips, local or remote hard disks, optical disks (that is, CDs or DVDs), solid-state drives, or other media, which may be accessed by the processor-based system to execute the stored code.
[0110] Although specific features of embodiments of the present disclosure may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments, for example, to construct additional assemblies and methods for use in diagnostic imaging.
[0111] while only certain features of the present disclosure have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
We Claim:
1. A method for ultrasound imaging, comprising: identifying one or more candidate structures corresponding to a feature of interest in each image frame in a plurality of image frames corresponding to a subject; evaluating each candidate structure in the one or more candidate structures in each image frame; retaining at least one candidate structure from the one or more candidate structures based on the evaluation; determining a correspondence between the at least one retained candidate structure and the feature of interest; identifying the at least one retained candidate structure as the feature of interest based on the determined correspondence; identifying a subset of image frames from the plurality of image frames, wherein the subset of image frames comprises the identified feature of interest; computing a correlation indicator for each image frame in the subset of image frames using a physics-based forward model, wherein the correlation indicator is representative of a correlation between a scan plane corresponding to the image frame and a desired scan plane; and selecting an optimal image frame from the subset of image frames based on the correlation indicator.
2. A method for ultrasound imaging, comprising: generating a plurality of binary images corresponding to a plurality of image frames; identifying one or more candidate structures corresponding to a feature of interest in each binary image in the plurality of binary images using connected component analysis; selecting a candidate structure from the one or more candidate structures in the binary image based on one or more characteristics corresponding to the one or more candidate structures in the binary image; determining end points of the selected candidate structure; generating a first mask using the determined end points; applying the first mask to the binary image to generate a second mask corresponding to the selected candidate structure; and applying the second mask to an image frame that corresponds to the binary image for identifying the feature of interest in the image frame.
3. The method of claim 2, wherein generating the plurality of binary images comprises: smoothing each image frame in the plurality of image frames; applying a feature enhancement filter to the smoothened image frame to generate a filtered image frame; identifying the one or more candidate structures in the filtered image frame using thresholding; and removing one or more unconnected components from the one or more candidate structures in the filtered image frame to generate a corresponding binary image.
4. The method of claim 2, wherein selecting the candidate structure comprises: generating skeletons corresponding to the one or more candidate structures; removing one or more branches of the skeletons; determining one or more characteristics corresponding to each of the skeletons; computing a metric corresponding to each of the skeletons based on the one or more characteristics; and selecting a skeleton as the feature of interest from the skeletons based on the computed metric.
5. The method of claim 4, wherein determining the one or more characteristics comprises determining an aspect ratio, phase symmetry, intensity, size, location, vesselness, or combinations thereof, corresponding to each of the skeletons.
6. The method of claim 5, wherein computing the metric corresponding to each of the skeletons comprises computing a normalized sum of the aspect ratio, the phase symmetry, the intensity, the size, the location, the vesselness, or combinations thereof, corresponding to each of the skeletons.
7. The method of claim 2, wherein identifying the one or more candidate structures comprises applying multilevel thresholding on the binary image.
8. The method of claim 2, wherein selecting the candidate structure from the one or more candidate structures comprises: computing a value of a uniformity indicator based on a size of a largest connected component in a foreground of a greyscale image corresponding to the binary image and a reference size of the feature of interest; and identifying the largest connected component as the feature of interest based on the computed value of the uniformity indicator.
9. The method of claim 8, further comprising identifying a connected component in the one or more candidate structures that has the highest value of the uniformity indicator as the feature of interest.
10. The method of claim 2, wherein selecting the candidate structure from the one or more candidate structures comprises: defining a neighborhood region around the end points of the selected candidate structure using the first mask; computing a sharpness metric corresponding to end points of the selected candidate structure based on a percentage overlap between an area of the selected candidate structure within the neighborhood region and a foreground of a greyscale image corresponding to the binary image within the neighborhood region; and identifying the selected candidate structure as the feature of interest based on the computed sharpness metric.
11. The method of claim 2, wherein generating the first mask comprises generating a parallelogram shaped mask, wherein a first side of the parallelogram shaped mask comprises a line connecting the end points of the selected candidate structure, and wherein the height of the parallelogram shaped mask is one tenth of the first side.
12. The method of claim 2, wherein applying the first mask to the binary image comprises: applying the parallelogram shaped mask to the binary image; and retaining a largest connected component in the binary image to generate the second mask.
13. A method for ultrasound imaging, comprising: receiving a plurality of image frames comprising a feature of interest; receiving one or more characteristics corresponding to the feature of interest in each of the plurality of image frames, wherein the one or more characteristics comprise end points, length, depth, intensity profile, or combinations thereof; determining a reference template corresponding to the feature of interest using the one or more characteristics and a physics-based forward model; computing a reference intensity profile of the feature of interest using the reference template; determining corresponding correlations between the reference intensity profile and the intensity profile of the feature of interest in each of the plurality of image frames; and identifying an optimal image frame based on a correlation indicator indicative of the determined correlation corresponding to each of the plurality of image frames.
14. The method of claim 13, wherein the feature of interest comprises a linear structure having high acoustic impedance.
15. The method of claim 13, wherein the feature of interest comprises a femur, a catheter, or a needle.
16. The method of claim 13, wherein determining the reference template corresponding to the feature of interest comprises modeling the feature of interest as a low absorption edge in a plane using edge diffraction formalism.
17. The method of claim 16, wherein using the edge diffraction formalism comprises modeling the low absorption edge as a linear structure having a length that is at least ten times the length of an ultrasound wave incident on the low absorption edge.
18. The method of claim 13, further comprising communicating the correlation indicator to a user, wherein communicating the correlation indicator comprises displaying the correlation indicator on a display, providing an audio output indicative of the correlation indicator, or a combination thereof.
19. The method of claim 13, wherein identifying the optimal image frame comprises selecting an image frame from the plurality of image frames that has the highest correlation between the reference intensity profile and an intensity profile of the feature of interest in the selected image frame.
20. The method of claim 19, further comprising communicating the optimal image frame to a user by providing a visual output, an audio output, or a combination thereof.
21. An imaging system, comprising: An acquisition subsystem configured to obtain a plurality of image frames corresponding to a target region of a subject; a processing unit in operative association with the acquisition subsystem and configured to: identify one or more candidate structures corresponding to a feature of interest in each image frame in a plurality of image frames corresponding to a subject; evaluate each candidate structure in the one or more candidate structures in each image frame; retain at least one candidate structure from the candidate structures based on the evaluation; determine a correspondence between the at least one retained candidate structure and the feature of interest; identify the at least one retained candidate structure as the feature of interest if the determined correspondence is greater than a determined threshold; identify a subset of image frames from the plurality of image frames, wherein the subset of image frames comprises the identified feature of interest; compute a correlation indicator for each image frame in the subset of image frames using a physics-based forward model, wherein the correlation indicator is representative of a correlation between a scan plane corresponding to the image frame and a desired scan plane; and select an optimal image frame from the subset of image frames based on the correlation indicator.
22. The imaging system of claim 21, wherein the imaging system comprises an ultrasound imaging system, a contrast enhanced ultrasound imaging system, an optical imaging system, or combinations thereof.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 4549-CHE-2013 POWER OF ATTORNEY 08-10-2013.pdf | 2013-10-08 |
| 1 | 4549-CHE-2013-ASSIGNMENT WITH VERIFIED COPY [19-03-2025(online)].pdf | 2025-03-19 |
| 1 | 4549-CHE-2013-IntimationOfGrant24-07-2023.pdf | 2023-07-24 |
| 2 | 4549-CHE-2013 ABSTRACT 08-10-2013.pdf | 2013-10-08 |
| 2 | 4549-CHE-2013-FORM-16 [19-03-2025(online)].pdf | 2025-03-19 |
| 2 | 4549-CHE-2013-PatentCertificate24-07-2023.pdf | 2023-07-24 |
| 3 | 4549-CHE-2013 FORM-3 08-10-2013.pdf | 2013-10-08 |
| 3 | 4549-CHE-2013-POWER OF AUTHORITY [19-03-2025(online)].pdf | 2025-03-19 |
| 3 | 4549-CHE-2013-Written submissions and relevant documents [13-07-2023(online)].pdf | 2023-07-13 |
| 4 | 4549-CHE-2013-IntimationOfGrant24-07-2023.pdf | 2023-07-24 |
| 4 | 4549-CHE-2013-Correspondence to notify the Controller [19-06-2023(online)].pdf | 2023-06-19 |
| 4 | 4549-CHE-2013 FORM-2 08-10-2013.pdf | 2013-10-08 |
| 5 | 4549-CHE-2013-PatentCertificate24-07-2023.pdf | 2023-07-24 |
| 5 | 4549-CHE-2013-AMENDED DOCUMENTS [01-06-2023(online)].pdf | 2023-06-01 |
| 5 | 4549-CHE-2013 FORM-18 08-10-2013.pdf | 2013-10-08 |
| 6 | 4549-CHE-2013-Written submissions and relevant documents [13-07-2023(online)].pdf | 2023-07-13 |
| 6 | 4549-CHE-2013-FORM 13 [01-06-2023(online)].pdf | 2023-06-01 |
| 6 | 4549-CHE-2013 FORM-1 08-10-2013.pdf | 2013-10-08 |
| 7 | 4549-CHE-2013-FORM-26 [01-06-2023(online)].pdf | 2023-06-01 |
| 7 | 4549-CHE-2013-Correspondence to notify the Controller [19-06-2023(online)].pdf | 2023-06-19 |
| 7 | 4549-CHE-2013 DRAWINGS 08-10-2013.pdf | 2013-10-08 |
| 8 | 4549-CHE-2013 DESCRIPTION (COMPLETE) 08-10-2013.pdf | 2013-10-08 |
| 8 | 4549-CHE-2013-AMENDED DOCUMENTS [01-06-2023(online)].pdf | 2023-06-01 |
| 8 | 4549-CHE-2013-POA [01-06-2023(online)].pdf | 2023-06-01 |
| 9 | 4549-CHE-2013 CORRESPONDENCE OTHERS 08-10-2013.pdf | 2013-10-08 |
| 9 | 4549-CHE-2013-FORM 13 [01-06-2023(online)].pdf | 2023-06-01 |
| 9 | 4549-CHE-2013-US(14)-HearingNotice-(HearingDate-28-06-2023).pdf | 2023-05-30 |
| 10 | 4549-CHE-2013 CLAIMS 08-10-2013.pdf | 2013-10-08 |
| 10 | 4549-CHE-2013-ABSTRACT [05-11-2019(online)].pdf | 2019-11-05 |
| 10 | 4549-CHE-2013-FORM-26 [01-06-2023(online)].pdf | 2023-06-01 |
| 11 | 4549-CHE-2013 POWER OF ATTORNEY 28-02-2014.pdf | 2014-02-28 |
| 11 | 4549-CHE-2013-CLAIMS [05-11-2019(online)].pdf | 2019-11-05 |
| 11 | 4549-CHE-2013-POA [01-06-2023(online)].pdf | 2023-06-01 |
| 12 | 4549-CHE-2013 CORRESPONDENE OTHERS 28-02-2014.pdf | 2014-02-28 |
| 12 | 4549-CHE-2013-COMPLETE SPECIFICATION [05-11-2019(online)].pdf | 2019-11-05 |
| 12 | 4549-CHE-2013-US(14)-HearingNotice-(HearingDate-28-06-2023).pdf | 2023-05-30 |
| 13 | abstract 4549-CHE-2013.jpg | 2016-08-25 |
| 13 | 4549-CHE-2013-CORRESPONDENCE [05-11-2019(online)].pdf | 2019-11-05 |
| 13 | 4549-CHE-2013-ABSTRACT [05-11-2019(online)].pdf | 2019-11-05 |
| 14 | 4549-CHE-2013-CLAIMS [05-11-2019(online)].pdf | 2019-11-05 |
| 14 | 4549-CHE-2013-DRAWING [05-11-2019(online)].pdf | 2019-11-05 |
| 14 | 4549-CHE-2013-FER.pdf | 2019-05-29 |
| 15 | 4549-CHE-2013-COMPLETE SPECIFICATION [05-11-2019(online)].pdf | 2019-11-05 |
| 15 | 4549-CHE-2013-FER_SER_REPLY [05-11-2019(online)].pdf | 2019-11-05 |
| 15 | 4549-CHE-2013-RELEVANT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 16 | 4549-CHE-2013-CORRESPONDENCE [05-11-2019(online)].pdf | 2019-11-05 |
| 16 | 4549-CHE-2013-FORM 13 [10-10-2019(online)].pdf | 2019-10-10 |
| 16 | 4549-CHE-2013-OTHERS [05-11-2019(online)].pdf | 2019-11-05 |
| 17 | 4549-CHE-2013-DRAWING [05-11-2019(online)].pdf | 2019-11-05 |
| 17 | 4549-CHE-2013-FORM 13 [10-10-2019(online)].pdf | 2019-10-10 |
| 17 | 4549-CHE-2013-OTHERS [05-11-2019(online)].pdf | 2019-11-05 |
| 18 | 4549-CHE-2013-FER_SER_REPLY [05-11-2019(online)].pdf | 2019-11-05 |
| 18 | 4549-CHE-2013-RELEVANT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 19 | 4549-CHE-2013-DRAWING [05-11-2019(online)].pdf | 2019-11-05 |
| 19 | 4549-CHE-2013-FER.pdf | 2019-05-29 |
| 19 | 4549-CHE-2013-OTHERS [05-11-2019(online)].pdf | 2019-11-05 |
| 20 | 4549-CHE-2013-CORRESPONDENCE [05-11-2019(online)].pdf | 2019-11-05 |
| 20 | 4549-CHE-2013-FORM 13 [10-10-2019(online)].pdf | 2019-10-10 |
| 20 | abstract 4549-CHE-2013.jpg | 2016-08-25 |
| 21 | 4549-CHE-2013-RELEVANT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 21 | 4549-CHE-2013-COMPLETE SPECIFICATION [05-11-2019(online)].pdf | 2019-11-05 |
| 21 | 4549-CHE-2013 CORRESPONDENE OTHERS 28-02-2014.pdf | 2014-02-28 |
| 22 | 4549-CHE-2013 POWER OF ATTORNEY 28-02-2014.pdf | 2014-02-28 |
| 22 | 4549-CHE-2013-CLAIMS [05-11-2019(online)].pdf | 2019-11-05 |
| 22 | 4549-CHE-2013-FER.pdf | 2019-05-29 |
| 23 | 4549-CHE-2013 CLAIMS 08-10-2013.pdf | 2013-10-08 |
| 23 | 4549-CHE-2013-ABSTRACT [05-11-2019(online)].pdf | 2019-11-05 |
| 23 | abstract 4549-CHE-2013.jpg | 2016-08-25 |
| 24 | 4549-CHE-2013-US(14)-HearingNotice-(HearingDate-28-06-2023).pdf | 2023-05-30 |
| 24 | 4549-CHE-2013 CORRESPONDENE OTHERS 28-02-2014.pdf | 2014-02-28 |
| 24 | 4549-CHE-2013 CORRESPONDENCE OTHERS 08-10-2013.pdf | 2013-10-08 |
| 25 | 4549-CHE-2013 POWER OF ATTORNEY 28-02-2014.pdf | 2014-02-28 |
| 25 | 4549-CHE-2013-POA [01-06-2023(online)].pdf | 2023-06-01 |
| 25 | 4549-CHE-2013 DESCRIPTION (COMPLETE) 08-10-2013.pdf | 2013-10-08 |
| 26 | 4549-CHE-2013 CLAIMS 08-10-2013.pdf | 2013-10-08 |
| 26 | 4549-CHE-2013 DRAWINGS 08-10-2013.pdf | 2013-10-08 |
| 26 | 4549-CHE-2013-FORM-26 [01-06-2023(online)].pdf | 2023-06-01 |
| 27 | 4549-CHE-2013 CORRESPONDENCE OTHERS 08-10-2013.pdf | 2013-10-08 |
| 27 | 4549-CHE-2013 FORM-1 08-10-2013.pdf | 2013-10-08 |
| 27 | 4549-CHE-2013-FORM 13 [01-06-2023(online)].pdf | 2023-06-01 |
| 28 | 4549-CHE-2013-AMENDED DOCUMENTS [01-06-2023(online)].pdf | 2023-06-01 |
| 28 | 4549-CHE-2013 FORM-18 08-10-2013.pdf | 2013-10-08 |
| 28 | 4549-CHE-2013 DESCRIPTION (COMPLETE) 08-10-2013.pdf | 2013-10-08 |
| 29 | 4549-CHE-2013-Correspondence to notify the Controller [19-06-2023(online)].pdf | 2023-06-19 |
| 29 | 4549-CHE-2013 FORM-2 08-10-2013.pdf | 2013-10-08 |
| 29 | 4549-CHE-2013 DRAWINGS 08-10-2013.pdf | 2013-10-08 |
| 30 | 4549-CHE-2013-Written submissions and relevant documents [13-07-2023(online)].pdf | 2023-07-13 |
| 30 | 4549-CHE-2013 FORM-3 08-10-2013.pdf | 2013-10-08 |
| 30 | 4549-CHE-2013 FORM-1 08-10-2013.pdf | 2013-10-08 |
| 31 | 4549-CHE-2013 ABSTRACT 08-10-2013.pdf | 2013-10-08 |
| 31 | 4549-CHE-2013 FORM-18 08-10-2013.pdf | 2013-10-08 |
| 31 | 4549-CHE-2013-PatentCertificate24-07-2023.pdf | 2023-07-24 |
| 32 | 4549-CHE-2013-IntimationOfGrant24-07-2023.pdf | 2023-07-24 |
| 32 | 4549-CHE-2013 POWER OF ATTORNEY 08-10-2013.pdf | 2013-10-08 |
| 32 | 4549-CHE-2013 FORM-2 08-10-2013.pdf | 2013-10-08 |
| 33 | 4549-CHE-2013-POWER OF AUTHORITY [19-03-2025(online)].pdf | 2025-03-19 |
| 33 | 4549-CHE-2013 FORM-3 08-10-2013.pdf | 2013-10-08 |
| 34 | 4549-CHE-2013-FORM-16 [19-03-2025(online)].pdf | 2025-03-19 |
| 34 | 4549-CHE-2013 ABSTRACT 08-10-2013.pdf | 2013-10-08 |
| 35 | 4549-CHE-2013-ASSIGNMENT WITH VERIFIED COPY [19-03-2025(online)].pdf | 2025-03-19 |
| 35 | 4549-CHE-2013 POWER OF ATTORNEY 08-10-2013.pdf | 2013-10-08 |
| 1 | 4549search_17-05-2019.pdf |