Abstract: A medical imaging apparatus for capturing images of a subject is disclosed. The medical imaging apparatus includes an image capturing unit configured to capture a live image stream of an object associated with the subject. An image selection processor is communicably coupled to the image capturing unit. The image selection processor is configured to receive a plurality of image frames associated with the object. The plurality of image frames is of the live image stream. A stable image frame is selected from the plurality of image frames based on one or more image selection parameters. A memory is communicably coupled to the image capturing unit and the image selection processor and is configured to the store of the plurality of image frames. FIG. 2
MEDICAL IMAGING APPARATUS AND METHOD FOR IDENTIFYING STABLE IMAGES OF OBJECT
TECHNICAL FIELD
[0001] The subject matter disclosed herein relates to a medical imaging apparatus for capturing images of a subject. More specifically, the invention relates to identifying stable images from a plurality of images captured by the medical imaging apparatus.
BACKGROUND OF THE INVENTION
[0002] Medical imaging systems are used in different applications to image different regions or areas (e.g. different organs) of patients or other objects. For example, an ultrasound imaging system may be utilized to generate an image of organs, vasculature, heart, or other portions of the body. Ultrasound imaging systems are generally located at a medical facility, for example, a hospital or imaging center. The ultrasound imaging system includes an ultrasound probe placed on a portion of subject's body to capture images of objects (e.g. organs) in the subject. The images may be presented as a live streaming video of an organ to a user. The live streaming video is buffered in the ultrasound imaging system referred to as cine loops. A cine loop represents a first in, first out circular image buffer to capture image data that is displayed in real-time to the user. The user may freeze the live stream by entering a freeze command at a user interface of the medical imaging system. The user may use a foot switch, a hand switch, touch gesture or voice command to freeze the live stream. The cine loop includes multiple images of the organ in different orientations. The orientations vary based on the position of the ultrasound probe with respect to the object that is being scanned. The cine loops are held in a main memory of the ultrasound imaging system and then can be replayed by the user to review the images frame by frame. While reviewing the cine loop, the user may select a stable image by manually going through the cine buffer and selecting the most stable and visually clear image. Usually the ultrasound imaging system presents the last image that was captured at the time freeze command is received. But usually the process of activation of freeze disturbs user's concentration on the capture and lands up seeing a blurred or image from a slightly moved position. Hence they are required to manually go through few frames in the buffered cine loops to bring the right frame up on the screen.
[0003] Hence, there is a need for an improved medical imaging apparatus for identifying stable images accurately.
BRIEF DESCRIPTION OF THE INVENTION
[0004] The above-mentioned shortcomings, disadvantages and problems are addressed herein which will be understood by reading and understanding the following specification.
[0005] In an embodiment a medical imaging apparatus for capturing images of a subject. The medical imaging apparatus includes an image capturing unit configured to capture a live image stream of an object associated with the subject. An image selection processor is communicably coupled to the image capturing unit. The image selection processor is configured to receive a plurality of image frames associated with the object. The plurality of image frames is of the live image stream. A stable image frame is selected from the plurality of image frames based on one or more image selection parameters. A memory is communicably coupled to the image capturing unit and the image selection processor and is configured to the store of the plurality of image frames.
[0006] In another embodiment a system for managing a plurality of image frames of an object captured by a medical imaging apparatus is disclosed. The system includes an image selection module configured to receive a plurality of image frames associated with the object. The plurality of image frames is of a live image stream captured using the medical imaging apparatus. A stable image frame is selected from the plurality of image frames based on one or more image selection parameters. The stable image frame is presented to a user by a presentation module.
[0007] In yet another embodiment a method for managing a plurality of image frames of an object captured by a medical imaging apparatus in accordance with an embodiment. The method includes receiving a plurality of image frames associated with the object of a subject. The plurality of image frames is of a live image stream captured using the medical imaging apparatus. Selecting a stable image frame from the plurality of image frames based on one or more image selection parameters; and presenting the stable image frame to a user.
[0008] Various other features, objects, and advantages of the invention will be made apparent to those skilled in the art from the accompanying drawings and detailed description thereof.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] FIGURE 1 illustrates an ultrasound imaging system that directs ultrasound energy pulses into an object, typically a human body, and creates an image of the body based upon the ultrasound energy reflected from the tissue and structures of the body in accordance with an embodiment;
[0010] FIGURE 2 is a schematic illustration of a medical imaging apparatus used on a patient to capture a plurality of image frames of objects in a patient in accordance with an embodiment;
[0011] FIGURE 3 is a schematic illustration of identifying a stable image frame from the plurality of image frames in accordance with an embodiment;
[0012] FIGURE 4 is a schematic illustration of a medical imaging system for identifying a stable image frame from a plurality of image frames associated with an object in accordance with an embodiment;
[0013] FIGURE 5 is a flow diagram of a method for managing a plurality of image frames of an object captured by a medical imaging apparatus in accordance with an embodiment; and
[0014] FIGURE 6 is a flow diagram of a method for managing a plurality of image frames of an object captured by a medical imaging apparatus in accordance with another embodiment.
DETAILED DESCRIPTION OF THE INVENTION
[0015] In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention.
[0016] As discussed in detail below, embodiments of the invention including an ultrasound probe diagnosing apparatus for diagnosing an ultrasound probe having a probe head comprising a plurality of transducer elements is disclosed.
[0017] Although the various embodiments are described with respect to an ultrasound imaging system, the various embodiments may be utilized with any suitable medical imaging system, for example, X-ray, computed tomography, single photon emission computed tomography, magnetic resonance imaging, or the like.
[0018] FIG. 1 shows an ultrasound imaging system 100 that directs ultrasound energy pulses into an object, typically a human body, and creates an image of the body based upon the ultrasound energy reflected from the tissue and structures of the body.
[0019] The ultrasound imaging system 100 comprises a probe 102 (i.e. an image acquisition unit) that includes a transducer array having a plurality of transducer elements. The probe 102 and the ultrasound imaging system 100 may be physically connected, such as through a cable, or they may be in communication through a wireless technique. The transducer array can be one-dimensional (1-D) or two-dimensional (2-D).
A 1-D transducer array comprises a plurality of transducer elements arranged in a single dimension and a 2-D transducer array comprises a plurality of transducer elements arranged across two dimensions namely azimuthal and elevation. The number of transducer elements and the dimensions of transducer elements may be the same in the azimuthal and elevation directions or different. Further, each transducer element can be configured to function as a transmitter 108 or a receiver 110. Alternatively, each transducer element can be configured to act both as a transmitter 108 and a receiver 110.
[0020] The ultrasound imaging system 100 further comprises a pulse generator 104 and a transmit/receive switch 106. The pulse generator 104 is configured for generating and supplying excitation signals to the transmitter 108 and the receiver 110. The transmitter 108 is configured for transmitting ultrasound beams, along a plurality of transmit scan lines, in response to the excitation signals. The term "transmit scan lines" refers to spatial directions on which transmit beams are positioned at some time during an imaging operation. The receiver 110 is configured for receiving echoes of the transmitted ultrasound beams. The transmit/receive switch 106 is configured for switching transmitting and receiving operations of the probe 102.
[0021] The ultrasound imaging system 100 further comprises a transmit beamformer 112 and a receive beamformer 114. The transmit beamformer 112 is coupled through the transmit/receive (T/R) switch 106 to the probe 102. The transmit beamformer 112 receives pulse sequences from the pulse generator 104. The probe 102, energized by the transmit beamformer 112, transmits ultrasound energy into a region of interest (ROI) in a patient's body. As is known in the art, by appropriately delaying the waveforms applied to the transmitter 108 by the transmit beamformer 112, a focused ultrasound beam may be transmitted.
[0022] The probe 102 is also coupled, through the T/R switch 106, to the receive beamformer 114. The receiver 110 receives ultrasound energy from a given point within the patient's body at different times. The receiver 110 converts the received ultrasound energy to transducer signals which may be amplified, individually delayed and then accumulated by the receive beamformer 114 to provide a receive signal that represents
the received ultrasound levels along a desired receive line ("transmit scan line" or "beam"). The receive signals are image data that can be processed to obtain images i.e. ultrasound images of the region of interest in the patient's body. The receive beamformer 114 may be a digital beamformer including an analog-to-digital converter for converting the transducer signals to digital values. As known in the art, the delays applied to the transducer signals may be varied during reception of ultrasound energy to effect dynamic focusing. The process of transmission and reception is repeated for multiple transmit scan lines to create an image frame for generating an image of the region of interest in the patient's body.
[0023] In an alternative system configuration, different transducer elements are employed for transmitting and receiving. In that configuration, the T/R switch 106 is not included, and the transmit beamformer 112 and the receive beamformer 114 are connected directly to the respective transmit or receive transducer elements.
[0024] The receive signals from the receive beamformer 114 are applied to a signal processing unit 116, which processes the receive signals for enhancing the image quality and may include routines such as detection, filtering, persistence and harmonic processing. The output of the signal processing unit 116 is supplied to a scan converter 118. The scan converter 118 creates a data slice from a single scan plane. The data slice is stored in a slice memory and then is passed to a display unit 120, which processes the scan converted image data so as to display an image of the region of interest in the patient's body.
[0025] In one embodiment, high resolution is obtained at each image point by coherently combining the receive signals thereby synthesizing a large aperture focused at the point. Accordingly, the ultrasound imaging system 100 acquires and stores coherent samples of receive signals associated with each receive beam and performs interpolations (weighted summations, or otherwise), and/or extrapolations and/or other computations with respect to stored coherent samples associated with distinct receive beams to synthesize new coherent samples on synthetic scan lines that are spatially distinct from the receive scan lines and/or spatially distinct from the transmit scan lines and/or both.
1 0
The synthesis or combination function may be a simple summation or a weighted summation operation, but other functions may as well be used. The synthesis function includes linear or nonlinear functions and functions with real or complex, spatially invariant or variant component beam weighting coefficients. The ultrasound imaging system 100 then in one embodiment detects both acquired and synthetic coherent samples, performs a scan conversion, and displays or records the resulting ultrasound image.
[0026] Ultrasound data is typically acquired in image frames, each image frame representing a sweep of an ultrasound beam emanating from the face of the transducer array. A 1-D transducer array produces 2-D rectangular or pie-shaped sweeps, each sweep being represented by a series of data points. Each of the data points are, in effect, a value representing the intensity of an ultrasound reflection at a certain depth along a given transmit scan line. On the other hand, the 2-D transducer array allows beam steering in two dimensions as well as focus in the depth direction. This eliminates the need to physically move the probe 102 to translate focus for the capture of a volume of ultrasound data to be used to render 3-D images.
[0027] One method to generate real-time 3-D scan data sets is to perform multiple sweeps wherein each sweep is oriented in a different scan plane. The transmit scan lines of every sweep are typically arrayed across the probe's 102 "lateral" dimension. The planes of the successive sweeps in an image frame are rotated with respect to each other, e.g. displaced in the "elevation" direction, which is typically orthogonal to the lateral dimension. Alternatively, successive sweeps may be rotated about a centerline of the lateral dimension. In general, each scan frame comprises plurality of transmit scan lines allowing the interrogation of a 3-D scan data set representing a scan volume of some pre¬determined shape, such as a cube, a sector, frustum, or cylinder.
[0028] In one exemplary embodiment, each scan frame represents a scan volume in the shape of a sector. Therefore the scan volume comprises multiple sectors. Each sector comprises plurality of beam positions, which may be divided into sub sectors. Each sub sector may comprise equal number of beam positions. However, it is not necessary for
1 #
the sub sectors to comprise equal number of beam positions. Further, each sub sector comprises at least one set of beam positions and each beam position in a set of beam positions is numbered in sequence. Therefore, each sector comprises multiple sets of beam positions indexed sequentially on a predetermined rotation.
[0029] Plurality of transmit beam sets are generated from each sector. Further, each transmit beam set comprises one or more simultaneous transmit beams depending on the capabilities of the ultrasound imaging system 100. The term "simultaneous transmit beams" refers to transmit beams that are part of the same transmit event and that are in flight in overlapping time periods. Simultaneous transmit beams do not have to begin precisely at the same instant or to terminate precisely at the same instant. Similarly, simultaneous receive beams are receive beams that are acquired from the same transmit event, whether or not they start or stop at precisely the same instant.
[0030] The transmit beams in each transmit beam set are separated by the plurality of transmit scan lines wherein each transmit scan line is associated with a single beam position. Thus, the multiple transmit beams are arranged in space separated such that they do not have significant interference effects.
[0031] The transmit beamformer 112 can be configured for generating each transmit beam set from beam positions having the same index value. Thus, beam positions with matching index value, in each sub sector, can be used for generating multiple simultaneous transmit beams that form a single transmit beam set. In one embodiment, at least two consecutive transmit beam sets are generated from beam positions not indexed sequentially. In an alternative embodiment, at least a first transmit beam set and a last transmit beam set, in a sector, are not generated from neighboring beam positions.
[0032] FIG. 2 is a schematic illustration of a medical imaging apparatus 200 used on a patient 202 to capture a plurality of image frames 204 of objects in the patient 202 in accordance with an embodiment. FIG. 3 is a schematic illustration of identifying a stable image frame from the plurality of image frames 204 in accordance with an embodiment. It may be noted that hereinafter FIG. 2 and FIG. 3 will be concurrently described. The
1 ā¢
medical imaging apparatus 200 may include an imaging unit 206 used to capture the plurality of image frames 204 of the objects. For instance an ultrasound imaging apparatus includes an ultrasound probe that is used on the patient's body for receiving signals associated with live image data of objects. These signals are received and processed by an image capturing unit 208 to generate the plurality of image frames 204 presented as part of a live image stream to the user. Considering the previous example when the ultrasound probe is moved on an abdomen of a patient then a live image stream including multiple images frames of an internal portion of the abdomen is presented to a user. Each image frame may present an image of the internal portion of the abdomen captured at a particular instance. In an embodiment the imaging unit 206 includes an accelerometer 210 configured to identify position and time information associated with each image frame of the plurality of image frames. Position information indicates an orientation (such as, angle of orientation, ultrasonic signal or beam angle) with respect to the object while capturing an image frame. The time information indicates time or instance when the image frame is captured. The position and time information of the plurality of image frames are stored in a memory 212. In an embodiment the plurality of image frames 204 may be part of a cine loop. The cine loop is a collection of image frames of an object captured for a predefined interval. Multiple cine loops are stored in the medical imaging apparatus 200 and can be viewed by a user at any stage. The objects in the patient 202 include organs and tissues in the patient's body. The plurality of image frames may be only a portion of the live image stream or may be the complete live image stream. The plurality of image frames 204 is filtered by a filtering processor 214 to remove any disturbances. The disturbances may be speckles in the plurality of image frames 204. However it may be envisioned that disturbances other than speckles may be present in the plurality of image frames 204 and the filtering processor 214 may be configured to remove these disturbances as well. Further in other embodiments the filtering processor 214 may include separate filtering processors for removing different disturbances in the plurality of image frames 204.
[0033] The filtered plurality of image frames 204 is processed by an image selection processor 216 to select a stable image frame 218. The stable image frame 218 is selected
based on one or more image selection parameters such as predefined measurements of an object and its elements, predefined shape of an object and its elements and a predefined imaging plane. Considering the example of an internal structure of a fetal head, multiple predefined measurements for example, biparietal diameter (BPD), head circumference (HC), femur length (FL), and abdominal curvature are used for selecting a stable image frame of the internal structure. Further the predefined shape of elements in the fetal head may be an ellipse shape of fetal head frame, a butterfly shape of thalami, an empty box shape of cavum, and a straight line shape of falx. In this example fetal head is the object and the elements include the fetal head frame, thalami, cavum and falx. Moreover this stable image frame 218 is also selected based on a predefined imaging plane. The predefined imaging plane is a plane in which the stable image frame 218 presents the internal structure of the fetal head in a clear form. The images can be acquired from different scanning planes such as but not limited to, an axial plane, a transventricular plane, a transthalamic plane, a transcerebellar plane, a coronal plane, a sagittal plane and a mid-sagittal plane. The plurality of image frames 204 are also compared with a predefined image template associated with the object. An image frame that correlates with the predefined image template is identified from the plurality of image frames 204. The correlating image frame is the stable image frame 218 or designated as the stable image frame 218. The stable image frame 218 is considered as correlating when this image frame is closest with respect to the predefined image template. The predefined image template is selected from a set of predefined image templates 220 stored in the memory 212. Each predefined image template may be associated with different objects for example an organ or a body portion of a patient. The one or more image selection parameters are associated with the predefined image template.
[0034] In order to identify the stable image frame or to identify a correlation between an image frame and the predefined image template, more specifically one or more image parameters of each image frame of the plurality of image frames 204 is analyzed by the image selection processor 216. The one or more image parameters include but are not limited to measurement of elements in the object, shape of the elements and an imaging plane of each image frame. The one or more image parameters are compared with the
1 *
one or more image selection parameters. If these image parameters match with the one or more selection parameters then the stable image frame is selected. Explaining using the previous example multiple image frames of an internal structure of a fetal head may be obtained. Image parameters of an image frame may be compared with multiple image selection parameters. The image parameters include measurements of the fetal head and its elements, shape of the fetal head and its elements and an imaging plane. Current measurement and shape of the fetal head and elements in the image frame are compared with one or more image selection parameters i.e. predefined measurement and predefined shape of the fetal head and its elements to identify whether the image frame is a stable image frame.
[0035] In an embodiment the image selection processor 216 may be configured to identify the stable image frame using the predefined image template based on visual similarity and subsequently a comparison between the one or more image parameters of the stable image frame and the one or more image selection parameters is performed. However it may be envisioned that the process of identifying the stable image frame can be performed in any other order. The stable image frame 218 is identified and presented in a display device 222 communicably connected to the medical imaging apparatus 200 to the user.
[0036] In another embodiment the image selection processor 216 analyzes the position and time information of the plurality of image frames 204 and determines position and time information of the stable image frame 218 from the plurality of image frames 204. This position and time information assists the image selection processor 216 to locate the stable image frame in a faster manner. This is explained in further detail in conjunction with FIG. 4.
[0037] In the event a stable image frame is not identified an image frame closer to the predefined image template is identified by the image selection processor 216 and presented to the user. In an embodiment the image frame is identified as closer to the predefined image template based on its degree of closeness to the predefined image template. This is represented by a degree of closeness in image parameters with respect
to the image selection parameters. For instance an image frame may be selected if a degree of closeness in measurement of a fetal head frame with respect to a predefined measurement of the fetal head frame in the predefined image template is less as compared to the case of other image frames. Here the image frame is selected as the stable image frame. It should be appreciated that selection of the image frame based on one image parameter is explained as an example and thus one or more image parameters may be compared with their respective image selection parameters for identifying the stable image frame.
[0038] FIG. 4 is a schematic illustration of a medical imaging system 400 for identifying a stable image frame from a plurality of image frames associated with an object in accordance with an embodiment. The plurality of image frames is part of a live image stream captured using an imaging unit. For example in case of an ultrasound imaging system an ultrasound probe may be used to capture images (in the form of image frames) of an object. So when the ultrasound probe is moved on an abdomen of a patient then a live image stream including images of an internal portion of the abdomen is presented to a user. The medical imaging system 400 includes a filtering module 402 for filtering disturbance on the plurality of image frames. The disturbances may be speckles in the plurality of image frames. However it may be envisioned that disturbances other than speckles may be present in the plurality of image frames and the filtering module 402 may be configured to remove the speckles and other disturbances. Further in other embodiments the filtering module 402 may include separate filtering modules for removing different types of disturbances in the plurality of image frames. In an embodiment the medical imaging system 400 includes an accelerometer 404 configured to identify position and time information associated with each image frame of the plurality of image frames. Position information indicates an orientation (such as, angle of orientation, ultrasonic signal or beam angle) with respect to the object while capturing an image frame. The time information indicates time or instance when the image frame is captured. The position and time information of the plurality of image frames are stored.
[0039] The plurality of image frames is then analyzed by an image selection module 406 for selecting a stable image frame. The stable image frame is selected based on one or more image selection parameters such as predefined measurements of an object and its elements, predefined shape of an object and its elements and a predefined imaging plane. In another embodiment the stable image frame may be also selected based on a predefined imaging plane. The predefined imaging plane is a plane in which the stable image frame presents the internal structure of the fetal head in a clear form. A comparison module 408 compares the plurality of image frames with a predefined image template associated with the object. Based on this comparison an image identification module 410 identifies an image frame correlating with the predefined image template from the plurality of image frames. The correlating image frame is the stable image frame or designated as the stable image frame by the image identification module 410. The stable image frame is considered as correlating when this image frame is closest with respect to the predefined image template. The predefined image template is selected from a set of predefined image templates. Each predefined image template may be associated with different objects for example an organ or a body portion of a patient. The one or more image selection parameters are associated with the predefined image template.
[0040] In another scenario the comparison module 408 analyzes one or more image parameters of each image frame of the plurality of image frames to identify a correlation between an image frame and the predefined image template. The one or more image parameters are compared with the one or more image selection parameters. If these image parameters match with the one or more selection parameters then the stable image frame is identified by the image identification module 410. In an embodiment the comparison module 408 sends comparison information associated with the one or more image parameters to the image identification module 410 to identify the match. In another embodiment the comparison module 408 may be configured to determine the comparison information and identify the match. The match between the image frame and the predefined image template is communicated to the image identification module 410 for identifying or designating the image frame as the stable image frame.
1 *
[0041] In another embodiment the image selection module 406 analyzes the position and time information of the plurality of image frames and determines position and time information of the stable image frame from the plurality of image frames. This position and time information assists the image selection module 406 to locate the stable image frame in a faster manner.
[0042] In the event a stable image frame is not identified an image frame closer to the predefined image template is identified by the image identification module 410 and presented to the user. In an embodiment the image frame is identified as closer to the predefined image template based on its degree of closeness to the predefined image template by the comparison module 408. The degree of closeness is represented by a degree of closeness in image parameters with respect to the image selection parameters. For instance the comparison module 408 determines a degree of closeness in measurement of a fetal head frame with respect to a predefined measurement of the fetal head frame in the predefined image template. If the degree of closeness in the measurement is less as compared to the case of other image frames then the comparison module 408 identifies the image frame and the stable image frame. Then the image identification module 410 designates the image frame and the stable image frame. In another embodiment the comparison module 408 is configured to compare the measurement of the fetal head frame and the predefined measurement of the fetal head frame and send comparison information to the image identification module 410. Then the image identification module 410 determines the degree of closeness in the measurement from the comparison information and consequently identifies the stable image frame. The stable image frame is presented to the user by a presentation module 412. The stable image frame can be used by the user for analyzing the object i.e. the fetal head frame.
[0043] FIG. 5 is a flow diagram of a method 500 for managing a plurality of image frames of an object captured by a medical imaging apparatus in accordance with an embodiment. At step 502 the plurality of image frames associated with the object is received at the medical imaging apparatus. The plurality of image frames are captured by
an imaging unit (for example ultrasound probe) connected to the medical imaging apparatus. The plurality of image frames is part of a live image stream captured using an imaging unit. The medical imaging apparatus analyzes the plurality of image frames to select a stable image frame based on one or more image selection parameters at step 504. The one or more image selection parameters include predefined measurements of an object and its elements, predefined shape of an object and its elements and a predefined imaging plane. For example in case of an internal structure of a fetal head, multiple predefined measurements may include, biparietal diameter (BPD), head circumference (HC), femur length (FL), and abdominal curvature are used for selecting a stable image frame of the internal structure. Further the predefined shape of elements in the fetal head may be an ellipse shape of fetal head frame, a butterfly shape of thalami, an empty box shape of cavum, and a straight line shape of falx. In this example fetal head is the object and the elements include the fetal head frame, thalami, cavum and falx. The stable image frame is presented to the user at step 506. The stable image is presented through a display device connected to the medical imaging apparatus.
[0044] FIG. 6 is a flow diagram of a method 600 for managing a plurality of image frames of an object captured by a medical imaging apparatus in accordance with another embodiment. At block 602 the plurality of image frames associated with the object is received at the medical imaging apparatus. The plurality of image frames are captured by an imaging unit connected to the medical imaging apparatus. The plurality of image frames is part of a live image stream captured using an imaging unit.
[0045] The plurality of image frames are compared with a predefined image template associated with the object at block 604. An image frame is identified from the plurality of image frames that correlates with the predefined image template at block 606 based on the comparison. The correlating image frame is the stable image frame or designated as the stable image frame. The stable image frame is considered as correlating when this image frame is closest with respect to the predefined image template. The predefined image template is selected from a set of predefined image templates. Each predefined image template may be associated with different objects for example an organ or a body
1 f
portion of a patient. The one or more image selection parameters are associated with the predefined image template.
[0046] In order to identify the stable image frame or to identify a correlation between an image frame and the predefined image template, more specifically one or more image parameters of each image frame of the plurality of image frames are analyzed at block 608. The one or more image parameters include but are not limited to measurement of elements in the object, shape of the elements and an imaging plane of each image frame. The one or more image parameters are compared with the one or more image selection parameters to determine a match at block 610. A check is performed at block 612 to determine a match. If these image parameters match with the one or more selection parameters at block 614 the image frame is selected from the plurality of image frames as a stable image frame. Subsequently at block 616 the stable image frame is presented to the user.
[0047] In an embodiment the stable image frame may be identified using the predefined image template based on visual similarity and subsequently a comparison between the one or more image parameters of the stable image frame and the one or more image selection parameters is performed. However it may be envisioned that the process of identifying the stable image frame can be performed in any other order. The stable image frame is identified and presented to the user.
[0048] In the event a stable image frame is not identified an image frame closer to the predefined image template is identified from the plurality of image frames. Thereafter the image frame closer to the predefined image template is presented to the user at block 618. In an embodiment the image frame is identified as closer to the predefined image template based on its degree of closeness to the predefined image template. This is represented by a degree of closeness in image parameters with respect to the image selection parameters. For instance an image frame may be selected if a degree of closeness in measurement of a fetal head frame with respect to a predefined measurement of the fetal head frame in the predefined image template is less as compared to the case of other image frames. Here the image frame is selected as the stable image frame. It may be noted that selection of the image frame based on one image parameter is explained as an example and thus one or more image parameters may be compared with their respective image selection parameters for identifying the stable image frame.
[0049] The methods 500 and 600 can be performed using a processor or any other processing device. The method steps can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium. The tangible computer readable medium may be for example a flash memory, a read-only memory (ROM), a random access memory (RAM), any other computer readable storage medium and any storage media. Although the method for managing a plurality of image frames of an object captured by a medical imaging apparatus in accordance with another embodiment are explained with reference to the flow chart of FIGS. 5 and 6, other methods of implementing the method can be employed. For example, the order of execution of each method steps may be changed, and/or some of the method steps described may be changed, eliminated, divide or combined. Further the method steps may be sequentially or simultaneously executed for managing a plurality of image frames of an object captured by a medical imaging apparatus in accordance with another embodiment.
[0050] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any computing system or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.
We Claim:
1. A medical imaging apparatus comprises:
an image capturing unit configured to capture a live image stream of an object;
an image selection processor communicably coupled to the image capturing unit, wherein the image selection processor is configured to:
receive a plurality of image frames associated with the object, wherein the plurality of image frames is of the live image stream;
select a stable image frame from the plurality of image frames based on at least one image selection parameter;
a memory communicably coupled to the image capturing unit and the image selection processor, the memory configured to store the plurality of image frames.
2. The medical imaging apparatus of claim 1 further comprises a filtering processor for filtering disturbances in the plurality of image frames.
3. The medical imaging apparatus of claim 1, wherein the image selection processor is further configured to:
compare the plurality of image frames with a predefined image template associated with the object; and
identify an image frame from the plurality of image frames that correlates with the predefined image template, wherein the image frame correlating with the predefined image template is the stable image frame.
4. The medical imaging apparatus of claim 3, wherein the memory is further configured to store a set of predefined image templates, the predefined image template is of the set of predefined image templates.
5. The medical imaging apparatus of claim 3, wherein the at least one image selection parameter is associated with the predefined image template, the at least one image selection parameter comprises measurements and shape of elements in the object, and an imaging plane.
6. The medical imaging apparatus of claim 4, wherein the image selection processor is further configured to:
analyze image parameters of each image frame of the plurality of image frames; and
determine a match between image parameters of the image frame correlating with the predefined image template and the at least one image selection parameter.
7. The medical imaging apparatus of claim 1, wherein the image capturing unit comprises an accelerometer configured to identify and store position and time information associated with each image frame of the plurality of image frames, wherein the position information indicates orientation of the image capturing unit with respect to the object while capturing the image frame and the time information indicates time instance of capturing the image frame, wherein the position and time information are stored in the memory.
8. The medical imaging apparatus of claim 7, wherein the image selection processor is further configured to employ position and time information associated with the stable image frame for selection of the stable image frame.
9. The medical imaging apparatus of claim 1 is an ultrasound imaging apparatus.
10. A system for managing a plurality of image frames of an object captured by a medical imaging apparatus, the system comprises:
an image selection module configured to:
receive a plurality of image frames associated with the object, wherein the plurality of image frames is of a live image stream captured using an image capturing unit of the medical imaging apparatus;
select a stable image frame from the plurality of image frames based on at least one image selection parameter; and a presentation module for presenting the stable image frame.
11. The system of claim 9, wherein the image selection module comprises:
a comparison module for comparing the plurality of image frames with a predefined image template associated with the object; and
an image identification module for identifying an image frame from the plurality of image frames that correlates with the predefined image template, wherein the image frame correlating with the predefined image template is the stable image frame.
12. The system of claim 10, wherein the at least one image selection parameter is associated with the predefined image template, the at least one image selection parameter comprises measurements and shape of elements in the object, and an imaging plane.
13. The system of claim 9, wherein the image identification module is further configured to:
analyze image parameters of each image frame of the plurality of image frames; and
determine a match between image parameters of the image frame correlating with the predefined image template and the at least one image selection parameter.
14. The system of claim 8 further comprises a filtering module for filtering disturbances in the plurality of image frames.
15. The system of claim 10 further comprises an accelerometer module configured to identify and store position and time information associated with each image frame of the plurality of image frames,
wherein the position information indicates orientation of the image capturing unit with respect to the object while capturing the image frame and the time information indicates time instance of capturing the image frame, wherein the position and time information.
16. The system of claim 15, wherein the selection module is further configured to employ position and time information associated with the stable image frame for selection of the stable image frame.
17. A method for managing a plurality of image frames of an object captured by a medical imaging apparatus, the method comprising:
receiving a plurality of image frames associated with the object, wherein the plurality of image frames is of a live image stream captured using the medical imaging apparatus;
selecting a stable image frame from the plurality of image frames based on at least one image selection parameter; and
presenting the stable image frame to a user.
18. The method of claim 17, wherein selecting the stable image frame comprises:
comparing the plurality of image frames with a predefined image template associated with the object; and
identifying an image frame from the plurality of image frames that correlates with the predefined image template, wherein the image frame correlating with the predefined image template is the stable image frame.
19. The method of claim 18, wherein selecting the stable image frame further comprises:
analyzing image parameters of each image frame of the plurality of image frames; and
determining a match between image parameters of the image frame correlating with the predefined image template and the at least one image selection parameter, wherein the at least one image selection parameter is associated with the predefined image template, the at least one image selection parameter comprises measurements and shape of elements in the object, and an imaging plane.
20. The method of claim 17 further comprises storing a set of predefined image templates, the predefined image template is of the set of predefined image templates.
| # | Name | Date |
|---|---|---|
| 1 | 890-CHE-2013 DRAWINGS 28-02-2013.pdf | 2013-02-28 |
| 1 | 890-CHE-2013-US(14)-HearingNotice-(HearingDate-27-10-2021).pdf | 2021-10-17 |
| 2 | 890-CHE-2013 FORM-5 28-02-2013.pdf | 2013-02-28 |
| 2 | 890-CHE-2013-US(14)-HearingNotice-(HearingDate-28-10-2021).pdf | 2021-10-17 |
| 3 | 890-che-2013-ABSTRACT [04-04-2019(online)].pdf | 2019-04-04 |
| 3 | 890-CHE-2013 FORM-2 28-02-2013.pdf | 2013-02-28 |
| 4 | 890-che-2013-CLAIMS [04-04-2019(online)].pdf | 2019-04-04 |
| 4 | 890-CHE-2013 FORM-18 28-02-2013.pdf | 2013-02-28 |
| 5 | 890-che-2013-COMPLETE SPECIFICATION [04-04-2019(online)].pdf | 2019-04-04 |
| 5 | 890-CHE-2013 FORM-1 28-02-2013.pdf | 2013-02-28 |
| 6 | 890-che-2013-CORRESPONDENCE [04-04-2019(online)].pdf | 2019-04-04 |
| 6 | 890-CHE-2013 DESCRIPTION (COMPLETE) 28-02-2013.pdf | 2013-02-28 |
| 7 | 890-che-2013-DRAWING [04-04-2019(online)].pdf | 2019-04-04 |
| 7 | 890-CHE-2013 CORRESPONDENCE OTHERS 28-02-2013.pdf | 2013-02-28 |
| 8 | 890-che-2013-FER_SER_REPLY [04-04-2019(online)].pdf | 2019-04-04 |
| 8 | 890-CHE-2013 CLAIMS 28-02-2013.pdf | 2013-02-28 |
| 9 | 890-CHE-2013 ABSTRACT 28-02-2013.pdf | 2013-02-28 |
| 9 | 890-che-2013-OTHERS [04-04-2019(online)].pdf | 2019-04-04 |
| 10 | 890-CHE-2013 CORRESPONDENCE OTHERS 05-03-2013.pdf | 2013-03-05 |
| 10 | 890-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)].pdf | 2019-04-04 |
| 11 | 890-CHE-2013 FORM-1 30-08-2013.pdf | 2013-08-30 |
| 11 | 890-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 12 | 890-CHE-2013 CORRESPONDENCE OTHERS 30-08-2013.pdf | 2013-08-30 |
| 12 | 890-CHE-2013-FER.pdf | 2018-10-04 |
| 13 | abstract890-CHE-2013.jpg | 2014-09-16 |
| 14 | 890-CHE-2013 CORRESPONDENCE OTHERS 30-08-2013.pdf | 2013-08-30 |
| 14 | 890-CHE-2013-FER.pdf | 2018-10-04 |
| 15 | 890-CHE-2013 FORM-1 30-08-2013.pdf | 2013-08-30 |
| 15 | 890-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 16 | 890-CHE-2013 CORRESPONDENCE OTHERS 05-03-2013.pdf | 2013-03-05 |
| 16 | 890-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)].pdf | 2019-04-04 |
| 17 | 890-che-2013-OTHERS [04-04-2019(online)].pdf | 2019-04-04 |
| 17 | 890-CHE-2013 ABSTRACT 28-02-2013.pdf | 2013-02-28 |
| 18 | 890-CHE-2013 CLAIMS 28-02-2013.pdf | 2013-02-28 |
| 18 | 890-che-2013-FER_SER_REPLY [04-04-2019(online)].pdf | 2019-04-04 |
| 19 | 890-che-2013-DRAWING [04-04-2019(online)].pdf | 2019-04-04 |
| 19 | 890-CHE-2013 CORRESPONDENCE OTHERS 28-02-2013.pdf | 2013-02-28 |
| 20 | 890-che-2013-CORRESPONDENCE [04-04-2019(online)].pdf | 2019-04-04 |
| 20 | 890-CHE-2013 DESCRIPTION (COMPLETE) 28-02-2013.pdf | 2013-02-28 |
| 21 | 890-che-2013-COMPLETE SPECIFICATION [04-04-2019(online)].pdf | 2019-04-04 |
| 21 | 890-CHE-2013 FORM-1 28-02-2013.pdf | 2013-02-28 |
| 22 | 890-che-2013-CLAIMS [04-04-2019(online)].pdf | 2019-04-04 |
| 22 | 890-CHE-2013 FORM-18 28-02-2013.pdf | 2013-02-28 |
| 23 | 890-che-2013-ABSTRACT [04-04-2019(online)].pdf | 2019-04-04 |
| 23 | 890-CHE-2013 FORM-2 28-02-2013.pdf | 2013-02-28 |
| 24 | 890-CHE-2013-US(14)-HearingNotice-(HearingDate-28-10-2021).pdf | 2021-10-17 |
| 24 | 890-CHE-2013 FORM-5 28-02-2013.pdf | 2013-02-28 |
| 25 | 890-CHE-2013 DRAWINGS 28-02-2013.pdf | 2013-02-28 |
| 25 | 890-CHE-2013-US(14)-HearingNotice-(HearingDate-27-10-2021).pdf | 2021-10-17 |
| 1 | search_04-10-2018.pdf |