Abstract: A medical imaging apparatus having multiple image projecting units for projecting images for a subject to view is disclosed. The medical imaging apparatus is present in a scanning room for performing imaging of the subject. The medical imaging apparatus includes a table for holding the subject. The table along with the subject is facilitated by an image capturing subsystem to pass there through for capturing medical images of the subject. The multiple image projecting units are configured within the image capturing subsystem for projecting images to one or more of walls of the scanning room and an inner surface of the image capturing subsystem. FIG. 1
MEDICAL IMAGING MANAGEMENT SYSTEM FOR MEDICAL IMAGE STORAGE, RETRIEVAL AND ANALYSIS
TECHNICAL FIELD
[0001] The subject matter disclosed herein relates to medical imaging. More specifically the subject matter relates to a medical imaging management system for storage, retrieval and analysis of medical images.
BACKGROUND OF THE INVENTION
[0002] Medical imaging apparatus are used in different applications to generate images of different regions or areas (e.g. different organs) of patients or other objects. Different types of medical imaging apparatus are available and they include for example, ultrasound imaging system, X-ray system, computed tomography (CT) system, single photon emission computed tomography (PET) system, magnetic resonance (MR) imaging system, or the like. To diagnose a critical condition of an anatomy of a patient multiple images and image slices of the anatomy from different perspectives (such as angles, positions with respect to patient's body) may be captured. A technician or doctor needs to view all these images and the image slices to identify the critical condition. Reviewing these images manually may be time consuming sometimes because all images or image slices may not show the critical condition. For example a tumor growth in brain and its condition may not be shown in all images and image slices. However to identify the images and image slices that show the condition of the tumor growth the technician needs to review all the images and the image slices.
[0003] Moreover for instance in order to examine a critical condition of the anatomy of the patient, the technician suggests to take images of the anatomy using different imaging techniques such as X-ray, CT and MR imaging. The images captured using different imaging techniques may be compared against same patient over a period of time or against a patient with similar diagnosis to analyze the critical condition in detail. In some occasions hard copies of the images are taken and manually compared by the
technician which consumes reasonable time because the images to be compared needs to be identified. For instance CT images and MRI images may be taken for an anatomy of the patient. The technician needs to determine which CT image needs to be compared against a MRI image to analyze an anatomy from a particular perspective or angle thereby increasing the burden on the technician. The technician needs to quickly narrow down to the anatomy/critical condition in a particular image or in a set of images (i.e. CT and/or MRI images) and retrieve images with the critical condition. The set of images may include CT and/or MRI images of other patients as well.
[0004] Thus there is a need for an improved medical image management system for medical image storage, retrieval and analysis of invention.
BRIEF DESCRIPTION OF THE INVENTION
[0005] The above-mentioned shortcomings, disadvantages and problems are addressed herein which will be understood by reading and understanding the following specification.
[0006] As discussed in detail below, embodiments of the invention include a medical image management system for image storage, retrieval, and analysis of image is disclosed. The medical imaging management system includes an image decoding module configured to receive one or more medical images of a subject. The one or more medical images are correlated with a medical image template. An image presentation module is configured to present the one or more medical image with anatomy information based on the correlation between the one or more medical images and the medical image template.
[0007] In another embodiment a method of managing medical images for medical image retrieval is disclosed. The method includes receiving one or more medical images of a subject; correlating the one or more medical images with a medical image template; and presenting the one or more medical images with anatomy information based on the correlation between the one or more medical images and the medical image template.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] FIGURE 1 illustrates a medical image management system for image storage, retrieval and analysis in accordance with an embodiment;
[0009] FIGURE 2 is a schematic illustration a workflow for selecting a medical image template in accordance with an embodiment;
[0010] FIGURE 3 illustrates a process of appending a plurality of labels in a medical image in accordance with an exemplary embodiment.
[0011] FIGURE 4 is a process of filtering labels in the medical image in accordance with an exemplary embodiment;
[0012] FIGURE 5 illustrates a process for viewing a desired point of interest in a particular portion of the anatomy in the output medical image in accordance with an exemplary embodiment;
[0013] FIGURE 6 illustrates a process of enabling a user to identify physiological conditions of some portions of the anatomy and mark them in accordance with an exemplary embodiment;
[0014] FIGURE 7 is a process of enabling a user to identify a medical image based on a physiological condition at a portion of the anatomy in accordance with an exemplary embodiment;
[0015] FIGURE 8 illustrates a process of searching and retrieving one or more medical images based on a degree of portion of a desired point of interest in the anatomy in accordance with an exemplary embodiment;
[0016] FIGURE 9 illustrates a process of enabling a user for retrieving a medical image based on a desired point of interest in accordance with an exemplary embodiment;
[0017] FIGURE 10 illustrates a process of comparing medical images of a subject captured at different stages in accordance with an exemplary embodiment;
[0018] FIGURE 11 illustrates a process of comparing medical images of one or more subjects captured using different imaging techniques in accordance with an exemplary embodiment; and
[0019] FIGURE 12 illustrates a method for managing one or more medical images in a medical image management system in accordance with an embodiment.
DETAILED DESCRIPTION OF THE INVENTION
[0020] In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention.
[0021] To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. One or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like) or multiple pieces of hardware. Similarly, the programs may be standalone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
[0022] As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural of said elements or
steps, unless such exclusion is explicitly stated. Furthermore, references to "one embodiment" of the present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising" or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property.
[0023] A medical image management system for image storage, retrieval and analysis of images is disclosed. The medical imaging management system includes an image decoding module configured to receive one or more medical images of a subject or subjects. The one or more medical images are correlated with a medical image template. An image presentation module is configured to present the one or more medical image with anatomy information based on the correlation between the one or more medical images and the medical image template.
[0024] Various embodiments of the invention provide a medical image management system 100 as shown in FIG. 1 is disclosed. The medical image management system 100 receives a medical image 102 for storage. The medical image 102 may be associated with an anatomy of a patient(s) or subject(s). Medical images may be captured using various medical imaging systems such as but not limited to, a magnetic resonance (MR) imaging system, a computed tomography (CT) imaging system, an X-ray system, a positron emission tomography (PET) imaging system, and an ultrasound imaging system. The medical image 102 may be a MR image, a CT image, an X-ray image, an ultrasound image and a PET image or images captured using other medical imaging systems.
[0025] The medical image 102 is processed by an image decoding module 104. The image decoding module 104 analyzes the medical image 102 received. The medical image 102 may be compared and correlated with a medical image template by the image decoding module 104. The medical image template may be an image of an anatomy for example a head of patient showing the internal organs such as nasal track, brain, skull and so on. The medical image template may be for example a two dimensional (2-D) image template and/or a three dimensional (3-D) image template. However it may be
appreciated that the medical image template may not be restricted to these and other multidimensional medical image templates may be within the scope of this disclosure. A template module 106 stores multiple medical image templates in memory(s) 108. The medical image templates such as a medical image template 200, a medical image template 202 and a medical image template 204 may be stored in an image repository 206 present in the memory(s) 108 as shown in FIG. 2. Nomenclature associated with each part of the anatomy may be appended in the image. The template module 106 appends a plurality of labels indicating nomenclatures in each medical image template. Considering a head of the patient, parts of the head (i.e. anatomy) like pituitary, spine and cerebellum (i.e. nomenclatures) are marked in a medical image template of the head. Thus separate labels including or indicating pituitary, spine and cerebellum are appended on to the medical image template. Similarly the template module 106 appends a plurality of labels to medical image templates of different anatomical parts of a subject. These medical image templates along with the plurality of labels are stored in the memory(s) 108.
[0026] While correlating the medical image 102 with different medical image templates, a medical image template for example the medical image template 202 that maps with the medical image 102 is identified by an image recognition module 110. The medical image template 202 is then selected as shown in FIG. 2. In an embodiment the medical image template is mapped with the medical image 102 using a pixel based comparison technique. However it may be appreciated that a medical image may be mapped with medical image templates using different techniques used in the art. During this mapping process, the image recognition module 110 maps an anatomy of the medical image 102 with anatomy of one or more medical image templates to obtain the correct medical image template. Then one or more labels in the identified medical image template are inserted into the anatomy in the medical image 102 using a labeling module 112. Each label may be inserted into a portion of the anatomy in the medical image 102 that maps with a portion of the anatomy in the identified medical image template 202. This is further explained in detail in conjunction with FIG. 3. The medical image 102 with the one or more labels is presented to a user by an image presentation module 114.
[0027] In an embodiment the image recognition module 110 may be configured to compare the one or more medical images of the subject with multiple medical images of one or more different subjects. The different subjects selected for comparison may have the same physiological condition similar to a physiological condition of the subject. This comparison may assist the technician or doctor to analyze the physiological condition of the subject in a better manner due to the presence of a history of medical images pertaining to the same physiological condition. The selected different subject may have been treated by a first doctor and thus a second doctor analyzing the subject can also refer to the analysis of the first doctor due to the similarity in the physiological conditions.
[0028] The user may be able to view a desired point of interest and navigate to the desired point of interest in the medical image 102. In this case for example the desired point of interest may be viewed in response to receiving user inputs in the form of keywords like "zoom to", "show". The user input may also be pointing of cursor to identify the point of the interest in the anatomy which may pop-up to display the point of interest and associated labels. The user may provide user input that is processed by the image recognition module 110 to identify the desired point of interest in the anatomy presented in the medical image 102. The user input may be but not limited to a text based input, cursor based input, a gesture input and a voice based input. The medical image 102 is processed for example filtered or refined to present the desired point of interest and one or more labels associated with the desired point of interest is retrieved from the medical image template corresponding to the medical image 102. The one or more labels are presented along with the desired point of interest in the medical image 102 to the user. In an embodiment the user may be allowed to provide user input i.e. user queries which may be processed by an image search module 116 to retrieve a medical image. The user queries may be one or more keywords for retrieving the medical image. In an instance the user may provide a keyword "cerebellum" or set of user specific keywords, or combination of anatomy nomenclatures, and the image search module 116 filters a medical image of a head i.e. zoom to a cerebellum area in an anatomy of the head. When magnified other portions in the cerebellum area and their labels may be presented to the user. The image recognition module 110 enables the user to control or
vary a granularity of the labels presented in the medical image or a desired point of interest in the medical image. In an instance the labels may be presented based on their hierarchy to reduce confusion due to cluttering. For example, initially a label associated with a cerebellum of a subject may be only displayed in a medical image. However when the area of the cerebellum is zoomed by the user, the labels associated with sub-portions of the cerebellum are also presented. Thus as and when the cerebellum area is zoomed incrementally the labels associated with the sub-portions are displayed in an incremental manner. On the contrary when the cerebellum area is zoomed out by the user the labels of the sub-portions of the cerebellum start disappearing incrementally based on a level of zooming of the medical image.
[0029] In an embodiment the image recognition module 110 is configured to retrieve a portion of the desired point of interest based on a user input. In this case the user input specifies a degree of portion of the desired point of interest to be viewed by a user. Considering an example the user input may indicate twenty percent of the eye portion of the head anatomy to be shown when a desired point of interest associated with the head anatomy is presented. The image recognition module 110 then filters the medical image of the head anatomy to present twenty percent of the eye.
[0030] Further the medical image management system 100 is configured to determine a physiological condition of an anatomy in the medical image 102. Here a condition detection module 118 is configured to receive the medical image 102 and compare against the corresponding medical image template to detect any change in physiological condition of one or more regions in the anatomy. The corresponding medical image template may represent the anatomy with ideal physiological conditions. These changes in physiological conditions can be detected based on a pixel based comparison technique. In an embodiment the image decoding module 104 may be configured to receive user input and process to present one or more regions having different physiological conditions (for example an inflammation, a tumor growth etc.) as compared to ideal physiological conditions of the anatomy. In an embodiment the image decoding module 104 may be configured to construct the anatomy of the subject in the form a medical
image model (for example a 2-D and/or 3-D model) and presented to the user. The medical image model may be constructed using multiple medical images (such as the medical image 102) of the subject. These medical images may be 2-D images or 3-D images. The medical image model may present the change in the physiological condition clearly to the user thereby assisting the user to determine a degree of change in the physiological condition. By comparing a 3D medical model with a 3D image template the system can analyze the size and proportions of the organs and any organ out of proportion or a physiological condition like tumor not found in the regular image template is communicated to the user.
[0031] The user may be able to insert marks on the anatomy of the medical image. To this end the labeling module 112 may receive user input and one or more marks may be input i.e. appended on to the anatomy of the medical image 102. A mark indicates a physiological state of a portion of the anatomy. The physiological state may be for example critical, severe and normal and so on. It may be appreciated that marks may indicate any other physiological information associated with the anatomy other than the physiological state of the anatomy. In an embodiment the user may be configured to modify any existing marks already appended to the anatomy of the medical image 102. The labeling module 112 facilitates these modifications based on user input. In case the medical image 102 is already stored in the medical image management system 100 and needs to be retrieved to make the modifications, then a user input including a keyword associated with the mark is provided. The image recognition module 110 retrieves the medical image 102 and presents a portion of the anatomy associated with or indicating the mark. The user may provide user input for modifying the mark. The user input may be but not limited to a text based input, cursor based input, a gesture input and a voice based input.
[0032] In another embodiment the image decoding module 104 is further configured to compare the medical image 102 with multiple medical images of one or more subjects. The medical image 102 is compared with medical images of subjects having a similar medical condition for instance head injury. This comparison is performed to determine if
the medical condition is same then treatment for the subject having same medical condition can be given for the subject associated with the medical image 102. In yet another embodiment the image decoding module 104 may be configured to compare the medical image 102 with multiple medical images of the subject indicating different physiological conditions. The multiple medical images of the subject may be taken at different periods of time. For instance a medical image showing a current medical condition of the subject can be compared with multiple past medical images showing the progress made while providing the medication to the subject.
[0033] In another scenario the medical image 102 may be compared with multiple medical images associated with different imaging techniques. The medical image 102 of an anatomy may be associated with an imaging technique and multiple medical images of the same anatomy may be captured using another imaging technique. Thereafter these medical images are compared against each other to diagnose a medical condition of the subject.
[0034] FIG. 3 illustrates a workflow of comparing a medical image 300 with a medical image template 302 in accordance with an embodiment. The medical image 300 may be received as an input from the user. A medical image template 302 may be identified that matches with the medical image 300. The medical image template 302 is shown to include three labels indicating different portions in an anatomy in the medical image template 302. The three labels are A, B and C associated with different nomenclatures of portions of the anatomy. The labels A, B and C may indicate pituitary, spinal cord and cerebellum respectively in the anatomy of the head. Portions of the anatomy corresponding to the portions labeled A, B and C are identified and the labels are appended on to the medical image 302 to get an output medical image 304. The output medical image 304 is then presented to the user. The user may be a technician, a medical expert, a doctor and so on. The output medical image 304 facilitates the user to identify the portions of the anatomy in the medical image 300.
[0035] The user may be able to filter the portions in the output medical image 304 that is desired by the user. As shown in FIG. 4 a user input 400 is received at the medical
image management system 100 indicating the portions of the anatomy that are of interest to the user in accordance with an embodiment. The user input 400 may include keywords indicating the nomenclatures of portions that are of interest. The keywords may be for instance cerebellum and spinal cord which may be used to filter the output medical image 304. After the filtering process the label 'A' is removed from the output medical image 304 to obtain a filtered medical image 402 showing the labels 'B' and 'C. The filtered medical image 402 is presented to the user and displays the portions i.e. cerebellum and spinal cord in the anatomy. In an embodiment the user may be configured to view to desired point of interest in a particular portion of the anatomy in the output medical image 304 as shown in FIG. 5 in accordance with an embodiment. As shown in FIG. 5 the user may be provided a user input 500 to present the desired point of interest (POI) in the particular portion of the anatomy. The desired POI is a specific portion of the anatomy that the user may prefer to magnify and view the specific portion in more detail. The user input 500 includes a nomenclature associated with the label 'B' indicating the desired POI i.e. cerebellum. The desired POI is of interest to the user. In an embodiment the user input 500 may be submitted by allowing the user to select the desired POI i.e. the label 'B' using a window 502. Based on the user input 500 the desired POI is magnified as shown by a medical image 504. Thereafter multiple labels 'B', 'D' and 'E' associated with different portions of the desired POI is retrieved from the medical image template 302. The label 'D' indicates pons in the desired POI. The labels 'B', 'D' and 'E' are appended to the portions of the desired POI and presented to the user as shown in a final medical image 506.
[0036] While analyzing a medical image, the medical image management system 100 enables the user to identify physiological conditions of some portions of the anatomy and mark them as illustrated in FIG. 6. As illustrated in FIG. 6 specifically to a medical image 600, the user provides different marks to a portion having the label 'A'. The different marks may include but are not limited to critical and severe. These marks are defined by the user. In a scenario these marks may be part of a menu that may be presented when a right click function is performed by the user using the cursor. The user can type any text (shown in FIG. 6) that is desired by the user to define a mark. If the physiological condition of the portion having the label 'A' is critical then the user can provide a mark 602 indicating critical. Similarly a mark 604 indicating 'severe' can also be given by the user. These markings help the user (such as a medical expert) to conveniently identify the portion that are in a critical or severe physiological condition. In an embodiment the user can perform a right click to present a menu including multiple marks associated with different physiological conditions. The user can select a mark from the menu and the selected mark is appended to the portion. Further in another embodiment a menu option 606 i.e. 'type text' enables the user to create a mark indicating the physiological condition of the portion by typing some text. In an embodiment the medical image management system 100 enables the user to provide or append a mark on a portion of the anatomy indicating a physiological condition of the portion. For example a mark 608 may indicate condition A for the portion as shown in a medical image 610. The condition A may be for an injured portion of the anatomy. The user may right click using a cursor resulting in presenting a menu option indicating different conditions such as the condition A. The cursor can be used to select the condition A for appending a mark on the portion of the anatomy indicating this condition.
[0037] Similarly multiple marks may be provided on different portions of the anatomy representing different physiological conditions. As illustrated in FIG. 7 a mark 700 indicating condition B may be appended to a different portion of the anatomy. Multiple medical images for instance the medical image 610 and the medical image 700 may be stored in the image repository 206. The medical image management system 100 enables the user to identify a medical image based on a physiological condition at a portion of the anatomy. A user input 702 may be received at the medical image management system 100 to identify a medical image. The user input 702 may include one or more keywords indicating one or more physiological conditions. In an instance the user input 702 may include one or more keywords associated with the anatomy or desired portion in the anatomy. Thus it may be envisioned that a user input for retrieving medical images may include multiple keywords in different combinations decided by the user. For example a user input may be combination of keywords associated with a physiological condition and an anatomy. Another user input may be a combination of keywords associated with a physiological condition and a desired portion of the anatomy. In still another scenario a user input may be a combination of keywords associated with a physiological condition and a name of a patient or a subject. Based on the user input 702 for identifying a medical image depicting the condition B, the medical image management system 100 retrieves the medical image 610 from the image repository 206 and present to the user. The medical image 610 including the mark 700 (indicating "condition B") is retrieved intuitively based on the user input at any instance from the medical image management system 100.
[0038] The medical image management system 100 also enables a user to search for medical images based on degree of portion of a desired point of interest in the anatomy. FIG. 8 illustrates searching and retrieving one or more medical images based on a degree of portion of a desired point of interest in the anatomy in accordance with an exemplary embodiment. The image repository 206 includes multiple images such as a medical image 800, a medical image 802, a medical image 804, a medical image 806, a medical image 808, a medical image 810, a medical image 812 and a medical image 814. These medical images as shown are images of a head of a subject taken at different angles and/or positions. The images taken may cover or focus on different portions of an anatomy of the head. Each image may present different levels of anatomical information of the head. For instance is a user input includes 25 percent eye and 50 percent of pons then the medical image 810 is presented to the user by the medical image management system 100. The 25 percent eye represents a degree of portion of a desired POI i.e. eye is to be presented. The 50 percent pons represents a degree of portion of a desired POI i.e. pons is to be presented. The user input is used to identify the medical image 810. The medical image 810 is also compared against the medical image template 302 to confirm whether the medical image 810 is the correct image for the user input received. The medical image template 302 acts a reference image for deciding the medical image 810. The medical image 810 as shown in FIG. 8 presents 25 percent eye and 50 percent of pons. Further another user input includes 25 percent pons and 50 percent cerebellum. The medical image management system 100 retrieves the medical image 814 and compares with the medical image template to confirm whether the medical image 814 is the correct image. The medical image 814 shows 25 percent pons and 50 percent cerebellum. These medical images are retrieved and reviewed by the user to examine physiological condition of different anatomical portions of the head. The medical image management system 100 is also configured to identify and retrieve one or more medical images from the image repository 206 storing the medical image 800, the medical image 802, the medical image 804, the medical image 806, the medical image 808, the medical image 810, the medical image 812 and the medical image 814 as illustrated in FIG. 9. A user input may include a desired anatomy or POI in the anatomy that the user wants to review. Based on the user input the medical image 804, the medical image 806 and the medical image 814 are retrieved and presented to the user. All medical images are compared with the medical image template 302 to identify the medical image 804, the medical image 806 and the medical image 814.
[0039] Now turning to FIG. 10 illustrating comparing medical images of a subject captured at different stages in accordance with an exemplary embodiment. Medical images such as a medical image 1000, a medical image 1002, a medical image 1004, a medical image 1006, a medical image 1008, a medical image 1010, a medical image 1012 and a medical image 1014 may be associated with a subject taken in fifth week. Thereafter medical images such as a medical image 1016, a medical image 1018, a medical image 1020, a medical image 1022, a medical image 1024, a medical image 1026, a medical image 1028 and a medical image 1030 may be associated with the subject and taken at fifteenth week. A user input can be submitted to the medical image management system 100 for comparing medical images captured at different stages for instance a medical image captured in fifth week and a medical image captured in fifteenth week. Considering an example the medical image 1012 having a critical "condition A" marked may be compared with the medical image 1028 having a critical "condition A" marked. The medical image 1012 and the medical image 1028 are captured at the same orientation or angle but at different stages. The medical image 1012 and the medical image 1028 at different stages are compared to determine the changes happened from the fifth week to the fifteenth week. The changes may be improvement in the physiological condition of the anatomy of the subject. In another embodiment a medical image of a
subject may be compared with a medical image of another subject. In this case the subjects may have a similar physiological condition associated with anatomies of the subjects.
[0040] FIG. 11 illustrates comparing medical images of one or more subjects captured using different imaging techniques in accordance with an exemplary embodiment. As illustrated a medical image 1100, a medical image 1102 and a medical image 1104 may be captured using different imaging techniques such as a PET image, a CT image and a MRI image. These medical images may be associated with a subject and same anatomical location. The medical images care compared against each other to gather more details on a physiological condition of the anatomical location. In another scenario the medical image 1100, the medical image 1102 and the medical image 1104 may be associated with different subjects having identical physiological condition such as a head injury at the same anatomical location in the head. This comparison is performed to determine if the physiological condition of multiple subjects are same and if a medication given to a subject can be given to another subject. The different subjects selected for comparison may have the same physiological condition similar to a physiological condition of the subject. This comparison may assist the technician or doctor to analyze the physiological condition of the subject in a better manner due to the presence of a history of medical images pertaining to the same physiological condition. The selected different subject may have been treated by a first doctor and thus a second doctor analyzing the subject can also refer to the analysis of the first doctor due to the similarity in the physiological conditions.
[0041] Turning now to FIG. 12 illustrates a method 1200 for managing one or more medical images in a medical image management system in accordance with an embodiment. The medical image management system receives one or more medical images of a subject at block 1202. The one or more medical images are correlated with a medical image template at block 1204. The medical image template may be an image of an anatomy for example a head of patient showing the internal organs such as nasal track, brain, skull and so on. In an embodiment the medical image template may be associated
anatomies in a normal physiological condition. Multiple medical image templates may be stored in a memory of the medical image management system. Nomenclature associated with each part of the anatomy may be appended in the medical image template. A plurality of labels indicating nomenclatures may be appended in each medical image template. Considering a head of the patient, parts of the head (i.e. anatomy) like pituitary, spinal cord and cerebellum (i.e. nomenclatures) are marked in a medical image template of the head. Thus separate labels including or indicating pituitary, spinal cord and cerebellum are appended on to the medical image template. Similarly a plurality of labels is appended to medical image templates of different anatomical parts of a subject. These medical image templates along with the plurality of labels are stored in the memory. Based on the correlation between the one or more medical images and the medical image template, the one or more medical images are presented with anatomy information at block 1206. The plurality of labels present in the medical image template is appended on to the one or more medical images to show the anatomy information.
[0042] The various embodiments and/or components, for example, the modules, or components and controllers therein, also may be implemented as part of one or more computers or processors. The computer or processor may include a computing device, an input device, a display unit and an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and Read Only Memory (ROM). The computer or processor further may include a storage device, which may be a hard disk drive or a removable storage drive such as a floppy disk drive, optical disk drive, and the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor. As used herein, the term "computer" or "module" may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein. The above
examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of the term "computer". The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.
[0043] The methods described in conjunction with figures can be performed using a processor or any other processing device. The method steps can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium. The tangible computer readable medium may be for example a flash memory, a read-only memory (ROM), a random access memory (RAM), any other computer readable storage medium and any storage media. Although the method of projecting images onto one or more walls and a ceiling using in-built image projecting units in a medical imaging apparatus is explained with reference to the flow chart of figures, other methods of implementing the method can be employed. For example, the order of execution of each method steps may be changed, and/or some of the method steps described may be changed, eliminated, divide or combined. Further the method steps may be sequentially or simultaneously executed for projecting images onto one or more walls and a ceiling using in-built image projecting units in a medical imaging apparatus.
[0044] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any computing system or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.
We Claim:
1. A medical image management system comprises:
an image decoding module configured to:
receive at least one medical image of a subject; correlate the at least one medical image with a medical image template; and
an image presentation module configured to present the at least one medical image with anatomy information based on the correlation between the at least one medical image and the medical image template.
2. The medical image management system of claim 1, wherein the image decoding
module comprises:
a template module configured to:
store a plurality of medical image templates; and
append a plurality of labels in each medical image template, the plurality of labels is associated with an anatomy in each medical image template; and a labeling module configured to insert at least one label associated with anatomy in a medical image of the
at least one medical image based on a plurality of labels in a medical image template corresponding to the medical image.
3. The medical image management system of claim 2 further comprises an image
recognition module further configured to:
map the anatomy in the medical image with anatomy in at least one medical image template; and
identify labels in the at least one medical image template correlating with the anatomy in the medical image.
4. The medical image management system of claim 3, wherein the image recognition module is further configured to:
process user input for identifying a desired point of interest in the medical image; and
present labels associated with the desired point of interest based on at least one medical image template corresponding to the medical image.
5. The medical image management system of claim 4, wherein the image recognition module is further configured to vary a granularity of labels presented in the desired point of interest of the medical image.
6. The medical image management system of claim 4, wherein the image recognition module comprises a image search module configured to:
receive a user input for retrieving the medical image; and retrieve the medical image based on the user input.
7. The medical image management system of claim 6, wherein the user input specifies a degree of portion of the desired point of interest to be viewed by a user, the image recognition module is further configured to retrieve a portion of the desired point of interest mapping with the degree of portion specified in the user input.
8. The medical image management system of claim 3, wherein the image decoding module further comprises a condition detection module configured to detect a physiological condition of at least one region in the anatomy of the medical image based on a mapping between the anatomy in the medical image and anatomy in the at least one medical image template.
9. The medical image management system of claim 3, wherein the labeling module is further configured to:
input at least one mark on the anatomy of the medical image based on user input, a mark of the at least one mark indicating a physiological state of a portion of the anatomy; and
modify the at least one label and the at least one mark based on user input.
10. The medical image management system of claim 9, wherein the image recognition module is further configured to:
identify a medical image from the at least one medical image based on a user input; and
retrieve a portion of an anatomy in the medical image associating with the mark.
11. The medical image management system of claim 1, wherein the image recognition module is further configured to compare at least one of:
at least one medical image of a subject with a plurality of medical images of at least one different subject;
at least one medical image of the subject with a plurality of medical images of the subject, wherein the plurality of medical images is associated with a plurality of physiological conditions of the subject;
at least one medical image of the subject with a plurality of medical images of the subject, wherein the plurality of medical images is associated with a plurality of imaging techniques; and
wherein the image presentation module is configured to present the at least one medical image of the subject based on the comparison.
12. A method of managing medical images for medical image retrieval, the method
comprising:
receiving at least one medical image of a subject;
correlating the at least one medical image with a medical image template; and
presenting the at least one medical image with anatomy information based on the correlation between the at least one medical image and the medical image template.
13. The method of claim 12 further comprises:
storing a plurality of medical image templates;
appending a plurality of labels in each medical image template, the plurality of labels is associated with an anatomy in each medical image template; and
inserting at least one label associated with anatomy in a medical image of the at least one medical image based on a plurality of labels in a medical image template corresponding to the medical image.
14. The method of claim 13 further comprises:
mapping the anatomy in the medical image with anatomy in at least one medical image template; and
identifying labels in the at least one medical image template correlating with the anatomy in the medical image.
15. The method of claim 14 further comprises:
processing user input for identifying a desired point of interest in the medical image, wherein the user input specifies a degree of portion of the desired point of interest to be viewed by a user; and
presenting labels associated with the desired point of interest based on at least one medical image template corresponding to the medical image, wherein the desired point of interest maps with the degree of portion specified in the user input.
16. The method of claim 14 further comprises detecting a physiological condition of at least one region in the anatomy of the medical image based on a mapping between
the anatomy in the medical image and anatomy in the at least one medical image template.
17. The method of claim 14 further comprises:
inputting at least one mark on the anatomy of the medical image based on user input, a mark of the at least one mark indicating a physiological state of a portion of the anatomy;
modifying the at least one label and the at least one mark based on user input;
identifying a medical image from the at least one medical image based on a user input; and
retrieving a portion of an anatomy in the medical image associating with the mark.
18. The method of claim 12, further comprises:
comparing at least one of:
at least one medical image of a subject with a plurality of medical images of at least one different subject;
at least one medical image of the subject with a plurality of medical images of the subject, wherein the plurality of medical images is associated with a plurality of physiological conditions of the subject; and
at least one medical image of the subject with a plurality of medical images of the subject, wherein the plurality of medical images is associated with a plurality of imaging techniques; and
presenting the at least one medical image of the subject based on the comparison.
| # | Name | Date |
|---|---|---|
| 1 | 2871-CHE-2013 FORM-18 28-06-2013.pdf | 2013-06-28 |
| 1 | 2871-CHE-2013-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 2871-CHE-2013-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 2 | 2871-CHE-2013 FORM-5 28-06-2013.pdf | 2013-06-28 |
| 2 | 2871-CHE-2013-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 2 | 2871-CHE-2013-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 3 | 2871-CHE-2013 FORM-2 28-06-2013.pdf | 2013-06-28 |
| 3 | 2871-CHE-2013-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 3 | 2871-CHE-2013-US(14)-HearingNotice-(HearingDate-18-12-2020).pdf | 2021-10-17 |
| 4 | 2871-CHE-2013-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 4 | 2871-CHE-2013-IntimationOfGrant24-06-2021.pdf | 2021-06-24 |
| 4 | 2871-CHE-2013 FORM-1 28-06-2013.pdf | 2013-06-28 |
| 5 | 2871-CHE-2013-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 5 | 2871-CHE-2013-PatentCertificate24-06-2021.pdf | 2021-06-24 |
| 5 | 2871-CHE-2013 DRAWINGS 28-06-2013.pdf | 2013-06-28 |
| 6 | 2871-CHE-2013-US(14)-HearingNotice-(HearingDate-18-12-2020).pdf | 2021-10-17 |
| 6 | 2871-CHE-2013-Annexure [18-12-2020(online)].pdf | 2020-12-18 |
| 6 | 2871-CHE-2013 DESCRIPTION (COMPLETE) 28-06-2013.pdf | 2013-06-28 |
| 7 | 2871-CHE-2013-Written submissions and relevant documents [18-12-2020(online)].pdf | 2020-12-18 |
| 7 | 2871-CHE-2013-IntimationOfGrant24-06-2021.pdf | 2021-06-24 |
| 7 | 2871-CHE-2013 CORRESPONDENCE OTHERS 28-06-2013.pdf | 2013-06-28 |
| 8 | 2871-CHE-2013 CLAIMS 28-06-2013.pdf | 2013-06-28 |
| 8 | 2871-CHE-2013-Correspondence to notify the Controller [20-11-2020(online)].pdf | 2020-11-20 |
| 8 | 2871-CHE-2013-PatentCertificate24-06-2021.pdf | 2021-06-24 |
| 9 | 2871-CHE-2013 ABSTRACT 28-06-2013.pdf | 2013-06-28 |
| 9 | 2871-CHE-2013-Annexure [18-12-2020(online)].pdf | 2020-12-18 |
| 9 | Correspondence by Agent_Notarized Assignment_29-04-2019.pdf | 2019-04-29 |
| 10 | 2871-CHE-2013 FORM-1 30-08-2013.pdf | 2013-08-30 |
| 10 | 2871-che-2013-ABSTRACT [23-04-2019(online)].pdf | 2019-04-23 |
| 10 | 2871-CHE-2013-Written submissions and relevant documents [18-12-2020(online)].pdf | 2020-12-18 |
| 11 | 2871-CHE-2013 CORRESPONDENCE OTHERS 30-08-2013.pdf | 2013-08-30 |
| 11 | 2871-che-2013-CLAIMS [23-04-2019(online)].pdf | 2019-04-23 |
| 11 | 2871-CHE-2013-Correspondence to notify the Controller [20-11-2020(online)].pdf | 2020-11-20 |
| 12 | 2871-che-2013-COMPLETE SPECIFICATION [23-04-2019(online)].pdf | 2019-04-23 |
| 12 | abstract2871-CHE-2013.jpg | 2014-06-27 |
| 12 | Correspondence by Agent_Notarized Assignment_29-04-2019.pdf | 2019-04-29 |
| 13 | 2871-CHE-2013-FER.pdf | 2018-10-23 |
| 13 | 2871-che-2013-CORRESPONDENCE [23-04-2019(online)].pdf | 2019-04-23 |
| 13 | 2871-che-2013-ABSTRACT [23-04-2019(online)].pdf | 2019-04-23 |
| 14 | 2871-che-2013-CLAIMS [23-04-2019(online)].pdf | 2019-04-23 |
| 14 | 2871-che-2013-DRAWING [23-04-2019(online)].pdf | 2019-04-23 |
| 14 | 2871-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 15 | 2871-che-2013-COMPLETE SPECIFICATION [23-04-2019(online)].pdf | 2019-04-23 |
| 15 | 2871-che-2013-FER_SER_REPLY [23-04-2019(online)].pdf | 2019-04-23 |
| 15 | 2871-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)].pdf | 2019-04-04 |
| 16 | 2871-che-2013-CORRESPONDENCE [23-04-2019(online)].pdf | 2019-04-23 |
| 16 | 2871-che-2013-OTHERS [23-04-2019(online)].pdf | 2019-04-23 |
| 16 | 2871-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)]-1.pdf | 2019-04-04 |
| 17 | 2871-che-2013-DRAWING [23-04-2019(online)].pdf | 2019-04-23 |
| 17 | 2871-che-2013-OTHERS [23-04-2019(online)].pdf | 2019-04-23 |
| 17 | 2871-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)]-1.pdf | 2019-04-04 |
| 18 | 2871-che-2013-FER_SER_REPLY [23-04-2019(online)].pdf | 2019-04-23 |
| 18 | 2871-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)].pdf | 2019-04-04 |
| 19 | 2871-che-2013-DRAWING [23-04-2019(online)].pdf | 2019-04-23 |
| 19 | 2871-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 19 | 2871-che-2013-OTHERS [23-04-2019(online)].pdf | 2019-04-23 |
| 20 | 2871-che-2013-CORRESPONDENCE [23-04-2019(online)].pdf | 2019-04-23 |
| 20 | 2871-CHE-2013-FER.pdf | 2018-10-23 |
| 20 | 2871-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)]-1.pdf | 2019-04-04 |
| 21 | abstract2871-CHE-2013.jpg | 2014-06-27 |
| 21 | 2871-CHE-2013-PETITION UNDER RULE 137 [04-04-2019(online)].pdf | 2019-04-04 |
| 21 | 2871-che-2013-COMPLETE SPECIFICATION [23-04-2019(online)].pdf | 2019-04-23 |
| 22 | 2871-CHE-2013 CORRESPONDENCE OTHERS 30-08-2013.pdf | 2013-08-30 |
| 22 | 2871-che-2013-CLAIMS [23-04-2019(online)].pdf | 2019-04-23 |
| 22 | 2871-CHE-2013-FORM-26 [06-12-2018(online)].pdf | 2018-12-06 |
| 23 | 2871-CHE-2013 FORM-1 30-08-2013.pdf | 2013-08-30 |
| 23 | 2871-che-2013-ABSTRACT [23-04-2019(online)].pdf | 2019-04-23 |
| 23 | 2871-CHE-2013-FER.pdf | 2018-10-23 |
| 24 | Correspondence by Agent_Notarized Assignment_29-04-2019.pdf | 2019-04-29 |
| 24 | abstract2871-CHE-2013.jpg | 2014-06-27 |
| 24 | 2871-CHE-2013 ABSTRACT 28-06-2013.pdf | 2013-06-28 |
| 25 | 2871-CHE-2013 CORRESPONDENCE OTHERS 30-08-2013.pdf | 2013-08-30 |
| 25 | 2871-CHE-2013-Correspondence to notify the Controller [20-11-2020(online)].pdf | 2020-11-20 |
| 25 | 2871-CHE-2013 CLAIMS 28-06-2013.pdf | 2013-06-28 |
| 26 | 2871-CHE-2013 CORRESPONDENCE OTHERS 28-06-2013.pdf | 2013-06-28 |
| 26 | 2871-CHE-2013 FORM-1 30-08-2013.pdf | 2013-08-30 |
| 26 | 2871-CHE-2013-Written submissions and relevant documents [18-12-2020(online)].pdf | 2020-12-18 |
| 27 | 2871-CHE-2013 ABSTRACT 28-06-2013.pdf | 2013-06-28 |
| 27 | 2871-CHE-2013 DESCRIPTION (COMPLETE) 28-06-2013.pdf | 2013-06-28 |
| 27 | 2871-CHE-2013-Annexure [18-12-2020(online)].pdf | 2020-12-18 |
| 28 | 2871-CHE-2013-PatentCertificate24-06-2021.pdf | 2021-06-24 |
| 28 | 2871-CHE-2013 DRAWINGS 28-06-2013.pdf | 2013-06-28 |
| 28 | 2871-CHE-2013 CLAIMS 28-06-2013.pdf | 2013-06-28 |
| 29 | 2871-CHE-2013-IntimationOfGrant24-06-2021.pdf | 2021-06-24 |
| 29 | 2871-CHE-2013 FORM-1 28-06-2013.pdf | 2013-06-28 |
| 29 | 2871-CHE-2013 CORRESPONDENCE OTHERS 28-06-2013.pdf | 2013-06-28 |
| 30 | 2871-CHE-2013-US(14)-HearingNotice-(HearingDate-18-12-2020).pdf | 2021-10-17 |
| 30 | 2871-CHE-2013 FORM-2 28-06-2013.pdf | 2013-06-28 |
| 30 | 2871-CHE-2013 DESCRIPTION (COMPLETE) 28-06-2013.pdf | 2013-06-28 |
| 31 | 2871-CHE-2013 FORM-5 28-06-2013.pdf | 2013-06-28 |
| 31 | 2871-CHE-2013 DRAWINGS 28-06-2013.pdf | 2013-06-28 |
| 31 | 2871-CHE-2013-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 32 | 2871-CHE-2013-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 32 | 2871-CHE-2013 FORM-18 28-06-2013.pdf | 2013-06-28 |
| 32 | 2871-CHE-2013 FORM-1 28-06-2013.pdf | 2013-06-28 |
| 33 | 2871-CHE-2013-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 33 | 2871-CHE-2013 FORM-2 28-06-2013.pdf | 2013-06-28 |
| 34 | 2871-CHE-2013-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 34 | 2871-CHE-2013 FORM-5 28-06-2013.pdf | 2013-06-28 |
| 35 | 2871-CHE-2013-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 35 | 2871-CHE-2013 FORM-18 28-06-2013.pdf | 2013-06-28 |
| 1 | Search_20-08-2018.pdf |