Sign In to Follow Application
View All Documents & Correspondence

Medical Imaging System And Method For Measuring Medical Image

Abstract: A medical imaging system for measuring a medical image is disclosed. The medical imaging system comprises a display unit for presenting one or more images of an object. An image capturing unit is configured to capture multiple gestures made by a user at a distance from the display unit. An image measurement unit is communicably coupled to the image capturing unit and the image measurement unit is configured to generate measurement events based on one or more gestures of the multiple gestures. Based on the measurement events image measurement unit performs measurement of the object. The measurements events correspond to steps associated with measurement of the object. FIG. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 December 2012
Publication Number
19/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2022-08-31
Renewal Date

Applicants

GENERAL ELECTRIC COMPANY
1 RIVER ROAD, SCHENECTADY, NEW YORK 12345

Inventors

1. KRISHNA KOMMU, MOHAN
JOHN F WELCH TECHNOLOGY CENTRE, WHITEFIELD ROAD, HOODI VILLAGE, WHITEFIELD ROAD
2. JOSHI, BHUVNESH
JOHN F WELCH TECHNOLOGY CENTRE, WHITEFIELD ROAD, HOODI VILLAGE, WHITEFIELD ROAD

Specification

MEDICAL IMAGING SYSTEM AND METHOD FOR MEASURING MEDICAL
IMAGE

TECHNICAL FIELD

[0001] The subject matter disclosed herein relates to a medical imaging system. More specifically the subject matter relates to a system and a method of measuring a medical image in the medical imaging system.

BACKGROUND OF THE INVENTION

[0002] Medical imaging systems are used in different applications to generate images of different regions or areas (e.g. different organs) of patients or other objects. Different types of medical imaging systems are available and they include for example, X-ray, computed tomography, single photon emission computed tomography, magnetic resonance imaging, or the like. The medical imaging system generates medical images of the organs of the patient or other objects and measurements need to be taken on the medical images. For example a fetal image may be obtained and measured to determine the fetal length. The measurement of the fetal length is critical because this a parameter that indicates the growth rate and health state of the fetus. This is done by selecting two points on the fetal image using an input device and then measuring the fetal length. The input device may be for example a mouse or any other input device. Various other techniques may be also used as creating some shapes (such as a circle and a parabola) on the medical image and measure the length between two points in the medical image. However all these techniques can often become cumbersome for an end user of the medical imaging system because the user may have to handle many processes or sub-devices of the medical imaging system for capturing the medical images.

[0003] Thus there is a need for a system that can facilitate measurement of the fetal image in a more convenient manner for the end user using a medical imaging system.

BRIEF DESCRIPTION OF THE INVENTION

[0004] The above-mentioned shortcomings, disadvantages and problems are addressed herein which will be understood by reading and understanding the following specification.

[0005] As discussed in detail below, embodiments of the invention include a medical imaging system for measuring a medical image. The medical imaging system comprises a display unit for presenting one or more images of an object. An image capturing unit is configured to capture multiple gestures made by a user at a distance from the display unit. An image measurement unit is communicably coupled to the image capturing unit and the image measurement unit is configured to generate measurement events based on one or more gestures of the multiple gestures. Based on the measurement events image measurement unit performs measurement of the object. The measurements events correspond to steps associated with measurement of the object.

[0006] In another embodiment a method of measuring a medical image in a medical imaging system. The method includes presenting one or more medical images of an object in a display unit of the medical imaging system; capturing multiple gestures made by a user at a distance from the display unit; generating measurement events based on at least one gesture of the plurality of gestures; and measuring the object based on the measurement events. The measurement events correspond to steps associated with measurement of the object.

BRIEF DESCRIPTION OF THE DRAWINGS

[0007] FIGURE 1 illustrates an medical imaging system for performing an imaging procedure on a subject;

[0008] FIGURE 2 is a schematic illustration of a medical imaging system for measuring a medical image in accordance with an embodiment;

[0009] FIGURE 3 is a schematic illustration a medical imaging system capable of performing measurements on an object based on eye gestures in accordance with an exemplary embodiment;

[0010] FIGURE 4 is a schematic illustration of a medical imaging system capable of performing measurements on the object based on hand gestures in accordance with an exemplary embodiment;

[0011] FIGURE 5 is a schematic illustration of a medical imaging system capable of performing measurements on the object based on hand gestures using a finger cap worn on a finger of user's hand in accordance with an exemplary embodiment; and

[0012] FIGURE 6 illustrates a flow diagram of a method for measuring a medical image in a medical imaging system in accordance with an embodiment.

DETAILED DESCRIPTION OF THE INVENTION

[0013] In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention.

[0014] To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. One or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like) or multiple pieces of hardware. Similarly, the programs may be standalone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package,
and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.

[0015] As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to "one embodiment" of the present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising" or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property.

[0016] A medical imaging system for measuring a medical image is disclosed. The medical imaging system comprises a display unit for presenting one or more images of an object. An image capturing unit is configured to capture multiple gestures made by a user at a distance from the display unit. An image measurement unit is communicably coupled to the image capturing unit and the image measurement unit is configured to generate measurement events based on one or more gestures of the multiple gestures. Based on the measurement events image measurement unit performs measurement of the object. The measurements events correspond to steps associated with measurement of the object.

[0017] Various embodiments of the invention provide a medical imaging system 100 as shown in FIG. 1. The medical imaging system 100 may be any type of system that uses an acquisition component 102 for acquiring images. The medical imaging system 100 may be for example, an ultrasound imaging system, X-ray system, a computed tomography imaging system, a magnetic resonance imaging system, a single photon emission computed tomography system, an electrocardiography system, a positron emission tomography (PET) system or a multi-modal imaging system. However, the various embodiments are not limited to medical imaging systems or imaging systems for imaging human subjects mentioned above.

[0018] The medical imaging system 100 includes the acquisition component 102 configured to acquire image data (e.g., ultrasound image data). The acquisition component 102 in case of an ultrasound imaging system may be an ultrasound probe 102 for scanning or otherwise imaging an object or volume of interest. An "ultrasound probe" includes a plurality of transducers for sending a plurality of ultrasound signals to a region of interest on a human subject to acquire images. In another example the acquisition component 102 may be an array of "X-ray detectors" in a gantry of a computed tomography imaging system. Now considering the case of a positron emission tomography system the acquisition component 102 may be "an array of image detectors" in a PET scanner. The acquisition component 102 is operatively connected to an image processing component 104. The image processing component 104 is any type of image processor capable of processing image data acquired using the acquisition component 102. The image processing component 104 is also operatively coupled to a display component 106. The display component 106, which may be a controller, configures or formats the processed image data for display on a display screen 108. The display screen 108 may be any type of screen capable of displaying images, graphics, text, etc. For example, the display screen 108 may be a cathode ray tube (CRT) screen, a liquid crystal display (LCD) screen or a plasma screen, among others.

[0019] A processor 110 (e.g., computer) or other processing unit controls the various operations within the medical imaging system 100. For example, the processor 110 may receive user inputs from a user interface 112 and display requested image data or adjust the settings for the displayed image data. For example, a user may provide inputs or settings to change the image displayed or the display properties of the display screen 108.

[0020] FIG. 2 illustrates a medical imaging system 200 for measuring a medical image in accordance with an embodiment. The medical imaging system 200 may be a portable system, a mobile system, a handheld system or any other medical imaging systems. The medical imaging system 200 may include an image acquisition unit such as the acquisition component 102 for acquiring at least one image (i.e. medical images) of the object. The at least one image is presented in a display unit 202. A user of the
medical imaging system 200 may need to take measurements on the object. For instance an image of a fetus (i.e. the object) may need to be measured to determine the fetal length. This can be performed taking measurement between two or more points on an image of the object. However it may be contemplated that the measurements can be taken on an object in any other way or form other than measuring between multiple points on the object.

[0021] The user performs multiple gestures in front of the medical imaging system 200. These gestures are captured by an image capturing unit 204. The gestures may be performed using a body portion such as but not limited to, a limb and an eye of the user. The gestures may include multiple predefined movements that can be performed using the body portion. This is further exemplarily explained in detail in conjunction with FIG. 3, FIG. 4 and FIG. 5. The image capturing unit 204 may be a camera such as a video camera that captures the gestures of the user. In an embodiment the captured gestures may be stored in a memory 206. An image measurement unit 208 analyzes these gestures to generate one or more measurement events for measuring the image. During analysis each gesture may be associated with the one or more measurement events. The measurement events correspond to steps involved in performing measurements on the object. The one or more measurement events may be associated with a predefined movement in a gesture. The measurement on the object may be performed for instance by determining a length between two points on the image. The measurement event may be for example drawing a line between two points on the image, a circle covering multiple points or two points on the image, measuring a distance between the two points, and registering or confirming or selecting a point. It may be envisioned that other measurement events may be utilized for performing measurement on the object and are within the scope of this disclosure. The measurement events may be selected from a set of measurement events stored in the memory 206.

[0022] Based on the selected measurement events the image measurement unit 208 performs measurements on the object. For instance a measurement event may be for drawing a line between two points on a fetus to measure a femur diaphysis length. The
measurements taken are displayed on the display unit 202 in the form of for example a numerical value in millimeters. However it may be contemplated that the measurements performed and measured values may be presented in any other form.

[0023] FIG. 3 illustrates a medical imaging system 300 capable of performing measurements on an object 302 based on eye gestures in accordance with an exemplary embodiment. The medical imaging system 300 presents the object 302 through a display unit 304 (i.e. user interface). The display unit 304 may be a touch sensitive display screen or a cathode ray tube (CRT) screen. A user operating the medical imaging system 300 uses user's eye 306 to perform multiple gestures. It may be noted that the eye 306 may be a single eye or two eyes of the user however it is explained as a single eye for ease in convenience of description. Considering an example if a distance between two points i.e. a point 308 and a point 310 need to be measured then initially these points are determined and finalized. The user may focus or gaze at the point 308 and perform a gesture for confirming the point 308. The gesture performed may be for example but not limited to, blinking the eye 306, gazing at the point 308 for a predefined time, moving the eye 306 closer to the display unit 304, and moving the eye 306 away from the display unit 304. A camera 312 may capture the gesture of the eye 306. The camera 312 may be positioned above the display unit 304 as shown in FIG. 3 or may be positioned in any other location on the medical imaging system 300 for conveniently capturing the gestures of the user. The gesture is analyzed to detect predefined movements such as for example blinking, gazing, and moving the eye 306 closer to and away from the display unit 304. The predefined movements may be stored in a memory (for example the memory 206) of the medical imaging system 300. Each predefined movement may have one or more measurement events for performing the measurement on the object. In an embodiment a mapping table may be stored in the memory that represents a relationship between each predefined movement and the one or more measurement events. The predefined movement in the gesture for example gazing at the point 308 for a predefine time is associated with a measurement event i.e. registering or confirming the point 308 as the first point. The point 308 may be registered and stored in the medical imaging system 300. The user then shifts focus or gaze to the point 310 for a predefined time to confirm
the selection. Thus the point 310 is selected as the second point. The medical imaging system 300 may measure the length between these points. This may be accomplished by performing one or more measurement events such as drawing a line between two points (i.e. the point 308 and the point 310) on the image, measuring a distance between the two points and drawing a circle covering two points on the image. The distance between the point 308 and the point 310 is represented as 'L' in FIG. 3. The distance 'L' may have a numerical value and may be displayed in the display unit 304 various forms.

[0024] Turning now to FIG. 4 illustrating the medical imaging system 300 capable of performing measurements on the object 302 based on hand gestures in accordance with an exemplary embodiment. As explained in FIG. 4, the distance between two points such as the point 308 and the point 310 may need to be measured. The user may use user's hand 400 to make gestures for performing the measurements on the object 302. The user may point a finger 402 at the point 308 for a predefined time that is considered as a gesture. The camera 312 gathers this gesture and detects pointing of the finger 402 as a predefined movement. This predefined movement performed for the predefined time is associated with a measurement event such as registering or confirming the point 308. Similarly the point 310 may be also selected or registered based on a similar gesture using the hand 400. It may be noted that other kinds of gestures may be performed using the hand 400 for instance moving the finger 402 closer to the points (i.e. the point 308 and the point 310), moving the finger 402 away from the points, rotating the finger 402, and waving of the hand 400 and these gestures are within the scope of this disclosure. Once the point 308 and the point 310 are registered then a measurement event such as measuring the distance between these points are performed. The distance between the points may be represented by 'L' and a numerical value for the distance may be displayed on the display unit 304.

[0025] In another instance the finger 402 may be covered with a finger cap 500 and used for performing gestures before the medical imaging system 300 as illustrated in FIG. 5 in accordance with an embodiment. As shown in FIG. 5 the finger 402 along with the finger cap 500 may be used to perform gestures by the user. The finger cap 500 may be
held or positioned in front of the medical imaging system 300 so as to select or confirm the point 308 and the point 310. The camera 312 may be configured to capture the gestures and distinguish the hand 400 and finger 402 from the finger cap 500. In an embodiment the camera 312 may be able to distinguish the finger cap 500 based on its color. The finger cap 500 may have different colors. The user may position the finger cap 500 to focus on the point 308 and then rotate the finger 402 along with the finger cap 500. The camera 312 detects the rotational movement of the finger cap 500 and the medical imaging system 300 confirms this as a gesture. A predefined movement associated with the gesture is a rotational movement of the finger cap 500. The predefined movement indicates selection of the point 308 by the user. In a similar way the user uses the finger 402 and the finger cap 500 for selecting the point 310. In order to find a distance between the two points the user may move the finger 402 in a direction similar to a line joining the point 308 and the point 310. The movement of finger 402 may be identified by the camera 312 by detecting the movement of the finger cap 500. The medical imaging system 300 analyzes this gesture and executes a measurement event such as drawing a line joining the point 308 and the point 310 which is displayed in the display unit 304. Then the distance is measured between the two points and presented to the user. It may be noted that the distance between the two points on the object (such as a fetus) as displayed on the display unit 304 may not be an exact length but the distance or the length may be appropriately adjusted based on a predefined scale to represent the exact length or distance.

[0026] In another embodiment a point on the finger cap 500 may be used by the camera 312 to determine whether the user is performing a gesture. The user may position the finger cap 500 to focus on the point 308 and then rotate the finger 402 along with the finger cap 500. The camera 312 tracks movement of the point on the finger cap 500 to detect the rotational movement of the finger cap 500. The point on the finger cap 500 may be a dot having a color. The medical imaging system 300 identifies that the point on the finger cap 500 is aligned to the point 308 and its movement indicates the selection of the point 308. A predefined movement in this case may be a rotational movement of the point on the finger cap 500 with respect to the point 308. The user may use the same
technique to select the point 310. To find a distance between the two points the user may move the finger 402 in a direction similar to a line joining the point 308 and the point 310. The movement of the point on the finger cap 500 may be identified by the camera 312. The medical imaging system 300 analyzes this gesture and executes a measurement event such as drawing a line joining the point 308 and the point 310 which is displayed in the display unit 304. The distance is then measured between the two points and presented to the user.

[0027] Referring to FIG. 3, FIG. 4 and FIG. 5 various embodiments of performing measurements on the object by a medical imaging system based on gestures are described and these embodiments are merely exemplary and hence other forms of gestures performed using a body portion of the user is within the scope of this disclosure.

[0028] FIG. 6 illustrates a flow diagram of a method 600 for measuring a medical image in a medical imaging system in accordance with an embodiment. The medical imaging system is used to capture one or more medical images of an object. The medical imaging system is operated by a user for example, a technician and a medical expert. The object may be an anatomy of a subject (i.e. a human). The medical images are presented in a display unit of the medical imaging system at step 602. Measurements need to be taken on the medical images and thus the user may show multiple gestures before the medical imaging system. The gestures may be performed at a distance from the display unit. A camera in the medical imaging system captures the gestures at step 604. The gestures are performed using a body portion such as an eye and a limb of the user. The gestures may include multiple predefined movements that can be performed using the body portion. The medical imaging system analyzes these gestures to generate one or more measurement events for measuring the image at step 606. Each gesture may be associated with the one or more measurement events. The measurement events correspond to steps involved in performing measurements on the object. The measurement events may be associated with a predefined movement in a gesture. The gesture is analyzed to detect predefined movements. The predefined movements may be for example blinking, gazing, and moving the user's eye closer to and away from the
display unit. Based on the selected measurement events the medical imaging system performs measurements on the object. The object is measured based on the measurement events that correspond to steps associated with measurement of the object at step 608. The measurement event may be for example drawing a line between two points on the image, a circle covering multiple points or two points on the image, measuring a distance between the two points, and registering or confirming or selecting a point.

[0029] The various embodiments and/or components, for example, the modules, or components and controllers therein, also may be implemented as part of one or more computers or processors. The computer or processor may include a computing device, an input device, a display unit and an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and Read Only Memory (ROM). The computer or processor further may include a storage device, which may be a hard disk drive or a removable storage drive such as a floppy disk drive, optical disk drive, and the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.

[0030] As used herein, the term "computer" or "module" may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of the term "computer".

[0031] The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.

[0032] The methods described in conjunction with FIG. 6 can be performed using a processor or any other processing device. The method steps can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium. The tangible computer readable medium may be for example a flash memory, a read-only memory (ROM), a random access memory (RAM), any other computer readable storage medium and any storage media. Although the method for measuring a medical image in a medical imaging system is explained with reference to the flow chart of FIG. 6, other methods of implementing the method can be employed. For example, the order of execution of each method steps may be changed, and/or some of the method steps described may be changed, eliminated, divide or combined. Further the method steps may be sequentially or simultaneously executed for measuring a medical image in a medical imaging system.

[0033] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any computing system or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.

We Claim:

1. A medical imaging system comprises:

a display unit for presenting at least one image of an object;

an image capturing unit configured to capture a plurality of gestures made by a user at a distance from the display unit; and

an image measurement unit communicably coupled to the image capturing unit, wherein the image measurement unit is configured to:

generate measurement events based on at least one gesture of the plurality of gestures;

perform measurement on the object based on the measurement events,

wherein the measurement events correspond to steps associated with measurement on the object.

2. The medical imaging system of claim 1, wherein the at least one image of the object comprises at least one medical image of an anatomy.

3. The medical imaging system of claim 1, wherein the imaging capturing unit is a camera.

4. The medical imaging system of claim 1, wherein the plurality of gestures is associated with movement of a body portion of a user.

5. The medical imaging system of claim 4, wherein the image measurement unit is further configured to generate the measurement events by:

detecting predefined movements of the body portion performed by the user from the at least one gesture; and

determining the measurement events associated with the predefined movements from a set of measurement events.

6. The medical imaging system of claim 4, wherein the body portion is one of an eye and a limb of the user.

7. A method of measuring a medical image in a medical imaging system, the method comprising:

presenting at least one medical image of an object in a display unit of the medical imaging system;

capturing a plurality of gestures made by a user at a distance from the display unit;

generating measurement events based on at least one gesture of the plurality of gestures; and

measuring the object based on the measurement events, wherein the measurement events correspond to steps associated with measurement of the object.

8. The method of claim 7, wherein the plurality of gestures is associated with movement of a body portion of a user.

9. The method of claim 8, wherein generating the measurement events comprises:

detecting predefined movements of the body portion performed by the user from the at least one gesture; and

determining the measurement events associated with the predefined movements from a set of measurement movements.

10. The method of claim 8, wherein the body portion is one of an eye and a limb of the user.

Documents

Application Documents

# Name Date
1 5486-CHE-2012 FORM-5 28-12-2012.pdf 2012-12-28
1 5486-CHE-2012-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf 2025-03-18
1 5486-CHE-2012-IntimationOfGrant31-08-2022.pdf 2022-08-31
2 5486-CHE-2012 FORM-2 28-12-2012.pdf 2012-12-28
2 5486-CHE-2012-FORM-16 [18-03-2025(online)].pdf 2025-03-18
2 5486-CHE-2012-PatentCertificate31-08-2022.pdf 2022-08-31
3 5486-CHE-2012 FORM-18 28-12-2012..pdf 2012-12-28
3 5486-CHE-2012-FORM 3 [16-04-2019(online)].pdf 2019-04-16
3 5486-CHE-2012-POWER OF AUTHORITY [18-03-2025(online)].pdf 2025-03-18
4 5486-CHE-2012-IntimationOfGrant31-08-2022.pdf 2022-08-31
4 5486-CHE-2012-FORM-26 [16-04-2019(online)].pdf 2019-04-16
4 5486-CHE-2012 FORM-1 28-12-2012.pdf 2012-12-28
5 5486-CHE-2012-PatentCertificate31-08-2022.pdf 2022-08-31
5 5486-che-2012-ABSTRACT [15-04-2019(online)].pdf 2019-04-15
5 5486-CHE-2012 DRAWINGS 28-12-2012.pdf 2012-12-28
6 5486-CHE-2012-FORM 3 [16-04-2019(online)].pdf 2019-04-16
6 5486-che-2012-CLAIMS [15-04-2019(online)].pdf 2019-04-15
6 5486-CHE-2012 CORRESPONDENCE OTHERS 28-12-2012.pdf 2012-12-28
7 5486-CHE-2012-FORM-26 [16-04-2019(online)].pdf 2019-04-16
7 5486-che-2012-COMPLETE SPECIFICATION [15-04-2019(online)].pdf 2019-04-15
7 5486-CHE-2012 DESCRIPTION (COMPLETE) 28-12-2012.pdf 2012-12-28
8 5486-CHE-2012 CLAIMS 28-12-2012.pdf 2012-12-28
8 5486-che-2012-ABSTRACT [15-04-2019(online)].pdf 2019-04-15
8 5486-che-2012-CORRESPONDENCE [15-04-2019(online)].pdf 2019-04-15
9 5486-CHE-2012 ABSTRACT 28-12-2012.pdf 2012-12-28
9 5486-che-2012-CLAIMS [15-04-2019(online)].pdf 2019-04-15
9 5486-che-2012-DRAWING [15-04-2019(online)].pdf 2019-04-15
10 5486-CHE-2012 CORRESPONDENCE OTHERS 06-03-2013.pdf 2013-03-06
10 5486-che-2012-COMPLETE SPECIFICATION [15-04-2019(online)].pdf 2019-04-15
10 5486-che-2012-FER_SER_REPLY [15-04-2019(online)].pdf 2019-04-15
11 5486-CHE-2012 FORM-1 06-03-2013.pdf 2013-03-06
11 5486-che-2012-CORRESPONDENCE [15-04-2019(online)].pdf 2019-04-15
11 5486-che-2012-OTHERS [15-04-2019(online)].pdf 2019-04-15
12 5486-che-2012-DRAWING [15-04-2019(online)].pdf 2019-04-15
12 5486-CHE-2012-PETITION UNDER RULE 137 [04-04-2019(online)].pdf 2019-04-04
12 abstract5486-CHE-2012.jpg 2014-05-15
13 5486-CHE-2012-FORM-26 [06-12-2018(online)].pdf 2018-12-06
13 5486-che-2012-FER_SER_REPLY [15-04-2019(online)].pdf 2019-04-15
13 5486-CHE-2012 FORM-3 30-06-2014.pdf 2014-06-30
14 5486-CHE-2012 CORRESPONDENCE OTHERS 30-06-2014.pdf 2014-06-30
14 5486-CHE-2012-FER.pdf 2018-10-16
14 5486-che-2012-OTHERS [15-04-2019(online)].pdf 2019-04-15
15 5486-CHE-2012-Correspondence-210915.pdf 2018-03-16
15 5486-CHE-2012-Form 3-210915.pdf 2015-12-01
15 5486-CHE-2012-PETITION UNDER RULE 137 [04-04-2019(online)].pdf 2019-04-04
16 5486-CHE-2012-Correspondence-210915.pdf 2018-03-16
16 5486-CHE-2012-Form 3-210915.pdf 2015-12-01
16 5486-CHE-2012-FORM-26 [06-12-2018(online)].pdf 2018-12-06
17 5486-CHE-2012 CORRESPONDENCE OTHERS 30-06-2014.pdf 2014-06-30
17 5486-CHE-2012-FER.pdf 2018-10-16
18 5486-CHE-2012-Correspondence-210915.pdf 2018-03-16
18 5486-CHE-2012-FORM-26 [06-12-2018(online)].pdf 2018-12-06
18 5486-CHE-2012 FORM-3 30-06-2014.pdf 2014-06-30
19 5486-CHE-2012-Form 3-210915.pdf 2015-12-01
19 5486-CHE-2012-PETITION UNDER RULE 137 [04-04-2019(online)].pdf 2019-04-04
19 abstract5486-CHE-2012.jpg 2014-05-15
20 5486-CHE-2012 FORM-1 06-03-2013.pdf 2013-03-06
20 5486-CHE-2012 CORRESPONDENCE OTHERS 30-06-2014.pdf 2014-06-30
20 5486-che-2012-OTHERS [15-04-2019(online)].pdf 2019-04-15
21 5486-che-2012-FER_SER_REPLY [15-04-2019(online)].pdf 2019-04-15
21 5486-CHE-2012 FORM-3 30-06-2014.pdf 2014-06-30
21 5486-CHE-2012 CORRESPONDENCE OTHERS 06-03-2013.pdf 2013-03-06
22 5486-CHE-2012 ABSTRACT 28-12-2012.pdf 2012-12-28
22 5486-che-2012-DRAWING [15-04-2019(online)].pdf 2019-04-15
22 abstract5486-CHE-2012.jpg 2014-05-15
23 5486-CHE-2012 CLAIMS 28-12-2012.pdf 2012-12-28
23 5486-CHE-2012 FORM-1 06-03-2013.pdf 2013-03-06
23 5486-che-2012-CORRESPONDENCE [15-04-2019(online)].pdf 2019-04-15
24 5486-che-2012-COMPLETE SPECIFICATION [15-04-2019(online)].pdf 2019-04-15
24 5486-CHE-2012 CORRESPONDENCE OTHERS 06-03-2013.pdf 2013-03-06
24 5486-CHE-2012 DESCRIPTION (COMPLETE) 28-12-2012.pdf 2012-12-28
25 5486-CHE-2012 ABSTRACT 28-12-2012.pdf 2012-12-28
25 5486-CHE-2012 CORRESPONDENCE OTHERS 28-12-2012.pdf 2012-12-28
25 5486-che-2012-CLAIMS [15-04-2019(online)].pdf 2019-04-15
26 5486-CHE-2012 CLAIMS 28-12-2012.pdf 2012-12-28
26 5486-CHE-2012 DRAWINGS 28-12-2012.pdf 2012-12-28
26 5486-che-2012-ABSTRACT [15-04-2019(online)].pdf 2019-04-15
27 5486-CHE-2012 DESCRIPTION (COMPLETE) 28-12-2012.pdf 2012-12-28
27 5486-CHE-2012 FORM-1 28-12-2012.pdf 2012-12-28
27 5486-CHE-2012-FORM-26 [16-04-2019(online)].pdf 2019-04-16
28 5486-CHE-2012 CORRESPONDENCE OTHERS 28-12-2012.pdf 2012-12-28
28 5486-CHE-2012 FORM-18 28-12-2012..pdf 2012-12-28
28 5486-CHE-2012-FORM 3 [16-04-2019(online)].pdf 2019-04-16
29 5486-CHE-2012 DRAWINGS 28-12-2012.pdf 2012-12-28
29 5486-CHE-2012 FORM-2 28-12-2012.pdf 2012-12-28
29 5486-CHE-2012-PatentCertificate31-08-2022.pdf 2022-08-31
30 5486-CHE-2012 FORM-1 28-12-2012.pdf 2012-12-28
30 5486-CHE-2012 FORM-5 28-12-2012.pdf 2012-12-28
30 5486-CHE-2012-IntimationOfGrant31-08-2022.pdf 2022-08-31
31 5486-CHE-2012-POWER OF AUTHORITY [18-03-2025(online)].pdf 2025-03-18
31 5486-CHE-2012 FORM-18 28-12-2012..pdf 2012-12-28
32 5486-CHE-2012-FORM-16 [18-03-2025(online)].pdf 2025-03-18
32 5486-CHE-2012 FORM-2 28-12-2012.pdf 2012-12-28
33 5486-CHE-2012-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf 2025-03-18
33 5486-CHE-2012 FORM-5 28-12-2012.pdf 2012-12-28

Search Strategy

1 5486CHE2012searchstrategy_05-10-2018.pdf

ERegister / Renewals

3rd: 18 Apr 2023

From 28/12/2014 - To 28/12/2015

4th: 18 Apr 2023

From 28/12/2015 - To 28/12/2016

5th: 18 Apr 2023

From 28/12/2016 - To 28/12/2017

6th: 18 Apr 2023

From 28/12/2017 - To 28/12/2018

7th: 18 Apr 2023

From 28/12/2018 - To 28/12/2019

8th: 18 Apr 2023

From 28/12/2019 - To 28/12/2020

9th: 18 Apr 2023

From 28/12/2020 - To 28/12/2021

10th: 18 Apr 2023

From 28/12/2021 - To 28/12/2022

11th: 18 Apr 2023

From 28/12/2022 - To 28/12/2023

12th: 27 Dec 2023

From 28/12/2023 - To 28/12/2024

13th: 24 Dec 2024

From 28/12/2024 - To 28/12/2025