Abstract: A method for tracking a primary object (PO) in presence of one or more secondary objects (SO) including a similar profile as that of the PO. The method includes detecting the PO from a captured image. Initiating an object tracking window (OTW) for tracking the detected PO. Determining and storing multiple object tracking parameters and trajectory of the tracked PO. Determining number of objects in the OTW and interrupting tracking the PO upon determining number of objects greater than one and detection of the SO in the OTW. Altering size of the OTW. Determining multiple object tracking parameters of the detected SO and the detected PO. Determining a match for identifying the PO by comparing the determined object tracking parameters of the SO and the PO with the stored object tracking parameters of the PO. Relocking the matched object in the OTW as the PO for tracking of the PO.
Claims:
We claim:
1. A method for tracking a primary object in presence of atleast one secondary objects comprising a similar profile as that of said primary object, said method comprising:
detecting said primary object from a captured image;
initiating an object tracking window for tracking said detected primary object;
tracking said primary object in said object tracking window;
determining and storing a plurality of object tracking parameters of said tracked primary object, wherein said object tracking parameters are contrast difference, covariance values, and object geometry;
determining and storing trajectory of said primary object;
detecting number of objects in said object tracking window;
detecting atleast one secondary object along with said primary object in said object tracking window;
interrupting said tracking of said primary object upon detection of said secondary object along with said primary object in said object tracking window;
altering size of said object tracking window upon interrupting said tracking of said primary object;
determining said plurality of object tracking parameters of said detected secondary objects and said plurality of object tracking parameters of said primary object;
determining a match for identifying said primary object by comparing said determined object tracking parameters of said secondary objects and said determined object tracking parameters of said primary object with said stored object tracking parameters of said primary object; and
relocking said matched object in said object tracking window as said primary object for reviving tracking of said primary object.
2. The method of claim 1, further comprising identifying said primary object wherein atleast one of said determined object tracking parameters of said primary object and atleast one of said determined object tracking parameters of said secondary object match with said corresponding stored object tracking parameters of said primary object, said identifying of said primary object comprises:
viewing said object tracking window comprising said atleast one secondary object and said primary object in a direction based on said future trajectory prediction of said primary object;
detecting said number of objects in said tracking window is equal to one;
determining a match for identifying said primary object by comparing said determined plurality of object tracking parameters of said detected primary object in said object tracking window and said stored plurality of object tracking parameters of said primary object; and
relocking said matched object in said object tracking window as said primary object for reviving tracking of said primary object.
3. The method of claim 1, wherein said initiating of said object tracking window for tracking said detected primary object is one of manual and automatic.
4. The method of claim 4, wherein relocking of said matched object comprises:
comparing said determined object tracking parameters of said primary object and said determined object tracking parameters of said secondary object in said object tracking window with stored object tracking parameters of said primary object;
determining a match for identifying said primary object based on said comparison of said determined object tracking parameters of said primary object and said determined object tracking parameters of said secondary object; and
identifying said primary object based on said determined match upon determining said match to said stored object tracking parameters.
5. The method of claim 1, wherein said detection of number of objects in said object tracking window is based upon object size, object maximum speed and total number of pixel movement of object in plurality of captured image frames, said number of objects in said object tracking window for relocking is about one primary object.
6. The method of claim 1, wherein said contrast difference is difference between a background and said corresponding primary object and said corresponding secondary objects, said contrast difference is utilized for relocking said primary object by identifying said secondary object in place of said primary object in said object tracking window.
7. The method of claim 1, wherein said covariance values provide similarity between pixels depicting said objects in successive image frames, said covariance value is utilized as a measure of match for relocking said primary object by identifying said primary object with said secondary object present in said object tracking window.
8. The method of claim 1, wherein said object geometry is obtained by measuring diagonal distance of said primary object and said secondary object, said object geometry is utilized for relocking said primary object by identifying said primary object present with said secondary object in said object tracking window.
9. The method of claim 1, wherein said trajectory of object provides position of an object in successive image frames.
10. The method of claim 1, wherein said interrupting said tracking avoids consecutive and continuous hops in the process of tracking between the primary object and the secondary object in the object tracking window.
11. An object tracking system, said system comprising:
an image capturing block for capturing a sequence of image frames;
a video and tracking processing block for processing said captured sequence of image frames and tracking a primary object in presence of a plurality of secondary objects comprising a similar profile as that of said primary object; and
a display device for displaying said tracked primary object.
12. The system of claim 11, wherein said video and tracking processing block comprises a memory unit and a processor.
13. The system of claim 12, wherein said memory unit stores number of objects detected in object tracking window, trajectory of said primary object, and object tracking parameters of said primary object and said atleast one secondary object, said object tracking parameters are contrast difference, covariance values, and object geometry.
14. The system of claim 13, wherein said processor compares object tracking parameters of a primary object and one or more secondary objects.
15. The system of claim 13, wherein said processor determines an object as said primary object based on said comparison of said object tracking parameters of said primary object and said secondary object for relocking.
Dated this 18th day of January, 2019
FOR BHARAT ELECTRONICS LIMITED
(By their Agent)
(D.Manoj Kumar) (IN/PA 2110)
KRISHNA & SAURASTRI ASSOCIATES LLP
, Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
SINGLE OBJECT TRACKING IN PRESENCE OF MULTIPLE OBJECTS OF SAME PROFILE
BHARAT ELECTRONICS LIMITED
OUTER RING ROAD, NAGAVARA, BANGALORE- 560045,
KARNATAKA, INDIA
THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.
TECHNICAL FIELD
The present disclosure relates generally to a method identifying and tracking objects by processing images. More specifically, it relates to a method for tracking a single object in presence of multiple objects of similar profile.
BACKGROUND
Electro optical (EO) sensor based object tracking systems, methods for real time object tracking, apparatus and devices are widely available. Most of the EO based object tracking systems work on a principal of background modelling and subtraction, correlation based object tracking feature point extraction and matching, change detection, using R,G,B color space and image segmentation methods. A criteria for separating background in most of object trackers is contrast difference for object and background separation. The object tracking systems perform well in cases of single object tracking or in cases of multiple objects with different profiles where multiple objects can be distinctively separated and a track lock for a desired object may be maintained. A correlation based object tracking may also be used for object tracking however it is applicable for slow moving target because of high computational requirement. The object tracking systems may fail in cases of similar type of multiple objects indentified by the EO sensor. In presence of multiple objects similar to the tracked object, the object tracking systems may jump or divert to track one of the multiple objects with a profile similar to the tracked desired object. The object tracking system can continue to divert the tracking from one object to another object of the similar profile regularly depending upon number of objects with similar profile inside an object tracking window. The situation is complicated if multiple objects, with a profile similar to the profile of the tracked desired object, completely cover the desired object or similar profile objects. The object tracking system performs well by tracking a desired object in a case of single object tracking or in a case of multiple objects with different geometries where the objects may be distinguishable. However, the object tracking system fails when multiple objects with similar geometry compared to the object geometry of the primary object may not be distinguishable are captured by the EO sensor.
The prior art US 7684592B2 proposes a method for real time object tracking system is color pixel information is used for object tracking which cannot cater for black and white image comes from thermal imager camera. The method proposed in the prior art may not be used for single object tracking in case of presence of similar type of multiple objects in a tracking window.
The prior art US 8538082 proposes a change detection method to detect at least one change in an image and generating a 3-D Spatio-temporal volume of the changes in the image. The method further includes converting 3-D Spatio-temporal image onto a 2-D image using Hough transform and extracting a 2-D band in the 2D spatial-temporal image. The method in the prior art can be useful for single target as well multiple targets where all targets have different geometric profiles. However, the method proposed in the prior art may not identify a single target in between multiple targets with profile similar to the single target. Also, the 3D image processing based methods have high computational requirement for target tracking.
The prior art WO 2010005251A9 proposes a method for feature point extraction and matching that can cater for identification of multiple objects with different features. However, the method in the prior art may not work on similar type of overlapping objects.
The prior art US 6035067 proposes a method for image segmentation for tracking. However, the method in the prior art may not work for similar type of multiple objects in a tracking window due to matching of image segmentation at multiple places.
Therefore there is a need for a method to identify a single target in between multiple targets with profile similar to the single target.
SUMMARY
An aspect of the present invention is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below.
The method for tracking a primary object in presence of atleast one secondary objects comprising a similar profile as that of the primary object. The method comprises detecting the primary object from a captured image, initiating an object tracking window and employing a process of tracking the primary object in the object tracking window. The method further comprises determining and storing a multiple object tracking parameters of the tracked primary object, where the object tracking parameters are contrast difference, covariance values, and object geometry. The method further comprises determining and storing trajectory of the primary object and continuously determining number of objects in the object tracking window. The method further comprises detecting the secondary object along with the primary object in the object tracking window and interrupting the tracking of the primary object upon detection of the secondary object along with the primary object in the object tracking window. The method further comprises altering size of the object tracking window upon interrupting the tracking of the primary object and determining the object tracking parameters of the detected secondary object along with the primary object. The method further comprises determining a match for identifying the primary object by comparing the determined object tracking parameters of the secondary objects and the detected object tracking parameters of the primary object with the stored object tracking parameters of the primary object. The method further comprises relocking the matched object in the object tracking window as the primary object for beginning the process tracking of the matched object
Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention
BRIEF DESCRIPTION OF DRAWINGS
The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings in which:
FIG. 1 illustrates a method for tracking a single object in presence of a plurality of objects comprising a similar profile as that of the single object
FIG. 2 exemplarily illustrates an image based object tracking system.
FIG. 3A exemplarily illustrates a single object tracking in a limited size object tracking window.
FIG. 3B exemplarily illustrates multiple objects in the object tracking window.
FIG. 3C exemplarily illustrates an increase in size of the object tracking window during presence of multiple objects in the object tracking window.
FIG. 3D exemplarily illustrates relocking of single object in the object tracking window.
FIG. 4A-4B illustrates object relocking procedure in presence of similar type of multiple objects, such as atleast one secondary object falling in tracking window.
Persons skilled in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and may have not been drawn to scale. For example, the dimensions of some of the elements in the figure may be exaggerated relative to other elements to help to improve understanding of various exemplary embodiments of the present disclosure. Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
DETAILED DESCRIPTION OF DRAWINGS
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
FIGS. 1 through 4B, discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way that would limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged communications system. The terms used to describe various embodiments are exemplary. It should be understood that these are provided to merely aid the understanding of the description, and that their use and definitions, in no way limit the scope of the invention. Terms first, second, and the like are used to differentiate between objects having the same terminology and are in no way intended to represent a chronological order, unless where explicitly stated otherwise. A set is defined as a non-empty set including at least one element.
Those skilled in this technology can make various alterations and modifications without departing from the scope and spirit of the invention. Therefore, the scope of the invention shall be defined and protected by the following claims and their equivalents.
FIGS. 1-4B are merely representational and are not drawn to scale. Certain portions thereof may be exaggerated, while others may be minimized. FIGS. 1-4B illustrate various embodiments of the invention that can be understood and appropriately carried out by those of ordinary skill in the art.
In the foregoing detailed description of embodiments of the invention, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the invention require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description of embodiments of the invention, with each claim standing on its own as a separate embodiment.
FIG. 1 illustrates a method for tracking a primary object in presence of one or more secondary objects comprising a similar profile as that of the primary object. The method comprises detecting 101 the primary object from a captured image and initiating 102 an object tracking window for tracking the detected primary object. The initiation of the object tracking window is one of manual and automatic. The method further comprises tracking 103 the primary object in the object tracking window. The step 401 of FIG. 4A may also be referred for clarity. The method further comprises determining and storing 104 multiple object tracking parameters of the tracked primary object. The steps 402-406 of FIG. 4A may also be referred for clarity. The object tracking parameters are contrast difference, covariance values, and object geometry. The contrast difference is the difference between a background and the corresponding primary object and the corresponding secondary objects. The covariance values provide similarity between pixels depicting the objects in successive image frames. The covariance values form a covariance matrix is computed and determined for a limited sized object tracking window. The object geometry is obtained by measuring diagonal distance, length and width of the primary object and the secondary object. The trajectory of object provides position of an object in at interval of camera video frame rate, that is, in successive image frames. The trajectory of the primary object is determined and stored 105. Based on the stored trajectory of the primary object, a future object trajectory prediction is made. He method further comprising detecting 106 number of objects in the object tracking window. The number of objects in the object tracking window is based upon object size, object maximum speed and total number of pixel movement of object in multiple captured image frames.
The method further comprises detecting 107 atleast one secondary object along with the primary object in the object tracking window along with the primary object in the object tracking window. During the process of tracking the detection of number of objects is monitored at input image frame rate. A periodic object detection is employed and the step 407 of FIG. 4A may be referred for clarity. The method further comprises interrupting 108 the tracking of the primary object upon detection of the secondary object along with the primary object in the object tracking window. The interruption of the tracking refers to breaking the tracking of the primary object. However, if no new objects are detected, the tracking process continues without an interruption. The steps 408–409 and steps 412-413 of FIG. 4A may also be referred for clarity. Upon detection of atleast one new objects such as the secondary objects in the object tracking window in front of the primary object, process of tracking of the primary object is interrupted or broken. Interrupting the tracking avoids consecutive and continuous hops in the process of tracking between the primary object and the secondary object in the object tracking window. Upon interrupting the tracking, following the future object trajectory prediction may avoid diversion of tracking a secondary object instead. The method further comprises altering 109 size of the object tracking window upon interrupting the tracking of the primary object by increasing the size of the object tracking window to about 80 percent of the original size. The altered object tracking window would comprise the secondary objects along with the primary object to handle manoeuvring and direction change of the primary object during time of interruption in the process of tracking, thereby avoiding the detected secondary objects and the primary object going out from object tracking window area. The step 411of FIG. 4A may also be referred for clarity. The secondary objects may partially or fully mask the visibility of the primary object. The method further comprises determining 110 multiple object tracking parameters of the detected secondary objects and the multiple object tracking parameters of the primary object. The steps 414-415 of FIG. 4B may be referred for clarity. The determined object tracking parameters are compared with the stored object tracking parameters. The stored object parameters are the object tracking parameters of the primary object that was tracked earlier to the detection of the secondary parameters in the object tracking window. The comparison process includes comparing the determined object tracking parameters of each of the detected secondary objects and also the detected primary object with the stored object tracking parameters. The number of objects are determined after background modelling and comparing thresholds.
The method further comprises determining 111 a match for identifying the primary object by comparing the determined object tracking parameters of the secondary objects and determined object tracking parameters of the primary object with the stored object tracking parameters of the primary object. The method further comprises relocking 112 the matched object in the object tracking window as the primary object for beginning the process of tracking of the matched object, however the relocking may be initiated after a single object, that is, the primary object is covered in the object tracking window. The steps 416-418 of FIG. 4B may be referred for more clarity. If one or more determined object parameters of both the secondary object and the primary object in the object tracing window match the stored object parameters of the primary object, then no object in the object tracking window is relocked. That is, if more than one object such as one or more secondary objects and the primary object are the determined matches, then the primary object may be considered for tracking based on the stored future object trajectory prediction. The object tracking window comprising the matched secondary object and the primary object shall be viewed in a direction based on the future trajectory prediction of said primary object. The object tracking window is continuously viewed and the number of objects in the object tracking window is determined. Upon determining the number of objects in the object tracking window as one, the determined object tracking parameters of the object in the object tracking window, that is, primary oject is compared with the stored object parameters of the primary object. The match is determined for identifying the primary object. The steps 416-421 of FIG. 4B may be referred for more clarity. The number of objects in the object tracking window for relocking is about one primary object. The contrast difference is utilized for relocking the primary object by identifying the secondary objects, which are false objects, in place of the primary object in the object tracking window. The covariance value is utilized as a measure of match and the object geometry is also utilized as a matching parameter for relocking the primary object by identifying the primary object in the object tracking window.
The number of objects in object tracking window is more than one and the object geometry, the contrast difference and the covariance values match then any object may not be relocked. After detecting multiple objects with determined object tracking parameters matching the stored object tracking parameters, secondary objects that are false objects may be relocked for tracking. To avoid tracking secondary objects, relocking may not be initiated until only a single object is covered in the object tracking window. However if the determined object tracking parameters of the single object in the object tracking window does not match the stored object tracking parameters, then the single object in the object tracking window may not be relocked. Therefore until a single object that is a primary object with a match in object geometry, contrast difference and covariance values is obtained in the object tracking window, the relocking procedure may not be initiated. The future object trajectory prediction based upon previous trajectory, the object geometry, the contrast difference, and the covariance matrix matching and object tracking window enlargement are required for smooth relocking of the primary object in the object tracking window.
The covariance matrix for relocking of object of interest is calculated as given in equation (1). The expectation value of object in track window in recent frame is calculated as given in equation (2). The expectation value of object in track window in next successive frame is calculated as given in equation (3). This derived covariance matrix is used for relocking the object of interest after clearing all the similar type of objects falling in track window.
COV(X,Y)=E[(X- µ_X )(Y-µ_Y ) ] (1)
µ_X= E[X]= (?_(i=1)^N¦X_i )/N (2)
? µ?_Y= E[Y]= (?_(i=1)^N¦Y_i )/N (3)
Consider a case wherein multiple objects, such as one or more secondary objects and also the primary object are detected in the object tracking window and the secondary objects have similar object geometry as that of the primary object. The object tracking parameters of all the objects in the object tracking window match the stored object tracking parameters, then none of the objects in the object tracking window may be relocked and the tracking may depend upon the future object trajectory prediction.
FIG. 2 exemplarily illustrates an image based object tracking system. The image based object tracking system comprises an image capturing block 201, a video processing and tracking block 202, and a display device 205. The image based object tracking system performs background modelling, background subtraction, correlation based object tracking, change detection based object tracking, image segmentation based object tracking and color pixel based object tracking. The image capturing block 201 comprises electro optical (EO) sensors for capturing sequence of image frames. The EO sensors are one of, for example, CCD (Charge Coupled Device) camera, IR (Infrared) camera and low light camera. The video and tracking processing block 202 comprises a processor 204 for the performing the method for tracking a single object in presence of a plurality of objects comprising a similar profile as that of the single object as disclosed in the detailed description of FIG. 1. The processor 204 receives the sequence of image frames and applies background subtraction and thresholding technique to the sequence of image frames for separating background and object to clear the background and track the primary object only. The processor 204 continuously detects the number of objects in the object tracking window after subtracting background and contemporary methods known in the art may be used for the same. The processor 204 computes object tracking parameters of the primary object secondary objects such as the number of objects detected in the object tracking window, the contrast difference, the covariance values, the object geometry, and trajectory of object along with future object trajectory prediction, as disclosed in the detailed description of FIG. 1.
The video tracking processing block 202 further comprises a memory unit 203 for storing the computed object tracking parameters of the primary object and one or more secondary objects such as the number of objects detected in the object tracking window, the contrast difference, the covariance values and covariance matrix, the object geometry, and trajectory of object along with future object trajectory prediction. The covariance value of the tracked primary object is stored in the memory unit 203. The covariance value between first tracked primary object and successive frames are stored in the memory unit 203. The processor 204 halts the tracking process upon detection of the secondary objects along with the primary object, where the secondary objects have the similar profile as that of the primary object. The processor 204 based on the future object trajectory prediction considers an object as the primary object and determines and compares the object tracking parameters with the stored object tracking parameters. The processor 204 resumes tracking of the required object after detecting the primary object only in the object tracking window with the detected object tracking parameters matching the stored object tracking parameters of the primary object. The processor 204 relocks the primary object for tracking after determining the match as disclosed in the detailed description of FIG. 1. The object trajectory as disclosed in the detailed description of FIG. 1 is stored in the memory unit 203 of the video and tracking possessing block 202. The memory unit 203 stores object trajectory of the primary object at intervals of about one second during tracking. The processor 204 utilizes the stored information about the object trajectories of the primary object and provides future object trajectory prediction. The processor 204 generates predicted trajectory based upon stored trajectory of primary object tracking. The processor 204 views the object tracking window in the direction based on the future object trajectory prediction for identifying the primary object. The number of objects in the object tracking window is continuously determined. After determining the number of objects as one, the processor 204 compares the determined object tracking parameters of the detected primary object in the object tracking window with the stored object tracking parameters of the primary object. The processor 204 determines the match for identifying the primary object by the comparison. The processor 204 relocks the object as the primary object for tracking upon determining the match. The display device 205 displays the process of tracking of the primary object within the object tracking window 206.
FIGS. 3A–3D exemplarily illustrate tracking of one or more objects in the object tracking window 206 within the display device 205. FIG. 3A exemplarily illustrates a single object detection and tracking in the limited size object tracking window 206. FIG. 3B exemplarily illustrates multiple objects in the object tracking window 206, where the multiple objects are multiple secondary objects along with the primary object. The secondary objects have a profile similar to the primary object. FIG. 3C exemplarily illustrates an increase in size of the object tracking window 206 during presence and detection of multiple objects in the object tracking window. The object tracking window 206 is enlarged in the presence of multiple objects to avoid the objects from going out of the coverage area of the object tracking window area due object manoeuvring. FIG. 3D exemplarily illustrates relocking of single object in the object tracking window 206.
It is understood that the above description is intended to be illustrative, and not restrictive. It is intended to cover all alternatives, modifications and equivalents as may be included within the spirit and scope of the invention as defined in the appended claims. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively.
| # | Name | Date |
|---|---|---|
| 1 | 201941002308-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 1 | 201941002308-STATEMENT OF UNDERTAKING (FORM 3) [18-01-2019(online)].pdf | 2019-01-18 |
| 2 | 201941002308-AMENDED DOCUMENTS [04-10-2024(online)].pdf | 2024-10-04 |
| 2 | 201941002308-FORM 1 [18-01-2019(online)].pdf | 2019-01-18 |
| 3 | 201941002308-FORM 13 [04-10-2024(online)].pdf | 2024-10-04 |
| 3 | 201941002308-DRAWINGS [18-01-2019(online)].pdf | 2019-01-18 |
| 4 | 201941002308-POA [04-10-2024(online)].pdf | 2024-10-04 |
| 4 | 201941002308-DECLARATION OF INVENTORSHIP (FORM 5) [18-01-2019(online)].pdf | 2019-01-18 |
| 5 | 201941002308-COMPLETE SPECIFICATION [18-01-2019(online)].pdf | 2019-01-18 |
| 5 | 201941002308 Reply From Defence.pdf | 2023-06-17 |
| 6 | 201941002308-Response to office action [24-08-2022(online)].pdf | 2022-08-24 |
| 6 | 201941002308-FORM-26 [04-07-2019(online)].pdf | 2019-07-04 |
| 7 | Correspondence by Agent_Power of Attorney_15-07-2019.pdf | 2019-07-15 |
| 7 | 201941002308-Defence-22-08-2022.pdf | 2022-08-22 |
| 8 | 201941002308-Proof of Right (MANDATORY) [18-07-2019(online)].pdf | 2019-07-18 |
| 8 | 201941002308-CLAIMS [25-07-2022(online)].pdf | 2022-07-25 |
| 9 | 201941002308-COMPLETE SPECIFICATION [25-07-2022(online)].pdf | 2022-07-25 |
| 9 | Correspondence by Agent_Form 1_26-07-2019.pdf | 2019-07-26 |
| 10 | 201941002308-DRAWING [25-07-2022(online)].pdf | 2022-07-25 |
| 10 | 201941002308-FORM 18 [24-12-2020(online)].pdf | 2020-12-24 |
| 11 | 201941002308-FER.pdf | 2022-01-26 |
| 11 | 201941002308-FER_SER_REPLY [25-07-2022(online)].pdf | 2022-07-25 |
| 12 | 201941002308-OTHERS [25-07-2022(online)].pdf | 2022-07-25 |
| 12 | 201941002308-PETITION UNDER RULE 137 [25-07-2022(online)].pdf | 2022-07-25 |
| 13 | 201941002308-OTHERS [25-07-2022(online)].pdf | 2022-07-25 |
| 13 | 201941002308-PETITION UNDER RULE 137 [25-07-2022(online)].pdf | 2022-07-25 |
| 14 | 201941002308-FER.pdf | 2022-01-26 |
| 14 | 201941002308-FER_SER_REPLY [25-07-2022(online)].pdf | 2022-07-25 |
| 15 | 201941002308-DRAWING [25-07-2022(online)].pdf | 2022-07-25 |
| 15 | 201941002308-FORM 18 [24-12-2020(online)].pdf | 2020-12-24 |
| 16 | 201941002308-COMPLETE SPECIFICATION [25-07-2022(online)].pdf | 2022-07-25 |
| 16 | Correspondence by Agent_Form 1_26-07-2019.pdf | 2019-07-26 |
| 17 | 201941002308-Proof of Right (MANDATORY) [18-07-2019(online)].pdf | 2019-07-18 |
| 17 | 201941002308-CLAIMS [25-07-2022(online)].pdf | 2022-07-25 |
| 18 | Correspondence by Agent_Power of Attorney_15-07-2019.pdf | 2019-07-15 |
| 18 | 201941002308-Defence-22-08-2022.pdf | 2022-08-22 |
| 19 | 201941002308-Response to office action [24-08-2022(online)].pdf | 2022-08-24 |
| 19 | 201941002308-FORM-26 [04-07-2019(online)].pdf | 2019-07-04 |
| 20 | 201941002308-COMPLETE SPECIFICATION [18-01-2019(online)].pdf | 2019-01-18 |
| 20 | 201941002308 Reply From Defence.pdf | 2023-06-17 |
| 21 | 201941002308-POA [04-10-2024(online)].pdf | 2024-10-04 |
| 21 | 201941002308-DECLARATION OF INVENTORSHIP (FORM 5) [18-01-2019(online)].pdf | 2019-01-18 |
| 22 | 201941002308-FORM 13 [04-10-2024(online)].pdf | 2024-10-04 |
| 22 | 201941002308-DRAWINGS [18-01-2019(online)].pdf | 2019-01-18 |
| 23 | 201941002308-FORM 1 [18-01-2019(online)].pdf | 2019-01-18 |
| 23 | 201941002308-AMENDED DOCUMENTS [04-10-2024(online)].pdf | 2024-10-04 |
| 24 | 201941002308-STATEMENT OF UNDERTAKING (FORM 3) [18-01-2019(online)].pdf | 2019-01-18 |
| 24 | 201941002308-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 1 | SearchStrategy201941002308E_25-01-2022.pdf |