Abstract: ABSTRACT SYSTEM AND METHOD FOR SEARCHING PRE-SELECTED SIMILAR TARGETS IN AN IMAGE A method for searching similar targets in an image comprises the following steps: selecting a target of interest from the image; identifying the type of target selected; extracting multiple textural features such as Haralick features, entropy statistics from gradient matrices and statistical geometrical features from the pre-selected target; extracting similar set of features from the portion of image within the sliding window which is moved over the entire image whose size is same as that of the target pattern; identifying first the possible areas of presence of similar targets by comparing the Haralick features between the target image and the image portion within the sliding window; and comparing the other textural features and also the range filter output value between the target image and the image portion within the sliding window to make a final decision of presence of similar targets in the image. To be published Figure No. 1
DESC:FORM – 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(SEE SECTION 10, RULE 13)
SYSTEM AND METHOD FOR SEARCHING PRE-SELECTED SIMILAR TARGETS IN AN IMAGE
BHARAT ELECTRONICS LIMITED
WITH ADDRESS:
OUTER RING ROAD, NAGAVARA, BANGALORE 560045, KARNATAKA, INDIA
THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.
TECHNICAL FIELD
[0001] The present disclosure relates to method and system for searching similar instances of a target pattern that is pre-selected in an image.
[0002] Pattern searching typically involves locating one or more regions of interest in an image, and then recognizing the patterns in the regions of interest. In images with various different textures or in gray-scale images, variations in brightness make it difficult to accurately locate patterns in different lighting conditions.
[0003] There have been certain endeavors for automatically searching instances of a target pattern in a candidate image. US Patent number US7657126 entitled “System and Method for Search Portions of Objects in Images and Features Thereof” deals with a method which enables searching of objects in images by determining one or more search criteria corresponding to portion of the object in a user selected image. The search criteria corresponding to portion of the object includes specifying a visual characteristic such as colour, shape, pattern or texture of the object. In addition to the visual characteristics, the method also selects an alternative value for the visual characteristic of the selected portion of the object which includes specifying a colour and shape criteria for the selected portion and assigning weights for the specified colour and shape criteria.
[0004] US Patent number US7492957 entitled “Using Run Length Encoding to Detect Target Images” mentions a target image detection method in a candidate image. An image detection manager extracts run length encoding data from the candidate image and distinguishes between a foreground and background of the candidate image and target image by taking into account an interval of scale factors for matching colour runs in the foreground and run lengths in the background. By treating the background pixels as wildcards, the image detection manager utilizes fuzzy colour matching functionality for comparing rows of the run length encoding data from the candidate image to rows of run length encoding data from the target image, and determines whether the target is present in the candidate image.
[0005] US publication number US20170053169A1 entitled “Object Detection and Analysis via Unmanned Air Vehicle” mentions locating objects of interest, such as people, vehicles, products, logos, fires, and other detectable objects in the image data captured using one or more unmanned air vehicles. This includes a process wherein background and motion is accounted for in order to identify foreground objects, which is then analyzed using trained classifiers.
[0006] Indian Patent Application number 201641000111 entitled “Texture based Land Cover Classification of Aerial Imagery” mentions a method for classification of gray level aerial imagery into multiple predefined land cover classes using texture analysis. The method involves classification of pixels in aerial imagery into generic land covers types comprising the steps of, image enhancement, selection of training samples, texture statistics extraction, codebook generation for each land class and mapping the pixels into the land cover classes.
[0007] However, all the aforesaid existing arts are mainly related to finding instances of a specific target pattern in a candidate image. There is therefore felt a need of an invention for searching similar instances of a target pattern that is pre-selected in an image.
SUMMARY
[0008] This summary is provided to introduce concepts of the invention related to searching similar targets in an image. This summary is neither intended to identify essential features of the invention as per the present invention nor is it intended for use in determining or limiting the scope of the invention as per the present invention.
[0009] In accordance with an embodiment of the present invention, there is provided a computer implemented method for searching similar targets in an image. The method comprises: selecting a target of interest from an image; identifying the type of target selected; extracting multiple sets of textural features from the selected target of interest; moving a window of size equal to the size of the selected target of interest over the image; extracting identical sets of textural features from a portion of the image within the window moved over the image; measuring Euclidean distance between a first set of textural features extracted from the selected target of interest and from the portion of the image within the window; identifying possible areas of presence of the target of interest by applying a threshold on the Euclidean distance measured between the first set of textural features; measuring Euclidean distance between a second set of textural features extracted from the selected target of interest and the portion of the image within the window; measuring Euclidean distance between a third set of textural features extracted from the target of interest and the portion of the image within the window; and determining the presence or absence of a similar target of interest within the portion of the image within the window based on each of the Euclidean distances between the first, second and third set of textural features.
[0010] In accordance with another embodiment of the present invention, there is provided a system for searching similar targets in an image. The system comprises: an image sensor configured to capture/acquire an image; a display unit configured to display the image; a target selection device configured to enable an operator to select a target of interest from the image; and a target search processor in communication with the image sensor, the display unit and the target selection device. The target search processor is configured to: identify a type of target selected; extract multiple sets of textural features from the selected target of interest; move a window of size equal to the size of the selected target of interest over the image; extract identical sets of textural features from a portion of the image within the window moved over the image; measure Euclidean distance between a first set of textural features extracted from the selected target of interest and from the portion of the image within the window; identify possible areas of presence of the target of interest by applying a threshold on the Euclidean distance measured between the first set of textural features; measure Euclidean distance between a second set of textural features extracted from the selected target of interest and the portion of the image within the window; measure Euclidean distance between a third set of textural features extracted from the target of interest and the portion of the image within the window; and determine the presence or absence of a similar target of interest within the portion of the image within the window based on each of the Euclidean distances between the first, second and third set of textural features.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0011] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0012] Figure 1 illustrates a block diagram depicting a system for searching similar targets in an image, according to an exemplary embodiment of the present disclosure.
[0013] Figure 2 illustrates a flow chart depicting the steps involved in a method for searching similar targets in an image, according to another exemplary embodiment of the present disclosure.
[0014] Figure 3 illustrates a flow chart depicting the steps involved in extracting a feature of a target in the image, according to another exemplary embodiment of the present disclosure.
[0015] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative methods embodying the principles of the present disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0016] The various embodiments of the present disclosure describe about a method for searching similar targets in an image and a system therefor. Identifying the type of target selected by using multiple textural feature quantity types form the basis of the method.
[0017] Further, the various embodiments of the present disclosure also describe about a system for searching similar targets in an image. The system can be a part of a surveillance system where an operator selects a target of interest present in the image and the method locates all the similar instances of the target selected, i.e. all the image patterns similar to the target selected.
[0018] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure may be practiced without these details. One skilled in the art will recognize that embodiments of the present disclosure, some of which are described below, may be incorporated into a number of systems.
[0019] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the presently disclosure and are meant to avoid obscuring of the presently disclosure.
[0020] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0021] In contrast to the prior art where the objective is to find instances of a specific target pattern in a candidate image, in the present invention, the operator selects a target of interest present in the image and all the similar instances of the selected target are detected by the system by implementing the method. The present invention accurately detects all the patterns which are similar to the target pattern selected with very few false alarms based on multiple textural feature quantity types.
[0022] Figure 1 illustrates a block diagram depicting the system 10 for searching similar targets in an image, according to an exemplary embodiment of the present disclosure. The system 10 is typically a computer based system comprising at least an image sensor 1, a target search processor 2, a display unit 3 and a target selection device 4. Image acquired by the sensor 1 is given as an input to the similar target search processor 2 which is meant for carrying out searching of similar targets in the input image. The input image is also displayed on the display unit 3 from which the operator selects a target of interest by means of a target selection device 4. The target pattern of interest selected by the operator is also given as an input to the similar target search processor 2.
[0023] Figure 2 illustrates a flow chart depicting the steps involved in the method for searching similar targets in an image, according to another exemplary embodiment of the present disclosure. The method is implemented by the system 10 illustrated in Figure 1. With reference to Figures 1 and 2, the input image 21 is the image in which the operator selects a target of interest. The type of target selected by the operator is then identified 22. From the selected target image pattern of interest, multiple sets of textural features are extracted 23. Figure 3 illustrates a flow chart depicting the steps involved in extracting the sets of textural features of the selected target in the image, according to another exemplary embodiment of the present disclosure. Range filtering is first performed on the target pattern/image 31 and the standard deviation of range filter output is used to determine the type of target selected 32.
[0024] Next, a set of ten statistical Haralick features is extracted from the target pattern. Gray Level Co-occurrence Matrix is first constructed for target image based on the pair wise spatial co-occurrences of pixels separated by unity distance and in each of the four directions, 0, 45, 90 and 135 degrees 33. A set of ten statistical features such as angular second moment, contrast, homogeneity, dissimilarity, entropy, sum average, sum variance, sum entropy, difference variance and difference entropy is calculated for each one of the co-occurrence matrices and the average of the four values is used to form a feature vector 34. Second set of textural features that is extracted from the target image is the statistical geometrical features 35. These set of features is based on the statistics of geometrical attributes of connected regions in a sequence of binary images obtained from the target image. For each of the binary image, geometrical attributes such as the number of connected regions of 1-valued pixels, 0-valued pixels and their irregularity is obtained. Each of the attribute is further characterised using four statistics such as their maximum value, average value, sample mean and sample standard deviation which constitutes a total of sixteen features.
[0025] The third and final set of features that is extracted from the target image is the entropy value of the co-occurrence matrix that is derived from the target pattern gradient matrix in each of the four directions. From the target image, a gradient matrix is constructed 36 in each of the four directions with pixels separated by a distance of fifteen. From each of the four direction gradient matrices, a co-occurrence matrix is obtained in the horizontal direction and entropy statistics is calculated from the co-occurrence matrix 37. The entropy values obtained from the gradient matrix in each of the four directions forms a final set of feature vector.
[0026] Following the target image feature extraction, a window of size equal to the selected target pattern is moved over the entire image 24. From image portion within every window, a set of ten statistical Haralick features, set of sixteen statistical geometrical features and entropy statistics from gradient matrix is extracted 25. Euclidian distance measure between co-occurrence matrix features extracted from the target image and from image portion within the window is computed 26 and the possible areas of presence of similar targets is first identified by applying threshold on the Euclidian distance 27. This is followed by the computation of Euclidian distance between statistical geometrical features 28 and Euclidian distance between gradient matrix entropies extracted from the target image and from image portion within the moving window 29.
[0027] In addition to the above mentioned set of features, range filtering is also performed on image portion within every window and standard deviation is computed from the range filter output 31, 32. Depending upon the type of target identified, different threshold is applied on the Euclidian distance computed between statistical geometrical features and also on the difference between target image and window image portion co-occurrence matrix entropy feature. Final decision of whether the selected target is present within the moving window is made if the Euclidian distance between statistical geometrical features, range filter standard deviation difference between target image and image portion within the moving window, co-occurrence matrix entropy difference between target image and image portion within the moving window, Euclidian distance between gradient matrix entropy values is less than the pre-defined threshold 30.
[0028] Thus, the present invention in accordance with the flow charts illustrated in Figures 2 and 3 and as explained above, provides a computer implemented method for searching similar targets in an image. The method comprises: at step 21 - selecting a target of interest from an image; at step 22 - identifying the type of target selected; at step 23 - extracting multiple sets of textural features from the selected target of interest; at step 24 - moving a window of size equal to the size of the selected target of interest over the image; at step 25 - extracting identical sets of textural features from a portion of the image within the window moved over the image; at step 26 - measuring Euclidean distance between a first set of textural features extracted from the selected target of interest and from the portion of the image within the window; at step 27 - identifying possible areas of presence of the target of interest by applying a threshold on the Euclidean distance measured between the first set of textural features; at step 28 - measuring Euclidean distance between a second set of textural features extracted from the selected target of interest and the portion of the image within the window; at step 29 - measuring Euclidean distance between a third set of textural features extracted from the target of interest and the portion of the image within the window; and at step 30 - determining the presence or absence of a similar target of interest within the portion of the image within the window based on each of the Euclidean distances between the first, second and third set of textural features.
[0029] In accordance with the method, the size of the window being moved over the image is equal to the size of the selected target.
[0030] In accordance with the method, identical sets of textural features are extracted from the portion of the image within every window and the Euclidian distances are computed between the first, second and third sets of textural features extracted from the target of interest and the portion of the image within every window.
[0031] In accordance with the method, the first set of textural features include co-occurrence matrix statistical features, the second set of textural features include statistical geometrical features, and the third set of textural features include gradient matrix entropy values.
[0032] In accordance with the method, the step of extracting the multiple sets of textural features from the selected target of interest includes: at step 31 - performing range filtering on the selected target of interest; at step 32 - computing standard deviation of the range filter output; at step 33 - computing a co-occurrence matrix for the selected target of interest in four angular directions; at step 34 - extracting from the computed co-occurrence matrix in each of the four angular directions, the first set of co-occurrence matrix statistical features selected from angular second moment, contrast, homogeneity, dissimilarity, entropy, sum average, sum variance, sum entropy, difference variance and difference entropy; at step 35 - extracting the second set of statistical geometrical features from the selected target of interest, wherein the statistical geometrical features are based on statistics of geometrical attributes of connected regions in a sequence of binary images obtained from the selected target of interest; at step 36 - computing a gradient matrix in four angular directions; and at step 37 - computing a co-occurrence matrix in horizontal direction from the computed gradient matrix in each of the four angular directions and calculating the third set of gradient matrix entropy values from the computed co-occurrence matrix in horizontal direction.
[0033] In accordance with the method, the type of target is identified based on the standard deviation value; the four angular directions are selected from 0, 45, 90 and 135 degrees; and for each binary image, the geometrical attributes include the number of connected regions of 1-valued pixels, 0-valued pixels and the irregularities of said pixels.
[0034] In accordance with the method, depending upon the type of target identified, thresholds applied on each Euclidian distance computed between the second set of textural features and the third set of textural features are different than the threshold applied on the Euclidian distance computed between the first set of textural features, and the similar target is determined to be present in the portion of the image in the window if each Euclidean distance between the first, second and third set of textural features is less than its corresponding threshold.
[0035] Referring now to Figure 1 together with Figures 2 and 3, the system 10 for searching similar targets in an image as provided by the present invention, comprises the target search processor 2 in communication with the image sensor 1, the display unit 3 and the target selection device 4. The target search processor 2 is configured to: identify a type of target selected; extract multiple sets of textural features from the selected target of interest; move a window of size equal to the size of the selected target of interest over the image; extract identical sets of textural features from a portion of the image within the window moved over the image; measure Euclidean distance between a first set of textural features extracted from the selected target of interest and from the portion of the image within the window; identify possible areas of presence of the target of interest by applying a threshold on the Euclidean distance measured between the first set of textural features; measure Euclidean distance between a second set of textural features extracted from the selected target of interest and the portion of the image within the window; measure Euclidean distance between a third set of textural features extracted from the target of interest and the portion of the image within the window; and determine the presence or absence of a similar target of interest within the portion of the image within the window based on each of the Euclidean distances between the first, second and third set of textural features.
[0036] In order to extract multiple sets of textural features from the selected target of interest, the processor 2 is configured to: perform range filtering on the selected target of interest; compute standard deviation of the range filter output; computing a co-occurrence matrix for the selected target of interest in four angular directions; extract from the computed co-occurrence matrix in each of the four angular directions, the first set of co-occurrence matrix statistical features selected from angular second moment, contrast, homogeneity, dissimilarity, entropy, sum average, sum variance, sum entropy, difference variance and difference entropy; extract the second set of statistical geometrical features from the selected target of interest, wherein the statistical geometrical features are based on statistics of geometrical attributes of connected regions in a sequence of binary images obtained from the selected target of interest; compute a gradient matrix in four angular directions; and compute a co-occurrence matrix in horizontal direction from the computed gradient matrix in each of the four angular directions and calculate the third set of gradient matrix entropy values from the computed co-occurrence matrix in horizontal direction.
[0037] The presently disclosed method is an unsupervised method of target identification and is applicable to any type of target selected. Advantageously, the method can be applied on both gray scale and colour images of any scene, and is not limited to UAV images alone.
[0038] The foregoing description of the invention has been set merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the substance of the invention may occur to person skilled in the art, the invention should be construed to include everything within the scope of the invention.
,CLAIMS:WE CLAIM:
1. A computer implemented method for searching similar targets in an image, the method comprising:
selecting a target of interest from an image;
identifying the type of target selected;
extracting multiple sets of textural features from the selected target of interest;
moving a window of size equal to the size of the selected target of interest over the image;
extracting identical sets of textural features from a portion of the image within the window moved over the image;
measuring Euclidean distance between a first set of textural features extracted from the selected target of interest and from the portion of the image within the window;
identifying possible areas of presence of the target of interest by applying a threshold on the Euclidean distance measured between the first set of textural features;
measuring Euclidean distance between a second set of textural features extracted from the selected target of interest and the portion of the image within the window;
measuring Euclidean distance between a third set of textural features extracted from the target of interest and the portion of the image within the window; and
determining the presence or absence of a similar target of interest within the portion of the image within the window based on each of the Euclidean distances between the first, second and third set of textural features.
2. The method as claimed in claim 1, wherein the size of the window being moved over the image is equal to the size of the selected target.
3. The method as claimed in claim 1, wherein the identical sets of textural features are extracted from the portion of the image within every window and the Euclidian distances are computed between the first, second and third sets of textural features extracted from the target of interest and the portion of the image within every window.
4. The method as claimed in claim 1, wherein,
the first set of textural features include co-occurrence matrix statistical features,
the second set of textural features include statistical geometrical features, and
the third set of textural features include gradient matrix entropy values.
5. The method as claimed in claims 1 and 4, wherein the step of extracting the multiple sets of textural features from the selected target of interest includes:
performing range filtering on the selected target of interest;
computing standard deviation of the range filter output;
computing a co-occurrence matrix for the selected target of interest in four angular directions;
extracting from the computed co-occurrence matrix in each of the four angular directions, the first set of co-occurrence matrix statistical features selected from angular second moment, contrast, homogeneity, dissimilarity, entropy, sum average, sum variance, sum entropy, difference variance and difference entropy;
extracting the second set of statistical geometrical features from the selected target of interest, wherein the statistical geometrical features are based on statistics of geometrical attributes of connected regions in a sequence of binary images obtained from the selected target of interest;
computing a gradient matrix in four angular directions; and
computing a co-occurrence matrix in horizontal direction from the computed gradient matrix in each of the four angular directions and calculating the third set of gradient matrix entropy values from the computed co-occurrence matrix in horizontal direction.
6. The method as claimed in claims 1 and 5, wherein the type of target is identified based on the standard deviation value.
7. The method as claimed in claim 5, wherein the four angular directions are selected from 0, 45, 90 and 135 degrees.
8. The method as claimed in claim 5, wherein, for each binary image, the geometrical attributes include the number of connected regions of 1-valued pixels, 0-valued pixels and the irregularities of said pixels.
9. The method as claimed in claim 1, wherein, depending upon the type of target identified, thresholds applied on each Euclidian distance computed between the second set of textural features and the third set of textural features are different than the threshold applied on the Euclidian distance computed between the first set of textural features, and
the similar target is determined to be present in the portion of the image in the window if each Euclidean distance between the first, second and third set of textural features is less than its corresponding threshold.
10. A system 10 for searching similar targets in an image, said system 10 comprising:
an image sensor 1 configured to capture/acquire an image;
a display unit 3 configured to display the image;
a target selection device 4 configured to enable an operator to select a target of interest from the image; and
a target search processor 2 in communication with said image sensor 1, said display unit 3 and said target selection device 4, said target search processor configured to:
identify the type of target selected;
extract multiple sets of textural features from the selected target of interest;
move a window of size equal to the size of the selected target of interest over the image;
extract identical sets of textural features from a portion of the image within the window moved over the image;
measure Euclidean distance between a first set of textural features extracted from the selected target of interest and from the portion of the image within the window;
identify possible areas of presence of the target of interest by applying a threshold on the Euclidean distance measured between the first set of textural features;
measure Euclidean distance between a second set of textural features extracted from the selected target of interest and the portion of the image within the window;
measure Euclidean distance between a third set of textural features extracted from the target of interest and the portion of the image within the window; and
determine the presence or absence of a similar target of interest within the portion of the image within the window based on each of the Euclidean distances between the first, second and third set of textural features.
11. The system as claimed in claim 10, wherein
the first set of textural features include co-occurrence matrix statistical features,
the second set of textural features include statistical geometrical features, and
the third set of textural features include gradient matrix entropy values.
12. The system as claimed in claims 11 and 10, wherein, to extract multiple sets of textural features from the selected target of interest, the processor 2 is configured to:
perform range filtering on the selected target of interest;
compute standard deviation of the range filter output;
computing a co-occurrence matrix for the selected target of interest in four angular directions;
extract from the computed co-occurrence matrix in each of the four angular directions, the first set of co-occurrence matrix statistical features selected from angular second moment, contrast, homogeneity, dissimilarity, entropy, sum average, sum variance, sum entropy, difference variance and difference entropy;
extract the second set of statistical geometrical features from the selected target of interest, wherein the statistical geometrical features are based on statistics of geometrical attributes of connected regions in a sequence of binary images obtained from the selected target of interest;
compute a gradient matrix in four angular directions; and
compute a co-occurrence matrix in horizontal direction from the computed gradient matrix in each of the four angular directions and calculate the third set of gradient matrix entropy values from the computed co-occurrence matrix in horizontal direction.
Dated this 1st day of April, 2019
FOR BHARAT ELECTRONICS LIMITED
By their Agent)
D. MANOJ KUMAR (IN/PA-2110)
KRISHNA & SAURSATRI ASSOCIATES LLP
| # | Name | Date |
|---|---|---|
| 1 | 201941013040-PROVISIONAL SPECIFICATION [04-01-2019(online)].pdf | 2019-01-04 |
| 1 | 201941013040-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 2 | 201941013040-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 2 | 201941013040-FORM 1 [04-01-2019(online)].pdf | 2019-01-04 |
| 3 | 201941013040-IntimationOfGrant06-12-2023.pdf | 2023-12-06 |
| 3 | 201941013040-DRAWINGS [04-01-2019(online)].pdf | 2019-01-04 |
| 4 | 201941013040-PatentCertificate06-12-2023.pdf | 2023-12-06 |
| 4 | 201941013040-FORM-26 [28-06-2019(online)].pdf | 2019-06-28 |
| 5 | Correspondence by Agent _Power Of Attorney_08-07-2019.pdf | 2019-07-08 |
| 5 | 201941013040-Response to office action [21-12-2022(online)].pdf | 2022-12-21 |
| 6 | 201941013040-Proof of Right (MANDATORY) [11-07-2019(online)].pdf | 2019-07-11 |
| 6 | 201941013040-CLAIMS [21-07-2022(online)].pdf | 2022-07-21 |
| 7 | Correspondence by Agent_Form1_22-07-2019.pdf | 2019-07-22 |
| 7 | 201941013040-COMPLETE SPECIFICATION [21-07-2022(online)].pdf | 2022-07-21 |
| 8 | 201941013040-FORM 3 [06-12-2019(online)].pdf | 2019-12-06 |
| 8 | 201941013040-FER_SER_REPLY [21-07-2022(online)].pdf | 2022-07-21 |
| 9 | 201941013040-FER.pdf | 2022-01-27 |
| 9 | 201941013040-ENDORSEMENT BY INVENTORS [06-12-2019(online)].pdf | 2019-12-06 |
| 10 | 201941013040-DRAWING [06-12-2019(online)].pdf | 2019-12-06 |
| 10 | 201941013040-FORM 18 [10-02-2021(online)].pdf | 2021-02-10 |
| 11 | 201941013040-COMPLETE SPECIFICATION [06-12-2019(online)].pdf | 2019-12-06 |
| 11 | 201941013040-CORRESPONDENCE-OTHERS [06-12-2019(online)].pdf | 2019-12-06 |
| 12 | 201941013040-COMPLETE SPECIFICATION [06-12-2019(online)].pdf | 2019-12-06 |
| 12 | 201941013040-CORRESPONDENCE-OTHERS [06-12-2019(online)].pdf | 2019-12-06 |
| 13 | 201941013040-DRAWING [06-12-2019(online)].pdf | 2019-12-06 |
| 13 | 201941013040-FORM 18 [10-02-2021(online)].pdf | 2021-02-10 |
| 14 | 201941013040-ENDORSEMENT BY INVENTORS [06-12-2019(online)].pdf | 2019-12-06 |
| 14 | 201941013040-FER.pdf | 2022-01-27 |
| 15 | 201941013040-FER_SER_REPLY [21-07-2022(online)].pdf | 2022-07-21 |
| 15 | 201941013040-FORM 3 [06-12-2019(online)].pdf | 2019-12-06 |
| 16 | 201941013040-COMPLETE SPECIFICATION [21-07-2022(online)].pdf | 2022-07-21 |
| 16 | Correspondence by Agent_Form1_22-07-2019.pdf | 2019-07-22 |
| 17 | 201941013040-CLAIMS [21-07-2022(online)].pdf | 2022-07-21 |
| 17 | 201941013040-Proof of Right (MANDATORY) [11-07-2019(online)].pdf | 2019-07-11 |
| 18 | 201941013040-Response to office action [21-12-2022(online)].pdf | 2022-12-21 |
| 18 | Correspondence by Agent _Power Of Attorney_08-07-2019.pdf | 2019-07-08 |
| 19 | 201941013040-PatentCertificate06-12-2023.pdf | 2023-12-06 |
| 19 | 201941013040-FORM-26 [28-06-2019(online)].pdf | 2019-06-28 |
| 20 | 201941013040-IntimationOfGrant06-12-2023.pdf | 2023-12-06 |
| 20 | 201941013040-DRAWINGS [04-01-2019(online)].pdf | 2019-01-04 |
| 21 | 201941013040-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 21 | 201941013040-FORM 1 [04-01-2019(online)].pdf | 2019-01-04 |
| 22 | 201941013040-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 22 | 201941013040-PROVISIONAL SPECIFICATION [04-01-2019(online)].pdf | 2019-01-04 |
| 1 | SearchE_20-01-2022.pdf |