Abstract: A method of segmenting objects from a background of a depth map (24), including steps of generating (30) a plurality of vertices (V) of a mathematical graph by combining spatial coordinates of an image element of the plurality of image elements with its corresponding depth value (D), generating (34) a two-dimensional similarity matrix (K), obtained from applying at least two similarity measures to pairs of distinct vertices (Vj, Vj), generating (36) a two-dimensional edge matrix (E), and defining (38) at least one object imaged in the depth map (24) by clustering the pairs of vertices (Vj, Vj) of the plurality of vertices (V) for which the edge matrix element has a non-zero value; and an image segmentation device (10) for segmenting objects from a background of a depth map (24), configured to store at least one program code comprising converted method steps of any embodiment of the method, and wherein a processor unit (14) is configured to execute the at least one program code. ' (Fig. 2)
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
The Patent Rules, 2003
COMPLETE SPECIFICATION
[See Section 10 and Rule 13]
1. TITLE OF THE INVENTION
OBJECT SEGMENTATION IN DEPTH MAPS
2. APPLICANTS
(a) ifm Engineering Private Limited
(b) an Indian company,
(c) of Unit 14, Plot 59, Amchi Colony 1, N. D. A. Road, Bawdhan, Pune-411 021, Maharashtra, INDIA,
3. PREAMBLE TO THE DESCRIPTION
The following specification particufarfy describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
The invention pertains to a method of segmenting objects from a background of a depth map and an image segmentation device employing such method.
BACKGROUND OF THE INVENTION
It is known to employ time-of-flight cameras for generating images comprising pixels, wherein depth information, e.g. a value for the distance of a pixel to an object or a portion of an object imaged by the pixel, is associated with each pixel. Such depth . images, also known as depth maps, depict depth variations in an imaged scene and are extensively used in many applications, such as object (hand, gesture) detection and recognition.
Performance of such applications is strongly influenced by the applied method of segmentation of objects imaged in a depth map. Segmentation methods are known in the art of image processing, for instance from Fernand Meyer, "Topographic distance and watershed lines", Signal Processing 38, p. 113-125, (1994). Software modules comprising a Watershed algorithm are commercially available.
SUMMARY OF THE INVENTION
It is therefore an object of the invention to provide a robust and reliable method for object segmentation in depth maps.
In one aspect of the present invention, the object is achieved by a method of segmenting objects from a background of a depth map, presented in the following. The depth map comprises optical image data of a plurality of image elements and a depth value at each image element of the plurality of image elements. The method includes steps of
generating a plurality of vertices of a mathematical graph by combining spatial coordinates of an image element of the plurality of image elements with its corresponding depth value, wherein each vertex of the plurality of vertices is uniquely assigned to one image element of the plurality of image elements,
generating a two-dimensional similarity matrix, wherein each element of the similarity matrix is uniquely assigned to two distinct vertices of the plurality of vertices, and is
determined by combining values obtained from applying at least two similarity measures to pairs of distinct vertices of the plurality of vertices, wherein each similarity measure of the at least two similarity measures is indicative of a similarity of values of the pairs of distinct vertices of the plurality of vertices with regard to the same parameter, and wherein the at least two similarity measures are indicative of the similarity with regard to at least two distinct parameters,
generating a two-dimensional edge matrix, wherein each element of the edge matrix is uniquely assigned to a pair of two distinct vertices of the plurality of vertices, and has a nonzero value if the element of the similarity matrix that is assigned to the pair of two distinct vertices of the plurality of vertices is equal to or exceeds a pre-determined thresho.ld value, and has a value of substantially zero otherwise, and
defining at least one object imaged in the depth map by clustering the pairs of vertices of the plurality of vertices for which the edge matrix element has a non-zero value.
The phrase "substantially zero", as used in this application, shall particularly be understood as being small in comparison to the non-zero value.
The invention is based on employing concepts of mathematical graph theory for clustering of pixels which eventually leads to object segmentation. A graph G is formed by a pair of sets (V,E), wherein V is a set consisting of the plurality of vertices, and the set E contains the edges of the graph. Pairs of distinct vertices, which a large value for the element of the similarity matrix is uniquely assigned to, are more likely to be connected by an edge of the graph than pairs of distinct vertices with a small value for their assigned similarity matrix element.
One advantage of the method lies in an improved immunity to noise. Another advantage lies in that the method implicitly finds out a number of segments present in the depth map, so that the number of segments in the depth map is not required as an input parameter.
It is further advantageous that algorithms already known in the field of graph theory may be employed for efficient object segmentation.
Preferably, the method steps to be conducted are converted into a computer program code that is implementable in a digital memory unit of a computer device and is executable by a processor unit of the computer device. In this way, convenient times for execution of the method can be accomplished.
In a preferred embodiment, the method further comprises a preceding step of retrieving the depth map from a time-of-flight camera. Such cameras are commercially
available, for instance time-of-flight cameras comprising at least one photonic mixer device (PMD), so that depth maps can readily be provided.
In another preferred embodiment, the step of generating the two-dimensional similarity matrix includes that one similarity measure of the at least two similarity measures is indicative of thesimilarity of values of the two distinct vertices of the plurality of vertices with regard to a parameter given by a spatial distance between the two image elements assigned to the two distinct vertices. In this way, the higher probability of two distinct vertices that are spatially close to be part of the same object of the depth map can be exploited for the step of clustering of pairs of vertices.
in yet another preferred embodiment, the step of generating the two-dimensional similarity matrix.includes that one similarity measure of the at least two similarity measures is indicative of the similarity of values of the two distinct vertices of the plurality of vertices with regard to a parameter given by a difference between the depth values of the two image elements assigned to the two distinct vertices. In this way, the higher probability of two distinct vertices having similar depth values to be part of the same object of the depth map can be exploited for the step of clustering of pairs of vertices.
In one embodiment, the step of determining elements of the two-dimensional similarity matrix includes applying at least one similarity measure of the at least two similarity measures that comprises a Gaussian kernel. Gaussian kernels are known to quickly fall off with increasing distance to the position of the center of the peak. By suitably selecting parameters of the Gaussian kernel, the similarity measure can readily be adjusted according to the nature of the object of the depth map to be segmented.
In one embodiment, the method further comprises a step of adjusting a spread of the Gaussian kernel in dependence of a priori knowledge on the object to be segmented, wherein the step is carried out prior to determining the elements of the similarity matrix. In this way, a desired result for segmentation of an object of the depth map can quickly be accomplished.
Preferably, the step of determining elements of the two-dimensional similarity matrix includes multiplying the at least two similarity measures. By that, the higher probability of two distinct vertices that are similar in two distinct ways to be part of the same object of the depth map can be exploited for the step of clustering of pairs of vertices.
In a preferred embodiment, the step of clustering pairs of vertices of the plurality of vertices includes applying a graph-connected component algorithm to the edge matrix. The graph-connected component algorithm iteratively finds pairs of vertices which
are directly or indirectly, i.e. through a path, connected to each other and provides these pairs of vertices as an output. Graph-connected component algorithms are known from the field of graph manipulation and are described for instance in John Hopcroft and Robert Tarjan, "Efficient Algorithms for Graph Manipulation", Communications of the ACM J_6, p. 372-378 (1973), which hereby shall be incorporated by reference. One advantage of this embodiment lies in that the number of connected vertices need not be specified initially.
in another preferred embodiment, the step of generating the two-dimensional similarity matrix comprises sampling vertices from the depth map in a uniform sampling interval and determining elements of the similarity matrix for the sampled vertices only.
Further, the step of clustering the pairs of vertices of the plurality of vertices comprises applying a graph-connected component algorithm and labeling the remaining vertices of the plurality of vertices using nearest neighbor criteria.
In this way, the step of generating the two-dimensional similarity matrix can be carried out in an efficient way by reducing a number of vertices of the mathematical graph.
Advantageously, a size of the uniform sampling interval is adjusted based on a size of the object to be segmented. For large objects, the depth map can be sampled more sparsely, resulting in a very efficient segmentation method.
Preferably, the labeling of the remaining vertices is carried out by employing a median filter.
It is another object of the invention to provide an image segmentation device for segmenting objects from a background of the depth map, including at least one digital memory unit, at least one processor unit, and
a data link for exchanging data between the at least one digital memory unit and the at least one processor unit.
The digital memory unit is configured to store at least one program code comprising converted steps of any embodiment of the method disclosed herein, and the processor unit is configured to execute the at least one program code.
In yet another aspect of the present invention, a software module is provided for carrying out steps of any embodiment of the disclosed method of segmenting objects from a background of a depth map. The method steps to be conducted are converted into a program code of the software module, wherein the program code is implementable in a digital memory unit and is executable by a processor unit.
The software module can enable a robust and reliable execution of the method and can allow for a fast modification of method steps.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter. Such embodiment does not necessarily represent the full scope of the invention, however, and reference is made therefore to the claims and herein for interpreting the scope of the invention.
In the drawings:
Fig. 1 schematically shows an image segmentation device in accordance with the invention for segmenting objects from a background of a depth map,
Fig. 2 is a flowchart of an embodiment of a method in accordance with the invention of segmenting objects from the background of tht? depth map pursuant to Fig. 1,
Fig. 3(a) to 3(f) illustrate results of a performance test of an embodiment of the method in accordance with the invention, and
Figs. 4{A) to to 4(d show results of another performance test of the embodiment of the method pursuant to Fig. 3.
DETAILED DESCRIPTION OF EMBODIMENTS
Fig. 1 schematically shows an image segmentation device 10 in accordance with the invention for segmenting objects from a background of a depth map 24, and a time-of-flight camera 20 comprising a photonic mixer device (PMD) with a plurality of image elements formed by pixels. The image segmentation device 10 and the time-of-flight camera 20 are connected by a data transmission line 22 for transmitting depth map data acquired by the time-of-flight camera 20.
The image segmentation device 10 includes 3 digital memory unit 12 designed as a random access memory (RAM), a processor unit 14, and several data links 18 for exchanging data between the digital memory unit 12 and the processor unit 14.
In the following, an embodiment of a method of segmenting objects from a background of a depth map 24 is described. A flow chart of the embodiment of the method is given in Fig. 2. In preparation of operating the image segmentation device 10, it shall be understood that all involved units and devices are in an operational state and configured as' illustrated in Fig. 1.
In order to be able to carry out the embodiment of the method of segmenting objects, the image segmentation device 10 comprises a software module 16 that is implemented in the digital memory unit 12. The method steps 26-38 to be conducted are converted into a program code of the software module 16 and are executable by the processor unit 14 of the image segmentation device 10.
After the depth map 24 has been acquired by the time-of-flight camera 20, data representing the depth map 24 are transferred upon request by the image segmentation device 10 via the data transmission line 22 to the image segmentation device 10 in a preceding step 26 of the method (Fig. 2). The data representing the depth map 24 include the optical image data of the plurality of pixels and the depth values D of each pixel of the plurality of pixels. Data concerning positions P of the pixels of the plurality of pixels in the form of spatial coordinates may be transferred together with the optical image data and the depth values D, or they may be stored in the digital memory unit 12 or another permanent digital memory unit of the image segmentation device 10.
In a first step 28 of the method, the data representing the depth map 24 are uniformly sampled with a constant sampling interval.
In a next step 30 then, a plurality of vertices V of a mathematical graph is generated from the sampled data by combining spatial coordinates of an image element of the plurality of image elements with its corresponding depth value Dj, wherein each vertex Vj of the plurality of vertices V is uniquely assigned to one image element of the plurality of image elements.
The vertex Vj is thus formed according to
Vi=[xi,yi,Di]
wherein x; and yi denote the x-coordinate and the y-coordinate of the image element i, i.e. the pixel i, respectively, and Di represents the depth value at the image element i. In this way, the number of vertices Vj is equal to the number of image elements sampled from the depth map 24.
Next, a two-dimensional similarity matrix K is generated. Each element of the similarity matrix K is uniquely assigned to two distinct vertices Vj, V,, (i ^ j) of the plurality of vertices V and is determined by combining values obtained from applying two similarity measures to all potential pairs of distinct vertices Vj, Vj of the plurality of vertices V. As will be presented in the following, the two similarity measures are indicative of the similarity with regard to two distinct parameters.
The first applied similarity measure space_similarity(ij) is indicative of a similarity of pairs of distinct vertices Vi, Vj with regard to d same parameter given by a spatial distance between the two image elements assigned to the two distinct vertices Vj, Vj and includes a Gaussian kernel:
Herein, Pi and Pj indicate the positions of vertices Vj and V,, respectively, expresses by their x-coordinate and y-coordinate. denotes the Eucledian norm of x, and of denotes a spread
of the Gaussian kernel over space. A high value of the similarity measure space_similarity(ij) indicates that the image elements are spatially close.
The second applied similarity measure depthsimilarity (ij) is indicative of a similarity of pairs of distinct vertices Vi, Vj, (i^j) with regard to a same parameter given by a difference between depth values Di, Dj, of the two image elements assigned to the two distinct vertices Vj, Vj and also includes a Gaussian kernel:
σ2 denotes a spread of the Gaussian kernel over depth value D. A high value of the similarity measure depthjsimilarity(ij) indicates that the image elements have similar depth values Dj, Dj. Values of the two similarity measures space_similarity(i,j) and depth _similarity(ij) are calculated for all potential pairs of distinct vertices Vi, Vj of the plurality of vertices V in the next step 32 of the method.
Values for the spreads σt,σ2 of the Gaussian kernels are adjusted prior to determining the elements K(i,j) of the similarity matrix K in dependence of a priori knowledge on the object to be segmented.
For instance, if a user intends to segment a smooth object, the value of σ2 should be kept small while the value of σ2 can be set to an principally arbitrary, but still reasonable value. If the user intends to segment objects which are spatially distinct and are at approximately the same depth, then a value of σ1 should be kept small withσ2 set to an arbitrary, still reasonable value.
In the next step 34, elements K(i j) of the two-dimensional similarity matrix K are then obtained by multiplying the value of the first similarity measure and the value of the
second similarity measure, both values corresponding to the same pair of distinct vertices V,,
K(i,j) = space_similiarity(i,j) * depth_similarity(i,j) (3)
In the next step 36 of the method, a threshold criteria is applied to the similarity matrix elements K(iJ) to generate a two-dimensional edge matrix E. Each element E(i j) of the edge matrix E is uniquely assigned to a pair of two distinct vertices Vj, Vj of the plurality of vertices V, and has a non-zero value if the element of the edge matrix E is equal to or exceeds a pre-determined threshold value t, and has a value of zero otherwise:
In the next step 38 of the method, a graph-connected component algorithm, for instance an algorithm described in the above-mentioned paper by John Hopcroft and Robert Tarjan, is applied to the edge matrix E. The graph-connected component algorithm iteratively finds and labels the vertices Vi, Vj that are directly or indirectly (i.e. through a path) connected to each other, and outputs all such connected components.
In a final step 40 of the method, median filtering is used to assign the remaining, unlabeled vertices to a nearest cluster label. The clustering finally leads to defining objects imaged by the depth map 24.
Figs. 3(a) to 3(f) illustrate results of a performance test of an embodiment of the method in accordance with the invention. The method is applied to depth maps 241, 242, 243 of a scenario including several planes where there is gradual depth variation within a plane and sharp depth variations across'different planes.
Figs. 3(a), 3(c) and 3(e) show depth maps 241, 242, 243 taken from the "Middlebury" (Middlebury College, Vermont (USA)) dataset (D. Scharstein and R. Szeliski, ''A taxonomy and evaluation of dense two-frame stereo correspondence algorithms". International Journal of Computer Vision 47: p. 7-42 (2002)). As can be noticed, neighboring points that belong to the same plane have similar depth value. Therefore, to segment planes in this scenario, the spread of the Gaussian kernel σ2 has been set to a small value. There is no stringent constraint on a value for σ1, and it can be set to an arbitrary but still reasonable value. In this embodiment, σ1 has been set to 30 and σ2 to 5. However, very similar results were obtained when values of σ1 were varied in the range between 10 and 100.
Figs. 3(b), 3(d) and 3(f) show the corresponding plane segmentation output. Each plane is represented with a unique gray scale. It can be noted that, as a graph-connected component does not necessarily need a direct edge between pair of pixels (vertices) for it to be labeled in the same cluster, the pixels which are spatially distinct are also assigned to the same plane.
Figs. 4(a) to 4(d) depict results of another performance test of the embodiment of the method pursuant to Fig. 3. In this test, a zero mean Gaussian noise of standard deviation 10 was manually added to a depth map 244. A watershed segmentation algorithm from MATLAB® has been applied to the depth map 244 for comparison.
The depth map 244 contains a single plane and it is therefore expected that a plane segmentation approach assigns an identical label to every pixel. For the noise-free depth map 244 shown in Fig. 4(a), the watershed segmentation algorithm and the method in accordance with the invention performed equally well and, as expected, assigned a unique label to all pixels.
Fig. 4(b) shows the input depth map 244 with noise added. Fig. 4(c) shows the segmentation result of the watershed algorithm. Each segment is represented with a unique gray scale. It can be seen that due to presence of noise watershed wrongly divides an image into a large number of segments. Fig. 4(d) shows the result of applying the method in accordance with the invention, which still assigns an identical label to all the pixels of the depth map. As a consequence of the graph-based method which performs global segmentation based on local similarity measures, a higher level of robustness in the presence of noise is achieved compared to the watershed approach. Also, the method in accordance with the invention implicitly finds the number of segments present in the depth map data, as opposed to methods such as region growing and K-means clustering, where the number of segments is required as an input.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not
indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
REFERENCE SYMBOL LIST
10 image segmentation device
12 digital memory unit
14 processor unit
16 software module
18 data link
20 time-of-flight camera
22 data transmission line
24 depth map
26 step of retrieving data
28 step of uniformly sampling
30 step of generating vertices
32 step of calculating space and
depth similarity measures
34 step of generating similarity
matrix
36 step of applying threshold
criteria
38 step of applying graph-
connected component algorithm
40 step of using median filtering
D depth value
E edge matrix
K similarity matrix
P position of image element
t threshold value
Vj vertex i
We claim:
1. A method of segmenting objects from a background of a depth map (24), the depth
map (24) comprising optical image data of a plurality of image elements and a depth
value (D) at each image element of the plurality of image elements, the method including
steps of
generating (30) a plurality of vertices (V) of a mathematical graph by combining spatial coordinates of an image element of the plurality of image elements with its corresponding depth value (D), wherein each vertex (Vj, Vj) of the plurality of vertices (V) is uniquely assigned to one image element of the plurality of image elements,
generating (34) a two-dimensional similarity matrix (K), wherein each element (K(ij)) of the similarity matrix (K) is uniquely assigned to two distinct vertices (V;, Vj) of the plurality of vertices (V), and is determined by combining values obtained from applying at least two similarity measures to pairs of distinct vertices (Vi Vj) of the plurality of vertices (V), wherein each similarity measure of the at least two similarity measures is indicative of a similarity of values of the pairs of distinct vertices (Vi Vj) of the plurality of vertices (V) with regard to the same parameter, and wherein the at least two similarity measures are indicative of the similarity with regard to at least two distinct parameters,
generating (36) a two-dimensional edge matrix (E), wherein each element of the edge matrix is uniquely assigned to a pair of two distinct vertices (Vj, Vj) of the plurality of vertices (V), and has a non-zero value if the element of the similarity matrix (E) is equal to or exceeds a pre-determined threshold value (t), and has a value of substantially zero otherwise, and
defining (38) at least one object imaged in the depth map (24) by clustering the pairs of vertices (Vj, Vj) of the plurality of vertices (V) for which the edge matrix element has a non-zero value.
2. The method as claimed in claim 1, wherein the method steps to be conducted are converted into a computer program code that is implementable in a digital memory unit (12) of a computer device (10) and is executable by a processor unit (14) of the computer device (10).
3. The method as claimed in claim 1 or 2, further comprising a preceding step (26) of retrieving the depth map (24) from a time-of-flight camera (20).
4. The method as claimed in any one of the preceding claims, wherein the step of
generating (36) the two-dimensional similarity matrix (K) includes that one similarity
measure of the at least two similarity measures is indicative of the similarity of values of the
two distinct vertices (Vj, Vj) of the plurality of vertices (V) with regard to a parameter given
by a spatial distance between the two image elements assigned to the two distinct
vertices (Vj, Vj).
5. The method as claimed in any one of the preceding claims, wherein the step of generating (36) the two-dimensional similarity matrix (K) includes that one similarity measure of the at least two similarity measures is indicative of the similarity of values of the two distinct vertices (Vj, Vj) of the plurality of vertices (V) with regard to a parameter given by a difference between the depth values (D„ Dj) of the two image elements assigned to the two distinct vertices (Vj, Vj).
6. The method as claimed in any one of the preceding claims, wherein the step of determining (34) elements (K(i,j)) of the two-dimensional similarity matrix (K) includes applying at least one similarity measure of theat least two similarity measures that comprises a Gaussian kernel.
7. The method as claimed in claim 6, further comprising a step of adjusting a spread of the Gaussian kernel in dependence of a priori knowledge on the object to be segmented, wherein the step is carried out prior to the step of determining (34) the elements (K(ij)) of the similarity matrix (K).
8. The method as claimed in any one of the preceding claims, wherein the step of determining (34) elements (K(ij)) of the two-dimensional similarity matrix (K) includes multiplying the at least two similarity measures.
9. The method as claimed in anyone of the preceding claims, wherein the step of clustering (38) pairs of vertices (Vi, Vj) of the plurality of vertices (V) includes applying a graph-connected component algorithm to the edge matrix (E).
10. The method as claimed in any one of the preceding claims, wherein
the step of generating (34) the two-dimensional similarity matrix (K) comprises sampling vertices (Vj, Vj) from the depth map (24) in a uniform sampling interval and determining elements (K-(i,j)} of the similarity matrix (K) for the sampled vertices only, and wherein
the step of clustering (38) the pairs of vertices (V„ Vj) of the plurality of vertices (V) comprises applying a graph-connected component algorithm and labeling the remaining vertices of the plurality of vertices (V) using nearest neighbor criteria.
11. An image segmentation device (10) for segmenting objects from a background of a
depth map (24), including
at least one digital memory unit (12),
at least one processor unit (14): and
a data link (18) for exchanging data between the at least one digital memory unit (12)
and the at least one processor unit (14), wherein the digital memory unit (12) is configured to store at least one program code comprising converted method steps of the method as claimed in any one of claims 1 to 10, and wherein the processor unit (14) is configured to execute the at least one program code.
12. A software module (16) for carrying out the method as claimed in any one of claims 1
to 10, wherein the method steps to be conducted are converted into a program code of the
software module (16), and wherein the program code is implementable in a digital memory
unit (12) and is executable by a processor unit (14).
| # | Name | Date |
|---|---|---|
| 1 | 3187-MUM-2014-Correspondence to notify the Controller [21-11-2023(online)].pdf | 2023-11-21 |
| 1 | 3187-MUM-2014-PA [08-09-2017(online)].pdf | 2017-09-08 |
| 2 | 3187-MUM-2014-ASSIGNMENT DOCUMENTS [08-09-2017(online)].pdf | 2017-09-08 |
| 2 | 3187-MUM-2014-US(14)-HearingNotice-(HearingDate-22-11-2023).pdf | 2023-10-31 |
| 3 | 3187-MUM-2014-ABSTRACT [29-07-2020(online)].pdf | 2020-07-29 |
| 3 | 3187-MUM-2014-8(i)-Substitution-Change Of Applicant - Form 6 [08-09-2017(online)].pdf | 2017-09-08 |
| 4 | ABSTRACT1.jpg | 2018-08-11 |
| 4 | 3187-MUM-2014-CLAIMS [29-07-2020(online)].pdf | 2020-07-29 |
| 5 | 3187-MUM-2014-Power of Attorney-020315.pdf | 2018-08-11 |
| 5 | 3187-MUM-2014-DRAWING [29-07-2020(online)]-1.pdf | 2020-07-29 |
| 6 | 3187-MUM-2014-ORIGINAL UNDER RULE 6 (1A)-140917.pdf | 2018-08-11 |
| 6 | 3187-MUM-2014-DRAWING [29-07-2020(online)].pdf | 2020-07-29 |
| 7 | 3187-MUM-2014-FORM 5.pdf | 2018-08-11 |
| 7 | 3187-MUM-2014-FER_SER_REPLY [29-07-2020(online)].pdf | 2020-07-29 |
| 8 | 3187-MUM-2014-FORM 3.pdf | 2018-08-11 |
| 8 | 3187-MUM-2014-FORM 3 [29-07-2020(online)].pdf | 2020-07-29 |
| 9 | 3187-MUM-2014-FORM 2.pdf | 2018-08-11 |
| 9 | 3187-MUM-2014-OTHERS [29-07-2020(online)].pdf | 2020-07-29 |
| 10 | 3187-MUM-2014-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 10 | 3187-MUM-2014-PETITION UNDER RULE 137 [29-07-2020(online)].pdf | 2020-07-29 |
| 11 | 3187-MUM-2014-FER.pdf | 2020-01-30 |
| 11 | 3187-MUM-2014-FORM 1.pdf | 2018-08-11 |
| 12 | 3187-MUM-2014-ABSTRACT.pdf | 2018-08-11 |
| 12 | 3187-MUM-2014-Form 1-020315.pdf | 2018-08-11 |
| 13 | 3187-MUM-2014-CLAIMS.pdf | 2018-08-11 |
| 13 | 3187-MUM-2014-DRAWING.pdf | 2018-08-11 |
| 14 | 3187-MUM-2014-Correspondence-020315.pdf | 2018-08-11 |
| 14 | 3187-MUM-2014-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 15 | 3187-MUM-2014-CORRESPONDENCE.pdf | 2018-08-11 |
| 16 | 3187-MUM-2014-Correspondence-020315.pdf | 2018-08-11 |
| 16 | 3187-MUM-2014-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 17 | 3187-MUM-2014-DRAWING.pdf | 2018-08-11 |
| 17 | 3187-MUM-2014-CLAIMS.pdf | 2018-08-11 |
| 18 | 3187-MUM-2014-Form 1-020315.pdf | 2018-08-11 |
| 18 | 3187-MUM-2014-ABSTRACT.pdf | 2018-08-11 |
| 19 | 3187-MUM-2014-FER.pdf | 2020-01-30 |
| 19 | 3187-MUM-2014-FORM 1.pdf | 2018-08-11 |
| 20 | 3187-MUM-2014-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 20 | 3187-MUM-2014-PETITION UNDER RULE 137 [29-07-2020(online)].pdf | 2020-07-29 |
| 21 | 3187-MUM-2014-FORM 2.pdf | 2018-08-11 |
| 21 | 3187-MUM-2014-OTHERS [29-07-2020(online)].pdf | 2020-07-29 |
| 22 | 3187-MUM-2014-FORM 3 [29-07-2020(online)].pdf | 2020-07-29 |
| 22 | 3187-MUM-2014-FORM 3.pdf | 2018-08-11 |
| 23 | 3187-MUM-2014-FER_SER_REPLY [29-07-2020(online)].pdf | 2020-07-29 |
| 23 | 3187-MUM-2014-FORM 5.pdf | 2018-08-11 |
| 24 | 3187-MUM-2014-DRAWING [29-07-2020(online)].pdf | 2020-07-29 |
| 24 | 3187-MUM-2014-ORIGINAL UNDER RULE 6 (1A)-140917.pdf | 2018-08-11 |
| 25 | 3187-MUM-2014-Power of Attorney-020315.pdf | 2018-08-11 |
| 25 | 3187-MUM-2014-DRAWING [29-07-2020(online)]-1.pdf | 2020-07-29 |
| 26 | ABSTRACT1.jpg | 2018-08-11 |
| 26 | 3187-MUM-2014-CLAIMS [29-07-2020(online)].pdf | 2020-07-29 |
| 27 | 3187-MUM-2014-ABSTRACT [29-07-2020(online)].pdf | 2020-07-29 |
| 27 | 3187-MUM-2014-8(i)-Substitution-Change Of Applicant - Form 6 [08-09-2017(online)].pdf | 2017-09-08 |
| 28 | 3187-MUM-2014-US(14)-HearingNotice-(HearingDate-22-11-2023).pdf | 2023-10-31 |
| 28 | 3187-MUM-2014-ASSIGNMENT DOCUMENTS [08-09-2017(online)].pdf | 2017-09-08 |
| 29 | 3187-MUM-2014-PA [08-09-2017(online)].pdf | 2017-09-08 |
| 29 | 3187-MUM-2014-Correspondence to notify the Controller [21-11-2023(online)].pdf | 2023-11-21 |
| 1 | 3187MUM2014_09-01-2020.pdf |