Abstract: A system and apparatus for evaluating the reliability of a watermarking technique for a video sequence, said system and apparatus comprising: a) a calibrating means adapted to calibrate said system and apparatus using a test original video sequence and a test attacked video sequence in compliance with predetermined degrees of quality; b) an attacking means adapted to attack embedded watermark in said video sequence, said attack selected from a group of attacks; and c) an evaluating means including a comparator means for comparing the test original video sequence with test attacked video sequence and the watermark in the test original video sequence with the watermark in the test attacked video sequence to evaluate the reliability of a watermarking technique to give a reliability score.
FORM - 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
PROVISIONAL
Specification
(See section 10 and rule 13)
WATERMARKING IN A VIDEO CODING DECODING
TATA CONSULTANCY SERVICES LIMITED
an Indian Company
of Bombay House, 24, Homi Mody Street, Mumbai 400 001,
Maharashtra, India
THE FOLLOWING SPEC IFICATION DESCRIBES THE INVENTION.
Field of invention:
This invention relates to computer systems.
In particular, this invention relates to computer systems in both wireless and wire line communication systems.
Still particularly, this invention relates to a hardware system of decoding of LDPC-coded bitstream in both wireless and wireline communication systems.
In general, the system in accordance with this invention can be used in any decoder for communication systems where regular LDPC codes have been used. The decoder system exploits the projective geometry regularity in certain subset of Tanner graphs. However, following the industry practice of joint code-decoder design, the system can be used specifically with LDPC codes arising out of projective planes in mind. The decoder is fully generic in the sense that it can be used to decode codes of any length, provided the length is of the type (ps + 1), where p is a prime number and s any positive integer.
Background of invention:
Introduction:
It is known that digital data transmission suffers from channel influence such
as:
1. Severe (multipath) fading in terrestrial mobile radio communications.
2
Summary of the Invention:
In accordance with this invention, a novel method and apparatus is presented to measure and evaluate an attack by comparing the attacked video with original watermarked video stream.
Usually any binary image or any text message is used as embedding information during watermarking. In the method in accordance with this invention, inserted binary image and/ or text message is compared with retrieved and/ or text message using multifactorial methods. The obtained score is compared with the results obtained by Mean Opinion Score (MOS), which is purely based on Human Vision Psychology (HVS), and finally assigning a fuzzy membership value to each class of watermark.
Proliferation of digital storage and Internet based multimedia applications leads to threaten video content industry due to illegal and unauthorized copy of multimedia data. This increases the need for devising copyright protection and authentication measures. As a consequence of these requirements, well-defined water marking attack evaluation technique is the need of the day.
Any watermarking scheme can be evaluated by its performance measured in terms of its complexity and robustness against attacks. An attack to watermarking system can be defined as the technique to remove or change the hidden data in the video bitstream. Hence, in any watermarking evaluation technique, it is essential to concentrate on attacks on watermarking scheme by simulating the process of any end user trying to
3
remove or destroy the hidden information embedded into the video stream. Moreover, the evaluation method should also provide a measure of goodness of watermarking scheme in terms of robustness of the scheme and resultant video quality after the attack.
In the system in accordance with this invention, a measure is devised that can be used to evaluate the attack as well as the watermarking scheme. The watermarking scheme is and the attacking methods are evaluated in three ways.
First, a comparison is done of the attacked stream with watermarked stream. A total of seventeen odd parameters are used to measure the goodness of video quality which are then unified using multifactorial approach. Users, typically Twenty users (15 men and 5 women), are asked to rate the attacked video quality as per their perception into 4 classes, to judge the attacked streams.
Secondly, the retrieved and embedded binary images and texts are compared using statistical parameters.
Finally, the goodness of the method in terms of observations are observed.
The invention therefore envisages a method of using different pixel based metrics (10 parameters) for finding the video distortion.
4
Typically, the method involves using different feature-based scores (3 for image and 2 for text) for checking the retrieved text or image quality.
The method envisages evaluating the goodness of attack using three different ways, viz. Comparison of attacked stream against original watermarked stream, Comparison of retrieved binary image against original binary image and Comparison of retrieved Text against original Text.
The method therefore uses multifactorial approach to unify measure of goodness in 1,2 and 3 and arriving at a unified set of goodness values.
The method further involves using mean-opinion-score to classify the results of 4 into adjective factors depicting the perceptual quality.
In accordance with a preferred embodiment of the invention, the method involves combining any of the aforesaid steps with one another to arrive at a conclusion for the overall measure of goodness.
The method of this invention can specifically be used for evaluation in the, context of a H.264 based compressed domain watermarking scheme.
Further, the method of evaluation using a similar methodology specified above can be used for any video-watermarking scheme applied in the context of any compressed or uncompressed video.
5
The invention will now be described with reference to the accompanying drawings, in which
Figure 1 shows Table 1: List of Attacks;
Figure 2 shows Table 2: Decision making process based on different
parameters;
Figure 3 shows Table 3: Values in Video Quality Matrix After Attack;
Figure 4 shows Table 4: Evaluation of Video Quality After Attack;
Figure 5 shows Table 5: Evaluation of retrieved binary image against
original binary image;
Figure 6 shows Table 6: Evaluation of retrieved Text against original Text;
and
Figure 7 shows Table 7; Conclusion of the evaluation
Detailed Description of Invention:
Embedded watermarked message is normally either textual information or binary image. Usually a logo is embedded using image watermarking and information like name, IP address, time stamp etc. are embedded using as text watermarking.
The attack can be evaluated by how it can remove the watermarked data without hampering the original video data. So the goodness of attack should be characterized by:
Distortion of the video sequence after the attack.
6
Error in the embedded information retrieved by watermark detector after
attack.
Moreover the robustness of the watermarking method also can be judged by
comparing original video data and watermarked video data. The metrics for
judging the goodness of watermarking scheme is listed below:
Pixel Based Metrics for video:
A list a total of 10 measures to judge the video quality are as follows: Difference Distortion Metrics:
1. Average Absolute Difference (AAD)
2. Mean Square Error (MSE)
3. Normalised Mean Square Error (NMSE)
4. Laplacian Mean Square Error (LMSE)
5. Signal to Noise Ratio (SNR)
6. Peak Signal to Noise Ratio (PSNR)
7. Image Fidelity (IF) Others:
8. Structural Content (SC)
9. Global Sigma Signal to Noise Ratio (GSSNR)
10. Histogram Similarity (HS)
There are other parameters like Maximum Difference, Norm, Average Absolute Difference, L -Norm, Normalised Cross-Correlation, Correlation Quality, Sigma Signal to Noise Ratio, Sigma to Error Ratio etc. which are used as metrics for distortion in Image domain, but it was
7
found that they do not convey much information as far as video quality is concerned.
Feature based score for retrieved image:
The retrieved image/message can be judged to be good or bad by checking some of its features, too. The list of features that can be tested are listed below:
Check for images:
1. Centroid deviation of Is and Os - Most of the cases binary images are used as watermark image. The checking algorithm is as follows:
Compute the Centroid of 1 s and Os for the image that has been used as watermark,
Compute the same for retrieved image,
Now find the deviation for black pixels and white pixels.
2. Run length feature - If image of some text is inserted as the
watermark, Runlength of black/white pixels is a feature by which it can be recognized. The checking algorithm is as follows:
Compute the Runlength of white/black (0/1) for each row of original image,
Compute the Runlength of white/black (0/1) for each row of retrieved image.
3. Crossing count feature - If image of some text is inserted as the watermark, number of 0 to 1 transition or 1 to 0 transitions is an interesting feature. The checking algorithm can be:
8
Compute the 0 to 1 and 1 to 0 transitions for each row of original image,
Compute the 0 to 1 and 1 to 0 transitions for each column of original image.
Check for text message:
1. Hamming distance between texts - In information theory, the Hamming distance between two strings of equal length is the number of positions for which the corresponding symbols are different. It measures the number of edit required to change one into the other, or the number of errors that transformed one string into the other.
2. Levenshtein distance - In information theory and computer science, the Levenshtein distance or edit distance between two strings is given by the minimum number of operations needed to transform one string into the other, where an operation is an insertion, deletion, or substitution of a single character. It is named after Vladimir Levenshtein, who considered this distance in 1965. It is useful in applications that need to determine how similar two strings are, such as spell checkers. It can be considered a generalization of the Hamming distance, which is used for strings of the same length and only considers substitution edits. There are also further generalizations of the Levenshtein distance that consider, for example, exchanging two characters as an operation, like in the Damerau-Levenshtein distance algorithm.
9
Method of Evaluation:
A single point decision from a number of feature values described above can be arrived at. For the analysis all the well-known attacks are considered, the list of which is given in table 1.
There are three different ways of evaluation -
1. Evaluation of attacked stream against original watermarked stream
2. Evaluation of retrieved binary image against original binary image
3. Evaluation of retrieved Text against original Text
In the following sections we describe each of the evaluation methods in detail and then try to derive a single measure of goodness based on the results of the evaluation through these three methods.
Evaluation of attacked stream against original watermarked stream: In this scheme, each of the frames of the attacked video sequence is evaluated against each of watermarked video sequence. Each frame is judged by using 10 different parameters (AAD, GSSNR, LMSE, MSE, PSNR, HS, IF, NMSE, SC, SNR). But these parameters give some mere values from which it is not easy to conclude anything. Hence there is a need to use the formulation of multifactorial based approach where different parameters are used to make a decision.
The evaluation method of assigning fuzzy value based on multifactorial analysis is given in detail below -
1. Compare identical image files to get these parameter values and treat this data set as best case data.
2. Compare two completely different image files to get the same parameter values and treat this data set as worst-case data.
10
3. Compare the original and a compressed-decompressed bitstream and get the values. We treat this data set as the average case.
4. These three sets of parameters are used as benchmark figures.
5. All parameters are assigned to values between 0 (for worst) to 5(for best) based on these observed values.
6. From the above these three sets of values it is observed that some parameter values (like AAD, GSSNR, LMSE, MSE, PSNR) vary largely for three types of cases. But other values vary not so abruptly.
7. In this method the formula, VALUE ((AAD+GSSNR+LMSE+MSE+PSNR)*3+HS+IF+NMSE+SC+SNR) , is used where the multiple factors are used to assign a single value using multifactorial approach.
8. 20 users (15 men and 5 women) are requested to judge attacked and original watermarked video sequence based on their perception. This judgement is purely based on human vision psychology (HVS). All these opinions are summed up in Mean Opinion Score (MOS).
9. This test is performed on the basis of 14 test streams.
10.A fuzzy value is assigned to the parameter " Cqual " based on VALUE that matches the result obtained from HVS. The method used is like IF(VALUE>=90), Cqual, = "Excellent",
IF(VALUE>=80), Cqual = "Good",
IF(VALUE>=75), Cqual, = "Average",
IF(VALUE>=70), Cqual = "Bad",
ELSE Cqual = "Poor"
Evaluation of retrieved binary image against original binary image: In this scheme each of the frames of the attacked video sequence is evaluated
11
against each of watermarked video sequence. Each frame is judged by using the parameters like bit error, deviation of Centroid and difference in crossing count. The method is given in details below:
1. Compute Euclidian distance (d ) of Centroid of black pixels of retrieved and original binary image.
2. If the resolution of the embedded binary image is height (h) cross width (w), the deviation parameter (de) is computed as:
3. Bit error (be) is the number of bits differing between retrieved and original binary image represented in percentage.
4. If c be the difference in crossing count of 0 to 1 of original and retrieved binary image, Crossing count error (ce) is defined as:
5. Error in retrieved image is defined as:
6. The conclusion can be drawn as follows :
If e< 0.5 Cimg is Excellent If e>0.5 Cimg is Good
If e> 5 Cimg is Medium
If e> 10 Cimg is Bad
If e > 15 Cimg is Poor
7. This rating is based on the MOS already defined.
12
Evaluation of retrieved Text against original Text;
For evaluating the goodness of retrieved text message against the original
message, w two distance parameters are considered: hamming distance
(h) and Levensthein distance (l).
The mean error (te) is computed as:
The conclusion can be drawn as follows.
If te < 0.5 Ctxt is Excellent
If te >0.5 Ctxt is Good
If te > 1 Ctxt is Medium
If te >3 Ctxt is Bad
If/, > 5 Ctxt is Poor
Aggregation of the above three methods for single point decision making: In descriptive terms, the aggregation can be performed using the concept that if there is no significant degradation in video quality and the retrieved watermarked information does not contain significant errors, then the watermarking scheme has a high measure of goodness. The aggregation methodology is summarized in Table 2 of the accompanying drawings.
The results obtained by applying some attacks of Stirmark on the watermarking scheme. In Table 3 the values of different image quality-indicating parameters to compute the VALUE. In table 4 the adjective factors are populated from VALUE. Similarly Table 5 and Table 6 show the conclusion on retrieved binary image against original binary image and the
13
conclusion on retrieved text message is drawn respectively. Table 7 shows the conclusion about the goodness of the watermarking method and the attack technique.
Although the invention has been described in terms of particular embodiments and applications, one of ordinary skill in the art, in light of this teaching, can generate additional embodiments and modifications without departing from the spirit of or exceeding the scope of the claimed invention. Accordingly, it is to be understood that the drawings and descriptions herein are proffered by way of example to facilitate comprehension of the invention and should not be construed to limit the scope thereof.
Dated this 5th day of February, 2007.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 207-MUM-2007-OTHER DOCUMENT(10-11-2014).pdf | 2014-11-10 |
| 1 | 207-MUM-2007-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 2 | 207-MUM-2007-CORRESPONDENCE(10-11-2014).pdf | 2014-11-10 |
| 2 | 207-MUM-2007-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 3 | 207-MUM-2007-RELEVANT DOCUMENTS [29-09-2021(online)].pdf | 2021-09-29 |
| 3 | 207-MUM-2007-ANNEXURE TO FORM 3(10-11-2014).pdf | 2014-11-10 |
| 4 | 207-MUM-2007-SPECIFICATION(AMENDED)(18-12-2015).pdf | 2015-12-18 |
| 4 | 207-MUM-2007-RELEVANT DOCUMENTS [29-03-2020(online)].pdf | 2020-03-29 |
| 5 | 207-MUM-2007-REPLY TO HEARING(18-12-2015).pdf | 2015-12-18 |
| 5 | 207-MUM-2007-RELEVANT DOCUMENTS [23-03-2019(online)].pdf | 2019-03-23 |
| 6 | 207-MUM-2007-POWER OF ATTORNEY (18-12-2015).pdf | 2015-12-18 |
| 6 | 207-MUM-2007-ABSTRACT(19-5-2015).pdf | 2018-08-09 |
| 7 | 207-MUM-2007-MARKED COPY(18-12-2015).pdf | 2015-12-18 |
| 7 | 207-mum-2007-abstract(5-2-2008).pdf | 2018-08-09 |
| 8 | 207-MUM-2007-FORM 2 (TITLE PAGE) (18-12-2015).pdf | 2015-12-18 |
| 8 | 207-MUM-2007-ABSTRACT(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 9 | 207-MUM-2007-ANNEXURE TO FORM 3(1-11-2013).pdf | 2018-08-09 |
| 9 | 207-MUM-2007-FORM 13 (18-12-2015).pdf | 2015-12-18 |
| 10 | 207-mum-2007-claims(5-2-2008).pdf | 2018-08-09 |
| 10 | 207-MUM-2007-FORM 1 (18-12-2015).pdf | 2015-12-18 |
| 11 | 207-MUM-2007-CLAIMS(AMENDED)-(19-5-2015).pdf | 2018-08-09 |
| 11 | 207-MUM-2007-DRAWING(18-12-2015).pdf | 2015-12-18 |
| 12 | 207-MUM-2007-CLAIMS(18-12-2015).pdf | 2015-12-18 |
| 12 | 207-MUM-2007-CLAIMS(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 13 | 207-MUM-2007-ABSTRACT(18-12-2015).pdf | 2015-12-18 |
| 13 | 207-MUM-2007-CORRESPONDENCE(1-11-2013).pdf | 2018-08-09 |
| 14 | 207-MUM-2007-CORRESPONDENCE(16-4-2009).pdf | 2018-08-09 |
| 14 | Form 27 [20-03-2017(online)].pdf | 2017-03-20 |
| 15 | 207-mum-2007-correspondence(25-4-2008).pdf | 2018-08-09 |
| 15 | Other Patent Document [05-05-2017(online)].pdf | 2017-05-05 |
| 16 | 207-MUM-2007-CORRESPONDENCE(IPO)-(9-3-2016).pdf | 2018-08-09 |
| 16 | 207-MUM-2007-RELEVANT DOCUMENTS [28-03-2018(online)].pdf | 2018-03-28 |
| 17 | abstract1.jpg | 2018-08-09 |
| 17 | 207-MUM-2007-CORRESPONDENCE(IPO)-(DECISION)-(9-3-2016).pdf | 2018-08-09 |
| 18 | 207-MUM-2007-CORRESPONDENCE(IPO)-(FER)-(23-5-2014).pdf | 2018-08-09 |
| 18 | 207-MUM-2007_EXAMREPORT.pdf | 2018-08-09 |
| 19 | 207-MUM-2007-CORRESPONDENCE(IPO)-(HEARING NOTICE)-(2-11-2015).pdf | 2018-08-09 |
| 19 | 207-MUM-2007-SPECIFICATION(AMENDED)-(19-5-2015).pdf | 2018-08-09 |
| 20 | 207-MUM-2007-CORRESPONDENCE-21032016.pdf | 2018-08-09 |
| 20 | 207-MUM-2007-REPLY TO EXAMINATION REPORT(19-5-2015).pdf | 2018-08-09 |
| 21 | 207-mum-2007-correspondence-received.pdf | 2018-08-09 |
| 21 | 207-MUM-2007-OTHER DOCUMENT(1-11-2013).pdf | 2018-08-09 |
| 22 | 207-mum-2007-description (provisional).pdf | 2018-08-09 |
| 22 | 207-MUM-2007-ORIGINAL LETTERS PATENT-21032016.pdf | 2018-08-09 |
| 23 | 207-mum-2007-description(complete)-(5-2-2008).pdf | 2018-08-09 |
| 23 | 207-MUM-2007-MARKED COPY(19-5-2015).pdf | 2018-08-09 |
| 24 | 207-mum-2007-form-3.pdf | 2018-08-09 |
| 24 | 207-MUM-2007-DESCRIPTION(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 25 | 207-mum-2007-drawing(5-2-2008).pdf | 2018-08-09 |
| 25 | 207-mum-2007-form-26.pdf | 2018-08-09 |
| 26 | 207-MUM-2007-DRAWING(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 26 | 207-mum-2007-form-2.pdf | 2018-08-09 |
| 27 | 207-mum-2007-drawings.pdf | 2018-08-09 |
| 28 | 207-MUM-2007-FORM 1(19-5-2015).pdf | 2018-08-09 |
| 28 | 207-mum-2007-form-1.pdf | 2018-08-09 |
| 29 | 207-mum-2007-form 1(5-3-2007).pdf | 2018-08-09 |
| 29 | 207-mum-2007-form 5(5-2-2008).pdf | 2018-08-09 |
| 30 | 207-mum-2007-form 13(16-4-2009).pdf | 2018-08-09 |
| 30 | 207-MUM-2007-FORM 5(19-5-2015).pdf | 2018-08-09 |
| 31 | 207-MUM-2007-FORM 18(16-4-2009).pdf | 2018-08-09 |
| 31 | 207-mum-2007-form 3(25-4-2008).pdf | 2018-08-09 |
| 32 | 207-mum-2007-form 2(5-2-2008).pdf | 2018-08-09 |
| 32 | 207-MUM-2007-FORM 26(19-5-2015).pdf | 2018-08-09 |
| 33 | 207-MUM-2007-FORM 2(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 33 | 207-MUM-2007-FORM 2(TITLE PAGE)-(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 34 | 207-MUM-2007-FORM 2(TITLE PAGE)-(19-5-2015).pdf | 2018-08-09 |
| 34 | 207-mum-2007-form 2(title page)-(complete)-(5-2-2008).pdf | 2018-08-09 |
| 35 | 207-MUM-2007-FORM 2(TITLE PAGE)-(19-5-2015).pdf | 2018-08-09 |
| 35 | 207-mum-2007-form 2(title page)-(complete)-(5-2-2008).pdf | 2018-08-09 |
| 36 | 207-MUM-2007-FORM 2(TITLE PAGE)-(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 36 | 207-MUM-2007-FORM 2(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 37 | 207-MUM-2007-FORM 26(19-5-2015).pdf | 2018-08-09 |
| 37 | 207-mum-2007-form 2(5-2-2008).pdf | 2018-08-09 |
| 38 | 207-MUM-2007-FORM 18(16-4-2009).pdf | 2018-08-09 |
| 38 | 207-mum-2007-form 3(25-4-2008).pdf | 2018-08-09 |
| 39 | 207-mum-2007-form 13(16-4-2009).pdf | 2018-08-09 |
| 39 | 207-MUM-2007-FORM 5(19-5-2015).pdf | 2018-08-09 |
| 40 | 207-mum-2007-form 1(5-3-2007).pdf | 2018-08-09 |
| 40 | 207-mum-2007-form 5(5-2-2008).pdf | 2018-08-09 |
| 41 | 207-MUM-2007-FORM 1(19-5-2015).pdf | 2018-08-09 |
| 41 | 207-mum-2007-form-1.pdf | 2018-08-09 |
| 42 | 207-mum-2007-drawings.pdf | 2018-08-09 |
| 43 | 207-MUM-2007-DRAWING(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 43 | 207-mum-2007-form-2.pdf | 2018-08-09 |
| 44 | 207-mum-2007-drawing(5-2-2008).pdf | 2018-08-09 |
| 44 | 207-mum-2007-form-26.pdf | 2018-08-09 |
| 45 | 207-MUM-2007-DESCRIPTION(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 45 | 207-mum-2007-form-3.pdf | 2018-08-09 |
| 46 | 207-MUM-2007-MARKED COPY(19-5-2015).pdf | 2018-08-09 |
| 46 | 207-mum-2007-description(complete)-(5-2-2008).pdf | 2018-08-09 |
| 47 | 207-MUM-2007-ORIGINAL LETTERS PATENT-21032016.pdf | 2018-08-09 |
| 47 | 207-mum-2007-description (provisional).pdf | 2018-08-09 |
| 48 | 207-mum-2007-correspondence-received.pdf | 2018-08-09 |
| 48 | 207-MUM-2007-OTHER DOCUMENT(1-11-2013).pdf | 2018-08-09 |
| 49 | 207-MUM-2007-CORRESPONDENCE-21032016.pdf | 2018-08-09 |
| 49 | 207-MUM-2007-REPLY TO EXAMINATION REPORT(19-5-2015).pdf | 2018-08-09 |
| 50 | 207-MUM-2007-CORRESPONDENCE(IPO)-(HEARING NOTICE)-(2-11-2015).pdf | 2018-08-09 |
| 50 | 207-MUM-2007-SPECIFICATION(AMENDED)-(19-5-2015).pdf | 2018-08-09 |
| 51 | 207-MUM-2007-CORRESPONDENCE(IPO)-(FER)-(23-5-2014).pdf | 2018-08-09 |
| 51 | 207-MUM-2007_EXAMREPORT.pdf | 2018-08-09 |
| 52 | 207-MUM-2007-CORRESPONDENCE(IPO)-(DECISION)-(9-3-2016).pdf | 2018-08-09 |
| 52 | abstract1.jpg | 2018-08-09 |
| 53 | 207-MUM-2007-CORRESPONDENCE(IPO)-(9-3-2016).pdf | 2018-08-09 |
| 53 | 207-MUM-2007-RELEVANT DOCUMENTS [28-03-2018(online)].pdf | 2018-03-28 |
| 54 | 207-mum-2007-correspondence(25-4-2008).pdf | 2018-08-09 |
| 54 | Other Patent Document [05-05-2017(online)].pdf | 2017-05-05 |
| 55 | 207-MUM-2007-CORRESPONDENCE(16-4-2009).pdf | 2018-08-09 |
| 55 | Form 27 [20-03-2017(online)].pdf | 2017-03-20 |
| 56 | 207-MUM-2007-ABSTRACT(18-12-2015).pdf | 2015-12-18 |
| 56 | 207-MUM-2007-CORRESPONDENCE(1-11-2013).pdf | 2018-08-09 |
| 57 | 207-MUM-2007-CLAIMS(18-12-2015).pdf | 2015-12-18 |
| 57 | 207-MUM-2007-CLAIMS(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 58 | 207-MUM-2007-DRAWING(18-12-2015).pdf | 2015-12-18 |
| 58 | 207-MUM-2007-CLAIMS(AMENDED)-(19-5-2015).pdf | 2018-08-09 |
| 59 | 207-mum-2007-claims(5-2-2008).pdf | 2018-08-09 |
| 59 | 207-MUM-2007-FORM 1 (18-12-2015).pdf | 2015-12-18 |
| 60 | 207-MUM-2007-ANNEXURE TO FORM 3(1-11-2013).pdf | 2018-08-09 |
| 60 | 207-MUM-2007-FORM 13 (18-12-2015).pdf | 2015-12-18 |
| 61 | 207-MUM-2007-ABSTRACT(GRANTED)-(9-3-2016).pdf | 2018-08-09 |
| 61 | 207-MUM-2007-FORM 2 (TITLE PAGE) (18-12-2015).pdf | 2015-12-18 |
| 62 | 207-MUM-2007-MARKED COPY(18-12-2015).pdf | 2015-12-18 |
| 62 | 207-mum-2007-abstract(5-2-2008).pdf | 2018-08-09 |
| 63 | 207-MUM-2007-POWER OF ATTORNEY (18-12-2015).pdf | 2015-12-18 |
| 63 | 207-MUM-2007-ABSTRACT(19-5-2015).pdf | 2018-08-09 |
| 64 | 207-MUM-2007-REPLY TO HEARING(18-12-2015).pdf | 2015-12-18 |
| 64 | 207-MUM-2007-RELEVANT DOCUMENTS [23-03-2019(online)].pdf | 2019-03-23 |
| 65 | 207-MUM-2007-RELEVANT DOCUMENTS [29-03-2020(online)].pdf | 2020-03-29 |
| 65 | 207-MUM-2007-SPECIFICATION(AMENDED)(18-12-2015).pdf | 2015-12-18 |
| 66 | 207-MUM-2007-ANNEXURE TO FORM 3(10-11-2014).pdf | 2014-11-10 |
| 66 | 207-MUM-2007-RELEVANT DOCUMENTS [29-09-2021(online)].pdf | 2021-09-29 |
| 67 | 207-MUM-2007-CORRESPONDENCE(10-11-2014).pdf | 2014-11-10 |
| 67 | 207-MUM-2007-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 68 | 207-MUM-2007-OTHER DOCUMENT(10-11-2014).pdf | 2014-11-10 |
| 68 | 207-MUM-2007-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |