Sign In to Follow Application
View All Documents & Correspondence

Method And Apparatus For Automatic Shot Boundary Detection From Compress Domain Features Using Fuzzy Multifactor Based Approach.

Abstract: A system and a method for shot boundary detection using compressed domain information have been disclosed. The system uses the compressed domain information to extract features of an input video stream and further applies a fuzzy mutlifactorial approach to determine frames representing shot boundaries. As, the system utilizes the compressed domain information it saves time and minimizes complex computations and thus, detects shot boundaries in real time.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 November 2009
Publication Number
06/2012
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2022-09-13
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED.
NIRMAL BUILDING,9TH FLOOR, NARINAM POINT, MUMBAI-400 021, MAHARASHTRA, INDIA

Inventors

1. CHATTOPADHYAY TANUSHYAM
TATA CONSULTANCY SERVICES, BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO.A2 M2 & N2, BLOCK-EP, SALT LAKE ELECTRONICS COMPLEX, SECTOR-V, KOLKATA-700091, WEST BENGAL, INDIA.
2. AYAN CHAKI
TATA CONSULTANCY SERVICES, BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO.A2 M2 & N2, BLOCK-EP, SALT LAKE ELECTRONICS COMPLEX, SECTOR-V, KOLKATA-700091, WEST BENGAL, INDIA.

Specification

FORM 2
THEPATENTSACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See Section 10; Rule 13)
SYSTEM FOR AUTOMATIC SHOT BOUNDARY DETECTION
FROM COMPRESS DOMAIN FEATURES USING FUZZY
MULTIFACTOR BASED APPROACH
TATA CONSULTANCY SERVICES LTD.,
an Indian Company of of Nirmal Building, 9th Floor, Nariman Point, Mumbai - 400 021,
Maharashtra, India.
The following specification particularly describes the invention and the manner in which
it is to be performed

FIELD OF THE INVENTION
The present invention relates to the field of telecommunication and multimedia.
Particularly, the present invention relates to the field of automatic shot boundary detection.
DEFINITIONS OF TERMS USED IN THE SPECIFICATION
The expression 'P-frame' in this specification relates to frames in a compressed video, where to decompress this type of frame, data from previous frames is used.
The expression T-frame' in this specification relates to a frame in a compressed video where no reference to either a previous or a later frame is required. These frames generally mark the beginning or end of a video.
The term 'macroblock' in this specification relates to the smallest segment of a frame,
The expression 'P_Skip' in this specification relates to a skipped macroblock in which information about the macroblock is not processed, as the image for that macroblock is taken directly from the same position of a previously displayed frame.
BACKGROUND OF THE INVENTION AND PRIOR ART
Browsing a digital video library can be very tedious especially with an ever-expanding collection of multimedia materials. Locating intended

information effectively and efficiently presents a great challenge to the researchers in the field of video retrieval. The very first step in video retrieval is to identify the shot boundaries. Shot boundary detection on an embedded platform is an additional challenge because of the speed and memory constraints of the hardware.
Shot boundaries can be categorized in two ways. A hard cut is a drastic change between two frames, whereas a soft cut includes fade-in, fade-out, wipe, dissolve and the like.
The major automatic shot boundary detection techniques can be classified into five categories: pixel based, statistics based, histogram based, feature based and transform based. In the pixel based approach, pixel wise difference is computed for two or more frames. The number of pixels (c) having a difference in intensity value more than some threshold is computed. It is concluded that a shot boundary exists if (c) is greater than some second threshold. Amongst other techniques used is a 3x3 averaging filtering technique to reduce noise and camera motion effects and computing chromatic images by dividing the change in gray level of each pixel between frames by the gray level of that pixel in the second frame. These techniques can detect the gradual transitions like wipe, dissolve and fades. However, the pixel based approaches are highly sensitive to camera motion and noises. Moreover, the automatic threshold selection method is slow and manually adjusting the threshold is unlikely to be practical.
In the statistics based approach, the image is split into regions and some statistical information like mean and standard deviation is obtained for each

such region. This approach can overcome the problem of noise sensitivity, but is slow due to the complexity of the statistical formulas and generates many false alarms.
In the histogram based approach, some statistics on color and gray level histograms are used to detect the shot boundary. Also, proposed is a feature based approach where the shot boundary is detected by sudden changes in edges and gradual transitions are identified by looking at the values of the entering and exiting edge percentage.
But all of the above mentioned approaches are slow and thus not capable of meeting the real time criteria. To achieve a fast detection technique, DCT based approach was proposed along with the use of motion vector of a MPEG stream. Also, a clustering and post filtering based approach was proposed which gave fairly high accuracy without producing many false positives. But these techniques / methods use complex algorithms like clustering and thus reduce the performance in terms of speed.
From a survey of the state of the art it is found that region based comparisons, motion vectors and running differences produce good results. But the problem with these methods is the complexity of the algorithm. Some methods for detecting special effects like wipe and dissolve from MPEG stream can be found in the following patent documents US7050115B2, US6940910B2, US20020027616A1, US20020027616A1, EP1132812B1, and EP1132812A1. United States Patent Application US20070030584A1 claims the detection of commercials in compressed domain. Indian Patent Applications 2124/MUM/2008 and 1825/MUM/2008 deals with detection of shot boundary in pixel domain whereas

1945/MUM/2008 deals with shot boundary detection using compressed domain information; however, these are heuristic based approaches.
Therefore, there is felt a need for a system for shot boundary detection which utilizes the region based comparisons approach but involving minimum complex operations. Also, there is felt a need for a system which detects the shot boundaries in real time.
OBJECT OF THE INVENTION
It is an object of the present invention to provide an efficient and fast shot boundary detection system which detects shot boundaries in real time.
It is another object of the present invention to provide a shot detection system which uses compressed domain features of H.264 video codec to detect the shot boundaries of video sequences.
It is still another object of the present invention to provide a high performance shot detection system which involves minimum complex computations.
SUMMARY OF THE INVENTION
The present invention envisages a system for automatic shot boundary detection using a compressed input video stream, wherein the compressed input video stream is divided into sets of I frames and P frames, each one of the frames being divided into MacroBlocks (MBs), the I frames being adapted to contain intra-coded (I) MBs and the P-frames being adapted to contain at least one of macroblocks selected from the group consisting of

intra-coded (I) MBs, inter-coded MBs and skipped (P_Skip) MBs, said system being characterized by:
• a information retrieval unit adapted to retrieve the compressed domain information from an encoded input video stream;
• a feature extraction unit adapted to receive the compressed domain information and extract at least one feature from at least one frame from the set of frames;
• assignment means adapted to assign a fuzzy confidence score to the extracted features; and
• a decision unit adapted to receive the confidence scores and further adapted to apply the fuzzy confidence scores in a fuzzy multifactorial technique, the decision unit having:

■ retrieving means adapted to determine the number of frames, the number of features extracted per frame and the confidence scores assigned to each of the extracted features;
■ matrix construction means adapted to receive the extracted number of frame, the number of extracted features per frame and the confidence score for the extracted features of that frame and arrange the received information into a evaluation matrix with confidence scores arranged as the columns of the matrix;
■ mapping means adapted to create a Additive Standard Multifactorial function based on the evaluation matrix and further adapted to apply the Additive Standard Multifactorial function to create a row matrix,

wherein each element of the row matrix represents a membership score; ■ thresholding means adapted to compare each of said membership scores with a predetermined threshold and further adapted to determine frames representing shot boundaries.
Typically, the compressed input video stream is compressed in the H.264 video format.
Preferably, the information retrieval unit is adapted to retrieve compressed domain information selected from the group of information consisting of frame layout information and DC value of transformed luma component.
In accordance with the present invention, the feature extraction unit is adapted to extract said plurality of features comprising number of I macroblocks in a P frame, number of P_Skip macroblocks in a P frame and the DC component of integer transformed luma co-efficient.
Further, the feature extraction unit includes a computational means adapted to compute the number of said features.
In accordance with the present invention, there is provided a method for shot boundary detection using the compressed domain features of an input video, the method comprises the following steps:
a) segmenting the input video into a set of I frames and P frames, wherein each of the frames are divided into MacroBlocks

(MBs), I frames contain intra-coded (I) MBs and P-frames contain intra-coded (I), inter-coded or skipped (P_Skip) MBs;
b) retrieving at least one compressed domain feature from the set of frames;
c) calculating first order difference for the extracted compressed domain feature of DC coefficients of integer transformed residual part of luma components in the P frames/field;
d) calculating the number of I MBs in a P frame/field;
e) calculating the number of P_Skip MBs in a P frame/field;
f) assigning a fuzzy confidence score for all the frames based on the above calculated features in steps (c), (d) and (e);
g) constructing an evaluation matrix based on the confidence scores assigned in step (f);
h) constructing an Additive Standard Multifactorial (ASM) function based on the evaluation matrix defined in step (g); and
i) deciding whether the frame under inspection is shot boundary or not based on the temporal continuity and the ASM defined in step (h). Typically, the step of calculating first order difference for the DC coefficients of integer transformed residual part of luma component includes the step of calculating the first order difference of DC coefficients of Luma value using an integer transformation on MBs of size 4x4.
Preferably, the step of calculating first order difference for the extracted compressed domain feature of DC coefficients of integer transformed residual part of luma components further includes the steps of computing the first order difference for DC luma value with neighboring sub blocks,

counting the number of sub blocks greater than a predetermined threshold and applying a median filtering technique to smoothen the data. BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The present invention will now be described with reference to the accompanying drawings, in which
FIGURE 1 illustrates a schematic of the system for shot boundary detection in accordance with the present invention; and
FIGURE 2 shows an application of the shot boundary detection system in accordance with the present invention.
DETAILED DESCRIPTION
The drawings and the description thereto are merely illustrative of a Shot Boundary detection system and only exemplify the system of the invention and in no way limit the scope thereof.
The prior art techniques for shot boundary detection involved use of complex techniques for detection of shot boundaries and hence these complex computations reduced the speed of the system and required specialized computing setup. Therefore, to provide an efficient system which detects shot boundaries in real time with minimum complexities, the present invention envisages a shot boundary detection system which readily uses compressed domain features computed during encoding of a video, thus saving time and minimizing the complexity of the system.
In accordance with one aspect of the present invention, the proposed system obtains features from the compress domain, thus avoiding the complex

operations needed to derive those features. Further, the system performs block level comparisons using at least two consecutive frames and uses the DC coefficient of transformed luma coefficient for extracting major information of a block and also detects high motion of an object in a frame by calculating the number of I macroblocks and checks for frame similarities by calculating the number of Skip macroblocks. So, the DC value of transformed luma coefficient and the number of I & Skip MB in P frame give the region based information. Since these features can be obtained during the process of decoding, no additional time is required to obtain them.
The proposed system uses the H.264 compressed video stream / UYVY video stream for performing the shot boundary detection along with the compressed domain features which are obtained during encoding of the video stream and finally a fuzzy multifactorial based approach is to identify the shot boundaries.
Referring to the accompanying drawings, FIGURE 1 shows a schematic of the system 100 for shot boundary detection. The system 100 includes the following components:
• an information retrieval unit 102 to retrieve the compressed domain information from an encoded input video stream;
• a feature extraction unit 104 adapted to receive the compressed domain information and extract a plurality of features from at least one frame from the set of frames;
• assignment means 106 adapted to assign a confidence score to the extracted features; and

• a decision unit 108 adapted to receive the confidence scores and further adapted to apply a fuzzy multifactorial technique and determine if the frame under inspection is shot boundary or not.
The information retrieval unit 102 is adapted to receive the compressed video stream and retrieve the compressed domain information including the compressed video frame layout that is, the frame arrangement in the encoded video showing the sequence of the I and the P frames in the compressed video and the DC value of the transformed luma component.
The compressed input video stream is divided into sets of I frames and P frames, each one of the frames being divided into MacroBlocks (MBs), the I frames being adapted to contain intra-coded (I) MBs and the P-frames being adapted to contain at least one of macroblocks selected from the group consisting of intra-coded (I) MBs, inter-coded MBs and skipped (P_Skip) MBs.
The retrieved compressed domain information is given to the feature extraction unit 104 which uses this information to extract at least three features from the set of frames including the number of I macroblocks (MBs) in a P frame, the number of macroblocks coded as P_Skip in P frame; and DC component of integer transformed luma coefficients.
These extracted features are then assigned a fuzzy confidence score by the assignment means 106 and given to the decision unit 108 to determine whether the frame is a shot boundary or not.

In accordance with the present invention, the detailed working of the feature extraction unit 106 and the decision unit 108 is explained hereinafter.
For simplicity of the implementation a single slice shall be considered in a frame and thus, slice and frame are identical. The baseline profile of H.264 video supports two types of frames for coding namely I-frames and P-frames. I-frames contain intra-coded macroblocks (MB). In each MB 16 16 or 4 x 4 luma region and each 8x8 chroma region is predicted from previously-coded samples in the same slice. P-frames may contain intra-coded, inter-coded or skipped MBs. Inter coded MBs in a P slice are predicted using a number of previously coded pictures, using motion compensation with quarter-sample (luma) motion vector accuracy.
The prediction mode based feature extraction by the feature extraction unit 106 involves computation of the number of I MBs (intra coded MBs) in a P frames/field. It is observed that in P frames, P mode or skipped mode is selected if there is a temporal redundancy within two frames and I mode is selected when there is no correlation within two consecutive frames. So, the number of MBs selected as I predicted mode' give an indication of its correlation with previous frame.
The feature extraction unit 106 includes computational means (not shown in the figures) for performing prediction mode based feature extraction. The computational means involves the following steps:

• counting the T predicted MBs in a P frame and storing them in a array a'



finding the factor value

for the

frame indicating the

confidence score of that frame to be a shot boundary as:

Where W_UB and HT_MB indicates the height and width of the frames in Macroblock unit.



Now, the value of

because


indicates that there is no similarity in two
consecutive frames and thus it is a shot boundary and conversely a1 = ° indicates that there is no significant difference between two consecutive frames and thus it is not a candidate frame for shot boundary.
The computational means of 106 further computes the value of DC component of transformed LUMA coefficient based feature in accordance with the present invention.
Typically, in H.264, 4x4 integer transformation is used which is different from the 8x8 DCT (Discrete Cosine Transform) transformation of MPEG (Moving Picture Experts Group) series video codec.

Integer transformation reduces the problem of round off and floating point realization in a fixed point DSP (Digital Signal Processing). This feature is used based on the observation that the number of high energy components in DC coefficients vary drastically across the shot boundary. Even in case of gradual transition, this feature works very well.
The method of extracting DC component of transformed LUMA coefficient based features by the computational means of 106 comprises the following steps:
a) getting the Luma DC value (dc1) for each 4x4 sub block from the
information retrieval unit 102;
b) computing the first order difference (d(dc1)) 0f dc1,with neighboring sub blocks in x and y direction;
c) counting the number of sub blocks for which d(dc,) greater than a experimentally obtained threshold value. Let the number such sub blocks for the i'h frame be denoted as If;
d) applying the median filtering on at;
e) finding the dc luma based factor value for the frame
indicating the confidence score of that frame to be a shot boundary as

Where HT _MB anci HT __MB indicate the height and width of the frames in the macroblock unit. The threshold factor in the denominator is computed using statistical analysis on the behavior of this feature across some shot boundaries,

The computational means then computes the number of P_Skipped MB in P frame/field. The P frame can contain inter coded MBs, intra coded MBs or skipped MBs. When a MB is predicted as skipped MB, no further data is sent for that macroblock. The decoder calculates a vector for the skipped macroblock and reconstructs the macroblock using motion-compensated prediction from the first reference picture in list 0. Thus, for better coding efficiency whenever the temporally adjacent MBs are similar, they are predicted as SKIP. Specifically, the more the number of MBs predicted as P_SKIP indicate more similarity with temporally adjacent frames. Similarly, at the shot boundaries number of MBs predicted as P_SKJP reduced drastically.
The method for extracting P_SKIP count based features by the computation means comprises the following steps:
a) counting the number of MBs predicted as P_SKIP in a P frame and store them in a array
b) finding the factor value for the frame indicating the confidence score of that frame to be a shot boundary as

Where indicates the height and width of the frames in
Macroblock unit. Now the value of because
indicates that all MBs in that frame have no significant difference between two

consecutive frames and thus it is not a candidate frame for shot boundary
and conversely indicates that there is no similarity in two consecutive frames and thus it is a shot boundary. These computed features along with their confidence scores are given to the decision unit 108. The decision unit 108 includes retrieving means 110 adapted to determine the number of frames, the number of features extracted per frame and the confidence scores assigned to each of the extracted features and pass this information to a matrix construction means 112 which constructs an m x n evaluation matrix based on the m number of factors and n number of frame.
Let the evaluation matrix (v) created by the matrix construction means 112 be defined as

Where represents the confidence score for the factor with the frame. Here we have used three factors. So m is 3 and n is the number of frames in the video. So the evaluation matrix is formed using the factors values as described below

Now, from the evaluation matrix V it is difficult to obtain any solution of the decision-making problem. So a mapping means 114 creates an Additive

Standard Multifactorial (ASM) mapping function to map the m-
dimensional vector into a one dimensional scalar i.e.

The ASM is applied on V to obtain the multifactorial evaluation as below:

In accordance with this invention, the ASM mapping function is a
simple arithmetic average over the m number of factors and thus it is defined as

Using the aforementioned technique a decision making matrix v which is a row matrix is now obtained and each element represents the confidence score of membership of the candidate frame being a shot boundary. The decision unit 108 further employs thresholding means 116 to compare the confidence score of membership with a predetermined threshold value to shortlist/ determine frames which represent shot boundaries.
In this process two threshold values T1 and Th are used depending upon the
user requirement. Experiments suggests that if T1 is used as the threshold value it is found that there are no misses and thus, a recall rate of 1 is
achieved but precision parameter becomes 0.89, on the other hand if h is selected as the threshold value, recall rate comes down to 0.92 but precision
becomes one. But when the default threshold value as is used a

T1 recall rate of 1.0 and a precision rate of .96 is achieved. The values for '
and T1 as 0.4 and 0.6 respectively are used, this multi threshold helps to get better accuracy in some cases.
In accordance with the present invention, there is provided a method for shot boundary detection using the compressed domain features of an input video, the method comprises the following steps as seen in FIGURE 3:
a) segmenting the input video into a set of I frames and P frames, wherein each of the frames are divided into MacroBlocks (MBs), I frames contain intra-coded (I) MBs and P-frames contain intra-coded (I), inter-coded or skipped (P_Skip) MBs, 1000;
b) extracting at least one compressed domain feature from the set of frames, 1002;
c) calculating first order difference for the extracted compressed domain feature of DC coefficients of integer transformed residual part of luma components in the P frames/fields, 1004;
d) calculating the number of I MBs in a P frame/field, 1006;
e) calculating the number of P_Skip MBs in a P frame/field, 1008;
f) assigning a confidence score for all the frames based on the above mentioned features in (c), (d) and (e), 1010;
g) constructing an evaluation matrix based on the confidence scores described in (f), 1012;
h) constructing an Additive Standard Multifactorial (ASM) function based on the evaluation matrix defined in (g) , 1014; and
i) deciding whether the frame under inspection is shot boundary or not based on the temporal continuity and the ASM defined in (h) , 1016.

TECHNICAL ADVANTAGES
The technical advantages of the present invention include in providing a system for shot boundary detection which efficiently and quickly detects the shot boundaries in real time.
The present invention makes use of the already encoded compressed domain properties of the input videos to determine the shot boundaries, thus minimizing the time, effort and computational resources involved in shot boundary detection.
Moreover, the present invention uses a multi-factorial approach for shot boundary detection thereby extracting the encoded features, performing block level region based comparisons and then using a fuzzy multi-factorial based approach is used to identify the shot boundaries.
While considerable emphasis has been placed herein on the components and component parts of the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the invention. These and other changes in the preferred embodiment as well as other embodiments of the invention will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the invention and not as a limitation.

WE CLAIM:
1. A system for automatic shot boundary detection using a compressed input video stream, wherein the compressed input video stream is divided into sets of I frames and P frames, each one of the frames being divided into MacroBlocks (MBs), the I frames being adapted to contain intra-coded (I) MBs and the P-frames being adapted to contain at least one of macroblocks selected from the group consisting of intra-coded (I) MBs, inter-coded MBs and skipped (PSkip) MBs, said system being characterized by:
• an information retrieval unit adapted to retrieve the compressed domain information from an encoded input video stream;
• a feature extraction unit adapted to receive said compressed domain information and extract a plurality of features from at least one frame from the set of frames;
• assignment means adapted to assign a confidence score to the extracted features; and
• a decision unit adapted to receive the confidence scores and further adapted to apply said fuzzy confidence scores in a fuzzy multifactorial technique, said decision unit having:

■ retrieving means adapted to determine the number of frames, the number of features extracted per frame and the confidence scores assigned to each of the extracted features;
■ matrix construction means adapted to receive the extracted number of frame, the number of extracted features per frame and the confidence score for the

extracted features of that frame and arrange the received information into a evaluation matrix with confidence scores arranged as the columns of the matrix;
■ mapping means adapted to create a Additive Standard Multifactorial function based on said evaluation matrix and further adapted to apply the Additive Standard Multifactorial function to create a row matrix, wherein each of the elements of said row matrix represents a membership score;
■ thresholding means adapted to compare said membership score with a predetermined threshold and further adapted to determine frames representing shot boundaries.

2. The system as claimed in claim 1, wherein said compressed input video stream is compressed in the H.264 video format.
3. The system as claimed in claim 1, wherein said information retrieval unit is adapted to retrieve compressed domain information selected from the group of information consisting of frame layout information and DC value of transformed luma component.
4. The system as claimed in claim 1, wherein said feature extraction unit is adapted to extract said plurality of features comprising number of I macroblocks in a P frame, number of P_Skip macroblocks in a P frame and the DC component of integer transformed luma co-efficient.

The system as claimed in claim 4, wherein said feature extraction unit includes a computational means adapted to compute the number of each of said features.
6. A method for shot boundary detection using the compressed domain features of an input video, the method comprises the following steps:
a) segmenting the input video into a set of I frames and P frames, wherein each of the frames are divided into MacroBlocks (MBs), I frames contain intra-coded (I) MBs and P-frames contain intra-coded (I), inter-coded or skipped (P_Skip) MBs;
b) retrieving at least one compressed domain feature from said set of frames;
c) calculating first order difference for the extracted compressed domain feature of DC coefficients of integer transformed residual part of luma components in the P frames/field;
d) calculating the number of I MBs in a P frame/field;
e) calculating the number of P Skip MBs in a P frame/field;
f) assigning a fuzzy confidence score for all the frames based on the above calculated features in steps (c), (d) and (e);
g) constructing an evaluation matrix based on the confidence scores assigned in step (f);
h) constructing an Additive Standard Multifactorial (ASM) function based on the evaluation matrix defined in step (g); and
i) deciding whether the frame under inspection is shot boundary or not based on the temporal continuity and the ASM defined in step (h).

7. The method as claimed in claim 8, wherein the step of calculating first order difference for the DC coefficients of integer transformed residual part of luma component includes the step of calculating the first order difference of DC coefficients of Luma value using an integer transformation on MBs of size 4x4.
8. The method as claimed in claim 8, wherein the step of calculating first order difference for the extracted compressed domain feature of DC coefficients of integer transformed residual part of luma components further includes the steps of computing the first order difference for DC luma value with neighboring sub blocks, counting the number of sub blocks greater than a predetermined threshold and applying a median filtering technique to smoothen the data.

Documents

Application Documents

# Name Date
1 2625-MUM-2009-FORM 5(11-11-2010).pdf 2010-11-11
1 2625-MUM-2009-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30
2 2625-MUM-2009-FORM 2(TITLE PAGE)-(11-11-2010).pdf 2010-11-11
2 2625-MUM-2009-IntimationOfGrant13-09-2022.pdf 2022-09-13
3 2625-MUM-2009-PatentCertificate13-09-2022.pdf 2022-09-13
3 2625-mum-2009-form 2(11-11-2010).pdf 2010-11-11
4 2625-MUM-2009-Written submissions and relevant documents [19-08-2022(online)].pdf 2022-08-19
4 2625-mum-2009-form 2 (11-11-2010).doc 2010-11-11
5 2625-MUM-2009-Response to office action [18-08-2022(online)].pdf 2022-08-18
5 2625-MUM-2009-DRAWING(11-11-2010).pdf 2010-11-11
6 2625-MUM-2009-DESCRIPTION(COMPLETE)-(11-11-2010).pdf 2010-11-11
6 2625-MUM-2009-Correspondence to notify the Controller [02-08-2022(online)].pdf 2022-08-02
7 2625-MUM-2009-US(14)-HearingNotice-(HearingDate-04-08-2022).pdf 2022-07-19
7 2625-MUM-2009-CORRESPONDENCE(11-11-2010).pdf 2010-11-11
8 2625-MUM-2009-CLAIMS(11-11-2010).pdf 2010-11-11
8 2625-MUM-2009-US(14)-ExtendedHearingNotice-(HearingDate-09-09-2021).pdf 2021-10-03
9 2625-MUM-2009-US(14)-HearingNotice-(HearingDate-02-09-2021).pdf 2021-10-03
10 2625-MUM-2009-ABSTRACT(11-11-2010).pdf 2010-11-11
10 2625-MUM-2009-Written submissions and relevant documents [24-09-2021(online)].pdf 2021-09-24
11 2625-MUM-2009-Correspondence to notify the Controller [01-09-2021(online)].pdf 2021-09-01
12 2625-MUM-2009-FORM-26 [01-09-2021(online)].pdf 2021-09-01
12 Other Patent Document [07-10-2016(online)].pdf 2016-10-07
13 2625-MUM-2009-Response to office action [31-08-2020(online)].pdf 2020-08-31
13 abstract1.jpg 2018-08-10
14 2625-MUM-2009-ABSTRACT [22-05-2019(online)].pdf 2019-05-22
14 2625-mum-2009-form 3.pdf 2018-08-10
15 2625-MUM-2009-CLAIMS [22-05-2019(online)].pdf 2019-05-22
15 2625-mum-2009-form 26.pdf 2018-08-10
16 2625-MUM-2009-DRAWING [22-05-2019(online)].pdf 2019-05-22
16 2625-mum-2009-form 2.pdf 2018-08-10
17 2625-mum-2009-form 2(title page).pdf 2018-08-10
17 2625-MUM-2009-FER_SER_REPLY [22-05-2019(online)].pdf 2019-05-22
18 2625-MUM-2009-FORM 18(10-4-2013).pdf 2018-08-10
18 2625-MUM-2009-FORM-26 [22-02-2019(online)].pdf 2019-02-22
19 2625-MUM-2009-FER.pdf 2019-01-25
19 2625-mum-2009-form 1.pdf 2018-08-10
20 2625-MUM-2009-CORRESPONDENCE(10-4-2013).pdf 2018-08-10
20 2625-MUM-2009-FORM 1(5-2-2010).pdf 2018-08-10
21 2625-MUM-2009-CORRESPONDENCE(5-2-2010).pdf 2018-08-10
21 2625-mum-2009-drawing.pdf 2018-08-10
22 2625-mum-2009-correspondence.pdf 2018-08-10
22 2625-mum-2009-description(provisional).pdf 2018-08-10
23 2625-mum-2009-correspondence.pdf 2018-08-10
23 2625-mum-2009-description(provisional).pdf 2018-08-10
24 2625-mum-2009-drawing.pdf 2018-08-10
24 2625-MUM-2009-CORRESPONDENCE(5-2-2010).pdf 2018-08-10
25 2625-MUM-2009-CORRESPONDENCE(10-4-2013).pdf 2018-08-10
25 2625-MUM-2009-FORM 1(5-2-2010).pdf 2018-08-10
26 2625-MUM-2009-FER.pdf 2019-01-25
26 2625-mum-2009-form 1.pdf 2018-08-10
27 2625-MUM-2009-FORM 18(10-4-2013).pdf 2018-08-10
27 2625-MUM-2009-FORM-26 [22-02-2019(online)].pdf 2019-02-22
28 2625-MUM-2009-FER_SER_REPLY [22-05-2019(online)].pdf 2019-05-22
28 2625-mum-2009-form 2(title page).pdf 2018-08-10
29 2625-MUM-2009-DRAWING [22-05-2019(online)].pdf 2019-05-22
29 2625-mum-2009-form 2.pdf 2018-08-10
30 2625-MUM-2009-CLAIMS [22-05-2019(online)].pdf 2019-05-22
30 2625-mum-2009-form 26.pdf 2018-08-10
31 2625-MUM-2009-ABSTRACT [22-05-2019(online)].pdf 2019-05-22
31 2625-mum-2009-form 3.pdf 2018-08-10
32 2625-MUM-2009-Response to office action [31-08-2020(online)].pdf 2020-08-31
32 abstract1.jpg 2018-08-10
33 2625-MUM-2009-FORM-26 [01-09-2021(online)].pdf 2021-09-01
33 Other Patent Document [07-10-2016(online)].pdf 2016-10-07
34 2625-MUM-2009-Correspondence to notify the Controller [01-09-2021(online)].pdf 2021-09-01
35 2625-MUM-2009-Written submissions and relevant documents [24-09-2021(online)].pdf 2021-09-24
35 2625-MUM-2009-ABSTRACT(11-11-2010).pdf 2010-11-11
36 2625-MUM-2009-US(14)-HearingNotice-(HearingDate-02-09-2021).pdf 2021-10-03
37 2625-MUM-2009-US(14)-ExtendedHearingNotice-(HearingDate-09-09-2021).pdf 2021-10-03
37 2625-MUM-2009-CLAIMS(11-11-2010).pdf 2010-11-11
38 2625-MUM-2009-US(14)-HearingNotice-(HearingDate-04-08-2022).pdf 2022-07-19
38 2625-MUM-2009-CORRESPONDENCE(11-11-2010).pdf 2010-11-11
39 2625-MUM-2009-DESCRIPTION(COMPLETE)-(11-11-2010).pdf 2010-11-11
39 2625-MUM-2009-Correspondence to notify the Controller [02-08-2022(online)].pdf 2022-08-02
40 2625-MUM-2009-Response to office action [18-08-2022(online)].pdf 2022-08-18
40 2625-MUM-2009-DRAWING(11-11-2010).pdf 2010-11-11
41 2625-MUM-2009-Written submissions and relevant documents [19-08-2022(online)].pdf 2022-08-19
42 2625-mum-2009-form 2(11-11-2010).pdf 2010-11-11
42 2625-MUM-2009-PatentCertificate13-09-2022.pdf 2022-09-13
43 2625-MUM-2009-FORM 2(TITLE PAGE)-(11-11-2010).pdf 2010-11-11
43 2625-MUM-2009-IntimationOfGrant13-09-2022.pdf 2022-09-13
44 2625-MUM-2009-FORM 5(11-11-2010).pdf 2010-11-11
44 2625-MUM-2009-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30

Search Strategy

1 2625MUM2009_24-01-2019.pdf

ERegister / Renewals

3rd: 29 Nov 2022

From 13/11/2011 - To 13/11/2012

4th: 29 Nov 2022

From 13/11/2012 - To 13/11/2013

5th: 29 Nov 2022

From 13/11/2013 - To 13/11/2014

6th: 29 Nov 2022

From 13/11/2014 - To 13/11/2015

7th: 29 Nov 2022

From 13/11/2015 - To 13/11/2016

8th: 29 Nov 2022

From 13/11/2016 - To 13/11/2017

9th: 29 Nov 2022

From 13/11/2017 - To 13/11/2018

10th: 29 Nov 2022

From 13/11/2018 - To 13/11/2019

11th: 29 Nov 2022

From 13/11/2019 - To 13/11/2020

12th: 29 Nov 2022

From 13/11/2020 - To 13/11/2021

13th: 29 Nov 2022

From 13/11/2021 - To 13/11/2022

14th: 29 Nov 2022

From 13/11/2022 - To 13/11/2023

15th: 03 Nov 2023

From 13/11/2023 - To 13/11/2024

16th: 04 Nov 2024

From 13/11/2024 - To 13/11/2025

17th: 30 Oct 2025

From 13/11/2025 - To 13/11/2026