Abstract: The present invention envisages a system and a method for shot boundary detection, particularly for hard cut detection. This invention proposes a three-tier approach for hard cuts. In the first tier there is sought a sliding- window consisting of three consecutive frames, where the third one is the pivotal frame. In the second tier is computed their intermediate differences and the RGB co-occurrence matrices at different pixel distances for the frame-differences are calculated. And finally in the third tier a set of texture features are extracted from these RGB co-occurrence matrices corresponding to the frame differences in order to identify the shot-frames and non-shot frames and K-means clustering are used to classify them.
FORM -2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
PROVISIONAL
Specification
(See Section 10 and rule 13) SHOT BOUNDARY DETECTION BASED ON CO-OCCURRENCE MATRICES
TATA CONSULTANCY SERVICES LTD.,
an Indian Company of Nirmal Building, 9th floor, Nariman Point, Mumbai 400 021, Maharashtra, India
THE FOLLOWING SPECIFICATION DESCRIBES THE INVENTION
Field of the Invention:
This invention relates to the field of analyzing video content.
Background of the Invention:
The recent internet boom, and the ever-expanding assemblage of multimedia material is on the brink of inundating us with a vast accumulation of unorganized digital video data. The final hour seems to be approaching us with inexorable certainty - when browsing a digital video library, video retrieval will become a more tedious work. To ease this mammoth task, significant research efforts are being devoted for effective retrieval and management of visual information. Each video sequence can be considered as a set of still images and the number of images in such a set is in the range of hundreds or more. It is well-understood that better the representation scheme of video contents, the faster and more accurate will be the retrieval of data. One of the good representation schemes is to index the video data i.e. look for information from the indexed data. Indexing videos manually is a time consuming venture. However, by utilizing various image processing techniques, computer scientists came up with the idea of shot detection which is the convention of grouping frames into shots. Thus, a shot designates a contiguous sequence of temporal video frames recorded by an uninterrupted camera operation. Video shot boundary detection algorithms need to perform under the constraints of camera motion, object motion and varying lighting conditions. Video shot
boundaries also vary in appearances; they can be abrupt temporal change (hard cut), smooth temporal change (fade and dissolve) or wipe.
Presently, a great deal of ongoing research is focused on automatic content indexing of videos. These include various techniques shot detection in the presence of hard cuts, gradual transitions and wipes. The features that have been used to determine the region of interest (ROI) for shot detection are mainly grayscale, color space, color histogram, color angiogram or transform coefficients such as DFT, DCT, and wavelet. Existing spatial feature domains include single frame pixel per feature, rectangular block arbitrarily shaped region or whole frame. For detection of shots, the established methods implement certain threshold parameters, adaptive thresholding probabilistic detection, trained classifier or heuristics. The major methods that have been used for shot boundary detection are pixel differences, statistical differences, histogram comparisons, edge differences, phase correlation based differences, compression differences, and motion vectors.
The objective of this invention is the detection of hard cuts (shot boundaries) of video sequences.
In particular, this invention envisages a novel way of estimating the difference between frames of input videos in each colour plane.
In particular, this invention envisages a novel way of shot boundary detection (hard cut) of a video using its texture feature.
In particular, this invention envisages a way of shot boundary detection (cut) of a video using K-means.
Further, this invention provides means for automatic detection of cuts from an input video sequence.
Summary of the Invention:
Content based video indexing and retrieval traces back to the elementary video structures, such as a shot or a scene. In the field of automatic shot boundary detection, a number of techniques and methods exist. However, most of the proposed methods share the necessity of a threshold value, which is used as a reference for detecting scene changes. The dynamic re-estimation of this threshold parameter remains the most challenging issue for the existing shot boundary detection algorithms. We have proposed a threshold-independent methodology for hard cut detection.
In this invention, a three-tier method is proposed for hard cut detection In the first tier there si sought a sliding-window consisting of three consecutive frames, where the third one is the pivotal frame. In the second tier is computed their intermediate differencesand the RGB co-occurrence matrices at different pixel distances for the frame-differences are calculated. And finally in the tird tier a set of texture features are extracted from these RGB co-occurrence matrices corresponding to the frame differences in order to identify the shot-frames and non-shot frames and K-means clustering are used to classify them.
Brief Description of the Accompanying Drawings:
The invention will now be described with reference to the accompanying
drawings, in which
Figure 1 shows the Sliding window for use in the method in accordance
with this invention; and
Figure 2 shows the Frame difference in accordance with this invention
Detailed Description of Invention:
The invention uses a set of feature computations from the RGB co¬occurrence matrices' statistics, defined at various pixel displacement distances. It integrates the statistical findings in a training set and implements a trained classifier to identify shot-frames and non-shot frames. The study led to a fast and robust algorithm for hard cut detection..
In figure 1, a sliding-window is defined consisting of three frames:ni-2, ni-1 and ni,where ni is the pivotal frame. So, the objective
is to determine whether ni is a shot-frame or a non-shot frame.
For this the sum of absolute differences(SAD) is computed over a block of size m x n between the frames in R, G and B planes independently. These differences are named as frame-differences (fd), i.e. the frame-difference matrix of the ist and ist frame is defined as:
where pi (k,/) represents the pixel value at index position (k, 1) of the i
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 2124-MUM-2008-FORM 1(23-12-2008).pdf | 2008-12-23 |
| 1 | 2124-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 2 | 2124-MUM-2008-CORRESPONDENCE(23-12-2008).pdf | 2008-12-23 |
| 2 | 2124-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 3 | 2124-MUM-2008-RELEVANT DOCUMENTS [30-09-2021(online)].pdf | 2021-09-30 |
| 3 | 2124-MUM-2008-FORM 18(18-11-2010).pdf | 2010-11-18 |
| 4 | 2124-MUM-2008-RELEVANT DOCUMENTS [29-03-2020(online)].pdf | 2020-03-29 |
| 4 | 2124-MUM-2008-CORRESPONDENCE(18-11-2010).pdf | 2010-11-18 |
| 5 | Other Document [22-12-2016(online)].pdf | 2016-12-22 |
| 5 | 2124-MUM-2008-RELEVANT DOCUMENTS [23-03-2019(online)].pdf | 2019-03-23 |
| 6 | Examination Report Reply Recieved [22-12-2016(online)].pdf | 2016-12-22 |
| 6 | 2124-MUM-2008-2. Marked Copy under Rule 14(2) (MANDATORY) [31-10-2018(online)].pdf | 2018-10-31 |
| 7 | Description(Complete) [22-12-2016(online)].pdf_324.pdf | 2016-12-22 |
| 7 | 2124-MUM-2008-IntimationOfGrant31-10-2018.pdf | 2018-10-31 |
| 8 | Description(Complete) [22-12-2016(online)].pdf | 2016-12-22 |
| 8 | 2124-MUM-2008-PatentCertificate31-10-2018.pdf | 2018-10-31 |
| 9 | 2124-MUM-2008-Retyped Pages under Rule 14(1) (MANDATORY) [31-10-2018(online)].pdf | 2018-10-31 |
| 9 | Correspondence [22-12-2016(online)].pdf | 2016-12-22 |
| 10 | 2124-MUM-2008-ABSTRACT (1-10-2009).pdf | 2018-08-09 |
| 10 | Claims [22-12-2016(online)].pdf | 2016-12-22 |
| 11 | 2124-MUM-2008-CLAIMS (1-10-2009).pdf | 2018-08-09 |
| 11 | Abstract [22-12-2016(online)].pdf | 2016-12-22 |
| 12 | 2124-MUM-2008-CORRESPONDENCE(1-10-2009).pdf | 2018-08-09 |
| 12 | 2124-MUM-2008-Written submissions and relevant documents (MANDATORY) [14-03-2018(online)].pdf | 2018-03-14 |
| 13 | 2124-MUM-2008-Amendment Of Application Before Grant - Form 13 [14-03-2018(online)].pdf | 2018-03-14 |
| 13 | 2124-MUM-2008-Correspondence-221216.pdf | 2018-08-09 |
| 14 | 2124-mum-2008-correspondence.pdf | 2018-08-09 |
| 14 | RTOA 2124_25 nov.pdf | 2018-08-09 |
| 15 | 2124-MUM-2008-DESCRIPTION(COMPLETE)-(1-10-2009).pdf | 2018-08-09 |
| 15 | Provisonal 2124 amended track+Clean.pdf | 2018-08-09 |
| 16 | CS_Marked+Cleandocx.pdf | 2018-08-09 |
| 17 | Claims Track+Clean.pdf | 2018-08-09 |
| 17 | 2124-mum-2008-description(provisional).pdf | 2018-08-09 |
| 18 | abstract1.jpg | 2018-08-09 |
| 18 | 2124-MUM-2008-DRAWING (1-10-2009).pdf | 2018-08-09 |
| 19 | 2124-mum-2008-drawing.pdf | 2018-08-09 |
| 19 | Abstract track+Clean.pdf | 2018-08-09 |
| 20 | 2124-mum-2008-form 1.pdf | 2018-08-09 |
| 20 | 2124-MUM-2008_POA.pdf | 2018-08-09 |
| 21 | 2124-mum-2008-form 2(1-10-2009).pdf | 2018-08-09 |
| 21 | 2124-MUM-2008_EXAMREPORT.pdf | 2018-08-09 |
| 22 | 2124-MUM-2008-FORM 2(TITLE PAGE)- (1-10-2009).pdf | 2018-08-09 |
| 22 | 2124-MUM-2008-HearingNoticeLetter.pdf | 2018-08-09 |
| 23 | 2124-mum-2008-form 2(title page).pdf | 2018-08-09 |
| 23 | 2124-MUM-2008-FORM 5 (1-10-2009).pdf | 2018-08-09 |
| 24 | 2124-mum-2008-form 3.pdf | 2018-08-09 |
| 25 | 2124-mum-2008-form 26.pdf | 2018-08-09 |
| 25 | 2124-mum-2008-form 2.pdf | 2018-08-09 |
| 26 | 2124-mum-2008-form 2.pdf | 2018-08-09 |
| 26 | 2124-mum-2008-form 26.pdf | 2018-08-09 |
| 27 | 2124-mum-2008-form 3.pdf | 2018-08-09 |
| 28 | 2124-mum-2008-form 2(title page).pdf | 2018-08-09 |
| 28 | 2124-MUM-2008-FORM 5 (1-10-2009).pdf | 2018-08-09 |
| 29 | 2124-MUM-2008-FORM 2(TITLE PAGE)- (1-10-2009).pdf | 2018-08-09 |
| 29 | 2124-MUM-2008-HearingNoticeLetter.pdf | 2018-08-09 |
| 30 | 2124-mum-2008-form 2(1-10-2009).pdf | 2018-08-09 |
| 30 | 2124-MUM-2008_EXAMREPORT.pdf | 2018-08-09 |
| 31 | 2124-mum-2008-form 1.pdf | 2018-08-09 |
| 31 | 2124-MUM-2008_POA.pdf | 2018-08-09 |
| 32 | 2124-mum-2008-drawing.pdf | 2018-08-09 |
| 32 | Abstract track+Clean.pdf | 2018-08-09 |
| 33 | 2124-MUM-2008-DRAWING (1-10-2009).pdf | 2018-08-09 |
| 33 | abstract1.jpg | 2018-08-09 |
| 34 | 2124-mum-2008-description(provisional).pdf | 2018-08-09 |
| 34 | Claims Track+Clean.pdf | 2018-08-09 |
| 35 | CS_Marked+Cleandocx.pdf | 2018-08-09 |
| 36 | 2124-MUM-2008-DESCRIPTION(COMPLETE)-(1-10-2009).pdf | 2018-08-09 |
| 36 | Provisonal 2124 amended track+Clean.pdf | 2018-08-09 |
| 37 | RTOA 2124_25 nov.pdf | 2018-08-09 |
| 37 | 2124-mum-2008-correspondence.pdf | 2018-08-09 |
| 38 | 2124-MUM-2008-Amendment Of Application Before Grant - Form 13 [14-03-2018(online)].pdf | 2018-03-14 |
| 38 | 2124-MUM-2008-Correspondence-221216.pdf | 2018-08-09 |
| 39 | 2124-MUM-2008-CORRESPONDENCE(1-10-2009).pdf | 2018-08-09 |
| 39 | 2124-MUM-2008-Written submissions and relevant documents (MANDATORY) [14-03-2018(online)].pdf | 2018-03-14 |
| 40 | 2124-MUM-2008-CLAIMS (1-10-2009).pdf | 2018-08-09 |
| 40 | Abstract [22-12-2016(online)].pdf | 2016-12-22 |
| 41 | 2124-MUM-2008-ABSTRACT (1-10-2009).pdf | 2018-08-09 |
| 41 | Claims [22-12-2016(online)].pdf | 2016-12-22 |
| 42 | 2124-MUM-2008-Retyped Pages under Rule 14(1) (MANDATORY) [31-10-2018(online)].pdf | 2018-10-31 |
| 42 | Correspondence [22-12-2016(online)].pdf | 2016-12-22 |
| 43 | Description(Complete) [22-12-2016(online)].pdf | 2016-12-22 |
| 43 | 2124-MUM-2008-PatentCertificate31-10-2018.pdf | 2018-10-31 |
| 44 | Description(Complete) [22-12-2016(online)].pdf_324.pdf | 2016-12-22 |
| 44 | 2124-MUM-2008-IntimationOfGrant31-10-2018.pdf | 2018-10-31 |
| 45 | Examination Report Reply Recieved [22-12-2016(online)].pdf | 2016-12-22 |
| 45 | 2124-MUM-2008-2. Marked Copy under Rule 14(2) (MANDATORY) [31-10-2018(online)].pdf | 2018-10-31 |
| 46 | Other Document [22-12-2016(online)].pdf | 2016-12-22 |
| 46 | 2124-MUM-2008-RELEVANT DOCUMENTS [23-03-2019(online)].pdf | 2019-03-23 |
| 47 | 2124-MUM-2008-RELEVANT DOCUMENTS [29-03-2020(online)].pdf | 2020-03-29 |
| 47 | 2124-MUM-2008-CORRESPONDENCE(18-11-2010).pdf | 2010-11-18 |
| 48 | 2124-MUM-2008-FORM 18(18-11-2010).pdf | 2010-11-18 |
| 48 | 2124-MUM-2008-RELEVANT DOCUMENTS [30-09-2021(online)].pdf | 2021-09-30 |
| 49 | 2124-MUM-2008-CORRESPONDENCE(23-12-2008).pdf | 2008-12-23 |
| 49 | 2124-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 50 | 2124-MUM-2008-FORM 1(23-12-2008).pdf | 2008-12-23 |
| 50 | 2124-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |