Abstract: The main object of the present invention is to remove the redundant data of color information from the video data sent from the broadcaster"s end, that can be reconstructed at the receiving end, for further compressing the video data at the broadcaster"s end. This and other objects of the invention can be achieved by reducing the amount of color information being sent to I-frames for lowering the data pumped on network. The present invention also provides a system for regenerating the video data. Thus the present invention, in one preferred embodiment provides a method for effective utilization of bandwidth when broadcasting video data, comprising the steps of: restricting transmission of redundant data of color information in adjacent frames of video signal; receiving the video frames in a video processing unit of a device like a digital television or a set-top-box; regenerating the data of color information at the receiving end, which were not transmitted, with the aid of cotor information of key frames and correlation between adjacent frames; and colorizing all grayscale frames taking the key-frame as reference, and storing the data in the buffer of the video processing unit.
The invention relates to a method and system for effective utilization of bandwidth when broadcasting video data. In particular it relates to exploring the possibilities of further compressing the video data sent from the broadcaster's end i.e. removing the information that can be reconstructed at receiving side. The video signal sent from broadcaster comprises luminance signal and chrominance signal. The luminance signal carries the brightness information of a picture or frame. The chrominance signal contains color information of a frame. According to the present invention it is possible to lower the amount of data pumped on network by reducing the amount of color information being sent in I-frames. Also, it suggests a mechanism to regenerate that data at the receiving end.
Document US 5, 523, 786 A discloses a color sequential camera where the chrominance components can be captured at a lower temporal rate than the luminance components.
The R and G sources of the RGB light source of this camera can be activated in combination to generate a luminance light beam. Separately, R and 6 sources can provide separate chrominance light beams. An image sensor is provided to produce a color sequential signal having a sequence of luminance and chrominance image components, by capturing the image light reflected by a subject. The chrominance image components can be captured at a lower temporal rate than the luminance image components by activating the light sources so as to alternate the chrominance light beams between the luminance light beams. The document also allows capture of the chrominance components at a lower spatial resolution than the luminance image components by binning the sensor photosites together for capturing the chrominance image.
The main object of the present invention is to remove the redundant data of color information from the video data sent from the broadcaster's end, that can be reconstructed at the receiving end, for further compressing the video data at the broadcaster's end.
This and other objects of the invention can be achieved by reducing the amount of color information being sent to I-frames for lowering the data pumped on network.
The present invention also provides a system for regenerating the video data.
Thus the present invention, in one preferred embodiment provides a method for effective utilization of bandwidth when broadcasting video data, comprising the steps of: restricting transmission of redundant data of color information in adjacent frames of video signal; receiving the video frames in a video processing unit of a device like a digital television or a set-top-box; regenerating the data of color information at the receiving end, which were not transmitted, with the aid of cotor information of key frames and correlation between adjacent frames; and colorizing all grayscale frames taking the key-frame as reference, and storing the data in the buffer of the video processing unit.
The invention can now be described in detail with the help of the figures of the accompanying drawings in which
Figures 1 shows the picture types in an MPEG display order.
Figure 2 shows in block diagram form the method of the present invention.
Figure 3 shows in flow chart form the signal processing of the present invention at the receiver.
The present invention describes the way to avoid sending redundant data of color information in adjacent frames in video signal and enables us to reconstruct a color video signal, using the minimum required color information sent from broadcaster along with video frames. This has an additional advantage that
instead of broadcasting both luminance information and color information of a frame, we can just broadcast a signal with luminance signal thereby lowering the overall bandwidth usage.
In general as shown in Figure 1 there are three major picture types found in typical video compression designs i.e. intra coded frames, predicated pictures, and bi-directional predictive pictures. They are also commonly referred to as I-frames, P-frames, and B-frames. The I-frames are intra coded, i.e. they can be reconstructed without any reference to other frames. The P-frames are forward predicted from the last I-frame or P-frame, i.e. it is impossible to reconstruct them without the data of another frame (I or P). The B-frames are both, forward predicted and backward predicted from the last / next I-frame or P-frame, i.e. there are two other frames necessary to reconstruct them. P-frames and B-frames are referred to as inter coded frames.
That means we must buffer at least three frames: one for forward prediction and one for backward prediction. The third buffer contains the frame coming into being. As the Figure shows the frame for backward prediction follows the
predicted frame. That would require suspending the decocting of B-frames till the next P or B-frame appears. But fortunately the display order is not the coding order. The frames appear on MPEG data stream in such an order that the referred frames precede the referring frames.
Often, I-frames are used for random access and are used as references for the decoding of other pictures. Intra refresh periods of a half-second are common on such applications as digital television broadcast and DVD storage.
Although the general problem of adding chromatic values to a grayscale image has no exact, objective solution, few approaches suggest a way to do so.
The following points make the embodiments of the present invention in their preferred form:
The broadcaster can reduce the amount of information sent i.e. removing the color information from all the frames lying between two I-frames which are t seconds apart from each other. These two I-frames can be chosen depending
upon desired compression and can have B, P and even I-frame in between them. These two frames will serve as KEY FRAMES for the frames lying in between them.
At the receiving end we have to regenerate the information which has been deliberately removed from the broadcasters end. Two contiguous frames don't differ much with respect to the color information. Using the color information of key frames and taking advantage of correlation between adjacent frames we can recover this color information.
Following steps elucidate the method involved in colorizing other frames using forward and backward colorization.
White receiving video frames it should be checked if the frame is an I-frame with color information. If we receive any such frame it can be marked as KEY-FRAME PREV. Processing of frames and storing in a buffer can be continued while checking for next I-frame with color information. If we receive next such I-frame it can be marked as KEY-FRAME NEXT. Now the data of buffer can be
pushed to colorizing module. Using KEY-FRAME PREV and KEY-FRAME NEXT all grayscale frames (recovering color information) can be colorized. These frames can then be pushed into video buffer.
We can calculate the size of buffer required taking following points into
consideration:
Number of Frames to be displayed per second.
Time required for processing one grayscale frame.
Size should be such that video buffer doesn't get starved.
Figure 2 is a self-explanatory block diagram of the present invention for avoiding sending of redundant data of color information thereby reducing the overall bandwidth usage. The stages M1-M5 in Figure 2 elaborate the steps described below for the reconstruction of color data at receiving side.
M1: The video processing unit of the device (Digital Television, STB etc.) will keep on receiving video frames keeping a check on number of colored I-frames received. All the frames will be stored to an intermediate buffer.
M2: This stage corresponds to process of identifying key-frames and storing them.
M3: Then colorizing module reads the data from intermediate buffer. M4: All grayscale frames are colored taking key-frames as reference. M5: Finally, data is pushed to buffer of video processing unit of the device.
Figure 3 is flow diagram of the present invention for processing of signal at the receiving end for reconstruction of color data.
Colorization is a term used to describe a computerized process for adding color to black and white pictures, movies or television programs by replacing a scalar value stored at each pixel of the gray scale image by a vector in a three dimensional color space with luminance, saturation and hue or simply RGB. Since different colors may carry the same luminance value but vary in hue and / or saturation, the problem of colorization has no inherently "correct" solution.
Due to these ambiguities, human interaction usually plays a large role. The colorization method described here takes the advantage of morphological distance transformation, predictive algorithm for determining color information of a pixel using two colored frames as reference frames.
Flow estimation: In a video clip, there exists correlation between adjacent
frames. In many cases, the background will keep unchanged white an object is
moving. Given two neighbouring frames, Ik and lk+1, each pixel p1 (x, y) in lk+1
is assumed as coming from a pixel p2(x + u, y + v) in Ik::
lk+1 (x, y) = lk(x + u, y + y)
Where, u describes the horizontal velocity of the pixel, v describes the vertical
velocity of the pixel.
Color similarity: Similarly, there exists a correlation between adjacent frames with respect to color values. Just like video compress technique, in-between frames can be processed with two nearby key frames. The more temporally near is the target frame to a key frame, the more similar is the target frame to the key frame. Say, there are f + b frames between two key frames and the
target frame is f frames to the first key frame and b frames to the second key
frame. Thus the simitar coefficient of the first key frame is b/(f+b) and that of
the second key frame is f/(f+b).
We have:
R + Rf* Cf + Rb * Cb
Where, R: the target frame result, Rf: the result of forward colorization, Cf: the
similar coefficient of the first key frame, Rb the result of backward cotorization,
Cb: the similar coefficient of the second key frame.
The forward cotorization and backward cotorization is identical except that they take foregoing key frame and latter key frame as reference respectively. The procedure of cotorization is described as below:
Step 1 Take the key frames from the buffer.
Step 2 Compare the illumination of the pixels at the same coordinate (x, y). If the difference is lower than a given threshold, the luminances of the two pixels are regarded as unchanged, i.e. the pixel does not move. Under this circumstance, chromatic information a(x, y) and b(x, y) of the pixel in the
reference image are transformed to target image directly. If the illumination difference is larger than E, we can trace the pixel and transfer the color of reference image to target one, keeping illumination unchanged.
Step 3 If there is no matching pixel in the whole reference image, we can synthesize it's a and b value with bilinear interpolation of nearby pixels.
The threshold E plays an important role in the matching procedure. If it is too small, normal pixels are regarded as noise spots. While it is too large, two pixels of different illumination will be deemed identical and thus we get the result image of poor quality.
Thus, bandwidth usage can be lowered considerably in the present invention. It is also possible to get more channels in the same bandwidth. The present invention also allows reduction in development cost B and W or grayscale media to color. Only few B and W or grayscale frames are required to be colored and a television or any other processing apparatus can perform coloring of rest of the frames. The invention also allows reduction of network overheads.
WE CLAIM
1. A method for effective utilization of bandwidth when broadcasting video data, comprising the steps of:
- restricting transmission of redundant data of color information in
adjacent frames of video signal;
- receiving the video frames in a video processing unit of a device
Iike a digital television or a set-top-box;
- regenerating the data of color information at the receiving end,
which were not transmitted, with the aid of color information of key
Frames and correlation between adjacent frames; and
- colorizing all grayscale frames taking the key-frame as reference,
and storing the data in the buffer of the video processing unit.
2. The method as claimed in claim 1, wherein the video frames are checked
if an I-frame with color is received and if so it can be marked as a key
frame.
3. The method as claimed in claim 2, wherein processed frames are stored in
a buffer and sent to a colorizing module.
4. The method as claimed in claim 1, wherein the step of cotorization
comprises the steps:
- taking the key-frame from the buffer;
- comparing the illumination of the pixels at a given coordinate; and
- synthesizing , in the event of there being no matching pixels, in the
whole reference image, the a and b value with bilinear interpolation
of nearby pixels.
5. The method as claimed in claim 1, wherein the size of the required buffer
is selected on the basis of number of frames to be displayed per second,
time required for processing one grayscale frame, and the selected size is
such that the video buffer is not empty.
6. A method for effective utilization of bandwidth when broadcasting video
data, substantially as herein described and illustrated in the figures of the
accompanying drawings.
The main object of the present invention is to remove the redundant data of color information from the video data sent from the broadcaster's end, that can be reconstructed at the receiving end, for further compressing the video data at the broadcaster's end. This and other objects of the invention can be achieved by reducing the amount of color information being sent to I-frames for lowering the data pumped on network. The present invention also provides a system for regenerating the video data. Thus the present invention, in one preferred embodiment provides a method for effective utilization of bandwidth when broadcasting video data, comprising the steps of: restricting transmission of redundant data of color information in adjacent frames of video signal; receiving the video frames in a video processing unit of a device like a digital television or a set-top-box; regenerating the data of color information at the receiving end, which were not transmitted, with the aid of cotor information of key frames and correlation between adjacent frames; and colorizing all grayscale frames taking the key-frame as reference, and storing the data in the buffer of the video processing unit.
| # | Name | Date |
|---|---|---|
| 1 | 1731-KOL-2008-ABANDONED LETTER.pdf | 2017-12-18 |
| 1 | 1731-kol-2008-specification.pdf | 2011-10-07 |
| 2 | 1731-kol-2008-gpa.pdf | 2011-10-07 |
| 2 | 1731-kol-2008-abstract.pdf | 2017-12-18 |
| 3 | 1731-KOL-2008_EXAMREPORT.pdf | 2016-06-30 |
| 3 | 1731-kol-2008-form 3.pdf | 2011-10-07 |
| 4 | 1731-kol-2008-claims.pdf | 2011-10-07 |
| 4 | 1731-kol-2008-form 2.pdf | 2011-10-07 |
| 5 | 1731-KOL-2008-FORM 18.pdf | 2011-10-07 |
| 5 | 1731-KOL-2008-CORRESPONDENCE.pdf | 2011-10-07 |
| 6 | 1731-kol-2008-form 1.pdf | 2011-10-07 |
| 6 | 1731-kol-2008-description (complete).pdf | 2011-10-07 |
| 7 | 1731-KOL-2008-FORM 1 1.1.pdf | 2011-10-07 |
| 7 | 1731-kol-2008-drawings.pdf | 2011-10-07 |
| 8 | 1731-KOL-2008-FORM 1 1.1.pdf | 2011-10-07 |
| 8 | 1731-kol-2008-drawings.pdf | 2011-10-07 |
| 9 | 1731-kol-2008-form 1.pdf | 2011-10-07 |
| 9 | 1731-kol-2008-description (complete).pdf | 2011-10-07 |
| 10 | 1731-KOL-2008-CORRESPONDENCE.pdf | 2011-10-07 |
| 10 | 1731-KOL-2008-FORM 18.pdf | 2011-10-07 |
| 11 | 1731-kol-2008-claims.pdf | 2011-10-07 |
| 11 | 1731-kol-2008-form 2.pdf | 2011-10-07 |
| 12 | 1731-KOL-2008_EXAMREPORT.pdf | 2016-06-30 |
| 12 | 1731-kol-2008-form 3.pdf | 2011-10-07 |
| 13 | 1731-kol-2008-gpa.pdf | 2011-10-07 |
| 13 | 1731-kol-2008-abstract.pdf | 2017-12-18 |
| 14 | 1731-kol-2008-specification.pdf | 2011-10-07 |
| 14 | 1731-KOL-2008-ABANDONED LETTER.pdf | 2017-12-18 |