Abstract: The present invention provides a method and system for on-the-fly encoding and streaming of video data in a peer-to-peer video sharing environment. Suppose a user of a recipient device wishes to watch video data available with a source device or the user of the source device wishes to share the video data with the user of the recipient device. In this scenario, the source device determines format of video file corresponding to the requested video data and encodes the video data in a streaming friendly format. Substantially simultaneously, the source device starts streaming the already encoded portion of video data without waiting for entire video data to be encoded. The recipient device continuously receives the encoded portion of video data and instantaneously plays video corresponding to the received video data on a display screen while the remaining portion of video data is being encoded and streamed by the source device. Figure 2
RELATED APPLICATIONS
Benefit is claimed to:
a) Indian Provisional Application No. 1200/CHE/2012 titled "CODEC-AGNOSTIC ON-THE-FLY ENCODING AND STREAMING OF VIDEO DATA IN A PEER-TO-PEER VIDEO SHARING ENVIRONMENT' by Quikast Technologies Pvt. Ltd. filed on 28th March 2012, which is herein incorporated in its entirety by reference for all purposes; and
b) Indian Provisional Application No. 1443/CHE/2012 titled "METHOD AND SYSTEM FOR ENCODING AND STREAMING MEDIA AT INCREMENTAL BITRATES" by Quikast Technologies Pvt. Ltd., filed on 10* April 2012, which is herein incorporated in its entirety by reference for all purposes.
FIELD OF THE INVENTION
The present invention relates to the field of peer-to-peer video sharing systems, and more particularly relates to on-the-fly encoding and streaming of video data in a peer-to-peer video sharing environment.
BACKGROUND OF THE INVENTION
With the increasing availability of high bandwidth networks, video-on-demand applications are gaining popularity on global digital communications networks such as the Internet as well as the private and corporate digital communication internal networks commonly referred to as Intranets.
Peer-to-peer file sharing technologies are being rapidly adopted to distribute digital Information (e.g., multi-media content such as video files). Peer-to-peer technology improved overall system reliability by allowing one or more peer computers to serve as a source of digital information requested by another peer computer. Thus, the risk of not being able to obtain digital information due to a server being not operational (either due to the server itself, or one or more network connections) is mitigated.
Transfer of video data over digital networks includes encoding of video data and its transmission over the global digital communications networks. One of the known methods of streaming video data is a progressive streaming method in which a recipient device progressively downloads video data from a source device and starts playback of the downloaded video data. In the progressive streaming method, video quality does not dynamically adapt to network bandwidth at the recipient device. Further, the progressive streaming method is not secure since the entire video file is saved at the recipient device. More importantly, the progressive streaming method does not allow the user of the recipient device to seek to any point in the video file unless the download of video data till that point is complete.
Another known method enables download of video file on the recipient device using streaming protocols such as RTMP. In this method, video-fragments of the video file are dynamically served on demand over a persistent connection. However, in this —method, the source device has to maintain persistent connection for streaming video files over such streaming protocols. Hence, this method suffers from scalability and reliability issues. In yet another known method, the source device dynamically streams video data to the recipient device over Hyper Text Transfer Protocol (HTTP) connections. In this method, a video in a source format is encoded into a streaming friendly format at different bitrates with the meta-information being placed at the beginning of the video file before delivering the video file on the recipient device. Sometimes, the meta-information is provided in a manifest file accompanying the video file, where the meta-information informs the recipient device about the different qualities i.e., bitrate information associated with the encoded video. However, in the current HTTP based streaming methods, the user has to wait until the encoder in the source device converts the entire video file into a streaming friendly format before delivering the video file for playing back the video on the recipient device.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
Figure 1 is a block diagram of an exemplary peer-to-peer video sharing system for sharing video data between two peer devices, according to one embodiment
Figure 2 is a flow diagram illustrating an exemplary method of sharing video data in a peer-to-peer video sharing environment, according to one embodiment.
Figure 3 is a flow diagram illustrating an exemplary method of sharing video data in a peer-to-peer video sharing environment, according to another embodiment.
Figure 4A is a schematic representation depicting a process of simultaneous encoding and streaming of video data where meta-information is pre-pended with respective fragments of encoded video data, according to one embodiment.
Figure 4B is a schematic representation depicting a process of simultaneous encoding and streaming of a portion of video data as requested by a recipient device, according 16 another embodiment
Figure 5 is a schematic representation depicting various playback scenarios when video data is encoded in an incremental bitrate format
Figure 6 is a block diagram of an exemplary source device, such as those shown in Figure 1, showing various components for implementing embodiments of the present subject matter.
Figure 7 is a block diagram of an exemplary recipient device, such as those shown in Figure 1, showing various components for implementing embodiments of the present subject matter.
The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
DETAILED DESCRIPTION OF THE INVENTION
The present invention provides a method and system for on-the-fly encoding and streaming of video data in a peer-to-peer video sharing environment In the following detailed description of the embodiments of the invention, reference is made to the accompanying drawings that form a part hereof, and in which are shown, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
The terms Video data' and Video content' are interchangeably used throughout the document.
Figure 1 is a block diagram of an exemplary peer-to-peer video sharing system 100 for sharing video data between two peer devices, according to one embodiment. In Figure 1, the system 100 includes a source device 102 and a recipient device 104 connected to the source device 102 via the network 106.
Suppose user of the recipient device 104 wishes to watch a video content available with the source device 102 or the user of the source device 102 wishes to share the video content with the user of the recipient device 104. Also consider that, the video content is stored on the source device 102 in format not suitable for streaming or watching on the recipient device 104.
According to one embodiment, the source device 102 determines format of video file corresponding to the requested video data and encodes the video data in a streaming friendly format. Substantially simultaneously, the source device 102 starts streaming the already encoded portion of video data without waiting for entire video data to be encoded. The recipient device 104 continuously receives the encoded portion of video data and instantaneously plays video corresponding to the received video data on a display screen while the remaining portion of video data is being encoded and streamed by the source device 102. This eliminates the need for the recipient device 104 to wait until the entire video data is encoded into a streaming friendly format and then streamed to the recipient device 104.
Now, consider that the user of the recipient device 104 wishes to watch video content from a new seek position when video data is being streamed by the source device 102. When the user seeks to a new position on the recipient device 104, the source device 102 determines a start point corresponding to the new seek position selected by the user. Accordingly, the source device 102 skips undesired portion of video data and resumes encoding video data from the new start point in a streaming friendly format. Substantially simultaneously, the source device 102 streams the encoded portion of video data from the new start point while the remaining video data from the new start point is being encoded. When the recipient device 104 receives the encoded video data from the new seek position, the recipient device 104 instantaneously plays video corresponding to the encoded video data on a display screen while the remaining video data from the new seek position is being encoded and streamed by the source device 102. This enables the user of the recipient device 104 to seek to a new position and instantaneously watch video from the new seek position without waiting for encoding and streaming undesired portion of video data by the source device 102. Also, this method saves resources required for re-encoding the entire video.
In accordance with the above embodiments, the source device 102 encodes and streams the video data to the recipient device 104 at a single bitrate based on current bandwidth. In some embodiments, the single bitrate is selected so as to closely match a current available bandwidth. That is, the source device 102 encodes and streams the video data to the recipient device 104 at a bitrate matching the current bandwidth. For example, if the current available bandwidth is 1.2Mbps, then the source device 102 encodes the video data at a bitrate of 1Mbps with 10 to 20% tolerance so as to match the current available bandwidth without affecting user experience due to delayed buffering of video data at the recipient device 104.
Alternatively, if the bandwidth is constantly fluctuating, the source device 102 encodes and streams the video data in an incremental bitrate format. That is, the source device 102 structures the video data in such a way that initial blocks or bytes of data in a fragment corresponds to a low resolution video with subsequent blocks adding up to resolution progressively. As a result, the recipient device 104 can play the received video data at low resolution or high resolution based on current bandwidth. In other words, if only a portion of fragment corresponding to the encoded portion of the video data is buffered at the recipient device 104 prior to playing the buffered portion of fragment, the recipient device 104 plays the video corresponding to the received video data at a low resolution. However, if the entire fragment corresponding to the encoded portion of the video data is buffered prior to playback of the buffered fragment, the recipient device 104 plays the video corresponding to the received fragment at maximum resolution. This is possible as the source device 102 has encoded the video data in an incremental bitrate format.
In the manner described above, the source device 102 encodes and streams video data to the recipient device 104 on-the-fly so that the video can be instantaneously played back at the recipient device 104 in parallel with encoding and streaming of remaining portion of video data. Since, the video data is encoded on-the-fly, the source device 102 need not pre-process video data and store multiple copies of video data encoded at different bitrates so as to match available bandwidth.
Figure 2 is a flow diagram 200 illustrating an exemplary method of sharing video data in a peer-to-peer video sharing system, according to one embodiment. When a user of the recipient device 104 selects a video file he or she wishes to download from the source device 102, then at step 202, the recipient device 104 sends a request for video data to the source device 102. In one embodiment, the video data requested by the recipient device 104 may correspond to a portion of video. In another embodiment, the video data requested by the recipient device 104 may correspond to entire video.
At step 204, the recipient device 104 determines current bandwidth between the source device 102 and the recipient device 104. At step 205, the recipient device 104 notifies the current bandwidth to the source device 102. At step 206, the source device 102 determines bitrate at which the video data is to be encoded so as to match the current bandwidth. In some embodiments, during downloading the video data, the recipient device 104 continuously determines whether there is network fluctuation or user activity on the recipient device 104. If any network fluctuation or user activity is detected, the recipient device 104 computes current available bandwidth and notifies the current available bandwidth to the source device 102. Accordingly, the source device 102 selects a different bitrate for encoding remaining video data to be streamed to the recipient device 104.
At step 208, the source device 102 determines duration (e.g., start point and end point) and format of the video data requested by the recipient device 104. At step 210, the source device 102 encodes video data into a streaming friendly format at the determined bitrate. It is appreciated that, the source device 102 encodes the video data using one or more of encoding techniques well known in the art.
At step 212, the source device 102 generates a fragment from the encoded portion of video data and associated meta-information (e.g., fragment duration, key frame locations, etc.). At step 214, the source device 102 streams the fragment to the recipient device 104 while encoding of remaining portion of video data is being performed. In some embodiments, the source device 102 streams the fragment in one or more data packets.
At step 216, the recipient device 104 processes the fragment corresponding to the encoded portion of the video data. At step 218, the recipient device 104 plays videc corresponding to the processed fragment on a display screen. The steps 210 to 21( are performed till the requested video data is encoded and streamed to the recipien device 104. According to the present invention, the recipient device 104 need not wai till entire video data is encoded for playback of video. Instead, the recipient device 10^ can instantaneously play video using the received video data while remaining videc data is being encoded and streamed.
Figure 3 is a flow diagram 300 illustrating an exemplary method of sharing video data ir a peer-to-peer video sharing system, according to another embodiment. When a use of the recipient device 104 selects a video file he or she wishes to stream and water from the source device 102, then at step 302, the recipient device 104 sends a reques for video data to the source device 102. In one embodiment the video data requests by the recipient device 104 may correspond to a portion of video. In anothe embodiment, the video data requested by the recipient device 104 may correspond t( entire video. The request may contain input parameters such as screen resolution refresh rate, aspect ratio, hardware capabilities, etc.
At step 303, the recipient device 104 computes a current bandwidth between the sourct device 102 and the recipient device 104. At step 304, the recipient device 104 indicate: the current bandwidth to the source device 102. At step 305, the source device 10? conputes variance in the current bandwidth based on fluctuations in the curren bandwidth. At step 306, the source device 102 determines bitrates suitable fo encoding the requested video data in an incremental bitrate format. In som< embodiments, the recipient device 104 continuously monitors fluctuations in the curren available bandwidth during streaming of the video data and notifies the fluctuation in th< current bandwidth to the source device 102. The source device 102 computes i variance in the current available bandwidth based on fluctuations in the curren available bandwidth. Further, the source device 102 determines different bitrates fo encoding the requested video data in an incremental bitrate format. In one exemplar implementation, the source device 102 determines a first bitrate value and a second bitrate value for encoding video data in an incremental bitrate format, where the first bitrate value is the minimum bitrate (e.g., 512kbps) and the second bitrate value is the maximum bitrate value (1mbps). In another exemplary implementation, the source device 102 may select more than two bit rate values (e.g., 512kbps, 1mbps, and 1.5mbps) for encoding the video data in the incremental bitrate format if variance in the current available bandwidth is large.
At step 307, the source device 102 determines duration (e.g., start point and end point) and format of the video data requested by the recipient device 104. At step 308, the source device 102 encodes the video data in an incremental bitrate format based on the determined bitrates. In some embodiments, the source device 102 encodes the video data in such a way that initial blocks of video data correspond to low resolution with subsequent blocks adding up to the quality of video progressively. In one exemplary implementation, the source device 102 encodes video data in manner described following steps:-
1) Block preparation- In this step, the video data is broken down into scanned coded samples of luminance and chrominance, and then regrouped and sampled into 8x8 pixel blocks.
2) Discrete Cosine Transform - In this step, the pixel blocks in time domain is transformed into frequency domain. Thus, each pixel block in frequency domain consists of low frequency components and high frequency components. The lower frequency components contain most of the signal information and need to be preserved. The higher frequency components contain detailed entropy information and need to be preserved for playing high-definition media.
3) Quantization - Low-pass and corresponding band-pass filters for chosen incremental bitrates are applied on each pixel block in frequency domain for separating out frequency components of the each pixel block into distinct mutually exclusive frequency ranges.
4) Compression - The output data from the low-pass and band-pass filters is compressed using well known techniques like Huffman encoding, Run Length Encoding, etc.
5) Concatenation - As a final step, the compressed data is concatenated back to back to obtain encoded portion of video data.
At step 310, the source device 102 generates a fragment from the encoded portion of video data and associated meta-information. At step 312, the source device 102 streams the fragment corresponding to the encoded portion of video data to the recipient device 104 while encoding of remaining portion of video data is being performed. At step 314, the recipient device 104 buffers the fragment corresponding to the encoded portion of the video data. Prior to playing video, at step 316, the recipient device 104 determines whether the fragment is fully received and buffered prior to start of playback.
If the recipient device 104 determines that fragment is fully received and buffered prior to playback, then at step 318, the recipient device 104 decodes the buffered fragment and plays video corresponding to the decoded fragment at a maximum resolution on the display screen. However, if the recipient device 104 determines that the fragment is partially received and buffered prior to the playback, then at step 320, the recipient device 104 decodes the partially buffered fragment and plays video corresponding to the decoded fragment data at a resolution lower than the maximum resolution on the display screen. This is possible as fragment received from the source device 102 contains a portion of video data encoded in an incremental bitrate format. Thus, even if part of the fragment is received, the recipient device 104 is able to play the video at a low clarity as initial blocks of portion of video data in the received fragment correspond to low resolution with subsequent blocks adding to quality of the video. The steps 306 to 320 are performed in parallel till the requested video data is encoded and streamed to the recipient device 104. Thus, the recipient device 104 need not wait till entire video data is encoded for playback of video. Instead, the recipient device 104 can instantaneously play video corresponding to the received video data while remaining video data is being encoded and streamed.
Figure 4A is a schematic representation depicting a process of simultaneous encoding and streaming of video data where meta-information is pre-pended with respective fragment corresponding to encoded video data, according to one embodiment. As depicted in Figure 4A, the source device 102 simultaneously encodes and streams video data associated with requested video. As a result, video associated with each fragment can be instantly played back on the recipient device 104 without waiting for the encoding of the entire video file.
Figure 4B is a schematic representation depicting a process of simultaneous encoding and streaming of a portion of video data as requested by the recipient device 104, according to another embodiment As depicted in Figure 4B, the source device 102 simultaneously encodes and streams only desired portion of the video file based on a naw seek position selected by the user so that the video can be instantaneously played back from the selected seek position. This can be achieved by simultaneously encoding and streaming only the desired portion of video file while skipping the unprocessed video data and also providing meta-information with the desired portion of the video file to the recipient device 104.
Figure 5 is a schematic representation 500 depicting various playback scenarios when video data is encoded and streamed in an incremental bitrate format Particularly, in Figure 5, various playback scenarios under fluctuating bandwidth are depicted with respect to playback timeline and buffer timeline.
Consider that at time T1, position of playback head 502A is at the beginning of a fragment a3 and the buffered media pointer 504A is pointing at a position till which the part of fragment a3 containing only 512 kbps quality of encoded video data is buffered, in such case, the recipient device 104 starts playback with 512kbps quality. However, due to lower network bandwidth, if the buffering of the remaining portion of the fragment a3 gets delayed compared to the normal playback time, the recipient device 104 discards the rest of the fragment a3 beyond the quality at which the media is played at that instant. Further, if the network bandwidth is suddenly improved and fragment data of 1mbps is buffered before the playback head 502A reaches position of the buffered media pointer 504A, the recipient device 104 automatically plays the video at a quality of 1mbps, higher than 512kbps quality.
In another scenario, consider that at time T2, position of the playback head 502B is at the beginning of the fragment a4 and the buffered media pointer 504B is pointing at a position till which the part of fragment a4 containing only 1mbps quality of video data is buffered. Hence, the recipient device 104 starts playback of video with quality of 1mbps. But, if the bandwidth drops and buffering of the remaining portion of the fragment a4 gets delayed compared to the normal playback time, the recipient device 104 discards the remaining portion of the fragment a4 beyond the quality at which the video is played at that instant.
In yet another scenario, consider that at time T3, position of the playback head 502C is at the beginning of the fragment a5 and the buffered media pointer 504C is pointing at the end of fragment a4 since the entire fragment a5 is already buffered. Hence, the recipient device 104 starts playback of video with the maximum quality of 1.5mbps. Substantially simultaneously, the next fragment a6 is buffered in the background and playback quality scaling if needed only happens beyond the fragment a5.
In reality, the buffer pointer needs to be ahead of the playback pointer by around 30 seconds for an interrupt-free playback. The ability to use partially received fragments rather than discarding the partially received fragments helps minimize wastage and re¬transmission of video data. This, in turn, helps the recipient device 104 to maintain the buffer more deterministically.
Figure 6 is a block diagram of the source device 102, such as those shown in Figure 1, showing various components for implementing embodiments of the present subject matter. In Figure 6, the source device 102 includes a processor 602, memory 604, a read only memory (ROM) 606, a communication interface 608, a bus 610, a display 612, an input device 614, and a cursor control 616.
The processor 602, as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a graphics processor, a digital signal processor, or any other type of processing circuit. The processor 602 may also include embedded controllers, such as generic or programmable logic devices or arrays, application specific integrated circuits, single-chip computers, smart cards, and the like.
The memory 604 may be volatile memory and non-volatile memory. For example, the memory 604 includes an orRhe-fly encoding and streaming module 618 for encoding video in a streaming friendly format and streaming the encoded portion of video data in parallel with encoding of the remaining portion of video data, according to one or more embodiments described above. A variety of computer-readable storage media may be stored in and accessed from the memory elements. Memory elements may include any suitable memory devices) for storing data and machine-readable instructions, such as read only memory, random access memory, erasable programmable read only memory, electrically erasable programmable read only memory, hard drive, removable media drive for handling compact disks, digital video disks, diskettes, magnetic tape cartridges, memory cards, and the like.
Embodiments of the present subject matter may be implemented in conjunction with modules, including functions, procedures, data structures, and application programs, for performing tasks, or defining abstract data types or low-level hardware contexts. The on-the-fly encoding and streaming module 618 may be stored in the form of machine-readable instructions on any of the above-mentioned storage media and may be executed by the processor 602. For example, a computer program may include machine-readable instructions capable of encoding video in a streaming friendly format and streaming the encoded portion of video data in parallel with encoding of the remaining portion of video data, according to the teachings and herein described embodiments of the present subject matter. In one embodiment, the computer program may be included on a compact disk-read only memory (CD-ROM) and loaded from the CD-ROM to a hard drive in the non-volatile memory.
The bus 610 acts as interconnect between various components of the source device 102. The components such as the ROM 606, the communication interface 608, display 612, the input device 614, and the cursor control 616 are well known to the person skilled in the art and hence the explanation is thereof omitted.
Figure 7 is a block diagram of the recipient device 104, such as those shown in Figure 1, showing various components for implementing embodiments of the present subject matter. In Figure 7, the recipient device 104 includes a processor 702, memory 704, a read only memory (ROM) 706, a communication interface 708, a bus 710, a display 712, an input device 714, and a cursor control 716.
The processor 702, as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a graphics processor, a digital signal processor, or any other type of processing circuit The processor 702 may also include embedded controllers, such as generic or programmable logic devices or arrays, application specific integrated circuits, single-chip computers, smart cards, and the like.
The memory 704 may be volatile memory and non-volatile memory. For example, the memory 704 includes a playback module 718 for instantaneously playing video corresponding to the encoded portion of video data received from the source device 102 on the display 712, according to one or more embodiments described above. A variety of computer-readable storage media may be stored in and accessed from the memory elements. Memory elements may include any suitable memory device(s) for storing data and machine-readable instructions, such as read only memory, random access memory, erasable programmable read only memory, electrically erasable programmable read only memory, hard drive, removable media drive for handling compact disks, digital video disks, diskettes, magnetic tape cartridges, memory cards, and the like.
Embodiments of the present subject matter may be implemented in conjunction with modules, including functions, procedures, data structures, and application programs, for performing tasks, or defining abstract data types or low-level hardware contexts. The . playback module 718 may be stored in the form of machine-readable instructions on any of the above-mentioned storage media and may be executed by the processor 702. For example, a computer program may include machine-readable instructions capable of instantaneously playing video corresponding to the encoded portion of video data received from the source device 102, according to the teachings and herein described embodiments of the present subject matter. In one embodiment, the computer program may be included on a compact disk-read only memory (CD-ROM) and loaded from the CD-ROM to a hard drive in the non-volatile memory.
The bus 710 acts as interconnect between various components of the recipient device 104. The components such as the ROM 706, the communication interface 708, display 712, the input device 714, and the cursor control 716 are well known to the person skilled in the art and hence the explanation is thereof omitted.
The present embodiments have been described with reference to specific example embodiments; it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments. Furthermore, the various devices, modules, and the like described herein may be enabled and operated using hardware circuitry, for example, complementary metal oxide semiconductor based logic circuitry, firmware, software and/or any combination of hardware, firmware, and/or software embodied in a machine readable medium. For example, the various electrical structure and methods may be embodied using transistors, logic gates, and electrical circuits, such as application specific integrated circuit.
We Claim:
1. A method of sharing video data in a peer to peer video sharing environment, comprising:
encoding video data of a first format in a second format by a source device; and substantially simultaneously streaming the encoded portion of video data to a recipient device while encoding of the remaining video data is in progress.
2. The method of claim 1, wherein substantially simultaneously streaming the encoded portion of the video data comprises:
generating a fragment from the encoded portion of the video data and associated meta-information; and
substantially simultaneously streaming the fragment corresponding to encoded portion of video data to the recipient device while encoding of remaining portion of the video data is in progress.
3. The method of claim 1, wherein encoding the video data of the first format in the second format comprises:
monitoring current bandwidth between the source device and the recipient device;
determining a bitrate for encoding the video data closely matching the current bandwidth; and
encoding the video data of the first format in the second format at the determined bitrate.
4. The method of claim 1, wherein encoding the video data of the first format in the second format comprises:
determining bitrates required for encoding the video data based on variance in current available bandwidth; and
encoding the video data in an incremental bitrate format based on the determined bitrates.
5. The method of claim 1, wherein encoding the video data of the first format in the second format comprises:
receiving a request for sharing a part of the video data from a new seek position from the recipient device while the encoding of video data is progress; and
encoding the part of the video data of the first format in the second format from the new seek position.
6. The method of claim 1, wherein the second format is a streaming friendly format and is playable at the recipient device.
7. A method of sharing video data in a peer to peer video sharing environment, comprising:
determining bitrates for encoding video data based on variance in current available bandwidth;
encoding the video data in an incremental bitrate format based on the determined bitrates; and
substantially simultaneously streaming the encoded portion of the video data to a recipient device while encoding of the remaining video data is in progress.
8. The method of claim 7, wherein encoding the video data in the incremental bitrate format comprises:
receiving a request for sharing a part of the video data from the recipient device from a new saek position while encoding and streaming of the video data is in progress; and
encoding the part of the video data from the new seek position in the incremental bitrate format based on the determined bitrates.
9. The method of claim 7, wherein substantially simultaneously streaming the encoded portion of the video data comprises:
generating a fragment from the encoded portion of video data and associated meta-information; and
substantially simultaneously streaming the fragment corresponding to the encoded portion of video data to the recipient device while encoding of the remaining portion of video data is in progress.
10. An apparatus comprising:
a processor; and
a memory coupled to the processor, wherein the memory includes an on4he-fly encoding and streaming module configured for encoding video data of a first format in a second format; and
substantially simultaneously streaming the encoded portion of video data to a recipient device while encoding of the remaining video data is in progress.
11. The apparatus of claim 10, wherein in substantially simultaneously streaming the encoded portion of the video data, the on-the-fly encoding and streaming module is configured for generating a fragment from the encoded portion of the video data and associated meta-information; and
substantially simultaneously streaming the fragment corresponding to encoded portion of video data to the recipient device while encoding of remaining portion of the video data is in progress.
12. The apparatus of claim 10, wherein in encoding the video data of the first format in the second format, the on-the-fly encoding and streaming module is configured for:
monitoring current bandwidth with the recipient device;
determining a bitrate for encoding the video data closely matching the current bandwidth; and
encoding the video data of the first format in the second format at the determined bitrate.
13. The apparatus of claim 10, wherein encoding the video data of the first format in the second format, the on-the-fly encoding and streaming module is configured for:
determining bitrates required for encoding the video data based on variance in current available bandwidth; and
encoding the video data in an incremental bitrate format based on the determined bitrates.
14. The apparatus of claim 10, wherein encoding the video data of the first format in the second format, the on-the-fly encoding and streaming module is configured for:
receiving a request for sharing a part of the video data from a new seek position from the recipient device while the encoding of video data is progress; and
encoding the part of the video data of the first format in the second format from the new seek position.
15. A method of playing video data shared by a peer device in a peer to peer video sharing environment, comprising:
sending a request for video data to the peer device;
receiving and buffering at least a part of fragment corresponding to a portion of the video data encoded in an incremental bitrate format from the peer device, wherein initial blocks of the at least one fragment correspond to low quality video data with subsequent blocks adding to the quality progressively;
determining whether the fragment is fully received and buffered prior to the start of the playback of the at least the part of fragment;
playing video corresponding to the fragment at a maximum resolution if the fragment is fully received and buffered prior to the start of the playback of the buffered fragment; and
playing video corresponding to at least the part of the fragment at a resolution lower than the maximum resolution if the fragment is partially received and buffered prior to the start of the playback of the partially buffered fragment.
16. The method of claim 15, further comprising:
discarding fragment data of the fragment received after the start of the playback of the corresponding portion of the fragment when at least the part of the fragment is received and buffered.
17. An apparatus comprising',
a processor; and
a memory coupled to the processor, wherein the memory comprises a playback module configured for.
sending a request for video data to the peer device;
receiving and buffering at least a part of fragment corresponding to a portion of the video data encoded in an incremental bitrate format from the peer device, wherein initial blocks of the at least one fragment correspond to low quality video data with subsequent blocks adding to the quality progressively;
determining whether the fragment is fully received and buffered prior to the start of the playback of the at least the part of fragment;
playing video corresponding to the fragment at a maximum resolution if the fragment is fully received and buffered prior to the start of the playback of the buffered fragment; and
playing video corresponding to at least the part of the fragment at a resolution lower than the maximum resolution if the fragment is partially received and buffered prior to the start of the playback of the partially buffered fragment
18. The apparatus of claim 17, wherein the playback module is configured for discarding fragment data of the fragment received after the start of the playback of the corresponding portion of the fragment when at least the part of the fragment is received and buffered.
19. A system comprising:
a source device configured for:
encoding video data of a first format in a second format;
substantially simultaneously streaming the encoded portion of video data to a recipient device; and a recipient device configured for:
playing video corresponding to the encoded portion of video data while remaining video data is being encoded and streamed by the source device.
20. The system of claim 19, wherein the source device is configured for encoding the video data of the first format in the second format at a bitrate substantially matching current bandwidth.
21. The system of claim 19, wherein the source device is configured for encoding the video data of the first format in the second format at an incremental bitrate in such a manner that initial blocks of the encoded portion of video data correspond to low quality video data with subsequent blocks adding to the quality progressively.
22. The system of claim 21, wherein the recipient device is configured for receiving and buffering at least a part of fragment corresponding to the encoded portion of the video data;
determining whether the fragment is fully received and buffered prior to the start of the playback of the at least the part of fragment;
playing video corresponding to the fragment at a maximum resolution if the fragment is fully received and buffered prior to the start of the playback of the buffered fragment; and
playing video corresponding to at least the part of the fragment at a resolution lower than the maximum resolution if the fragment is partially received and buffered prior to the start of the playback of the partially buffered fragment
23. A non-transitory computer-readable storage medium having instructions stroe therein, that when executed by a peer device, cause the peer device to perform a method comprising:
encoding video data of a first format in a second format;
substantially simultaneously streaming the encoded portion of video data to a recipient device; and
playing video corresponding to the encoded portion of video data while remaining video data is being encoded and streamed by a source device.
| # | Name | Date |
|---|---|---|
| 1 | abstract1200-CHE-2012.jpg | 2014-02-26 |
| 1 | Power of Authority.pdf | 2012-04-02 |
| 2 | 1200-CHE-2012 ABSTRACT 28-03-2013.pdf | 2013-03-28 |
| 3 | 1200-CHE-2012 CLAIMS 28-03-2013.pdf | 2013-03-28 |
| 4 | 1200-CHE-2012 CORRESPONDENCE OTHERS 28-03-2013.pdf | 2013-03-28 |
| 4 | 1200-CHE-2012 DESCRIPTION(COMPLETE) 28-03-2013.pdf | 2013-03-28 |
| 5 | 1200-CHE-2012 DRAWINGS 28-03-2013.pdf | 2013-03-28 |
| 5 | 1200-CHE-2012 POWER OF ATTORNEY 28-03-2013.pdf | 2013-03-28 |
| 6 | 1200-CHE-2012 OTHER PATENT DOCUMENT 28-03-2013.pdf | 2013-03-28 |
| 6 | 1200-CHE-2012 FORM-2 28-03-2013.pdf | 2013-03-28 |
| 7 | 1200-CHE-2012 FORM-1 28-03-2013.pdf | 2013-03-28 |
| 8 | 1200-CHE-2012 OTHER PATENT DOCUMENT 28-03-2013.pdf | 2013-03-28 |
| 8 | 1200-CHE-2012 FORM-2 28-03-2013.pdf | 2013-03-28 |
| 9 | 1200-CHE-2012 DRAWINGS 28-03-2013.pdf | 2013-03-28 |
| 9 | 1200-CHE-2012 POWER OF ATTORNEY 28-03-2013.pdf | 2013-03-28 |
| 10 | 1200-CHE-2012 CORRESPONDENCE OTHERS 28-03-2013.pdf | 2013-03-28 |
| 10 | 1200-CHE-2012 DESCRIPTION(COMPLETE) 28-03-2013.pdf | 2013-03-28 |
| 11 | 1200-CHE-2012 CLAIMS 28-03-2013.pdf | 2013-03-28 |
| 12 | 1200-CHE-2012 ABSTRACT 28-03-2013.pdf | 2013-03-28 |
| 13 | abstract1200-CHE-2012.jpg | 2014-02-26 |
| 13 | Power of Authority.pdf | 2012-04-02 |