Sign In to Follow Application
View All Documents & Correspondence

Method Of Rendering 360 Degree Vidio Content On A Client Device And System Thereof

Abstract: METHOD OF RENDERING 360-DEGREE VIDEO CONTENT ON A CLIENT DEVICE AND SYSTEM THEREOF ABSTRACT The present disclosure relates to a method of rendering 360-degree video content on a client device and a system thereof. The method includes obtaining the 360-degree video content in a cube-map projection format. The method includes converting the cube-map projection format into a plurality of weighted density map (WDM) video streams based on a first set of parameters and a second set of parameters. Each of the WDM video streams represents a viewing angle range at a maximum resolution. A resolution of each of the WDM video streams is lesser than a resolution of the 360-degree video content. The method includes generating a manifest file comprising access links corresponding to the plurality of WDM video streams at multiple resolutions. The method includes transmitting at least one WDM video stream from the plurality of WDM video streams to the client device based on the manifest file, the second set of parameters, and a viewpoint of a user of the client device for rendering the 360-degree video content on the client device in response to accessing the 360-degree video content on the client device. <>

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 February 2018
Publication Number
01/2020
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
mail@lexorbis.com
Parent Application

Applicants

SHILPMIS TECHNOLOGIES PVT. LTD.
SHILP MAITRI HOUSE, BHATAR CHAR RASTA, SURAT-395 007, GUJARAT, INDIA.

Inventors

1. SHILPMIS TECHNOLOGIES PVT. LTD.
SHILP MAITRI HOUSE, BHATAR CHAR RASTA, SURAT-395 007, GUJARAT, INDIA.

Specification

DESC:DESCRIPTION
TECHNICAL FIELD
The present invention generally relates to processing 360-degree video content, and more particularly, to a method of rendering 360-degree video content on a client device and a system thereof.
BACKGROUND
With advent of technology, there has been an exponential penetration of high-end smartphones alongside other dedicated or standalone devices for majority of the masses to experience 360-degree video content. The 360-degree video content can include videos and virtual reality (VR) videos. The 360-degree video content is typically optimized or processed using various techniques to ensure highest possible quality of viewing experience on the high-end smartphones and the dedicated device. However, such optimization techniques are either limited to predefined or pre-recorded 360-degree video content or performed as post-processing while live streaming of 360-degree video content. In addition, such optimization techniques are specific to the standalone devices and platform specific applications available on the high-end smartphones.
Further, transmission of the 360-degree video content is limited by available bandwidth and hardware support, which are available with high-end smartphones and standalone devices. As a result, quality of viewing experience becomes poor on high-end smartphones if bandwidth is not available. Likewise, quality of viewing experience is poor on mid-range smartphones as the mid-range smartphones lack the necessary hardware support in order to keep the cost of the mid-range smartphones lower. Furthermore, live streaming or on-demand streaming of the 360-degree video content on a smartphone or dedicated device is designed as an Ultra High Definition (UHD) extrapolation of traditional production pipeline. As such, the mid-range smartphones consume a considerable amount of bandwidth, power, and other resources while rendering/displaying the 360-degree video content, compared to regular 2-Dimensional (2D) video content. Other factors such as inefficient spherical projection (equirectangular), ultra-high resolution of the overall 360-degree video content, requirement of an acceptable resolution of the 90 degree viewport, non-standard frame sizes of 360-degree video content result in inefficient processing, transcoding, and playback of the 360-degree video content on the mid-range smartphones. This further leads to poor viewing experience of the 360-degree video content on the mid-range smartphones since the mid-range smartphones do not support more than 2K resolution during playback.
Thus, there exists a need for a solution to enable rendering of the 360-degree video content on any device, without consuming excess bandwidth and irrespective of type of the device.
SUMMARY
This summary is provided to introduce a selection of concepts in a simplified format that are further described in the detailed description of the present disclosure. This summary is not intended to identify key or essential inventive concepts of the claimed subject matter, nor is it intended for determining the scope of the claimed subject matter. In accordance with the purposes of the disclosure, the present disclosure as embodied and broadly described herein, describes a method of rendering 360-degree video content on a client device and a system thereof.
In accordance with an embodiment, the method includes obtaining the 360-degree video content in a cube-map projection format. The method includes converting the cube-map projection format into a plurality of weighted density map (WDM) video streams based on a first set of parameters and a second set of parameters. Each of the WDM video streams represents a viewing angle range at a maximum resolution, a resolution of each of the WDM video streams is lesser than a resolution of the 360-degree video content. The method includes generating a manifest file comprising access links corresponding to the plurality of WDM video streams at multiple resolutions. The method includes transmitting at least one WDM video stream from the plurality of WDM video streams to the client device based on the manifest file, the second set of parameters, and a viewpoint of a user of the client device for rendering the 360-degree video content on the client device in response to accessing the 360-degree video content on the client device.
In accordance with the embodiment, the system is disclosed. The system includes a receiving unit, a stream generating unit, and a rendering unit. The receiving unit is to obtain the 360-degree video content in a cube-map projection format. The stream generating unit is to convert the cube-map projection format into a plurality of weighted density map (WDM) video stream based on a first set of parameters and a second set of parameters. Each of the WDM video streams represents a viewing angle range at a maximum resolution, and a resolution of each of the WDM video streams is lesser than a resolution of the 360-degree video content. The stream generating unit is to generate a manifest file comprising access links corresponding to the plurality of WDM video streams at multiple resolutions. The stream generating unit is to transmit at least one WDM video stream from the plurality of WDM video streams to the client device based on the manifest file, the second set of parameters, and a viewpoint of a user of the client device for rendering the 360-degree video content on the client device in response to accessing the 360-degree video content on the client device.
In accordance with the embodiment, the method includes receiving a user-input indicative of accessing the 360-degree video content on the client device. The method includes accessing a manifest file from a database in response to the user-input, the manifest file comprising access links corresponding to a plurality of WDM video streams at multiple resolutions. The method includes determining a viewpoint of the user. The method includes determining a data corresponding to a plurality of device parameters associated with the client device. The method includes selecting a link of WDM video stream from the manifest file corresponding to the viewpoint based on the data such that the WDM video stream is closest to the viewpoint of the user. The method includes accessing the WDM video stream corresponding to the selected link from the database. The method includes rendering the WDM video stream on a display.
In accordance with the embodiment, the client device is disclosed. The client device includes a receiving unit and a rendering unit. The receiving unit is to receive a user-input indicative of accessing the 360-degree video content on the client device. The rendering unit is to determine the viewpoint of the user and the second set of parameters from the client device. The rendering unit is to access a manifest file from a database in response to the user-input. The manifest file comprising access links corresponding to a plurality of WDM video streams at multiple resolutions. The rendering unit is to determine a viewpoint of the user. The rendering unit is to determine a data corresponding to a plurality of device parameters associated with the client device. The rendering unit is to select a link of WDM video stream from the manifest file corresponding to the viewpoint based on the data such that the WDM video stream is closest to the viewpoint of the user. The rendering unit is to access the WDM video stream corresponding to the selected link from the database. The rendering unit is to render the WDM video stream on a display.
The advantages of the present invention include, but not limited to, enabling rendering of the 360-degree video content on the client device by creating multiple peripheral views or WDM video streams with varying qualities for every possible viewpoint or orientation of the user. Each peripheral view is at a resolution supported by the client device but lesser than a higher resolution of the 360-degree video content. This enhances viewing experience without compromising on quality of the 360-degree video content. In addition, consumption of bandwidth, power, and other resources is reduced without requirement of complex or specific hardware support at the client device. Thus, the 360-degree video content can be rendered on any device, without consuming excess bandwidth and/or resource(s) and without restricting to a type of the device.
These aspects and advantages will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.
BRIEF DESCRIPTION OF FIGURES
These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIGURE 1 illustrates an example environment to render 360-degree video content on a client device, in accordance with an example embodiment of the present disclosure.
FIGURE 2 schematically illustrates a system to render the 360-degree video content on the client device, in accordance with the example embodiment of the present disclosure.
FIGURE 3 illustrates a process flow for converting a cube-map projection format of the 360-degree video content to a plurality of weighted density maps (WDM) video streams, in accordance with the example embodiment of the present disclosure.
FIGURE 4 illustrates an example layout of cube-map projection format of one frame of the 360-degree video content, in accordance with the example embodiment of the present disclosure.
FIGURE 5 illustrates an example cube-map projection format of one frame of the 360-degree video content as shown in FIGURE 4, in accordance with the example embodiment of the present disclosure.
FIGURES 6A and 6B illustrate first example layout of WDM video streams converted from the cube-map projection format as shown in FIGURE 4, in accordance with the example embodiment of the present disclosure.
FIGURES 7A and 7B illustrate example WDM video streams converted from the cube-map projection format as shown in FIGURE 6A and 6B, respectively, in accordance with the example embodiment of the present disclosure.
FIGURES 8A, 8B, 8C, and 8D illustrate second example layout of WDM video streams converted from the cube-map projection format as shown in FIGURE 4, in accordance with the example embodiment of the present disclosure.
FIGURES 9A, 9B, 9C, and 9D illustrate third example layout of WDM video streams converted from the cube-map projection format as shown in FIGURE 4, in accordance with the example embodiment of the present disclosure.
FIGURE 10 schematically illustrates the client device to render the 360-degree video content, in accordance with the example embodiment of the present disclosure.
FIGURE 11 illustrates a process flow for rendering the 360-degree video content based on the WDM video stream on the client device, in accordance with the example embodiment of the present disclosure.
FIGURES 12-16 illustrate flow diagrams of a method of rendering the 360-degree video on the client device, in accordance with the example embodiment of the present disclosure.
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
DETAILED DESCRIPTION
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skilled in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
FIGURE 1 illustrates an example environment 100 to render 360-degree video content on a client device, in accordance with an example embodiment of the present disclosure. The environment 100 includes a system 102 that obtains a 360-degree video content in cube-map projection format from a content source 104. In one implementation, the system 102 can be standalone server. In another implementation, the system 102 can be part of an existing server capable of processing 360-degree video content. Examples of the 360-degree video content include, but not limited, video obtained from stitching plurality of videos shot at different angles, video shot from a 360-degree video capturing device, virtual reality (VR) video obtained from stitching plurality of VR videos shot at different angles, VR video shot from a 360-degree video capturing device, a stereo 360-degree video, a mono 360-degree video, etc. The 360-degree video content is one of a real time 360-degree video content and a pre-recorded 360-degree video content. Examples of the content source 104 include, but not limited to, 360-degree video capturing device, mobile device such as smart phone capable of creating 360-degree video content, a server storing pre-recorded 360-degree video contents, a broadcast network system capable of broadcasting live 360-degree content and/or streaming on-demand 360-degree video content, etc. The system 102 is connected to the content source 104 over a wireless network (represented by the arrow). Examples of the wireless network include, but not limited to, cloud based network, Wi-Fi, etc.
The system 102 includes video processing module(s)/unit(s) 106 to process the 360-degree video content in the cube-map projection format and obtain a plurality of weighted density map (WDM) video streams and a manifest file. Each of the WDM video streams represents a viewing angle range at a maximum resolution, and a resolution of each of the WDM video streams is lesser than a resolution of the 360-degree video content. The manifest file comprises access links corresponding to the plurality of WDM video streams at multiple resolutions.
At least one of the plurality of WDM video streams are then streamed over a network 108 to a client device 110 based on the manifest file, a viewpoint of a user of the client device 110, and parameters of the client device 110 to render the 360-degree video content on the client device 110. The network 108 is content distribution network responsible for distributing content to various client devices. The client device 110 is any electronic device capable of rendering video content. The client device 110 may or may not support a resolution of the 360-degree video content. The resolution of the 360-degree video content can be 4K, Ultra HD, etc., and the resolution supported by the client device 110 can be 4K, Ultra HD, 2K, full HD, etc. Examples of the client device 110 include, but not limited to, mid-range smartphones, high-end smartphones, standalone VR devices, laptops, tablets, etc.
The client device 110 includes a video playing module(s)/unit(s) 112 for receiving the plurality of WDM video streams and displaying the plurality of WDM video streams on the client device 110. In one implementation, the video playing module(s) 112 can be embedded in a web-browser. In one implementation, the video playing module(s) 112 can be embedded in mobile based application downloaded onto the client device 110. In one implementation, the video playing module(s) 112 can be embedded in web-based application downloaded onto the client device 110. Accordingly, the video playing module(s) 112 receives user-input indicative of accessing the 360-degree video content. In response, the video playing module(s) 112 access the manifest file. The video playing module(s) then determines viewpoint of the user of the client device 110 and device parameters of the client device 110. Based on the determined viewpoint and the device parameters, the video playing module(s) 112 selects a link of a WDM video stream which is closest to the viewpoint of the user from the manifest file. The video playing module(s) 112 then accesses the WDM video stream and renders the WDM video stream on the client device 110.
FIGURE 2 schematically illustrates the system 102 to render 360-degree video content on the client device 110, in accordance with the example embodiment of the present disclosure. The system 102 includes the video processing module(s) 106, a processor 202, a memory 204, a communication interface unit 206, resource(s) 208, and data 210. The communication interface unit 206 enables transmission and reception of data between the system 102 and the content source 104, and between the system 102 and the client device 110. The resource(s) 208 can refer to units necessary for proper functioning of the system 102 apart from the processor 202 and the memory 204. The processor 202, the memory 204, the communication interface unit 206, the resource(s) 208, and the video processing module(s) 106 are communicatively coupled with each other. The data 210 serves, amongst other things, as a repository for storing data processed, received, and generated by the video processing module(s) 106 or during interactions between any of the aforementioned components.
In an implementation, the video processing module(s) 106 can include a receiving unit 212, a stream generating unit 214, and a transcoding unit 216. The receiving unit 212, the stream generating unit 214, and the transcoding unit 216 are in communication with each other. The video processing module(s) 106 amongst other things include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The video processing module(s) 106 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
Further, the video processing module(s) 106 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor 202, a state machine, a logic array or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to perform the required functions. In another aspect of the present disclosure, the video processing module(s) 106 may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the described functionalities.
The operation of the system 102 will now be explained using FIGURE 3 to FIGURES 9A-9D. For the sake of brevity, details that are already explained in FIGURE 1 and FIGURE 2 are not repeated herein.
FIGURE 3 illustrates a process flow 300 for converting a cube-map projection format of the 360-degree video content to a plurality of weighted density maps (WDM) video streams, in accordance with the example embodiment of the present disclosure. Accordingly, the receiving unit 212 obtains the 360-degree video content in the cube-map projection format 302 from the content source 104. In one implementation, the receiving unit 212 directly obtains the 360-degree video content in the cube-map projection format 302. In another implementation, the receiving unit 212 obtains a 360-degree video content in an equirectangular projection format 302-A. In such implementation, the receiving unit 212 converts the 360-degree video content in the equirectangular projection format 302-A to the 360-degree video content in the cube-map projection format 302 using techniques as known in the art.
The stream generating unit 214 then receives the 360-degree video content in the cube-map projection format 302 from the receiving unit 212. The stream generating unit 214 converts the cube-map projection format 302 into a plurality of weighted density map (WDM) video streams 304 (represented as WDM-1, W DM-2, WDM-3, …WDM-N) based on a first set of parameters 306 and a second set of parameters 308. Each of the WDM video streams 304 represents a viewing angle range at a maximum resolution, and a resolution of each of the WDM video streams 304 is lesser than a resolution of the 360-degree video content. The first set of parameters 306 include resolution of the 360-degree video content and content type of the 360-degree video content. The stream generating unit 214 obtains the first set of parameters 306 from the content source 104. The second set of parameters 308 include period of viewing the 360-degree video content, resolution of a display of the client device 110, device type of the client device 110, processing capability of the client device 110 and bandwidth available to the client device 110. The stream generating unit 214 obtains the second set of parameters 308 from the client device 110. The stream generating unit 214 obtains the first set of parameters 306 and the second set of parameters 308 using techniques as known in the art.
To convert the cube-map projection format 302 into the WDM video streams 304, the stream generating unit 214 dynamically determining a factor based on the first set of parameters 306 and the second set of parameters 308. The factor is a numerical integer that segments a circular representation of the 360-degree video content into specific divisions. Examples of the factor can be two (2), three (3), four (4), and six (6). In an example, the resolution of the 360-degree video content is 4K and the resolution supported by the client device 110 is 2K. In such example, the factor can be determined as two (2) that segments the circular representation of the 360-degree video content into two halves. In another example, the resolution of the 360-degree video content is 4K and the resolution supported by the client device 110 is 2K. In such example, the factor can be determined as four (4) that segments the circular representation of the 360-degree video content into four halves.
The stream generating unit 214 then dynamically determines a layout for the plurality of WDM video streams 304 based on the first set of parameters 306 and the second set of parameters 308. The layout comprises of a plurality of rectangular areas arranged in a configuration. The stream generating unit 214 also dynamically determines the viewing angle range for each of the WDM video streams 304 based on the factor, the first set of parameters 306, and the second set of parameters 308. In one implementation, the viewing angle ranges can be distinct from each other. In another implementation, the viewing angle ranges can overlap with each other. The stream generating unit 214 then generates the plurality of WDM video streams 304 in accordance with the layout such that number of the WDM video streams that are generated is equal to the factor. Thus, the factor determines how many WDM video streams 304 are to be converted from the cube-map projection format 302.
In an example, the resolution of the 360-degree video content is 4K and the resolution supported by the client device 110 is 2K. In such example, the factor can be determined as two (2) that segments the circular representation of the 360-degree video content into two halves. As such, two WDM video streams are generated such that first WDM video stream (WDM-1) represents a viewing angle range of 0-degree to 180-degree and second WDM video stream (WDM-2) represents a viewing angle range of 180-degree to 360-degree.
In another example, the resolution of the 360-degree video content is 4K and the resolution supported by the client device 110 is 2K. In such example, the factor can be determined as four (4) that segments the circular representation of the 360-degree video content into four overlapping halves. As such, four WDM video streams are generated such that first WDM video stream (WDM-1) represents a viewing angle range of 0-degree to 180-degree, second WDM video stream (WDM-2) represents a viewing angle range of 180-degree to 360-degree, third WDM video stream (WDM-3) represents a viewing angle range of 90-degree to 270-degree, and fourth WDM video stream (WDM-4) represents a viewing angle range of 270-degree to 90-degree.
In yet another example, the resolution of the 360-degree video content is 4K and the resolution supported by the client device 110 is 2K. In such example, the factor can be determined as four (4) that segments the circular representation of the 360-degree video content into four non-overlapping halves. As such, four WDM video streams are generated such that first WDM video stream (WDM-1) represents a viewing angle range of 0-degree to 90-degree, second WDM video stream (WDM-2) represents a viewing angle range of 90-degree to 180-degree, third WDM video stream (WDM-3) represents a viewing angle range of 180-degree to 270-degree, and fourth WDM video stream (WDM-4) represents a viewing angle range of 270-degree to 360-degree.
Now, to generate each of the WDM video streams 304, the stream generating unit 214 selects cube-map faces of the cube-map projection format 302 corresponding to the viewing angle range. The selected cube-map faces include full cube-map faces and/or a portion of a cube-map faces that falls within the viewing angle range. The stream generating unit 214 then maps the selected cube-map faces into one or more rectangular areas of the layout such that a combined resolution of the selected cube-map faces is substantially closer to the maximum resolution.
Thereafter, the stream generating unit 214 selects remaining cube-map faces of the cube-map projection format 302. The remaining cube-map faces do not correspond to or not fall within the viewing angle range. The stream generating unit 214 maps the remaining cube-map faces into one or more further rectangular areas of the layout such that a combined resolution of the remaining cube-map faces is substantially lower than the maximum resolution.
In one implementation, the mapping includes cropping the selected cube-map faces to fill the one or more rectangular areas. In one implementation, the mapping includes resizing the selected cube-map faces to fill the one or more rectangular areas. In one implementation, the mapping rearranging pixel data in the selected cube-map faces to fill the one or more rectangular areas. In one implementation, the mapping includes both cropping and resizing. Thus, the stream generating unit 214 may perform at least one of cropping cube-map face(s), resizing cube-map face(s), and rearranging of pixel data of cube-map face(s) to map the cube-map face(s) into the rectangular area(s).
Now, as described earlier, the layout for the WDM video stream 304 comprises of the plurality of rectangular areas arranged in a configuration such that cube-map faces of the cube-map projection format 302 are mapped into the rectangular areas in accordance with the viewing angle ranges. The cube-map projection format has six (6) faces corresponding to the front, back, top, bottom, left, and right faces, respectively, and each of the faces being a planar square. In one example configuration, the six cube-map faces may be packed into two (2) rows, first row and second row. The cube-map faces (complete cube-map face and/or portion of a cube-map face) corresponding to the viewing angle range is mapped with rectangular areas in the first row at nearly maximum resolution. Remaining cube-map faces (complete cube-map face and/or portion of a cube-map face) are mapped with rectangular areas in the second row. Thus, the cube-map faces corresponding to the viewing angle arrange is rendered at higher resolution than the remaining cube-map faces on the client device 110. This reduces consumption of bandwidth, power, etc., on the client device 110 while rendering the WDM video streams 304 as compared to the rendering of the 360-degree video content.
In one example, the factor is two and first WDM video stream represents a viewing angle range is 0-degree to 180-degree. As such, all frames of the 360-degree video content in the cube-map faces corresponding to or falling within the viewing angle range 0-degree to 180-degree are mapped into the three rectangular areas. In the example, a maximum resolution supported by the client device 110 is 1920x1080. As such, the combined resolution of the selected cube-map faces when mapped into the rectangular area can be 1920x960.
In one example, the factor is 4 and first WDM video stream represents a viewing angle range is 0-degree to 180-degree. As such, all frames of the 360-degree video content in the cube-map faces corresponding to or falling within the viewing angle range is 0-degree to 180-degree are mapped into one rectangular area. In the example, a maximum resolution supported by the client device 110 is 1920x1080. As such, the combined resolution of the selected cube-map faces when mapped into the rectangular area can be 1920x960.
Further, in one implementation, the stream generating unit 214 selects media content. The media content can be 2-Dimensional (2D) image, 2D video, text, etc., that is required to be rendered in addition to the 360-degree video content on the client device 110. The stream generating unit 214 then maps the media content into a further rectangular area of the layout.
Referring to FIGURE 4 an example layout 402 of one frame of the 360-degree video in the cube-map projection format 302 is illustrated in accordance with the example embodiment of the present disclosure. The frame in the cube-map projection format 302 has a resolution of 1920x1080. The layout includes 6 cube-map faces, i.e., front face (F1), back face (F2), top face (F3), bottom face (F4), left face (F5), and right face (F6). The left face and the right face further include left portions F5L and F6L and right portions F5R and F6R. Referring to FIGURE 5 an example cube-map projection format 502 of the frame as shown in FIGURE 4 is illustrated, in accordance with the example embodiment of the present disclosure.
FIGURES 6A and 6B illustrate first example layout of WDM video streams converted from the cube-map projection format as shown in FIGURE 4, in accordance with the example embodiment of the present disclosure. In the example, the factor is determined as two (2). As such, two WDM video streams 304-1 and 304-2 are generated having layout 602-1 and layout 602-2 respectively. The viewing angle range for WDM video stream 304-1 is determined as 0-degree to 180-degree. The viewing angle range for WDM video stream 304-2 is determined as 180-degree to 360-degree.
Now, the stream generating unit 214 selects the cube-map faces, i.e., front face F1, right portion of left face F5R, and the left portion of the right face F6L corresponding to the viewing angle range 0-degree to 180-degree for the WDM video steam 304-1. Referring to FIGURE 6A, the stream generating unit 214 then maps the selected front face F1, right portion of left face F5R, and the left portion of the right face F6L to rectangular area 604, rectangular area 606, and rectangular area 608, respectively, of the layout 602-1. The resolution of the mapped front face F1 is 960x960. The resolution of each of the mapped right portion of left face F5R and left portion of the right face F6L is 480x960. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1920x960.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., bottom face F4, left portion of left face F5L, right portion of the right face F6R, top face F3, and back face F2, which do not correspond to the viewing angle range 0-degree to 180-degree. Referring again to FIGURE 6A, the stream generating unit 214 then maps the selected bottom face F4, left portion of left face F5L, right portion of the right face F6R, top face F3, and back face F2 to rectangular area 610, rectangular area 612, rectangular area 614, rectangular area 616, and rectangular area 618, respectively, of the layout 602-1. The resolution of the mapped bottom face F4 is 640x120. The resolution of the mapped left portion of left face F5L, right portion of the right face F6R, top face F3, and back face F2 is 320x120. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1920x960.
FIGURE 7A illustrates the example WDM video stream 304-1 converted from the cube-map projection format as shown in FIGURE 6A. As illustrated, the combined resolution of the front face F1, right portion of left face F5R, and the left portion of the right face F6L mapped to the rectangular areas (represented by dashed lines) is higher than a combined resolution of the remaining cube-map faces. Thus, the cube-map faces corresponding to the viewing angle arrange is rendered at higher resolution than the remaining cube-map faces on the client device 110. This reduces consumption of bandwidth, power, network speed, etc., on the client device 110.
Now, the stream generating unit 214 selects the cube-map faces, i.e., back face F2, right portion of right face F6R, and the left portion of the left face F5L corresponding to the viewing angle range 180-degree to 360-degree for the WDM video steam 304-2. Referring to FIGURE 6B, the stream generating unit 214 then maps the back face F2, right portion of right face F6R, and the left portion of the left face F5L to rectangular area 604, rectangular area 606, and rectangular area 608, respectively, of the layout 602-2. The resolution of the mapped back face F2 is 960x960. The resolution of each of the mapped right portion of right portion of right face F6R, and the left portion of the left face F5L is 480x960. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1920x960.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., bottom face F4, left portion of right face F6L, right portion of the left face F5R, top face F3, and front face F1, which do not correspond to the viewing angle range 180-degree to 360-degree. Referring again to FIGURE 6B, the stream generating unit 214 then maps the selected bottom face F4, left portion of right face F6L, right portion of the left face F5R, top face F3, and front face F1 to rectangular area 610, rectangular area 612, rectangular area 614, rectangular area 616, and rectangular area 618, respectively, of the layout 602-2. The resolution of the mapped bottom face F4 is 640x120. The resolution of the mapped left portion of right face F6L, right portion of the left face F5R, top face F3, and front face F1 is 320x120. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1920x960.
FIGURE 7B illustrates the example WDM video stream 304-2 converted from the cube-map projection format as shown in FIGURE 6B. As illustrated, the combined resolution of the back face F2, right portion of right face F6R, and the left portion of the left face F5L mapped to the rectangular areas (represented by dashed rectangular) is higher than a combined resolution of the remaining cube-map faces. Thus, the cube-map faces corresponding to the viewing angle arrange is rendered at higher resolution than the remaining cube-map faces on the client device 110. This reduces consumption of bandwidth, power, network speed, etc., on the client device 110.
FIGURES 8A, 8B, 8C, and 8D illustrate second example layout of WDM video streams converted from the cube-map projection format as shown in FIGURE 4, in accordance with the example embodiment of the present disclosure. In the example, the factor is determined as four (4). As such, four WDM video streams 304-1, 304-2, 304-3, and 304-4 are generated are generated having layout 802-1, layout 802-2, layout 802-3, and layout 802-4, respectively. The viewing angle range for WDM video stream 304-1 is determined as 0-degree to 180-degree in clockwise direction. The viewing angle range for WDM video stream 304-2 is determined as 180-degree to 360-degree in anti-clockwise direction. The viewing angle range for WDM video stream 304-3 is determined as 90-degree to 270-degree in clockwise direction. The viewing angle range for WDM video stream 304-4 is determined as 270-degree to 90-degree in anti-clockwise direction.
Now, the stream generating unit 214 selects the cube-map faces, i.e., front face F1, portion of left face F5(4/6) and the portion of the right face F6(4/6) corresponding to the viewing angle range 0-degree to 180-degree for the WDM video steam 304-1. Referring to FIGURE 8A, the stream generating unit 214 then maps the selected front face F1, portion of left face F5(4/6) and the portion of the right face F6(4/6) to rectangular area 804, rectangular area 806, and rectangular area 808, respectively, of the layout 802-1. The resolution of the mapped front face F1 is 720x720. The resolution of each of the mapped portion of left face F5(4/6) and portion of the right face F6(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., back face F2, top face F3, bottom face F4, portion of left face F5(2/6), and portion of the F6(2/6), which do not correspond to the viewing angle range 0-degree to 180-degree. Referring again to FIGURE 8A, the stream generating unit 214 then maps the selected back face F2, top face F3, bottom face F4, portion of left face F5(2/6), and portion of the right face F6(2/6) to rectangular area 810, rectangular area 812, rectangular area 814, rectangular area 816, and rectangular area 818, respectively, of the layout 802-1. The resolution of the mapped back face F2 is 720x360. The resolution of the mapped top face F3 and bottom face F4 is 480x360. The resolution of the mapped left portion of left face F5(2/6), and right portion of the F6(2/6) is 240x540. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
Now, the stream generating unit 214 selects the cube-map faces, i.e., back face F2, portion of right face F6(4/6) and portion of the left face F5(4/6) corresponding to the viewing angle range 180-degree to 360-degree for the WDM video steam 304-2. Referring to FIGURE 8B, the stream generating unit 214 then maps the selected back face F2, portion of right face F6(4/6) and portion of the left face F5(4/6) to rectangular area 804, rectangular area 806, and rectangular area 808, respectively, of the layout 802-2. The resolution of the mapped back face F2 is 720x720. The resolution of each of the mapped portion of right face F6(4/6) and portion of the left face F5(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., front face F1, top face F3, bottom face F4, portion of right face F6(2/6) and portion of the left face F5(2/6), which do not correspond to the viewing angle range 180-degree to 360-degree. Referring again to FIGURE 8B, the stream generating unit 214 then maps the selected front face F1, top face F3, bottom face F4, portion of right face F6(2/6) and portion of the left face F5(2/6) to rectangular area 810, rectangular area 812, rectangular area 814, rectangular area 816, and rectangular area 818, respectively, of the layout 802-2. The resolution of the mapped front face F1 is 720x360. The resolution of the mapped top face F3 and bottom face F4 is 480x360. The resolution of the mapped portion of right face F6(2/6) and portion of the left face F5(2/6) of the F6(2/6) is 240x540. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
Now, the stream generating unit 214 selects the cube-map faces, i.e., right face F6, front face F1(4/6) and the portion of the back face F2(4/6) corresponding to the viewing angle range 90-degree to 270-degree for the WDM video steam 304-3. Referring to FIGURE 8C, the stream generating unit 214 then maps the selected right face F6, front face F1(4/6) and the portion of the back face F2(4/6) to rectangular area 804, rectangular area 806, and rectangular area 808, respectively, of the layout 802-3. The resolution of the mapped right face F6 is 720x720. The resolution of each of the mapped portion of front face F1(4/6) and the portion of the back face F2(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., left face F5, top face F3, bottom face F4, portion of front face F1(2/6) and portion of the back face F2(2/6), which do not correspond to the viewing angle range 90-degree to 270-degree. Referring again to FIGURE 8C, the stream generating unit 214 then maps the selected left face F5, top face F3, bottom face F4, portion of front face F1(2/6) and portion of the back face F2(2/6) to rectangular area 810, rectangular area 812, rectangular area 814, rectangular area 816, and rectangular area 818, respectively, of the layout 802-3. The resolution of the mapped left face F5 is 720x360. The resolution of the mapped top face F3 and bottom face F4 is 480x360. The resolution of the mapped portion of portion of front face F1(2/6) and portion of the back face F2(2/6) is 240x540. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
Now, the stream generating unit 214 selects the cube-map faces, i.e., left face F5, portion of back face F2(4/6) and the portion of the front face F1(4/6) corresponding to the viewing angle range 270-degree to 90-degree for the WDM video steam 304-4. Referring to FIGURE 8D, the stream generating unit 214 then maps the selected left face F5, portion of back face F2(4/6) and the portion of the front face F1(4/6) to rectangular area 804, rectangular area 806, and rectangular area 808, respectively, of the layout 802-4. The resolution of the mapped left face F5 is 720x720. The resolution of each of the mapped portion of back face F2(4/6) and the portion of the front face F1(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., right face F6, top face F3, bottom face F4, portion of back face F2(2/6) and the portion of the front face F1(2/6), which do not correspond to the viewing angle range 270-degree to 90-degree. Referring again to FIGURE 8D, the stream generating unit 214 then maps the selected right face F6, top face F3, bottom face F4, portion of back face F2(2/6) and the portion of the front face F1(2/6) to rectangular area 810, rectangular area 812, rectangular area 814, rectangular area 816, and rectangular area 818, respectively, of the layout 802-4. The resolution of the mapped right face F6 is 720x360. The resolution of the mapped top face F3 and bottom face F4 is 480x360. The resolution of the mapped portion of back face F2(2/6) and the portion of the front face F1(2/6) is 240x540. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
FIGURES 9A, 9B, 9C, and 9D illustrate third example layout of WDM video streams converted from the cube-map projection format as shown in FIGURE 4, in accordance with the example embodiment of the present disclosure. In the example, the factor is determined as four (4). As such, four WDM video streams 304-1, 304-2, 304-3, and 304-4 are generated are generated having layout 902-1, layout 902-2, layout 902-3, and layout 902-4, respectively. The viewing angle range for WDM video stream 304-1 is determined as 0-degree to 180-degree in clockwise direction. The viewing angle range for WDM video stream 304-2 is determined as 180-degree to 360-degree in clockwise direction. The viewing angle range for WDM video stream 304-3 is determined as 90-degree to 270-degree in clockwise direction. The viewing angle range for WDM video stream 304-4 is determined as 270-degree to 90-degree in clockwise direction.
Now, the stream generating unit 214 selects the cube-map faces, i.e., front face F1, portion of left face F5(4/6) and portion of the right face F6(4/6) corresponding to the viewing angle range 0-degree to 180-degree for the WDM video steam 304-1. Referring to FIGURE 9A, the stream generating unit 214 then maps the selected front face F1, portion of left face F5(4/6) and the portion of the right face F6(4/6) to rectangular area 904, rectangular area 906, and rectangular area 908, respectively, of the layout 902-1. The resolution of the mapped front face F1 is 720x720. The resolution of each of the mapped left face F5(4/6) and portion of the right face F6(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., back face F2, top face F3, bottom face F4, portion of left face F5(2/6), and portion of the F6(2/6), which do not correspond to the viewing angle range 0-degree to 180-degree. Referring again to FIGURE 9A, the stream generating unit 214 then maps the selected back face F2, top face F3, bottom face F4, portion of the left face F5(2/6), and portion of the right face F6(2/6) to rectangular area 910, rectangular area 912, rectangular area 914, rectangular area 916, and rectangular area 918, respectively, of the layout 902-1. The resolution of the mapped back face F2 is 320x360. The resolution of the mapped top face F3 and bottom face F4 is 360x360. The resolution of the mapped portion of left face F5(2/6), and portion of the right face F6(2/6) is 240x540. Media content, represented by J, is mapped to rectangular area 920 of the layout 902-1. The resolution of the media content is 640x360. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
Now, the stream generating unit 214 selects the cube-map faces, i.e., back face F2, portion of right face F6(4/6) and portion of the left face F5(4/6) corresponding to the viewing angle range 180-degree to 360-degree for the WDM video steam 304-2. Referring to FIGURE 9B, the stream generating unit 214 then maps the selected back face F2, portion of right face F6(4/6) and portion of the left face F5(4/6) to rectangular area 904, rectangular area 906, and rectangular area 908, respectively, of the layout 902-2. The resolution of the mapped back face F2 is 720x720. The resolution of each of the mapped portion of right face F6(4/6) and portion of the left face F5(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., front face F1, top face F3, bottom face F4, portion of right face F6(2/6) and portion of the left face F5(2/6), which do not correspond to the viewing angle range 180-degree to 360-degree. Referring again to FIGURE 9B, the stream generating unit 214 then maps the selected front face F1, top face F3, bottom face F4, portion of right face F6(2/6) and portion of the left face F5(2/6) to rectangular area 910, rectangular area 912, rectangular area 914, rectangular area 916, and rectangular area 918, respectively, of the layout 902-2. The resolution of the mapped front face F1 is 320x360. The resolution of the mapped top face F3 and bottom face F4 is 360x360. The resolution of the mapped portion of right face F6(2/6) and portion of the left face F5(2/6) is 240x540. Media content, represented by J, is mapped to rectangular area 920 of the layout 902-2. The resolution of the media content is 640x360. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
Now, the stream generating unit 214 selects the cube-map faces, i.e., right face F6, the portion of the front face F1(4/6) and the portion of the back face F2(4/6) corresponding to the viewing angle range 90-degree to 270-degree for the WDM video steam 304-3. Referring to FIGURE 9C, the stream generating unit 214 then maps the selected right face F6, the portion of the front face F1(4/6) and the portion of the back face F2(4/6) to rectangular area 904, rectangular area 906, and rectangular area 908, respectively, of the layout 902-3. The resolution of the mapped right face F6 is 720x720. The resolution of each of the mapped portion of front face F1(4/6) and portion of the back face F2(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., left face F5, top face F3, bottom face F4, portion of front face F1(2/6) and portion of the back face F2(2/6), which do not correspond to the viewing angle range 90-degree to 270-degree. Referring again to FIGURE 9C, the stream generating unit 214 then maps the selected left face F5, top face F3, bottom face F4, portion of front face F1(2/6) and portion of the back face F2(2/6) to rectangular area 910, rectangular area 912, rectangular area 914, rectangular area 916, and rectangular area 918, respectively, of the layout 902-3. The resolution of the mapped left face F5 is 320x360. The resolution of the mapped top face F3 and bottom face F4 is 360x360. The resolution of the mapped portion of portion of front face F1(2/6) and portion of the back face F2(2/6) is 240x540. Media content, represented by J, is mapped to rectangular area 920 of the layout 902-3. The resolution of the media content is 640x360. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
Now, the stream generating unit 214 selects the cube-map faces, i.e., left face F5, portion of back face F2(4/6) and the portion of the front face F1(4/6) corresponding to the viewing angle range 270-degree to 90-degree for the WDM video steam 304-4. Referring to FIGURE 9D, the stream generating unit 214 then maps the selected left face F5, portion of back face F2(4/6) and the portion of the front face F1(4/6) to rectangular area 904, rectangular area 906, and rectangular area 908, respectively, of the layout 902-4. The resolution of the mapped left face F5 is 720x720. The resolution of each of the mapped portion of back face F2(4/6) and the portion of the front face F1(4/6) is 480x720. Thus, the combined resolution of the selected cube-map faces in the viewing angle range is 1680x720.
Thereafter, the stream generating unit 214 selects remaining cube-map faces, i.e., right face F6, top face F3, bottom face F4, portion of back face F2(2/6) and the portion of the front face F1(2/6), which do not correspond to the viewing angle range 270-degree to 90-degree. Referring again to FIGURE 9D, the stream generating unit 214 then maps the selected right face F6, top face F3, bottom face F4, portion of back face F2(2/6) and the portion of the front face F1(2/6) to rectangular area 910, rectangular area 912, rectangular area 914, rectangular area 916, and rectangular area 918, respectively, of the layout 902-4. The resolution of the mapped right face F6 is 320x360. The resolution of the mapped top face F3 and bottom face F4 is 360x360. The resolution of the mapped portion of back face F2(2/6) and the portion of the front face F1(2/6) is 240x540. Media content, represented by J, is mapped to rectangular area 920 of the layout 902-4. The resolution of the media content is 640x360. Thus, the combined resolution of the selected cube-map faces not corresponding to the viewing angle range is less than the 1680x720.
Referring to FIGURE 3 again, the stream generating unit 214 further generates manifest file 310 comprising access links corresponding to the plurality of WDM video streams 304 at multiple resolutions. The manifest file 310 can be generated using techniques as known in the art. Further, the stream generating unit 214 embeds an identifier in each of the plurality of WDM video streams 304. The identifier indicates the viewing angle range of the WDM video stream 304. The identifier can be in any form of code or mark overlaid or superimposed on one corner of the WDM video stream 304. In one example, the identifier can be a visual colour coded mark overlaid or superimposed on one corner of the WDM video stream 304. Referring to FIGURE 7A, a black-color coded mark 702-1 is overlaid on the WDM video stream 304-1 to indicate the WDM video stream 304-1 represents viewing angle range 0-degree to 180-degree. Referring to FIGURE 7B, a white-color coded mark 702-2 is overlaid on the WDM video stream 304-2 to indicate the WDM video stream 304-2 represents viewing angle range 180-degree to 360-degree. In one example, the identifier can be a textual mark overlaid or superimposed on one corner of the WDM video stream 304.
The stream generating unit 214 then provides the plurality of WDM video streams 304 to the transcoding unit 216. The transcoding unit 216 transcodes the plurality of WDM video streams 304 using any of the transcoding techniques as known in the art. The transcoding unit 216 then stores the plurality of trans-coded WDM video streams 304 and the manifest file 310 in a database 312 for transmitting to the client device 110 in response to accessing the 360-degree video content on the client device 110. The database 312 can be external to the system 102 or internal to the system 102 such as part of the memory 204. Thus, in response to accessing the 360-degree video content on the client device 110, the stream generating unit 214 transmits at least one WDM video stream from the plurality of WDM video streams 304 to the client device 110 based on the manifest file, the second set of parameters 308, and a viewpoint of a user of the client device 110 for rendering the 360-degree video content on the client device 110.
Further, in one implementation, the stream generating unit 214 collects a plurality of user-parameters corresponding to viewing of one or more 360-degree video contents on the client device 110 for a predetermined time period. The one or more 360-degree content includes the 360-degree video content. In an example, the predetermined time period is 2 months. The time period can be determined based on content type of the one or more 360-degree video contents. The plurality of user-parameters includes preferred viewpoint of the user, preferred content type of the 360-degree video content, and preferred period of viewing. The stream generating unit 214 collects the plurality of user-parameters using techniques as known in the art. The stream generating unit 214 stores the plurality of user-parameters in the database 312 for later use.
Upon expiry of the predetermined time period, the stream generating unit 214 then generates plurality of further WDM video streams from the cube-map projection format 302 based on the first set of parameters 306, the second set of parameters 308, and the plurality of user-parameters, in a manner as described earlier. This enables generation of multiple WDM video streams of various contents to cater to varied interests of the user. The process of collecting user parameters and generating WDM video streams accordingly continues as per requirement.
FIGURE 10 schematically illustrates the client device 110 to render the 360-degree video content, in accordance with the example embodiment of the present disclosure. The client device 110 includes the video playing module(s) 112, a processor 1002, a memory 1004, a display 1006, a communication interface unit 1008, resource(s) 1010, and data 1012. The communication interface unit 1008 enables transmission and reception of data between the system 102 and the client device 110. The resource(s) 1010 can refer to units necessary for proper functioning of the client device 110 apart from the processor 1002 and the memory 1004. The processor 1002, the memory 1004, the display 1006, the communication interface unit 1008, the resource(s) 1010, and the video playing module(s) 112 are communicatively coupled with each other. The data 1012 serves, amongst other things, as a repository for storing data processed, received, and generated by the video playing module(s) 112 or during interactions between any of the aforementioned components.
In an implementation, the video playing module(s) 112 can include a receiving unit 1014, and a rendering unit 1016. The receiving unit 1014 and the rendering unit 1016 are in communication with each other. The video playing module(s) 112 amongst other things include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The video playing module(s) 112 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
Further, the video playing module(s) 112 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor 1002, a state machine, a logic array or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to perform the required functions. In another aspect of the present disclosure, the video playing module(s) 112 may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the described functionalities.
The operation of the client device 110 will now be explained using FIGURE 11. For the sake of brevity, details that are already explained previously are not repeated herein.
FIGURE 11 illustrates a process flow 1100 for rendering the 360-degree video content based on the WDM video streams on the client device 110, in accordance with the example embodiment of the present disclosure. Accordingly, the receiving unit 1014 receives a user-input 1102 indicative of accessing the 360-degree video content on the client device 110. The 360-degree video content can be accessed via a web-link through a browser or a link through a mobile-based application or through a VR device. The user-input 1102 can be touch-based input or a non-touch based input or an input from an input device communicatively coupled with the client device 110. Examples of the non-touch based input include, but not limited to, waving of hand while wearing VR glove sensor, pointing of finger while wearing VR glove sensor, etc. Examples of the input device include, but not limited to, stylus, mouse, keyboard, joystick, etc.
In response to the user-input 1102, the rendering unit 1016 accesses the manifest file 310 from the database 312. To this end, the rendering unit 1016 transmits a query for the manifest file 310 to the system 102 through the network 108. The system 102 then fetches the manifest file 310 from the database 312 and transmits to the rendering unit 1016 through the network 108.
Upon accessing the manifest file 310, the rendering unit 1016 determines a viewpoint 1104 of a user. The viewpoint 1104 of the user can be determined based on one of (a) orientation of the client device 110, (b) head-movement of the user, (c) eye-movement of the user, and (d) touch-input of the user. The rendering unit 1016 determines the viewpoint 1104 using techniques as known in the art. Further, the rendering unit 1016 determines a data corresponding to a plurality of device parameters 1106 corresponding to the client device 110. The device parameters 1106 include resolution of the display 1006, device type of the client device 110, processing capability of the client device 110, and bandwidth available to the client device 110.The rendering unit 1016 obtains the device parameters 1106 of the client device 110 using techniques as known in the art.
Upon determining the viewpoint 1104 and the device parameters 1106, the rendering unit 1016 selects a link of WDM video stream from the manifest file 310 corresponding to the viewpoint based on the data such that the WDM video stream is closest to the viewpoint of the user. The rendering unit 1016 then accesses the WDM video stream corresponding to the selected link from the database 312. To this end, the rendering unit 1016 transmits a query for the WDM video stream to the system 102. The system 102 then fetches the WDM video stream from the database 312 and transmits the WDM video stream to the rendering unit 1016. The rendering unit 1016 then renders the WDM video stream on the display 1006 using techniques as known in the art.
In an example, the viewpoint is determined as ‘0-degree to 180-degree’. Thereafter, the data corresponding to the device parameters is determined. The data can indicate a resolution of the display 1006 is 2K and a bandwidth is 10MB. Accordingly, the rendering unit 1016 selects link to the WDM video stream (WDM-1) for display and access the WDM video stream (WDM-1) from the database 312. The rendering unit 1016 then renders the WDM video stream (WDM-1) on the display 1016.
Further, in one implementation, the rendering unit 1016 determines a further viewpoint of the user. The rendering unit 1016 determine a further data corresponding to the plurality of device parameters associated with the client device 110. The rendering unit 1016 selects a link of a further WDM video stream from the manifest file 310 corresponding to the further viewpoint based on the further data such that the further WDM video stream is closest to the further viewpoint of the user. The rendering unit 1016 accesses the further WDM video stream corresponding to the selected link from the database. To this end, the rendering unit 1016 transmits a query for the WDM video stream to the system 102 through the network 108. The system 102 then fetches the further WDM video stream from the database 312 and transmits the further WDM video stream to the rendering unit 1016 through the network 108. The rendering unit 1016 renders the further WDM video stream on the display 1006 using techniques as known in the art.
In the above example, the viewpoint is determined as ‘180-degree to 360-degree’ based on a head-movement of the user. Thereafter, further data corresponding to the device parameters is determined. The further data can indicate a resolution of the display 1006 is 2K and the bandwidth is 25MB. Accordingly, the rendering unit 1016 selects link to the WDM video stream (WDM-2) for display and access the WDM video stream (WDM-2 from the database 312. The rendering unit 1016 then renders the WDM video stream (WDM-2) on the display 1006.
Further, prior to rendering the further WDM video stream, the rendering unit 1016 rotates a three-dimensional object onto which the 360-degree video content is texture-mapped based on the identifier embedded in the selected further WDM video stream. The three-dimensional object can be any of cube, trapezium, cylinder, etc., as deemed appropriate for texture-mapping the 360-degree video content. The rendering unit 1016 then applies texture of the 360-degree video content on the rotated three-dimensional object. The rotation of the three-dimensional object and the application of the texture can be performed using techniques as known in the art. The rendering unit 1016 then renders the selected further WDM video stream on the display 1006 using techniques as known in the art. This enables seamless display of different WDM video streams on the display 1006 without any glitch, thereby resulting in enhanced user-experience while reducing consumption of bandwidth, power, resource(s), etc.
FIGURES 12-14 illustrate flow diagrams of a method 1200 of rendering the 360-degree video on the client device, in accordance with the example embodiment of the present disclosure. The method 1200 may be implemented in the system 102 using components thereof, as described above. Further, for the sake of brevity, details of the present disclosure that are explained in details in the description of FIGURE 1 to FIGURES 9A-9B are not explained in detail in the description of FIGURE 12. Referring to FIGURE 12, at block 1202, the method 1200 includes obtaining the 360-degree video content in a cube-map projection format.
At block 1204, the method 1200 includes converting the cube-map projection format into a plurality of weighted density map (WDM) video streams based on a first set of parameters and a second set of parameters. Each of the WDM video streams represents a viewing angle range at a maximum resolution, and a resolution of each of the WDM video streams is lesser than a resolution of the 360-degree video content. The resolution of the plurality of WDM video streams is supported by the client device. The first set of parameters is obtained from a content source and includes resolution of the 360-degree video content and content type of the 360-degree video content. The second set of parameters are obtained from the client device and include preferred viewpoint of the user, period of viewing the 360-degree video content, resolution of a display of the client device, device type of the client device, processing capability of the client device and bandwidth available to the client device.
At block 1206, the method 1200 includes generating a manifest file comprising access links corresponding to the plurality of WDM video streams at multiple resolutions.
At block 1208, the method 1200 includes transmitting at least one WDM video stream from the plurality of WDM video streams to the client device based on the manifest file, the second set of parameters, and a viewpoint of a user of the client device for rendering the 360-degree video content on the client device in response to accessing the 360-degree video content on the client device.
Further, the method 1200 includes embedding an identifier in each of the plurality of WDM video streams, the identifier indicating the viewing angle range of the WDM video stream. The method 1200 includes transcoding the plurality of WDM video streams. The method 1200 includes storing the plurality of trans-coded WDM video streams and the manifest file in a database for transmitting to the client device in response to accessing the 360-degree video content on the client device.
The step of converting WDM video streams from the cube-map projection format of the 360-degree video content as indicated at block 1204 in FIGURE 12 includes further steps. Referring to FIGURE 13, at block 1302, the method 1200 includes dynamically determining a factor based on the first set of parameters and the second set of parameters. At block 1304, the method 1200 includes dynamically determining a layout for the plurality of WDM video streams based on the first set of parameters and the second set of parameters. The layout comprises of a plurality of rectangular areas. At block 1306, the method 1200 includes generating the plurality of WDM video streams in accordance with the layout such that a number of the generated WDM video streams are equal to the factor.
The step of generating the plurality of WDM video streams as indicated at block 1306 in FIGURE 13 includes further steps. Referring to FIGURE 14, at block 1402, the method 1200 includes selecting cube-map faces of the cube-map projection format corresponding to the viewing angle range. At block 1404, the method 1200 includes mapping the selected cube-map faces into one or more rectangular areas of the layout such that a combined resolution of the selected cube-map faces is substantially closer to the maximum resolution. The mapping comprises at least one of: cropping the selected cube-map faces to fill the one or more rectangular areas; resizing the selected cube-map faces to fill the one or more rectangular areas; and rearranging pixel data in the selected cube-map faces to fill the one or more rectangular areas.
Further, in one implementation, the method 1200 may include additional steps. Accordingly, at block 1406, the method 1200 includes selecting remaining cube-map faces of the cube-map projection format. At block 1408, the method 1200 includes mapping the remaining cube-map faces into one or more further rectangular areas of the layout such that a combined resolution of the remaining cube-map faces is substantially lower than the maximum resolution.
Further, in one implementation, the method 1200 may include additional steps. Accordingly, the method 1200 includes selecting a media content. The method 1200 includes mapping the media content into a further rectangular area of the layout.
Further, in one implementation, the method 1200 includes collecting a plurality of user-parameters corresponding to viewing of one or more 360-degree video contents on the client device for a predetermined time period. The one or more 360-degree content includes the 360-degree video content. The method 1200 includes converting the cube-map projection format into a plurality of further WDM video streams based on the first set of parameters, the second set of parameters, and the plurality of user-parameters upon expiry of the predetermined time period.
FIGURES 15-16 illustrate flow diagrams of a method 1500 of rendering the 360-degree video on the client device 110, in accordance with the example embodiment of the present disclosure. The method 1500 may be implemented in the client device 110 using components thereof, as described above. Further, for the sake of brevity, details of the present disclosure that are explained in details in the description of FIGURE 10 to FIGURE 11 are not explained in detail in the description of FIGURE 15. Referring to FIGURE 15, at block 1502, the method 1500 includes receiving a user-input indicative of accessing the 360-degree video content on the client device.
At block 1504, method 1500 includes accessing a manifest file from a database in response to the user-input. The manifest file comprising access links corresponding to a plurality of WDM video streams at multiple resolutions.
At block 1506, method 1500 includes determining a viewpoint of the user. At block 1508, method 1500 includes determining a data corresponding to a plurality of device parameters associated with the client device.
At block 1510, method 1500 includes selecting a link of WDM video stream from the manifest file corresponding to the viewpoint based on the data such that the WDM video stream is closest to the viewpoint of the user. At block 1512, the method 1500 includes accessing the WDM video stream corresponding to the selected link from the database. At block 1514, the method 1500 includes rendering the WDM video stream on a display.
Upon rendering the 360-degree video content on the client device as indicated at block 1514 in FIGURE 15, the method 1500 includes further steps. Referring to FIGURE 16, at block 1602, the method 1500 includes determining a further viewpoint of the user. At block 1604, the method 1500 includes determining a further data corresponding to the plurality of device parameters associated with the client device.
At block 1606, the method 1500 includes selecting a link of a further WDM video stream from the manifest file corresponding to the further viewpoint based on the further data such that the further WDM video stream is closest to the further viewpoint of the user. At block 1608, the method 1500 includes accessing the further WDM video stream corresponding to the selected link from the database.
At block 1610, prior to rendering the further WDM video stream, the method 1500 includes rotating a three-dimensional object onto which the 360-degree video content is texture-mapped based on an identifier embedded in the selected further WDM video stream. At block 1612, the method 1500 includes applying texture of the 360-degree video content on the rotated three-dimensional object. At block 1614, upon applying the texture, the method 1500 includes rendering the selected further WDM video stream on the display of the client device.
Thus the present disclosure enables rendering of the 360-degree video content on the client device by creating multiple peripheral views or WDM video streams with varying qualities for every possible viewpoint or orientation of the user. Each peripheral view is at a resolution supported by the client device but lesser than a higher resolution of the 360-degree video content. This enhances viewing experience without compromising on quality of the 360-degree video content. In addition, consumption of bandwidth, power, and other resources is reduced without requirement of complex or specific hardware support at the client device. Thus, the 360-degree video content can be rendered on any device, without consuming excess bandwidth and/or resource(s) and without restricting to a type of the device.
While specific language has been used to describe the present disclosure, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein. The drawings and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. Clearly, the present disclosure may be otherwise variously embodied, and practiced within the scope of the following claims.
,CLAIMS:WE CLAIM:
1. A method of rendering 360-degree video content on a client device, the method comprising:
obtaining the 360-degree video content in a cube-map projection format;
converting the cube-map projection format into a plurality of weighted density map (WDM) video streams based on a first set of parameters and a second set of parameters, each of the WDM video streams represents a viewing angle range at a maximum resolution, and a resolution of each of the WDM video streams is lesser than a resolution of the 360-degree video content;
generating a manifest file comprising access links corresponding to the plurality of WDM video streams at multiple resolutions; and
transmitting at least one WDM video stream from the plurality of WDM video streams to the client device based on the manifest file, the second set of parameters, and a viewpoint of a user of the client device for rendering the 360-degree video content on the client device in response to accessing the 360-degree video content on the client device.

2. The method as claimed in claim 1, wherein the resolution of the plurality of WDM video streams is supported by the client device.

3. The method as claimed in claim 1, wherein:
the first set of parameters are obtained from a content source and include resolution of the 360-degree video content and content type of the 360-degree video content; and
the second set of parameters are obtained from the client device and include period of viewing the 360-degree video content, resolution of a display of the client device, device type of the client device, processing capability of the client device, and bandwidth available to the client device.

4. The method as claimed in claim 1, wherein the converting comprises:
dynamically determining a factor based on the first set of parameters and the second set of parameters;
dynamically determining a layout for the plurality of WDM video streams based on the first set of parameters and the second set of parameters, the layout comprising of a plurality of rectangular areas; and
generating the plurality of WDM video streams in accordance with the layout such that a number of the generated WDM video streams is equal to the factor.

5. The method as claimed in claim 4, wherein generating each of the plurality of WDM video streams comprises:
selecting cube-map faces of the cube-map projection format corresponding to the viewing angle range; and
mapping the selected cube-map faces into one or more rectangular areas of the layout such that a combined resolution of the selected cube-map faces is substantially closer to the maximum resolution.

6. The method as claimed in claim 5, wherein the mapping comprises at least one of:
cropping the selected cube-map faces to fill the one or more rectangular areas;
resizing the selected cube-map faces to fill the one or more rectangular areas; and
rearranging pixel data in the selected cube-map faces to fill the one or more rectangular areas.

7. The method as claimed in claim 5, wherein generating each of the plurality of WDM video stream further comprises:
selecting remaining cube-map faces of the cube-map projection format; and
mapping the remaining cube-map faces into one or more further rectangular areas of the layout such that a combined resolution of the remaining cube-map faces is substantially lower than the maximum resolution.

8. The method as claimed in claim 7, wherein generating each of the plurality of WDM video streams further comprises:
selecting a media content; and
mapping the media content into a further rectangular area of the layout.

9. The method as claimed in claim 1, further comprising:
embedding an identifier in each of the plurality of WDM video streams, the identifier indicating the viewing angle range of the WDM video stream;
trans-coding the plurality of WDM video streams; and
storing the plurality of trans-coded WDM video streams and the manifest file in a database for transmitting to the client device in response to accessing the 360-degree video content on the client device.

10. The method as claimed in claim 1, further comprising:
collecting a plurality of user-parameters corresponding to viewing of one or more 360-degree video contents on the client device for a predetermined time period, the one or more 360-degree content including the 360-degree video content; and
converting the cube-map projection format into a plurality of further WDM video streams based on the first set of parameters, the second set of parameters, and the plurality of user-parameters upon expiry of the predetermined time period.

11. The method as claimed in claim 10, wherein the plurality of user-parameters includes preferred viewpoint of the user, preferred content type of the 360-degree video content, and preferred period of viewing.

12. A method of rendering a 360-degree video content on a client device, the method comprising:
receiving a user-input indicative of accessing the 360-degree video content on the client device;
accessing a manifest file from a database in response to the user-input, the manifest file comprising access links corresponding to a plurality of WDM video streams at multiple resolutions;
determining a viewpoint of the user;
determining a data corresponding to a plurality of device parameters associated with the client device;
selecting a link of WDM video stream from the manifest file corresponding to the viewpoint based on the data such that the WDM video stream is closest to the viewpoint of the user;
accessing the WDM video stream corresponding to the selected link from the database; and
rendering the WDM video stream on a display.

13. The method as claimed in claim 12, further comprising:
determining a further viewpoint of the user;
determining a further data corresponding to the plurality of device parameters associated with the client device;
selecting a link of a further WDM video stream from the manifest file corresponding to the further viewpoint based on the further data such that the further WDM video stream is closest to the further viewpoint of the user;
accessing the further WDM video stream corresponding to the selected link from the database; and
rendering the further WDM video stream on the display.

14. The method as claimed in claim 13, further comprising:
prior to rendering the further WDM video stream, rotating a three-dimensional object onto which the 360-degree video content is texture-mapped based on an identifier embedded in the further WDM video stream; and
applying texture of the 360-degree video content on the rotated three-dimensional object.

15. The method as claimed in claim 13, wherein the plurality of device parameters includes resolution of the display, device type of the client device, processing capability of the client device, and bandwidth available to the client device.

16. A system to render 360-degree video content on a client device, the system comprising:
a receiving unit to obtain the 360-degree video content in a cube-map projection format; and
a stream generating unit to:
convert the cube-map projection format into a plurality of weighted density map (WDM) video stream based on a first set of parameters and a second set of parameters, each of the WDM video streams represents a viewing angle range at a maximum resolution, and a resolution of each of the WDM video streams is lesser than a resolution of the 360-degree video content;
generate a manifest file comprising access links corresponding to the plurality of WDM video streams at multiple resolutions; and
transmit at least one WDM video stream from the plurality of WDM video streams to the client device based on the manifest file, the second set of parameters, and a viewpoint of a user of the client device for rendering the 360-degree video content on the client device in response to accessing the 360-degree video content on the client device.

17. The system as claimed in claim 16 wherein the resolution of the plurality of WDM video streams is supported by the client device.

18. The system as claimed in claim 16, wherein:
the first set of parameters are obtained from a content source and include resolution of the 360-degree video content and content type of the 360-degree video content; and
the second set of parameters are obtained from the client device and include period of viewing the 360-degree video content, resolution of a display of the client device, device type of the client device, processing capability of the client device, and bandwidth available to the client device.

19. The system as claimed in claim 16, wherein to convert the cube-map projection format into the plurality of WDM video streams, the stream generating unit is to:
dynamically determine a factor based on the first set of parameters and the second set of parameters;
dynamically determine a layout for the plurality of WDM video streams based on the first set of parameters and the second set of parameters, the layout comprising of a plurality of rectangular areas; and
generate the plurality of WDM video streams in accordance with the layout such that a number of the generated WDM video streams is equal to the factor.

20. The system as claimed in claim 19, wherein to generate each of the plurality of WDM video streams, the stream generating unit is to:
select cube-map faces of the cube-map projection format corresponding to the viewing angle range; and
map the selected cube-map faces into one or more rectangular areas of the layout such that a combined resolution of the selected cube-map faces is substantially closer to the maximum resolution.
21. The system as claimed in claim 20, wherein to map the selected cube-map faces, the stream generating unit is to perform at least one of:
crop the selected cube-map faces to fill the one or more rectangular areas;
resize the selected cube-map faces to fill the one or more rectangular areas; and
rearrange pixel data in the selected cube-map faces to fill the one or more rectangular areas.

22. The system as claimed in claim 20, wherein to generate each of the plurality of WDM video streams, the stream generating unit is to further:
select remaining cube-map faces of the cube-map projection format; and
map the remaining cube-map faces into one or more further rectangular areas of the layout such that a combined resolution of the remaining cube-map faces is substantially lower than the maximum resolution.

23. The system as claimed in claim 21, wherein to generate each of the plurality of WDM video streams, the stream generating unit is to further:
select a media content; and
map the media content into a further rectangular area of the layout.

24. The system as claimed in claim 16, further comprises a transcoding unit and wherein:
the stream generating unit is to embed an identifier in each of the plurality of WDM video streams, the identifier indicating the viewing angle range of the WDM video stream; and
the transcoding unit is to:
transcode the plurality of WDM video streams; and
store the plurality of trans-coded WDM video streams and the manifest file in a database for transmitting to the client device in response to accessing the 360-degree video content on the client device.

25. The system as claimed in claim 16, wherein the stream generating unit is to further:
collect a plurality of user-parameters corresponding to viewing of one or more 360-degree video contents on the client device for a predetermined time period, the one or more 360-degree content including the 360-degree video content; and
convert the cube-map projection format into a plurality of further WDM video streams based on the first set of parameters, the second set of parameters, and the plurality of user-parameters upon expiry of the predetermined time period.

26. The system as claimed in claim 25, wherein the plurality of user-parameters includes preferred viewpoint of the user, preferred content type of the 360-degree video content, and preferred period of viewing.

27. A client device for rendering a 360-degree video content, the client device comprising:
a receiving unit to receive a user-input indicative of accessing the 360-degree video content on the client device; and
a rendering unit to:
determine the viewpoint of the user and the second set of parameters from the client device;
access a manifest file from a database in response to the user-input, the manifest file comprising access links corresponding to a plurality of WDM video streams at multiple resolutions;
determine a viewpoint of the user;
determine a data corresponding to a plurality of device parameters associated with the client device;
select a link of WDM video stream from the manifest file corresponding to the viewpoint based on the data such that the WDM video stream is closest to the viewpoint of the user;
access the WDM video stream corresponding to the selected link from the database; and
render the WDM video stream on a display.

28. The client device as claimed in claim 27, wherein the rendering unit is to further:
determine a further data corresponding to the plurality of device parameters associated with the client device;
select a link of a further WDM video stream from the manifest file corresponding to the further viewpoint based on the further data such that the further WDM video stream is closest to the further viewpoint of the user;
access the further WDM video stream corresponding to the selected link from the database; and
render the further WDM video stream on the display.

29. The client device as claimed in claim 28, wherein the rendering unit is to further:
prior to rendering the further WDM video stream, rotate a three-dimensional object onto which the 360-degree video content is texture-mapped based on an identifier embedded in the further WDM video stream; and
apply texture of the 360-degree video content on the rotated three-dimensional object.

30. The client device as claimed in claim 27, wherein the plurality of device parameters includes resolution of the display, device type of the client device, processing capability of the client device, and bandwidth available to the client device.

Documents

Application Documents

# Name Date
1 201821003862-CLAIMS [01-02-2022(online)].pdf 2022-02-01
1 201821003862-FORM28-010218.pdf 2018-08-11
2 201821003862-FER_SER_REPLY [01-02-2022(online)].pdf 2022-02-01
2 201821003862-Form 2(Title Page)-010218.pdf 2018-08-11
3 201821003862-OTHERS [01-02-2022(online)].pdf 2022-02-01
3 201821003862-Form 1-010218.pdf 2018-08-11
4 201821003862-FER.pdf 2021-11-26
4 201821003862-DRAWING [01-02-2019(online)].pdf 2019-02-01
5 201821003862-CORRESPONDENCE-OTHERS [01-02-2019(online)].pdf 2019-02-01
5 201821003862-CORRECTED PAGES [02-09-2020(online)].pdf 2020-09-02
6 201821003862-EVIDENCE FOR REGISTRATION UNDER SSI [02-09-2020(online)].pdf 2020-09-02
6 201821003862-COMPLETE SPECIFICATION [01-02-2019(online)].pdf 2019-02-01
7 201821003862-RELEVANT DOCUMENTS [23-04-2019(online)].pdf 2019-04-23
7 201821003862-FORM 18 [02-09-2020(online)].pdf 2020-09-02
8 201821003862-FORM FOR STARTUP [02-09-2020(online)].pdf 2020-09-02
8 201821003862-FORM 13 [23-04-2019(online)].pdf 2019-04-23
9 201821003862-AMENDED DOCUMENTS [23-04-2019(online)].pdf 2019-04-23
9 201821003862-ORIGINAL UR 6(1A) FORM 26-300419.pdf 2019-09-24
10 Abstract1.jpg 2019-06-10
11 201821003862-AMENDED DOCUMENTS [23-04-2019(online)].pdf 2019-04-23
11 201821003862-ORIGINAL UR 6(1A) FORM 26-300419.pdf 2019-09-24
12 201821003862-FORM 13 [23-04-2019(online)].pdf 2019-04-23
12 201821003862-FORM FOR STARTUP [02-09-2020(online)].pdf 2020-09-02
13 201821003862-FORM 18 [02-09-2020(online)].pdf 2020-09-02
13 201821003862-RELEVANT DOCUMENTS [23-04-2019(online)].pdf 2019-04-23
14 201821003862-COMPLETE SPECIFICATION [01-02-2019(online)].pdf 2019-02-01
14 201821003862-EVIDENCE FOR REGISTRATION UNDER SSI [02-09-2020(online)].pdf 2020-09-02
15 201821003862-CORRECTED PAGES [02-09-2020(online)].pdf 2020-09-02
15 201821003862-CORRESPONDENCE-OTHERS [01-02-2019(online)].pdf 2019-02-01
16 201821003862-DRAWING [01-02-2019(online)].pdf 2019-02-01
16 201821003862-FER.pdf 2021-11-26
17 201821003862-Form 1-010218.pdf 2018-08-11
17 201821003862-OTHERS [01-02-2022(online)].pdf 2022-02-01
18 201821003862-FER_SER_REPLY [01-02-2022(online)].pdf 2022-02-01
18 201821003862-Form 2(Title Page)-010218.pdf 2018-08-11
19 201821003862-FORM28-010218.pdf 2018-08-11
19 201821003862-CLAIMS [01-02-2022(online)].pdf 2022-02-01

Search Strategy

1 SearchHistoryE_23-11-2021.pdf