Abstract: The present disclosure relates to a method 400 and system 102 for concurrently providing an auxiliary content window 216 during streaming of a main content. In one example, a first content data 212 associated with an auxiliary content 210 may be obtained. The auxiliary content window 216 may then be generated based on the first content data 212. The auxiliary content window 216 may thereafter caused to be displayed as a target content window. Thereafter, a second content data 214 associated with the auxiliary content 210 may be obtained. Based on the second content data 214, the target content window may caused to be displayed. [FIG. 3 and FIG. 4]
FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
“SYSTEMS AND METHODS FOR CONCURRENTLY PROVIDING AUXILIARY CONTENT WINDOW DURING STREAMING OF A MAIN CONTENT”
We, Star India Private Limited, an Indian national, of Star House, Urmi Estate, 95, Ganapatrao Kadam Marg, Lower Parel West, Mumbai, Maharashtra 400013, India.
The following specification particularly describes the invention and the manner in which it is to be performed:
2
SYSTEMS AND METHODS FOR CONCURRENTLY PROVIDING AUXILIARY CONTENT WINDOW DURING STREAMING OF A MAIN CONTENT
TECHNICAL FIELD 5
[0001]
The present invention relates to streaming of media content. More specifically, the present invention relates to systems and methods for concurrently providing an auxiliary content window during streaming of a main content.
BACKGROUND
[0002]
Streaming media platforms that provide media content over the internet 10 have become increasingly popular in the last few years. With an increasing number of people using the digital platform to consume media content, the requirements of the consumers are also increasing. For example, a user, while consuming a media content on a digital platform, may wish to consume another media content or a part or information of another media content simultaneously with the media content that 15 the user is consuming. Both contents may be of user’s interest, and the user therefore may wish to consume both contents at the same time. In an instance, the user may be watching a movie while a sports match of the user’s interest is going on. Then the user may wish to know the updates of the sports match while watching the movie. 20
[0003]
In another example, the user may not be aware about the sports match happening and therefore already planned to watch a movie. However, if the user is suggested about the sports match happening at the same time when the user visits the digital platform to watch the movie, the user may wish to be updated about the score and/or the highlights of the sports match. 25
[0004]
Furthermore, there can be many such instances where the user for one or the other reason may wish to consume data from different events at the same time. For example, a live sports match may have many dull moments which may not be of user’s interest. And, the user may want to watch the sports match only
3
when something interesting is happening in the sports match. For this purpose, the
user may wish to be updated on the progress (score, key moments etc.) of ongoing sports match while the user is consuming some other content, for example, the user may be watching some other media content.
[0005]
The existing solutions are not able to provide such functionalities 5 mentioned above and hence there is a need for a solution that is able to provide the functionality of updating the information of an event customised based on the user’s interest while the user is consuming a media content, or consuming any content on the digital platform or on any application on the user device. An attempt to solve such problems was made by providing platforms that may be able to play two 10 different media streams simultaneously in the same window when the user wishes the same, such as in static or floating picture-in-picture functionality or a split-screen functionality. Another attempt was made by providing solutions that may be able to play a media content while the user is browsing through another content such as a web page etc. 15
[0006]
However, in such scenarios the attempts are not able to suggest a user for consuming information customised based on the user’s interest and also require some manual input of the user to present the content. Further, the attempted solutions also are not able to provide the functionality of playing a partial stream of a content and a partial stream of another content together, for example, playing 20 audio of one content and video of another content simultaneously. Also, the existing are not able to provide functionality for providing updates of an event, including a live streaming event, to the user automatically and customised based on user’s interest.
[0007]
In order to solve the above and other related inherent problems, there 25 exists an imperative need for a solution that provides suggestions for providing an auxiliary content to the user based on their interest automatically during streaming of a main content.
4
OBJECTIVES OF THE INVENTION
[0008]
This section is provided to introduce certain objectives and aspects of the present invention in a simplified format, that are then further elaborated upon in the subsequent paragraphs provided in the section of detailed description of the present disclosure. 5
[0009]
In order to overcome at least a few of the problems of the known solutions as provided in the previous section, an objective of the present disclosure is to substantially reduce the limitations and/or drawbacks of the prior arts as described herein above.
[0010]
An objective of the disclosure is to provide methods and systems for 10 concurrently providing an auxiliary content window during streaming of a main content.
[0011]
Another objective of the disclosure is to provide methods and systems for analysing user’s interest from the user’s historical data and accordingly suggest auxiliary content to the user for consumption. 15
[0012]
Yet another objective of the present disclosure is to provide methods and systems that can provide an option to the user for playing an audio from one media content and playing video from another media content.
[0013]
Yet another objective of the present disclosure is to provide methods and systems for automatically presenting the user an information pertaining to the 20 auxiliary event of user’s interest and to update such information dynamically.
[0014]
Yet another objective of the present disclosure is to provide methods and systems that can provide a content window in which the content presented to a user is customized, in terms of number of windows, information presented etc., for the user. 25
[0015]
Yet another objective of the present disclosure is to provide methods and systems for providing to a user, updates related to a live stream content when the user is presented with some targeted content pushed by the digital platform such as including but not limited to advertisement(s) etc.
5
[0016]
Yet another objective of the present disclosure is to provide the updates related to the live stream in audio/text format in case of a delay in rendering the live stream video due to unstable/low-bandwidth network connection.
[0017]
Yet another objective of the present disclosure is to provide an auxiliary content to the user in case a user is waiting for a completion of an action related to 5 a primary content, for example, action may be downloading and/ or buffering of the primary content.
SUMMARY
[0018]
This section is provided to introduce certain aspects of the present disclosure in a simplified form that are further described below in the detailed 10 description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
[0019]
An aspect of the present disclosure may relate to a method for concurrently providing an auxiliary content window during streaming of a main content. The method may include obtaining a first content data associated with an 15 auxiliary content. Based on the first content data, the method may include generating an auxiliary content window. Thereafter, the method may include causing to display the auxiliary content window as a target content window. The method may then include obtaining a second content data associated with the auxiliary content. Based on the second content data, the method may include 20 causing to display the target content window.
[0020]
In an exemplary aspect of the present disclosure, the auxiliary content may be based on one of a user’s historical data, a set of user preferences, one or more user profile attributes, a set of preferences related to a user cohort, and a combination thereof. 25
[0021]
In an exemplary aspect of the present disclosure, the one or more user profile attributes may include at least one of a gender, geographical location, preferred language, or a combination thereof.
[0022]
In an exemplary aspect of the present disclosure, each of the first content data and the second content data may include one or more content attributes 30
6
associated with the auxiliary content. The one or more content attributes of each of
the first content data and the second content data may include one of a text, an audio, a video, an image, or a combination thereof.
[0023]
In an exemplary aspect of the present disclosure, the method may further include receiving a user selection corresponding to at least one of the 5 plurality of content attributes associated with one of the first content data and the second content data. Based on the user selection and the respective first and the second content data, the method may include generating the auxiliary content window.
[0024]
In an exemplary aspect of the present disclosure, the method may 10 further include periodically obtaining the first content data and the second content data based on a pre-defined time interval.
[0025]
In an exemplary aspect of the present disclosure, the first content data and the second content data may be obtained from one of a content server, a content repository, and a combination thereof. 15
[0026]
In an exemplary aspect of the present disclosure, pursuant to obtaining the first and the second content data associated with the auxiliary content, the method may include processing the obtained first content data and the obtained second content data. Based on the processed first and the second content data, the method may include generating the respective auxiliary content window. 20
[0027]
In an exemplary aspect of the present disclosure, based on the second content data, the method may further include updating the auxiliary content window. Thereafter, the method may include causing to display the updated auxiliary content window as the target content window.
[0028]
In an exemplary aspect of the present disclosure, the method may 25 further include causing to display the auxiliary content window as the target content window on a display device.
[0029]
In an exemplary aspect of the present disclosure, the method may further include determining one or more vacant portions on a display device during streaming of the main content. Thereafter, the method may include causing to 30
7
display the auxiliary content window on the determined one or more vacant
portions.
[0030]
In an exemplary aspect of the present disclosure, the method may further include providing via the target content window, one or more indications related to the auxiliary content, based on at least one of a user preference and an 5 update related to the auxiliary content. The one or more indications comprise at least one of one or more audio indications and one or more colour indications. Also, each notification from the one or more indications is one of a manually configurable notification and an automatically configurable notification.
[0031]
In an exemplary aspect of the present disclosure, the method may 10 further include causing to automatically adjust the target content window based on a user preference for the auxiliary content.
[0032]
In an exemplary aspect of the present disclosure, the method may further include causing to swap a display of the target content window and the main content based on a pre-defined user action on the target content window. 15
[0033]
Another aspect of the present disclosure may relate to a system for concurrently providing an auxiliary content window during streaming of a main content. The system may include a processor, and a content retrieval unit coupled to the processor. In operation, the content retrieval unit may obtain a first content data associated with an auxiliary content. Based on the first content data, the content 20 retrieval unit may generate an auxiliary content window. Thereafter, the content retrieval unit may cause to display the auxiliary content window as a target content window. Thereafter, the content retrieval unit may obtain a second content data associated with the auxiliary content. Based on the second content data, the content retrieval unit may cause to display the target content window. 25
BRIEF DESCRIPTION OF THE DRAWINGS
[0034]
The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same
8
parts throughout the different drawings. Components in the drawings are not
necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Also, the embodiments shown in the figures are not to be construed as limiting the disclosure, but the possible variants of the method and system according to the disclosure are illustrated herein to highlight the 5 advantages of the disclosure. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components or circuitry commonly used to implement such components.
[0035]
FIG. 1 illustrates an exemplary computing environment with an exemplary system for concurrently providing an auxiliary content window during 10 streaming of a main content, in accordance with an exemplary implementation of the present disclosure;
[0036]
FIG. 2 illustrates a block diagram of an exemplary computing device for concurrently providing an auxiliary content window during streaming of a main content, in accordance with an exemplary implementation of the present disclosure; 15
[0037]
FIG. 3 depicts a diagram of an exemplary auxiliary content window during streaming of an exemplary main content, in accordance with an exemplary implementation of the present disclosure; and
[0038]
FIG. 4 illustrates a flowchart depicting an example method for concurrently providing an auxiliary content window during streaming of a main 20 content, in accordance with an exemplary implementation of the present disclosure.
[0039]
The foregoing shall be more apparent from the following more detailed description of the disclosure.
DETAILED DESCRIPTION
25
[0040]
In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific
9
details. Several features described hereafter may each be used independently of one
another or with any combination of other features. An individual feature may not address any of the problems discussed above or might address only some of the problems discussed above.
[0041]
The ensuing description provides exemplary embodiments only, and is 5 not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope 10 of the disclosure as set forth.
[0042]
Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, processes, and other components 15 may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail.
[0043]
Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the 20 operations as a sequential process, many of the operations may be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure.
[0044]
The word “exemplary” and/or “demonstrative” is used herein to mean 25 serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques 30 known to those of ordinary skill in the art. Furthermore, to the extent that the terms
10
“includes,” “has,” “contains,” and other similar words are used in either the detailed
description or the claims, such terms are intended to be inclusive—in a manner similar to the term “comprising” as an open transition word—without precluding any additional or other elements.
[0045]
As used herein, a “processing unit” or “processor” or “operating 5 processor” includes one or more processors, wherein processor refers to any logic circuitry for processing instructions. A processor may be a general-purpose processor, a special purpose processor, a conventional processor, a digital signal processor, a plurality of microprocessors, one or more microprocessors in association with a (Digital Signal Processing) DSP core, a controller, a 10 microcontroller, Application Specific Integrated Circuits, Field Programmable Gate Array circuits, any other type of integrated circuits, etc. The processor may perform signal coding data processing, input/output processing, and/or any other functionality that enables the working of the system according to the present disclosure. More specifically, the processor or processing unit is a hardware 15 processor. Furthermore, to execute certain operations, the processing unit/processor as disclosed in the present disclosure may include one or more Central Processing Unit (CPU) and one or more Graphics Processing Unit (GPU), selected based on said certain operations. Furthermore, the graphics processing unit (GPU) is a specialized electronic circuit designed to rapidly manipulate and alter a memory to 20 accelerate a creation of images in a frame buffer intended for output to a display device.
[0046]
As used herein, “storage unit” or “memory unit” refers to a machine or computer-readable medium including any mechanism for storing information in a form readable by a computer or similar machine. For example, a computer-readable 25 medium includes read-only memory (“ROM”), random access memory (“RAM”), magnetic disk storage media, optical storage media, flash memory devices or other types of machine-accessible storage media. The storage unit can be any type of storage unit such as Cloud or CDN (content delivery network) storage, public, shared, private, telecommunications operator-based storage, or any other type of 30 storage known in the art or may be developed in future. The storage unit stores at
11
least the data that may be required by one or more units of the server/system/user
device to perform their respective functions.
[0047]
A ‘smart computing device’ or ‘user device’ refers to any electrical, electronic, electromechanical or an equipment or a combination of one or more of the above devices. Smart computing devices may include, but not limit to, a mobile 5 phone, smart phone, pager, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, smart television, gaming consoles, media streaming devices, or any other computing device as may be obvious to a person skilled in the art. In general, a smart computing device is a digital, user configured, computer networked device that can operate autonomously. 10 A smart computing device is one of the appropriate systems for storing data and other private/sensitive information.
[0048]
A “smartphone” is one type of “smart computing device” that refers to the mobility wireless cellular connectivity device that allows end users to use services on 2G, 3G, 4G, 5G or other upcoming generations of mobile broadband 15 Internet connections with an advanced mobile operating system which combines features of a personal computer operating system with other features useful for mobile or handheld use. These smartphones can access the Internet, have a touchscreen user interface, can run third-party apps including capturing images, and may be camera phones possessing high-speed mobile broadband internet with video 20 calling, hotspot functionality, motion sensors, mobile payment mechanisms and/or enhanced security features with alarm and alert in emergency situations.
[0049]
As used herein “a target content window” may be an overlay content window or a shared screen content window. The overlay content window may refer to a visualisation in form of a content being superimposed over another content on 25 a user interface. The shared screen content window may refer to a portion of a user interface which is utilised for displaying one or more contents in an event one or more other portions of the user interface are utilised for displaying one or more other contents. The target content window may be provided on the user interface for providing different forms of contents concurrently in events such as while 30
12
watching a movie or while sharing a screen
, etc. Also, the target content window may be customisable automatically or manually based on a user preference.
[0050]
To address the problems mentioned in the background section, methods and systems for concurrently providing an auxiliary content window during streaming of a main content are described. As would be appreciated, the methods 5 and systems of the present subject matter provides the user with a media content simultaneously with another media content that the user is consuming. Further, the user’s historical data is analysed user’s interests are understood based on the same, and a content window comprising an information related to a content of user’s interest that is customised for the particular user is thereafter automatically 10 presented to the user. The users may be given an option to switch video and audio of both media content when desired.
[0051]
The present subject matter is further described with reference to the accompanying figures. Wherever possible, the same reference numerals are used in the figures and the following description to refer to the same or similar parts. It 15 should be noted that the description and figures merely illustrate principles of the present subject matter. It is thus understood that various arrangements may be devised that, although not explicitly described or shown herein, encompass the principles of the present subject matter. Moreover, all statements herein reciting principles, aspects, and examples of the present subject matter, as well as specific 20 examples thereof, are intended to encompass equivalents thereof.
[0052]
The manner in which the exemplary auxiliary content is concurrently provided during streaming of a main content, are explained in detail with respect to FIGS. 1-4. It is to be noted that drawings of the present subject matter shown here are for illustrative purposes and are not to be construed as limiting the scope of the 25 subject matter claimed.
[0053]
Referring to FIG. 1, an exemplary computing environment 100 with an exemplary system 102 is shown. The system 102 may be used for concurrently providing an auxiliary content window during streaming of a main content, in accordance with an implementation of the present subject matter. The auxiliary 30 content window may be associated with an auxiliary content. Examples of such
13
auxiliary content and main content may include, but not limited to, any multimedia
content such as an image, an audio, a video, an animation, and a combination thereof.
[0054]
The system 102, in an example, may be any system capable of receiving user’s inputs, processing it, and displaying output information based on the received 5 user’s inputs. Examples of such system 102 may include, but are not limited to, a personal computer, a handheld computer, a mobile device, and a portable computer.
[0055]
As depicted in FIG. 1, the system 102 further includes a processor 104 and a display device 106. The processor 104 may be a dedicated special-function processor or a general-purpose processor which may be used by the system 102, in 10 conjunction with other elements of the computing environment 100, to implement the features of the present disclosure. The display device 106 may be coupled to the system 102 (as a standalone display) or may be integrated within the system 102 (as a display panel of a laptop or a tablet computer).
[0056]
Continuing further, the system 102 may be connected to a centralized 15 server 108 over a network 110. The centralized server 108 may include a content retrieval unit 112.
[0057]
As used herein, the centralized server 108 may be a set of one or more systems, which may be responsible, either individually or collectively, to perform a set of functions to enable the processing of one or more sets of data that may be 20 received from the system 102, for concurrently providing an auxiliary content window during streaming of a main content. Further, the centralized server 108 may be one of a local server, a remote server, or a combination thereof. Alternatively, the centralized server 108 may also be implemented as a network-based, cloud-based, or a software-based server. Further, in an implementation, the centralized 25 server 108 may also communicate with one or more content library or one or more third party services for fetching the content. Further, it may be noted that the centralized server [108] may receive main content from one source and the auxiliary content may be fetched from another source and is not necessarily required to be received from a single or common source. 30
14
[0058]
Further, the content retrieval unit 112 may be implemented as a software component that may be configured to analyse one or more sets of data, on the basis of instructions received from the processor 104 of the system 102. Alternatively, the present disclosure also encompasses that the content retrieval unit 112 may also be a general or specific function hardware component, such as a processor, or a 5 combination of both software and hardware components.
[0059]
It is pertinent to note that, as depicted in FIG. 1, the content retrieval unit 112 may be present in the centralized server 108, which in turn may be in communication with the system 102, however, it may be noted that such implementation is not to be construed to limit the scope of the present subject matter 10 in any manner. In another example, the content retrieval unit 112 may be a part of the system 102 itself, which would be further explained later in conjunction with FIG. 2.
[0060]
Further, as used herein, the network 110 may be one of a local area network (LAN) or a wide area network (WAN), i.e., the internet. The system 102 15 and the centralized server 108, as embodied herein, may be connected to the network 110 using at least one of a wired connection and a wireless connection, or a combination thereof. A wired connection may comprise an ethernet connection, an optical fibre connection, or any other wired network connection as may be known to a person ordinarily skilled in the art. Further, a wireless connection may 20 comprise a connection based on at least one of the following wireless communication technologies, such as Wi-Fi, Li-Fi, Radio Frequencies, Infrared, Cellular Communication (3G, 4G, 5G), Bluetooth, Near Field Communication (NFC), satellite communication and/or any other wireless communication technology as may be known to a person ordinarily skilled in the art. 25
[0061]
The manner in which the system 102 may concurrently provide auxiliary content window during streaming of a main content, in conjunction with the other elements of the present disclosure is illustrated with further details in conjunction with FIGs. 2-4.
[0062]
Referring to FIG. 2, a block diagram of an exemplary computing device 30 200 for concurrently providing an auxiliary content window during streaming of a
15
main content, in accordance with an implementation of the present subject matter,
is shown. As depicted in FIG. 2, the computing device 200 may be coupled to a display device 202.
[0063]
The computing device 200 may be implemented as any computing device with a display capable of rendering the main content and the auxiliary 5 content. In one example, the computing device 200 may be implemented as the system 102 as explained in FIG. 1.
[0064]
In another example, the display device 202 may be implemented as a standalone display coupled to the computing device 200. For example, the computing device 200 may be implemented as a portable computer, which may be 10 in turn coupled to the display device 202. Examples of such display device 202 may include, but are not limited to, LED display, projector, and/or any other display device capable of rendering media content based on some signals.
[0065]
In yet another example, the display device 202 may be integrated within the computing device 200. In such cases, examples of the computing device 200 15 may include, but are not limited to, a smartphone, a smartwatch, an AR/VR headset, a laptop or desktop computer, a handheld tablet computer, a computer-based kiosk, and/or any other computing device as may be known to a person ordinarily skilled in the art.
[0066]
The computing device 200 may be used for concurrently providing an 20 auxiliary content window during streaming of a main content. The auxiliary content window may be associated with an auxiliary content. Examples of such auxiliary content and main content may include, but not limited to, any multimedia content such as an image, a video, an audio, an animation, and a combination thereof.
[0067]
Further, it may be noted that, the present example and the foregoing 25 description has been explained in context of providing a single auxiliary content window during streaming of a main content, however, it may be noted that the same is done only for the sake of clarity and ease of understanding. The methods and systems of the present subject matter may be used for providing any number of auxiliary content windows during streaming of the main content, and such examples 30 would also lie within the scope of the present subject matter.
16
[0068]
As depicted in FIG. 2, the computing device 200 may include processor(s) 104 and unit(s) 204 which may be connected to the processor 104. The unit 204 may further comprise sub-components for implementing one or more features of the present disclosure, such as, a content retrieval unit 112 and other unit(s) 208, wherein either, or all the sub-components may also be connected to the 5 processor 104. It may be noted that the other unit(s) 208 may perform any function that may be ancillary to the content retrieval unit 112 for concurrently providing the auxiliary content window during streaming of the main content.
[0069]
Further, the computing device 200 may be configured to obtain and store one or more sets of data 206, which may further comprise an auxiliary content 10 210, a first content data 212, a second content data 214, an auxiliary content window 216, and other data 218. It may be noted that the other data 218 may comprise any data that may be ancillary, and/or additional to the other sets of data comprised in the data 206, any or all of which may be stored or obtained by the computing device 200 to concurrently provide the auxiliary content window during streaming of the 15 main content.
[0070]
In operation, a user (not depicted in FIG. 2) may be operating the computing device 200 and may be consuming a multimedia content, referred to as a main content. The main content may be streaming on the display device 202. Such main multimedia content may include, for example, an audio, a video, an image, an 20 animation, and a combination thereof. In one example, the user may be consuming the multimedia content on a digital platform. In another example, the digital platform may include a User Interface, where the user may be provided with a content catalogue. The user may select one of the contents from the content catalogue. In yet another example, the user, using its credentials, may login onto 25 the digital platform, and may start consuming the multimedia content (the main content). It may be noted that all such implementations are only exemplary, and in no manner is to construe the scope of the present subject matter in any manner. The user may be consuming the main content and the same may be streaming on the display device 202 in any other manner or using any techniques known to a person 30
17
skilled in the art. All such variations would lie within the scope of the present
subject matter.
[0071]
Continuing further, while the user is streaming and consuming the main content, in one example, there may be an auxiliary content 210 that the user may intend to consume. In one example, such auxiliary content 210 may be associated 5 with an auxiliary multimedia that the user intends to consume. In another example, such auxiliary content 210 may be associated with an auxiliary event. However, it may be noted that the auxiliary content 210 may include any other content, and all such examples would lie within the scope of the present subject matter.
[0072]
For example, the user may be streaming a movie available on a digital 10 platform. While the user is streaming the movie, there may be an auxiliary multimedia, say, an audio, or a video, stored in a content repository or a content server (not depicted in FIG. 2) that the user intends to consume. Or there may be an auxiliary event, say, a news event, a podcast, or a cricket match, which may be occurring. While the user is streaming and consuming the main content, i.e., the 15 movie, the user may wish to concurrently consume the auxiliary content 210 associated with the auxiliary multimedia or the auxiliary event. It may be noted that the aforementioned examples of the main content and the auxiliary content 210 are only exemplary, and only have been referred for the ease of understanding. Such examples should not be construed to limit the scope of the present subject matter in 20 any manner.
[0073]
Continuing further, in one example, when the user is aware of the auxiliary multimedia or the occurrence of the auxiliary event and intends to consume the main content and the auxiliary content 210 (associated with the auxiliary multimedia or the auxiliary event) simultaneously, the user may manually 25 select the corresponding auxiliary multimedia/auxiliary event. In another example, the content retrieval unit 112 may select the auxiliary multimedia, or auxiliary event automatically based on one of a user’s historical data, a set of user preferences, one or more user profile attributes, a set of preferences related to a user cohort, and a combination thereof. In yet another example, the digital platform, on which the user 30 may be streaming the main content, may have data corresponding to the user,
18
referred to as user profile. In such cases, the user profile may include, for example,
at least one of a gender, geographical location, preferred language, or a combination thereof. The content retrieval unit 112 may select the auxiliary multimedia, or auxiliary event, in such cases, based on the user profile data.
[0074]
In yet another example, the content retrieval unit 112 may select the 5 auxiliary multimedia or auxiliary event based on a set of channels and media contents that the user had previously consumed. In yet another example, the user may be travelling to a location, and the content retrieval unit 112 may select the auxiliary multimedia or auxiliary event based on the geographical location of the user. For yet another example, the user may provide his/her interest and preferred 10 genres to the digital platform. Based on such user interests, the content retrieval unit 112 may select the auxiliary multimedia / auxiliary event.
[0075]
It may be again noted that the aforementioned scenarios of selection of the auxiliary multimedia or auxiliary event are only exemplary, and in no manner is construed to limit the scope of the present subject matter in any manner. The 15 content retrieval unit 112 may select the auxiliary multimedia or auxiliary event in any other manner, and all such examples would lie within the scope of the present subject matter.
[0076]
Continuing further, after the auxiliary multimedia or auxiliary event has been selected, during the streaming of the main content, in one example, the user 20 may be provided with a notification that the auxiliary multimedia is available or the auxiliary event has commenced or is about to commence. The user may opt for consumption of the auxiliary content 210 associated with the auxiliary multimedia / auxiliary event, while the main content is being streamed. In another example, the providing of auxiliary content 210 associated with the auxiliary multimedia / 25 auxiliary event may be initiated at a pre-defined time. In yet another example, the content retrieval unit 112 may automatically initiate the providing of the auxiliary content 210 associated with the auxiliary event as soon as the auxiliary event commences. It may be again noted that all such variations are only exemplary, and the auxiliary content 210 may be provided in any other manner as well. 30
19
[0077]
Continuing further, for concurrently providing the auxiliary content 210 during the streaming of the main content, the content retrieval unit 112 may obtain a first content data 212 associated with the auxiliary content 210. In one example, the computing device 200 may be in communication with at least one of a content server and a content repository. In such cases, the content retrieval unit 112 may 5 obtain the first content data 212 from at least one of the content server, the content repository, and a combination thereof.
[0078]
In another example, the content retrieval unit 112 may periodically obtain the first content data 212 based on a pre-defined time interval. Such time interval may be indicative of the frequency at which the user intends to consume 10 the auxiliary content 210.
[0079]
In another example, the first content data 212 may include one or more content attributes associated with the auxiliary content 210. Examples of such content attributes include, but are not limited to, a text, an audio, a video, an image, an animation, and a combination thereof, associated with the auxiliary content 210. 15 Such content attributes are only exemplary, and the first content data 212 may include any other content attributes. All such exemplary content attributes would lie within the scope of the present subject matter.
[0080]
For example, in the case of the auxiliary content 210 associated with the auxiliary event, and the auxiliary event being, say, a cricket match, the first 20 content data 212 may be a video feed of the cricket match at a time instance, a textual data corresponding to scores of the cricket match, an audio commentary, a set of user-configurable selective score, etc. Any other types of first content data 212 associated with the auxiliary event may also be possible and would lie within the scope of the present subject matter. 25
[0081]
Continuing further, based on the first content data 212, the content retrieval unit 112 may generate an auxiliary content window 216. In one example, the first content data 212 may be in another format. In such cases, the content retrieval unit 112 may process the obtained first content data. Based on the processed first content data, the content retrieval unit 112 may then generate the 30 auxiliary content window 216.
20
[0082]
For example, in the context of the auxiliary event being a cricket match, in one example, the first content data 212 may be an audio commentary of the match. The content retrieval unit 112 may convert the audio commentary of the match into a text data or extract some relevant information from the audio commentary and convert the relevant information into text data, and thereafter 5 generate the auxiliary content window 216.
[0083]
It may be noted that the aforementioned scenario is only exemplary, and in no manner is construed to limit the scope of the present subject matter in any manner.
[0084]
In another example, the content retrieval unit 112 may generate the 10 auxiliary content window 216 based on all the content attributes associated with the first content data 212. In another example, the user may be able to select the manner in which the user intends to consume the auxiliary content 210 and may select the content attributes associated with the first content data 212 for generation of the auxiliary content window 216. In such cases, the content retrieval unit 112 may 15 receive a user selection corresponding to at least one of the plurality of content attributes associated with the first content data 212. Based on the user selection, the content retrieval unit 112 may then generate the auxiliary content window 216.
[0085]
As would be noted and appreciated, such methods and systems may be beneficial when the user wishes to consume the main content in one manner, and 20 the auxiliary content 210 in another manner. For example, the user may keep watching the video of the main content and may select to consume only the audio of the auxiliary content 210.
[0086]
For another example, the user may be interested to watch a musical concert (main content) as well as a sports match (auxiliary content 210). The user 25 may not be interested in video of the musical concert but only in the video of the sports match, and also the user may not be interested in audio of the sports match but not in the audio of the musical concert. In such case, for the purposes of this example, the user may switch to the video of the sports match and the audio of the musical concert. 30
21
[0087]
For another example, in the case of the auxiliary event being, say, a cricket match, it may be possible that the user intends to consume the video feed associated with the cricket match, commentary associated with the cricket match, textual updates of the score of the cricket match, animations associated with selective score updates, etc. 5
[0088]
Continuing further, the content retrieval unit 112 may then display the auxiliary content window 216 as a target content window on the display device 202. An exemplary manner in which the content retrieval unit 112 may display the auxiliary content window 216 on the display device 202 has been depicted in FIG. 3. FIG. 3 depicts a diagram of an exemplary auxiliary content window, such as 10 auxiliary content window 216, during streaming of an exemplary main content, in accordance with an exemplary implementation of the present disclosure.
[0089]
As depicted in FIG. 3, the computing device 300 may be a mobile phone device. The display device may be integrated within the mobile phone device. The main content, that is streaming on the display of the mobile phone, is depicted by 15 302. During the streaming of the main content 302, as per the present example, the content retrieval unit 112 may display the auxiliary content window 304 as a target content window. In another example, the content retrieval unit 112 may display the auxiliary content window 304 as a target widget on the user interface of the digital platform on which the main content may be streamed. 20
[0090]
It may be noted that the auxiliary content window 304 as depicted in FIG. 3 is only exemplary and shown only for the purpose of illustrating the present example. The main content 302 may be streaming on the display device of the computing device 300 in any other manner, and the auxiliary content window 304 may also be displayed in any other manner. All such examples would lie within the 25 scope of the present subject matter.
[0091]
In another example, based on the first content data 212 and the corresponding content attributes, the content retrieval unit 112 may use graphical contents, such as one or more of text, animation, video, sound, etc., for displaying the auxiliary content window 216. For example, an animation, or a sound, or a 30
22
combination of both,
may be used by the content retrieval unit 112 to generate and display the auxiliary content window 216.
[0092]
For example, in the context of the auxiliary event being a cricket match, and the first content data 212 being highlights of the cricket match, when a player scores a ‘six’ in the cricket match, the content retrieval unit 112, in one example, 5 may generate and display an animation related to the event that ‘a player scores a six in the cricket match’ as the auxiliary content window. Such animation may be obtained by the content retrieval unit 112 while obtaining the first content data 212.
[0093]
Continuing further, it may also be noted that the auxiliary content window 216/304 may be of any shape, size, and dimensions. The content retrieval 10 unit 112, based on the user preferences, may vary the shape, size, dimensions, transparency, colour, layout, and any other attributes of the auxiliary content window 216/304. All such variations and examples would lie within the scope of the present subject matter.
[0094]
In another example, for causing to display the auxiliary content window 15 216 on the display device 202, the content retrieval unit 112 may determine one or more vacant portions on the display device 202 during streaming of the main content. For example, it may be possible that the main content, while being streamed, may have certain portions on the display device 202 that may be vacant. The content retrieval unit 112 may identify and determine such vacant portions 20 using any techniques known to a person skilled in the art. Such exemplary techniques are not explained here for the sake of brevity and would be well understood by a person skilled in the art. Thereafter, the content retrieval unit 112 may display the auxiliary content window 216/304 on the determined one or more vacant portions. 25
[0095]
Continuing further, the content retrieval unit 112 may then obtain a second content data 214 associated with the auxiliary content 210. The second content data 214 may be referred to as the content data associated with the auxiliary content 210 at a later time instance pursuant to obtaining the first content data 212 and generating the auxiliary content window 216 based on the first content data 30 212.
23
[0096]
The second content data 214 may be similar to the first content data 212, and the content retrieval unit 112 may accordingly process the second content data 214 in a similar manner as of the first content data 212. For example, the content retrieval unit 112 may obtain the second content data 214 from at least one of the content server, the content repository (not depicted in FIG. 2), or a 5 combination thereof.
[0097]
In another example, the content retrieval unit 112 may periodically obtain the second content data 214 based on a pre-defined time interval. Such time interval may be indicative of the frequency at which the user intends to update the information of the auxiliary content 210. 10
[0098]
Continuing further, in a similar manner like the first content data 212, the second content data 214 may also include one or more content attributes associated with the auxiliary content 210, such as a text, an audio a video, an image, an animation, and a combination thereof, associated with the auxiliary content 210.
[0099]
Continuing further, based on the second content data 214, the content 15 retrieval unit 112 may update the auxiliary content window 216.
[0100]
In one example, in cases where the second content data 214 may be in another format, the content retrieval unit 112 may process the obtained second data. Based on the processed second content data, the content retrieval unit 112 may update the auxiliary content window 216. 20
[0101]
In another example, the content retrieval unit 112 may update the auxiliary content window 216 based on all the content attributes associated with the second content data 214. In another example, the user may be able to select the manner in which the user intends to consume the auxiliary content 210 and may select the content attributes associated with the second content data 214 for 25 updating the auxiliary content window 216. In such cases, the content retrieval unit 112 may receive a user selection corresponding to at least one of the one or more content attributes associated with the second content data 214. Based on the user selection, the content retrieval unit 112 may then update the auxiliary content window 216. 30
24
[0102]
Thereafter, the content retrieval unit 112 may cause the display device 202 to display the updated auxiliary content window 216 as the target content window.
[0103]
It will be appreciated by a person skilled in the art that the present disclosure is not limited to displaying a single auxiliary content window 216 or a 5 single updated auxiliary content window 216, and depending on a use case more than one auxiliary content windows 216 or more than one updated auxiliary content windows 216 may be displayed. Thus, in an implementation of the present disclosure, there may be more than one instance of the auxiliary content window 216 and the target content window that are provided on the display device 202. For 10 example, a first auxiliary content (for example, a cricket match) is provided in one of the auxiliary content windows 216, and a second auxiliary content (for example, election results) is provided in another auxiliary content window 216.
[0104]
Further, in an implementation the method encompasses causing to swap a display of the target content window and the main content based on a pre-defined 15 user action on the target content window. For instance, it may be noted that the user selection such as users taps, long presses, etc. leads to interaction with the auxiliary content widgets in a defined manner. For example, on the user selection, the auxiliary content window may expand to be displayed as the main content. Further, the existing main content may be minimized and then converted into the auxiliary 20 content window.
[0105]
As would be noted and appreciated, the methods and systems of the present subject matter dynamically obtain the content data associated with the auxiliary content 210 in real-time, based on user’s interests, and accordingly provides the dynamically updated auxiliary content window 216. 25
[0106]
For example, in the context of the auxiliary event being the cricket match, the content retrieval unit 112 may have a set of user preferences that the user is interested in watching a particular player, say Player A, playing. Also, for example, based on the content data associated with the cricket match, the auxiliary content window 216 may be generated based on Player A. The auxiliary content 30 window 216 may be generated, displayed, and updated when the Player A comes to
25
the field to play, or when the Player A scores on the field, etc. In an implementation,
such moments related to Player A that is of user’s interest, or any such instances, may be obtained as content data and may be provided to the user in the form of auxiliary content window 216.
[0107]
In another example, the use may switch to a media stream associated 5 with the auxiliary content 210 on clicking the auxiliary content window 216. The media stream may show the corresponding auxiliary multimedia, or the auxiliary event related to which the auxiliary content 210 is shown in the auxiliary content window 212.
[0108]
For delivering the main content, a manifest file may be provided to the 10 user based on the request. Then, for delivering the auxiliary content, a second manifest file may be provided by the centralized server 108. The second manifest files may also be provided along with the first manifest file as an updated first manifest file, or a sub-manifest file. This allows simultaneously streaming of both the main content and the auxiliary content. As may be known, the manifest file may 15 refer to a list of multimedia files, or a playlist which may comprise one or more uniform resource locators (URLs) for a content in different renditions for user/ user devices that may be stored in a content delivery network connected with the centralized server 108. Further, the auxiliary content 210 when provided as a widget keeps on polling the relevant server from the centralized server 108 or content 20 delivery networks for updates. Then the centralized server 108 keeps pushing the updates in form of auxiliary content based on the received URLs in the manifest. Further, the auxiliary content window 212 in form of the widget may have different colours and a colour of the widget may be changed to indicate the user an event of being disconnected from the centralized server 108. Further, another colour may be 25 changed for indicating the user an event of reconnecting to the centralized server 108. It would be appreciated by a person skilled in the art, that the utilisation of a colour for providing an indication shall not be construed to be limiting and other such techniques for providing indication may be utilised.
[0109]
Further, in an implementation the method encompasses providing via 30 the target content window, one or more indications related to the auxiliary content
26
(210), based on at least one of a user preference and an update related to the
auxiliary content (210). The one or more indications comprise at least one of one or more audio indications and one or more colour indications. Also, each notification from the one or more indications is one of a manually configurable notification and an automatically configurable notification. More specifically, as 5 also disclosed above, the target content window may provide the auxiliary content which may be an audio, textual representation, or a visual representation. The target content window may provide a sound indication or a colour light indication or combination, etc. for the auxiliary content based on the set of user preferences, historical data, etc. For example, the indication may be provided in case of an event 10 or an update of the event according to the auxiliary content.
[0110]
Further, the indication may be predefined or configurable by the user, for example, the user may configure that the indication of a particular update should be a sound or a light. For example, a user may configure that the auxiliary content should be a Sound X, when a particular cricketer hits a boundary. In another 15 example, the user may configure that a Y colour light should be indicated when a particular team crosses 50 runs.
[0111]
In an implementation of the present disclosure, the method encompasses causing to automatically adjust the target content window based on a user preference for the auxiliary content (210). For instance, the target content 20 window may have one or more dimensions which may be configurable based on the user preference for the auxiliary content (210). The target content window may be maximized on a particular user preferred event provided by the second content data based on the auxiliary content. For example, the target overlay content is maximized in the event of a highlight event of a boundary is being provided in the 25 auxiliary content.
[0112]
In another implementation of the present disclosure, the auxiliary content, and the auxiliary content window 216 may be personalised and tailored based on one or more specific set of preferences of a user or a cohort of multiple users. 30
27
[0113]
FIG. 4 illustrates a flowchart depicting an example method 400 for concurrently providing an auxiliary content window during streaming of a main content, to be implemented in an exemplary computing device, in accordance with an exemplary implementation of the present subject matter. The order in which the method 400 is described is not intended to be construed as a limitation, and any 5 number of the described method blocks may be combined in any order to implement the aforementioned method, or an alternative method. Furthermore, method 400 may be implemented by processing resource or computing device(s) through any suitable hardware, non-transitory machine-readable instructions, or combination thereof. 10
[0114]
It may also be understood that method 400 may be performed by programmed system 102 or computing device 200 as depicted in FIGS. 1 or 2 respectively. Furthermore, the method 400 may be executed based on instructions stored in non-transitory computer readable medium, as will be readily understood. The non-transitory computer readable medium may include, for example digital 15 memories, magnetic storage media, such as one or more magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media. Although, the method 400 is described below with reference to the computing device 200 as described above, other suitable systems for the execution of these methods can also be utilized. Additionally, implementation of this method is not 20 limited to such examples.
[0115]
At block 402, a first content data associated with an auxiliary content may be obtained.
[0116]
At block 404, based on the first content data, an auxiliary content window may be generated. 25
[0117]
At block 406, the auxiliary content window may be caused to be displayed as a target content window.
[0118]
At block 408, a second content data associated with the auxiliary content may be obtained.
[0119]
At block 410, based on the second content data, the target content 30 window may be caused to be displayed.
28
[0120]
Although examples for the present disclosure have been described in language specific to structural features and/or methods, it should be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed and explained as examples of the present disclosure.
We Claim:
1.A method (400) for concurrently providing an auxiliary content window (216) during streaming of a main content, the method (400) comprising:
obtaining (402) a first content data (212) associated with an auxiliary content (210);
generating (404) an auxiliary content window (216) based on the first content data (212);
causing to display (406) the auxiliary content window (216) as a target content window;
obtaining (408) a second content data (214) associated with the auxiliary content (210); and
causing to display (410) the target content window based on the second content data (214).
2.The method (400) as claimed in claim 1, wherein the auxiliary content (210) is based on one of a user’s historical data, a set of user preferences, one or more user profile attributes, a set of preferences related to a user cohort, and a combination thereof.
3.The method (400) as claimed in claim 2, wherein the one or more user profile attributes comprise at least one of a gender, geographical location, preferred language, or a combination thereof.
4.The method (400) as claimed in claim 1, wherein each of the first content data (212) and the second content data (214) comprises one or more content attributes associated with the auxiliary content (210), wherein the one or more content attributes of each of the first content data (212) and the second content data (214) comprises one of a text, an audio, a video, an image, and a combination thereof.
5.The method (400) as claimed in claim 4, further comprising:
receiving a user selection corresponding to at least one of the one or more content attributes associated with one of the first content data (212) and the second content data (214); and
based on the user selection and the respective first content data (212) and the second content data (214), generating the auxiliary content window (216).
6.The method (400) as claimed in claim 1, further comprising:
periodically obtaining the first content data (212) and the second content data (214) based on a pre-defined time interval.
7.The method (400) as claimed in claim 1, wherein the first content data (212) and the second content data (214) are obtained from one of a content server, a content repository, and a combination thereof.
8.The method (400) as claimed in claim 1, wherein pursuant to obtaining the first content data (212) and the second content data (214) associated with the auxiliary content (210), the method (400) further comprises:
processing the obtained first content data and the obtained second content data; and
based on the processed first content data and the processed second content data, generating the respective auxiliary content windows (216).
9.The method (400) as claimed in claim 1, further comprising:
based on the second content data (214), updating the auxiliary content window (216); and
causing to display the updated auxiliary content window (216) as the target content window.
10.The method (400) as claimed in claim 1, further comprising: causing to display the auxiliary content window (216) as the target content window on a display device (202).
11.The method (400) as claimed in claim 1, further comprising:
determining one or more vacant portions on a display device (202) during streaming of the main content; and
causing to display the auxiliary content window (216) on the determined one or more vacant portions.
12.The method (400) as claimed in claim 1, further comprising:
providing via the target content window, one or more indications related to the auxiliary content (210), based on at least one of a user preference and an update related to the auxiliary content (210), wherein:
the one or more indications comprise at least one of one or more audio indications and one or more colour indications, and
each notification from the one or more indications is one of a manually configurable notification and an automatically configurable notification.
13.The method (400) as claimed in claim 1, further comprising:
causing to automatically adjust the target content window based on a user preference for the auxiliary content (210).
14.The method (400) as claimed in claim 1, further comprising:
causing to swap a display of the target content window and the main content based on a pre-defined user action on the target content window.
15.A system (102) for concurrently providing an auxiliary content window (216) during streaming of a main content, the system (102) comprising:
a processor (104); and
a content retrieval unit (112) coupled to the processor (104), wherein the content retrieval unit (112) is to:
obtain a first content data (212) associated with an auxiliary content (210);
generate an auxiliary content window (216) based on the first content data (212);
cause to display the auxiliary content window (216) as a target content window;
obtain a second content data (214) associated with the auxiliary content (210); and
cause to display the target content window based on the second content data (214).
Dated this 29th day of August 2023
| # | Name | Date |
|---|---|---|
| 1 | 202321057976-STATEMENT OF UNDERTAKING (FORM 3) [29-08-2023(online)].pdf | 2023-08-29 |
| 2 | 202321057976-PROVISIONAL SPECIFICATION [29-08-2023(online)].pdf | 2023-08-29 |
| 3 | 202321057976-POWER OF AUTHORITY [29-08-2023(online)].pdf | 2023-08-29 |
| 4 | 202321057976-FORM 1 [29-08-2023(online)].pdf | 2023-08-29 |
| 5 | 202321057976-FIGURE OF ABSTRACT [29-08-2023(online)].pdf | 2023-08-29 |
| 6 | 202321057976-DRAWINGS [29-08-2023(online)].pdf | 2023-08-29 |
| 7 | 202321057976-Proof of Right [20-02-2024(online)].pdf | 2024-02-20 |
| 8 | 202321057976-ORIGINAL UR 6(1A) FORM 1 & 26-020524.pdf | 2024-05-06 |
| 9 | 202321057976-PA [18-06-2024(online)].pdf | 2024-06-18 |
| 10 | 202321057976-ASSIGNMENT DOCUMENTS [18-06-2024(online)].pdf | 2024-06-18 |
| 11 | 202321057976-8(i)-Substitution-Change Of Applicant - Form 6 [18-06-2024(online)].pdf | 2024-06-18 |
| 12 | 202321057976-FORM-5 [28-08-2024(online)].pdf | 2024-08-28 |
| 13 | 202321057976-FORM 18 [28-08-2024(online)].pdf | 2024-08-28 |
| 14 | 202321057976-ENDORSEMENT BY INVENTORS [28-08-2024(online)].pdf | 2024-08-28 |
| 15 | 202321057976-DRAWING [28-08-2024(online)].pdf | 2024-08-28 |
| 16 | 202321057976-CORRESPONDENCE-OTHERS [28-08-2024(online)].pdf | 2024-08-28 |
| 17 | 202321057976-COMPLETE SPECIFICATION [28-08-2024(online)].pdf | 2024-08-28 |
| 18 | Abstract 1.jpg | 2024-09-04 |
| 19 | 202321057976-RELEVANT DOCUMENTS [12-06-2025(online)].pdf | 2025-06-12 |
| 20 | 202321057976-FORM 13 [12-06-2025(online)].pdf | 2025-06-12 |
| 21 | 202321057976-FORM-26 [09-09-2025(online)].pdf | 2025-09-09 |
| 22 | 202321057976-ORIGINAL UR 6(1A) FORM 26-220925.pdf | 2025-09-25 |
| 23 | 202321057976-ORIGINAL UR 6(1A) FORM 26-031125.pdf | 2025-11-04 |