Abstract: Disclosed is a system (130) and a method (400) for recommending a video resolution to a user. The method comprises controlling the user device (110) to stream the video at a first resolution level, transmitting a control signal to set a bitrate for streaming the video based on network parameters during the streaming of the video and receiving performance parameters of the video. The method further comprises reiteratively controlling an adjustment of a resolution of the video to multiple resolution levels higher than the first resolution level until the one or more performance parameters of the video violates a threshold criteria. Furthermore, the method comprises identifying an immediate resolution level lower than a previously adjusted resolution level among the multiple resolution levels as the video resolution and recommending to the user the identified video resolution to stream the video. FIG. 4
DESC:FORM 2
THE PATENTS ACT, 1970 (39 OF 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
SYSTEM AND METHOD FOR RECOMMENDING A VIDEO RESOLUTION TO A USER
Jio Platforms Limited, an Indian company, having registered address at Office -101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[0001] The embodiments of the present disclosure generally relate to the field of adaptive streaming in communication networks. More particularly, the present disclosure relates to a system and a method for recommending video resolution based on real-time network performance to enhance user experience.
BACKGROUND OF THE INVENTION
[0002] The subject matter disclosed in the background section should not be assumed or construed to be prior art merely due to its mention in the background section. Similarly, any problem statement mentioned in the background section or its association with the subject matter of the background section should not be assumed or construed to have been previously recognized in the prior art.
[0003] In the era of digital media, with an increased usage of smart devices, more users are accessing the video content online. To this end, various Over the Top (OTT) and media content platforms are now prevalently used by the users for entertainment or educational purposes. The benefits associated with on-demand videos including a vast array of available content, have revolutionized preferences of people watching television shows, movies, live events, and educational materials.
[0004] To this end, the users expect high-quality video playback that is smooth and free from interruptions while streaming the video. Regardless of the location i.e., at home, on the go, or in public spaces, the users expect consistent and reliable video quality and anticipate seamless viewing experience, without dealing with frustration of buffering delays, sudden drops in resolution, lags, and the like.
[0005] However, one of the primary issues faced by conventional video streaming services is variability of network performance. Networks are susceptible to issues like fluctuating bandwidths, latency, and packet loss, which severely impacts video playback quality. The issues manifest as the buffering delays, reduced resolution, and increased latency, leading to a suboptimal viewing experience. The buffering delays refers to an interruption in the video playback while preloading media content, causing frustration among the users. The reduced resolution refers to reduction in number of pixels of the media content, which decreases a level of detail and clarity in the media content thereby degrading the viewing experience of the users, while the latency affects the real-time interaction with live streams or video conferencing applications. Thus, efforts have been made in past to measure the network performance during the video streaming.
[0006] Heretofore, existing solutions to measure the network performance during the video streaming rely on generic network testing tools that do not adequately capture specific challenges associated with video data transmission. Further, network testing tools measure overall bandwidth or the latency but fail to provide insights into how the overall bandwidth and the latency affect a quality of the video streaming directly. Additionally, traditional video testing methods fail to dynamically adapt to varying network conditions and do not provide real-time feedback necessary for optimizing video playback performance.
[0007] Further, the existing solutions often involve testing the video quality by using multiple Uniform Resource Locators (URLs), each linked to different resolution levels of the same video content. The URLs are individually tested to evaluate the performance of various resolution levels under specific network conditions. However, this approach is inefficient as it requires additional time, bandwidth, and computational resources to retrieve and compare multiple versions of the same video. Moreover, switching between the URLs does not effectively account for real-time fluctuations in the network performance, leading to inaccurate assessments and suboptimal resolution recommendations.
[0008] In light of these limitations, there lies a need for a solution that facilitates recommending a video resolution to the users based on real-time network performance.
SUMMARY OF THE INVENTION
[0009] The following embodiments present a simplified summary in order to provide a basic understanding of some aspects of the disclosed invention. This summary is not an extensive overview, and it is not intended to identify key/critical elements or to delineate the scope thereof. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
[0010] In an embodiment, disclosed herein is a method for recommending a video resolution to a user. The method includes receiving, by an acquisition module from a user device, an input including a Uniform Resource Locator (URL) of a video. The method further includes controlling, by a streaming module, in response to the input, the user device to stream the video at a first resolution level. Further, the method includes transmitting, by a processing module, a control signal to set a bitrate for streaming the video based on a plurality of network parameters during the streaming of the video. Further, the method includes receiving, by the acquisition module, one or more performance parameters of the video. The one or more performance parameters of the video are detected by the user device during the streaming of the video at the set bitrate. Furthermore, the method includes reiteratively controlling an adjustment of a resolution of the video to multiple resolution levels higher than the first resolution level until the one or more performance parameters of the video violates a threshold criteria. Furthermore, the method includes identifying by an identification module, as the video resolution, an immediate resolution level lower than a previously adjusted resolution level among the multiple resolution levels and recommending, by a recommendation module to the user, the identified video resolution to stream the video.
[0011] In one or more implementations, the input is received by the acquisition module to perform one or more video tests on the video to identify the video resolution supported by a network.
[0012] In one or more implementations, the plurality of network parameters includes at least one of a bandwidth, a latency, and a packet loss during the streaming of the video.
[0013] In one or more implementations, the one or more performance parameters includes at least one of a buffering duration, a rebuffering frequency, and a freezing ratio of the video.
[0014] In one or more implementations, the one or more performance parameters satisfies the threshold criteria when at least one of the buffering duration is less than a specific duration, the rebuffering frequency is less than a specific frequency, or the freezing ratio is less than a specific ratio.
[0015] According to another aspect of the present disclosure, disclosed is a system for recommending a video resolution to a user. The system comprises an acquisition module, a streaming module, a processing module, an identification module, a recommendation module. The acquisition module is configured to receive, from a user device, an input including a Uniform Resource Locator (URL) of a video. The streaming module is configured to control the user device to stream the video at a first resolution level in response to the input. The processing module is configured to transmit a control signal to set a bitrate for streaming the video based on a plurality of network parameters during the streaming of the video. The acquisition module is further configured to receive one or more performance parameters of the video. The one or more performance parameters of the video are detected by the user device during the streaming of the video at the set bitrate. The processing module is further configured to reiteratively control an adjustment of a resolution of the video to multiple resolution levels higher than the first resolution level until the one or more performance parameters of the video violates a threshold criteria. The identification module is configured to identify, as the video resolution, an immediate resolution level lower than a previously adjusted resolution level among the multiple resolution levels. The recommendation module is configured to recommend to the user the identified video resolution to stream the video.
[0016] In one or more implementations, the acquisition module is configured to receive the input to perform one or more video tests on the video to identify the video resolution supported by a network.
[0017] According to yet another aspect of the present disclosure, disclosed herein is a user device that comprises a transceiver module and a computational module. The transceiver module is configured to receive, via a User Interface (UI), an input including a Uniform Resource Locator (URL) of a video for performing one or more video tests and receive, from a server, a first control signal to stream the video at a first resolution level. The transceiver module is further configured to receive, from the server, a second control signal to set a bitrate for streaming the video at the first resolution level. The computational module is configured to calculate one or more performance parameters of the video during the streaming of the video at the set bitrate. The transceiver module is further configured to reiteratively transmit the one or more performance parameters to the server and receive a control signal from the server to adjust a resolution of the video to one or more second resolution levels higher than the first resolution level until the one or more performance parameters of the video violates a threshold criteria and receive, from the server, a recommendation to stream the video at a specific resolution. The specific resolution is an immediate resolution level lower than a previously adjusted resolution level among the one or more second resolution levels.
[0018] In one or more implementations, the computation module is further configured to calculate, as the one or more performance parameters, a frequency of freezing of the video, a duration of the freezing of the video, and a freezing ratio during the streaming of the video.
BRIEF DESCRIPTION OF DRAWINGS
[0019] Various embodiments disclosed herein will become better understood from the following detailed description when read with the accompanying drawings. The accompanying drawings constitute a part of the present disclosure and illustrate certain non-limiting embodiments of inventive concepts disclosed herein. Further, components and elements shown in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. For consistency and ease of understanding, similar components and elements are annotated by reference numerals in the exemplary drawings.
[0020] FIG. 1 illustrates a block diagram depicting a network environment for recommending a video resolution to a user, in accordance with an embodiment of the present disclosure.
[0021] FIG. 2 illustrates a block diagram depicting a system architecture of a server, in accordance with an exemplary embodiment of the present disclosure.
[0022] FIG. 3 illustrates a block diagram depicting a system architecture of a user device, in accordance with an exemplary embodiment of the present disclosure.
[0023] FIG. 4 illustrates a flowchart depicting a method for recommending the video resolution to the user, in accordance with an embodiment of the present disclosure.
[0024] FIG. 5 illustrates a schematic block diagram depicting a computing system for recommending the video resolution to the user, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE INVENTION
[0025] Inventive concepts of the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which examples of one or more embodiments of inventive concepts are shown. Inventive concepts may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Further, the one or more embodiments disclosed herein are provided to describe the inventive concept thoroughly and completely, and to fully convey the scope of each of the present inventive concepts to those skilled in the art. Furthermore, it should be noted that the embodiments disclosed herein are not mutually exclusive concepts. Accordingly, one or more components from one embodiment may be tacitly assumed to be present or used in any other embodiment.
[0026] The following description presents various embodiments of the present disclosure. The embodiments disclosed herein are presented as teaching examples and are not to be construed as limiting the scope of the present disclosure. The present disclosure should in no way be limited to the illustrative implementations, drawings, and techniques illustrated below, including the exemplary design and implementation illustrated and described herein, but may be modified, omitted, or expanded upon without departing from the scope of the present disclosure.
[0027] The following description contains specific information pertaining to embodiments in the present disclosure. The detailed description uses the phrases “in some embodiments” or “some implementations” which may each refer to one or more or all of the same or different embodiments or implementations. The term “some” as used herein is defined as “one, or more than one, or all.” Accordingly, the terms “one,” “more than one,” “more than one, but not all” or “all” would all fall under the definition of “some.” In view of the same, the terms, for example, “in an embodiment” or “in an implementation” refers to one embodiment or one implementation and the term, for example, “in one or more embodiments” refers to “at least one embodiment, or more than one embodiment, or all embodiments.”. Further, the term, for example, “in one or more implementations” refers to “at least one implementation, or more than one implementation, or all implementations.
[0028] The term “comprising”, when utilized, means “including, but not necessarily limited to;” it specifically indicates open-ended inclusion in the so-described one or more listed features, elements in a combination, unless otherwise stated with limiting language. Furthermore, to the extent that the terms “include” “has” “have” “contains” and other similar words are used in the detailed description, such terms are intended to be inclusive in a manner similar to the term “comprising”.
[0029] In the following description, for the purposes of explanation, various specific details are set forth to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features.
[0030] The description provided herein discloses exemplary embodiments only and is not intended to limit the scope, applicability, or configuration of the present disclosure. Rather, the foregoing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing any of the exemplary embodiments. Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it may be understood by one of the ordinary skilled in the art that the embodiments disclosed herein may be practiced without these specific details.
[0031] The terminology used herein is for the purpose of describing embodiments only and is not intended to be limiting of the disclosure. As used herein the description, the singular forms "a", "an", and "the" include plural forms unless the context of the invention indicates otherwise.
[0032] The terminology and structure employed herein are for describing, teaching, and illuminating some embodiments and their specific features and elements and do not limit, restrict, or reduce the scope of the present disclosure. Accordingly, unless otherwise defined, all terms, and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by one having ordinary skill in the art.
[0033] An object of the present disclosure is to provide a system and a method for performing real-time video tests to recommend an optimal video resolution to a user.
[0034] Another object of the present disclosure is to provide a system and a method that ensures that the user may seamlessly switch to a video resolution that provides a buffer less video experience by dynamically adjusting video quality in response to network conditions.
[0035] Yet another object of the present disclosure is to provide a system and a method that continuously monitors and adjusts video resolution levels to prevent excessive buffering, stalling, or freezing, thereby improving user satisfaction.
[0036] Thus, the present disclosure provides the system and the method that performs video testing to determine the optimal video resolution supported by a network. The method involves initiating a video test upon receiving a user input and using a single video source with multiple resolutions and bitrates to facilitate an adaptive streaming. By assessing the network conditions including available bandwidth, a latency, and a packet loss, an initial adaptive bitrate is selected that ensures smooth playback. As the video streams, performance metrics such as the buffering, the stalling, and the freezing are continuously monitored. The present disclosure ultimately determines the optimal video resolution the network may support and communicates the optimal video resolution to the user, thereby enhancing video streaming experience.
[0037] Several key terms used in the description play pivotal roles in facilitating the system functionality. In order to facilitate an understanding of the description, the key terms are defined below.
[0038] The “adaptive streaming” in the entire disclosure may refer to a video delivery technique that dynamically adjusts the quality of a video stream in real-time based on an available network bandwidth and device capabilities. The adaptive streaming is used to ensure a smooth transition between different quality levels by continuously monitoring the network conditions and selecting the most suitable resolution.
[0039] The “video resolution” in the entire disclosure may refer to a number of pixels displayed in each frame of the video, which determines clarity of the video. The higher resolutions, such as 1080p or 4K, provide better visual quality but require more bandwidth.
[0040] The “optimal video resolution” in the entire disclosure may refer to a highest video resolution that may be sustained without the buffering, ensuring an uninterrupted viewing experience based on available network conditions at any given time. The optimal video resolution is determined through an adaptive streaming process that dynamically evaluates and selects the best possible quality level within a short time frame.
[0041] A “Uniform Resource Locator (URL)” in the entire disclosure may refer to a web address that specifies a location of a resource, such as the video, on internet. The URL typically includes the protocol Hypertext Transfer Protocol/ Hypertext Transfer Protocol Secure (e.g., HTTP/HTTPS), domain name, and file path to access the resource.
[0042] A “bitrate” in the entire disclosure may refer to an amount of data transmitted per second during the video streaming, usually measured in kilobits per second (kbps) or megabits per second (Mbps). A higher bitrate generally results in better video quality but requires more network bandwidth.
[0043] The “buffering” in the entire disclosure may refer to a process of preloading video data into memory before the playback to ensure smooth streaming. When the video is buffered, a portion of the video is temporarily stored to prevent interruptions due to network fluctuations.
[0044] A “rebuffering” in the entire disclosure may refer to a process which occurs when the video playback pauses due to insufficient buffered data, requiring additional data to be loaded before resuming the playback. The rebuffering is usually caused by slow network speeds or high bandwidth demands.
[0045] The “stalling” in the entire disclosure may refer to an unintended pause in the video playback due to a depletion of buffered content. The stalling is a user-perceived interruption that occurs when the video streaming cannot keep up with playback speed.
[0046] A “freezing ratio” in the entire disclosure may refer to a metric that quantifies a proportion of time the video remains frozen during the video playback due to network issues and is calculated as a total freezing duration divided by a total video playback duration and is expressed as a percentage. A lower freezing ratio indicates a smoother streaming experience.
[0047] Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings. FIG. 1through FIG. 5, discussed below, and the one or more embodiments used to describe the principles of the present disclosure are by way of illustration only and should not be construed in any way to limit the scope of the present disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged system or device.
[0048] FIG. 1 illustrates a block diagram depicting a network environment 100 for recommending the video resolution to the user, in accordance with an embodiment of the present disclosure.
[0049] The embodiments of the network environment 100 shown in FIG. 1 is for illustration only. Other embodiments of the network environment 100 may be used without departing from the scope of this disclosure.
[0050] As shown in FIG. 1, the network environment 100 may include a user device 110, a network 120, a server 130, and a database 150. The server 130 communicates with each of the user device 110, and the database 150 via the network 120.
[0051] The user device 110 may include a User Interface (UI) 110-1, a communication unit 110-2, a processor 110-3 (may also be referred to as “one or more primary processors 110-3”) and a memory 110-4. The user may initiate a video testing process via the UI 110-1. The communication unit 110-2 within the user device 110 may enable communication of the user device 110 with the server 130 for data exchange. In one or more embodiments, one or more applications may be installed on the user device 110 to communicate with the server 130. The processor 110-3 may execute operating system instructions stored in the memory 110-4 in order to control the overall operation of the user device 110.
[0052] The processor 110-3 is configured to execute programs and instructions stored in the memory 110-4. The processor 110-3 is further configured to move data into or out of the memory 110-4 as required by an executing process. The processor 110-3 may also be configured to execute the one or more applications based on an operating system or in response to signals received from the gNBs or an operator. The processor 110-3 may also be coupled to an I/O interface, which provides the user device 110 with an ability to connect to other devices, such as laptop computers and handheld computers. The I/O interface may act as a communication path between the above-described user device components and the processor 110-3. The processor 110-3 may include an intelligent hardware device including a general-purpose processor, such as, for example, and without limitation, a Central Processing Unit (CPU), an Application Processor (AP), a dedicated processor, or the like, a graphics-only processing unit such as a Graphics Processing Unit (GPU), a microcontroller, a Field-Programmable Gate Array (FPGA), a programmable logic device, a discrete hardware component, or any combination thereof.
[0053] The memory 110-4 may include any type of computer-readable medium usable by a computer or the processor 110-3, such as a Read-Only Memory (ROM), a Random-Access Memory (RAM), a flash memory, a removable storage drive, a Hard Disc Drive (HDD), a solid-state memory, a magnetic storage drive, a Programmable Read-Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), and/or an Electrically Erasable Programmable Read-Only Memory EEPROM. In an aspect, for example, the memory 110-4 may be a non-transitory computer-readable storage medium that stores one or more computer-executable codes or instructions.
[0054] The user device 110 may include, but not limited to, smartphones, tablets, laptops, desktop computers, Personal Digital Assistants (PDAs), smartwatches, or any other computing device capable of network connectivity. Further, the present disclosure is applicable to 4th Generation (4G), 5th Generation (5G), 6th Generation (6G), or any future mobile communication standards, including scenarios where a Set-Top Box (STB) is connected to the 4G network through Customer Premises Equipment (CPE), or where the 5G and the 6G networks are accessed via Fixed Wireless Access (FWA) solutions, covering various bands and carriers used by telecommunications operators. The CPE is an equipment kept at a user's physical location rather than on a service provider's premises. Further, the FWA refers to a high-speed broadband internet connection technology that uses wireless signals to provide internet services to fixed locations. The user may include, but is not limited to, an end consumer accessing a video content on a personal device, a field engineer testing network conditions for quality assessment, a network administrator monitoring video streaming performance across different locations, or an automated system making data-driven decisions for optimizing the video resolution.
[0055] The network 120 enables communication between components of the network environment 100. The network 120 may include suitable logic, circuitry, and interfaces that may be configured to provide several network ports and several communication channels for transmission and reception of data related to operations of various entities of the network environment 100. Each network port may correspond to a virtual address (or a physical machine address) for transmission and reception of the communication data. For example, the virtual address may be an Internet Protocol Version 4 (IPV4) (or an IPV6 address) and the physical address may be a Media Access Control (MAC) address. The network 120 may be associated with an application layer for implementation of communication protocols based on one or more communication requests from the various entities of the network environment 100. The communication data may be transmitted or received via the communication protocols. Examples of the communication protocols may include, but are not limited to, Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Mail Transfer Protocol (SMTP), Domain Network System (DNS) protocol, Common Management Interface Protocol (CMIP), Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Long Term Evolution (LTE) communication protocols, or any combination thereof. In some aspects of the present disclosure, the communication data may be transmitted or received via at least one communication channel of several communication channels in the network 120. The communication channels may include, but are not limited to, a wireless channel, a wired channel, a combination of wireless and wired channel thereof. The wireless or wired channel may be associated with a data standard which may be defined by one of a Local Area Network (LAN), a Personal Area Network (PAN), a Wireless Local Area Network (WLAN), a Wireless Sensor Network (WSN), Wireless Area Network (WAN), Wireless Wide Area Network (WWAN), a metropolitan area network (MAN), a satellite network, the Internet, an optical fiber network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. Aspects of the present disclosure are intended to include or otherwise cover any type of communication channel, including known, related art, and/or later developed technologies.
[0056] The server 130 (hereinafter also referred to as the “system 130”) functions as a central entity responsible for storing, processing, and delivering the video content to the user device 110 over the network 120. The server 130 utilizes adaptive streaming techniques to dynamically adjust the video quality based on the network conditions. The server 130 responds to requests from the user device 110, processes bandwidth availability, and ensures seamless data transmission. The server 130 facilitates communication between the user device 110 and the network 120, ensuring efficient content delivery.
[0057] The server 130 may be a network of computers, a software framework, or a combination thereof, that may provide a generalized approach to create a server implementation. Examples of the server 130 may include, but are not limited to, personal computers, laptops, mini-computers, mainframe computers, any non-transient and tangible machine that can execute a machine-readable code, cloud-based servers, distributed server networks, or a network of computer systems. The server 130 may be realized through various web-based technologies such as, but not limited to, a Java web-framework, a .NET framework, a Personal Home Page (PHP) framework, or any web-application framework.
[0058] The server 130 may include a processor 140 (may also be referred as “one or more processors 140”), a communication interface 142, and a memory 144. The processor 140 may include various processing circuitry/modules and communicates with the memory 144 and the communication interface 142. The processor 140 is configured to execute the instructions stored in the memory 144 and to perform various processes. The processor 140 may include the intelligent hardware device including the general-purpose processor, such as, for example, and without limitation, the CPU, the AP, the dedicated processor, or the like, the graphics-only processing unit such as the GPU, the microcontroller, the FPGA, the programmable logic device, the discrete hardware component, or any combination thereof.
[0059] The communication interface 142 may be configured to enable the server 130 to communicate with various entities of the network environment 100 via the network 120. Examples of the communication interface 142 may include, but are not limited to, a modem, a network interface such as an Ethernet card, a communication port, and/or a Personal Computer Memory Card International Association (PCMCIA) slot and card, an antenna, a Radio Frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a Coder-Decoder (CODEC) chipset, a Subscriber Identity Module (SIM) card, and a local buffer circuit. It will be apparent to a person of ordinary skill in the art that the communication interface 142 may include any device and/or apparatus capable of providing wireless or wired communications between the server 130 and various other entities of the network environment 100.
[0060] The memory 144 may be configured to store the logic, instructions, circuitry, interfaces, and/or codes of the processor 140 for executing various operations. Examples of the memory 144 may include but are not limited to, the ROM, the RAM, the flash memory, the removable storage drive, the HDD, the solid-state memory, the magnetic storage drive, the PROM, the EPROM, and/or the EEPROM.
[0061] The database 150 may store information related to historical network performance data, including bandwidth availability, the latency, and packet loss statistics for different users and locations. In an implementation, the database 150 may store user preferences, past streaming behaviors, and device capabilities to personalize the recommendations. The information may be accessed and updated by both the user device 110 and the server 130 as part of the video testing and recommendation process. The database 150 may correspond to, but not limited to a relational database, a non-relational database or an in-memory database depending on implementation requirements.
[0062] Although FIG. 1 illustrates one example of the network environment 100, various changes may be made to FIG. 1. For example, the network environment 100 may include any number of user devices and servers in any suitable arrangement. Further, in another example, the network environment 100 may include any number of components in addition to the components shown in FIG. 1. Further, various components in FIG. 1 may be combined, further subdivided, or omitted and additional components may be added according to particular needs.
[0063] FIG. 2 illustrates a block diagram depicting a system architecture of the server 130, in accordance with an exemplary embodiment of the present disclosure.
[0064] The server 130 may include the processor 140, the communication interface 142, the memory 144, processing modules 202, and an Input-Output (I/O) interface 204. Each of the components of the server 130 is communicatively coupled to each other via a first communication bus 206.
[0065] The processing modules 202 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the server 130. In non-limiting examples, described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing modules(s) 202 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor 140 may execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing modules 202. In such examples, the server 130 may also comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the server 130 and the processing resource. In other examples, the processing modules 202 may be implemented using an electronic circuitry.
[0066] The I/O interface 204 may include suitable logic, circuitry, interfaces, and/or codes that may be configured to receive input(s) and present (or display) output(s) on the server 130. For example, the I/O interface 204 may have an input interface (not shown) and an output interface (not shown). The input interface may be configured to enable the user to provide input(s) to trigger (or configure) the server 130 for performing data processing operation(s) for identifying the video resolution. Examples of the input interface may include, but are not limited to, a touch interface, a mouse, a keyboard, a motion recognition unit, a gesture recognition unit, a voice recognition unit, or the like. The output interface may be configured to display (or present) output(s) generated (or provided) by the server 130 such as, but not limited to, the video resolution supported by the network. In some aspects of the present disclosure, the output interface may provide the output(s) based on an instruction provided by the user of the server 130, by way of the input interface. Examples of the output interface may include, but are not limited to, a digital display, an analog display, a touch screen display, an appearance of a desktop, and/or illuminated characters. Aspects of the present disclosure are intended to include or otherwise cover any type of the input interface and output interface in the I/O interface 204, including known, related art, and/or later developed technologies without deviating from the scope of the present disclosure.
[0067] According to an example embodiment, the processing modules 202 may include an acquisition module 210, a streaming module 212, a processing module 214, an identification module 216, and a recommendation module 218. Various modules of the processing modules 202 may be communicatively coupled to each other by way of a second communication bus 208.
[0068] Referring to FIG. 2, the acquisition module 210 is configured to receive, from the user device 110, the input including the URL of the video. The streaming module 212 is configured to control the user device to stream the video at a first resolution level in response to the input. The processing module 214 is configured to transmit a control signal to set the bitrate for streaming the video based on a plurality of network parameters during the streaming of the video. The acquisition module 210 is further configured to receive one or more performance parameters of the video. The one or more performance parameters of the video are detected by the user device 110 during the streaming of the video at the set bitrate. The processing module 214 is further configured to reiteratively control the adjustment of the resolution of the video to multiple resolution levels higher than the first resolution level until the performance parameters of the video violates a threshold criteria. The identification module 216 is configured to identify, as the video resolution, an immediate resolution level lower than a previously adjusted resolution level among the multiple resolution levels. The recommendation module 218 is configured to recommend, to the user via the UI 110-1, the identified video resolution to stream the video.
[0069] Although FIG. 2 illustrates one example of the server 130, various changes may be made to FIG. 2. Further, the server 130 may include any number of components in addition to those shown in FIG. 2, without deviating from the scope of the present disclosure. Further, various components in FIG. 2 may be combined, further subdivided, or omitted and additional components may be added according to particular needs.
[0070] FIG. 3 illustrates a block diagram depicting a system architecture of the user device 110, in accordance with an exemplary embodiment of the present disclosure.
[0071] The user device may include the UI 110-1, the communication unit 110-2, the processor 110-3 (may also be referred to as “one or more primary processors 110-3”) the memory 110-4, processing modules 300. Each of the components of the user device 110 is communicatively coupled to each other via a first communication bus 302.
[0072] The processing modules 300 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the user device 110. In non-limiting examples, described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing modules(s) 300 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor 110-3 may execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing modules 300. In such examples, the user device 110 may also comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the user device 110 and the processing resource. In other examples, the processing modules 300 may be implemented using an electronic circuitry.
[0073] According to an example embodiment, the processing modules 300 may include a transceiver module 110-5, and a computational module 110-6. Various modules of the processing modules 300 may be communicatively coupled to each other by way of a second communication bus 304.
[0074] Referring to FIG. 3, the transceiver module 110-5 is configured to receive the input including the URL of the video for performing the video tests and receive, from the server 130, the first control signal to stream the video at the first resolution level. The transceiver module 110-5 is further configured to receive, from the server 130, a second control signal to set the bitrate for streaming the video at the first resolution level. The computational module 110-6 is configured to calculate the performance parameters of the video during the streaming of the video at the set bitrate. The transceiver module 110-5 is further configured to reiteratively transmit the performance parameters to the server 130 and receive the control signal from the server 130 to adjust the resolution of the video to second resolution levels higher than the first resolution level until the performance parameters of the video violates the threshold criteria and receive, from the server 130, the recommendation to stream the video at a specific resolution. The specific resolution is the immediate resolution level lower than the previously adjusted resolution level among the second resolution levels.
[0075] The computation module 110-6 is further configured to calculate a frequency of freezing of the video, a duration of the freezing of the video, and the freezing ratio during the streaming of the video. The freezing ratio represents a percentage of total video streaming time that is affected by video freezing. For instance, say total video streaming duration is 10 minutes (600 seconds) and a number of the freezing events is 5 times, and the duration of each freezing event is 2.4 seconds, then total freezing duration comes out to be 12 seconds (5*2.4). The freezing ratio is then calculated as (12/600) *100, which is equivalent to 2%.
[0076] FIG. 4 illustrates a flowchart depicting the method 400 for recommending the video resolution to the user, in accordance with an embodiment of the present disclosure. The method 400 comprises a series of operation steps indicated by blocks 402 through 414. The method 400 starts at block 402. The user initiates the video test through the UI 110-1 on the user device 110.
[0077] At block 402, the acquisition module 210 may receive, from the user device 110, the input including the URL of the video for performing the video tests. The acquisition module 210 may receive the input when the user selects the video on his user device 110 to play on a streaming platform. In a non-limiting example, the user may open a video streaming application on a smartphone, smart TV, or a web browser and select a 4K video from an online library. The acquisition module 210 receives the video URL and forwards the URL to the streaming module 212 for initiating the video playback.
[0078] In one or more embodiments, upon receiving the video URL, the acquisition module 210 may first identify all available resolution levels supported for the received video. In a non-limiting example, the acquisition module 210 may retrieve metadata from the streaming platform corresponding to the received video to identify all the available resolution levels supported for the received video.
[0079] At block 404, the streaming module 212 may control the user device 110 to begin streaming the video at the first resolution level. In an implementation, the first resolution level may also be pre-determined based on, but not limited to, default settings, historical user preferences, or the network conditions. In a non-limiting example, the streaming module 212 may initiate the video playback at 1080p resolution, assuming the user’s past viewing history and current network bandwidth supporting the said resolution.
[0080] At block 406, the processing module 214 may transmit the control signal to set the bitrate for streaming the video based on the plurality of network parameters, such as the bandwidth, the latency, and the packet loss during the streaming of the video. In an implementation, the user device 110 continuously monitors the plurality of network parameters during the streaming of the video. The user device 110 periodically transmits measured network parameters to the processing module 214. The processing module 214 then transmits the control signal to set the bitrate for streaming the video based on the plurality of network parameters. The selected bitrate may ensure a balance between the video quality and network efficiency. In a non-limiting example, if the network bandwidth is 10 Mbps, the latency is 30 ms, and the packet loss is 0.5%, the processing module 214 may transmit the control signal to set the bitrate to 5 Mbps to avoid excessive buffering.
[0081] At block 408, the acquisition module 210 may receive the performance parameters of the video. The performance parameters of the video such as the buffering duration, the rebuffering frequency, and the freezing ratio are detected by the user device 110 during the streaming of the video at the set bitrate. In an implementation, the user device 110 continuously detects the performance parameters during the video playback and transmits corresponding data related to the performance parameters to the acquisition module 210. The acquisition module 210 analyzes the received performance parameters. These performance parameters may help determine whether the current video resolution is sustainable under the prevailing network conditions. In a non-limiting example, if the buffering duration has exceeded 3 seconds and the rebuffering frequency is 3 times within a minute, the acquisition module 210 may identify a degradation in video streaming quality.
[0082] At block 410, the processing module 214 may reiteratively control the adjustment of the resolution level of the video while monitoring the performance parameters of the video. The processing module 214 may attempt resolution levels higher than the first resolution level until the performance parameters violates the threshold criterion. In a non-limiting example, if the video was initially streaming at 1080p resolution, the processing module 214 may increase the resolution to 1440p. However, after switching the resolution to 1440p, the buffering duration increases to 3.5 seconds, exceeding the threshold criterion, which requires the buffering duration to be less than 3 seconds. The rebuffering now occurs three times per minute exceeding the threshold criterion, which requires the rebuffering frequency to be less than 2 times per minute.
[0083] Since the performance parameters exceed their respective threshold criteria, the processing module 214 may determine that the 1440p resolution is unsustainable under current network conditions.
[0084] At block 412, the identification module 216 may identify the immediate resolution level lower than the previously adjusted resolution level among the multiple resolution levels. If the identification module 216 detects that a lower resolution level provides better playback stability than a previously attempted higher resolution, the identification module 216 finalizes the lower resolution for continued video streaming.
In a non-limiting example, if the identification module 216 identifies that 1080p is the immediate resolution level satisfying the threshold criterion and provides stable video playback than the 1440p resolution which is the previously adjusted resolution level, then the identification module 216 identifies 1080p as the optimal video resolution for the user.
[0085] At block 414, the recommendation module 218 may present the identified video resolution to the user. In an implementation, the user may be given an option to override the video recommendation manually, or the system may automatically adjust the video playback to the recommended video resolution. In a non-limiting example, a pop-up notification on the video player may inform the user: “Your network conditions support a stable 1080p resolution. Switching to 1080p resolution for a smoother playback.” In another non-limiting example, if automatic adjustment is enabled, the system directly switches the video resolution to the 1080p resolution without user intervention.
[0086] FIG. 5 illustrates a schematic block diagram of a computing system 500 for recommending the video resolution to the user, in accordance with an embodiment of the present disclosure.
[0087] The computing system 500 includes a network 502, a network interface 504, a processor 506 (similar in functionality to the processor 140 of FIG. 1), an Input/Output (I/O) interface 508 and a non-transitory computer readable storage medium 510 (hereinafter may also be referred to as the “storage medium 510” or the “storage media 510”).
[0088] The network interface 504 includes wireless network interfaces such as Bluetooth, Wi-Fi, Worldwide Interoperability for Microwave Access (WiMAX), General Packet Radio Service (GPRS), or Wideband Code Division Multiple Access (WCDMA) or wired network interfaces such as Ethernet, Universal Serial Bus (USB), or Institute of Electrical and Electronics Engineers-864 (IEEE-864).
[0089] The processor 506 may include various processing circuitry/modules and communicate with the storage medium 510 and the I/O interface 508. The processor 506 is configured to execute instructions stored in the storage medium 510 and to perform various processes. The processor 506 may include an intelligent hardware device including a general-purpose processor, such as, for example, and without limitation, the CPU, the AP, the dedicated processor, or the like, the graphics-only processing unit such as the GPU, the microcontroller, the FPGA, the programmable logic device, the discrete hardware component, or any combination thereof. The processor 506 may be configured to execute computer-readable instructions 510-1 stored in the storage medium 510 to cause the server 130 to perform various functions.
[0090] The storage medium 510 stores a set of instructions i.e., computer program instructions 510-1 (hereinafter may also be referred to as instructions 510-1) required by the processor 506 for controlling its overall operations.
[0091] The storage media 510 may include an electronic storage medium, a magnetic storage medium, an optical storage medium, a quantum storage medium, or the like. For example, the storage media 510 may include, but are not limited to, hard drives, floppy diskettes, optical disks, ROMs, RAMs, EPROMs, EEPROMs, flash memory, magnetic or optical cards, solid-state memory devices, or other types of physical media suitable for storing electronic instructions. In one or more embodiments, the storage media 510 includes a Compact Disk-Read Only Memory (CD-ROM), a Compact Disk-Read/Write (CD-R/W), and/or a Digital Video Disc (DVD).
[0092] In one or more implementations, the storage medium 510 stores computer program code configured to cause the computing system 500 to perform at least a portion of the processes and/or methods. Accordingly, in at least one implementation, the computing system 500 performs the method for recommending the video resolution to the user.
[0093] Embodiments of the present disclosure have been described above with reference to flowchart illustrations of methods and systems according to embodiments of the disclosure, and/or procedures, algorithms, steps, operations, formulae, or other computational depictions, which may also be implemented as computer program products. In this regard, each block or step of the flowchart, and combinations of blocks (and/or steps) in the flowchart, as well as any procedure, algorithm, step, operation, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code. As will be appreciated, any such computer program instructions may be executed by one or more computer processors, including without limitation a general-purpose computer or special purpose computer, or other programmable processing apparatus to perform a group of operations comprising the operations or blocks described in connection with the disclosed method.
[0094] Further, these computer program instructions, such as embodied in computer-readable program code, may also be stored in one or more computer-readable memory or memory devices (for example, the memory 144 or the storage medium 510) that can direct a computer processor or other programmable processing apparatus to function in a particular manner, such that the instructions 510-1 stored in the computer-readable memory or memory devices produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s).
[0095] It will further be appreciated that the term “computer program instructions” as used herein refer to one or more instructions that can be executed by the one or more processors (for example, the processor 140 or the processor 506) to perform one or more functions as described herein. The instructions 510-1 may also be stored remotely such as on a server, or all or a portion of the instructions can be stored locally and remotely.
[0096] Now, referring to the technical abilities and advantageous effect of the present disclosure, operational advantages that may be provided by one or more embodiments may include that the system and the method dynamically adjusts the video resolution based on real-time network conditions and the performance parameters, thereby ensuring that the users experience high-quality video while maintaining smooth video playback without excessive buffering.
[0097] Further, the system utilizes a single video URL for the video streaming, eliminating a need to switch between multiple URLs with different resolution levels. This approach significantly reduces bandwidth consumption and computational overhead while enabling real-time adaptation of the video resolution based on the network conditions. By dynamically adjusting the resolution within a single stream, the invention ensures a seamless viewing experience without the interruptions caused by buffering delays or manual URL switching.
[0098] The system prevents unnecessary high-resolution streaming in conditions where the network cannot support the high-resolution streaming, thereby reducing data consumption, which is crucial for mobile networks or the users with limited data plans, ensuring efficient use of available bandwidth. Further, a minimization of freezing events ensures a seamless viewing experience, particularly in live streaming and video-on-demand services.
[0099] Furthermore, the system ensures a consistent video streaming experience across multiple streaming platforms. Streaming service providers may optimize server load and Content Delivery Network (CDN) resources by adjusting the video resolution dynamically. This reduces network congestion, improves streaming efficiency, and enhances overall service quality.
[0100] Those skilled in the art will appreciate that the methodology described herein in the present disclosure may be carried out in other specific ways than those set forth herein in the above disclosed embodiments without departing from essential characteristics and features of the present invention. The above-described embodiments are therefore to be construed in all aspects as illustrative and not restrictive.
[0101] Further, using embodiments described above, the performance metrics report may be generated faster and human involvement may be reduced, thereby saving time and resources, and enabling faster detection and resolution of bottlenecks in the network.
[0102] Those skilled in the art will appreciate that the methodology described herein in the present disclosure may be carried out in other specific ways than those set forth herein in the above disclosed embodiments without departing from essential characteristics and features of the present invention. The above-described embodiments are therefore to be construed in all aspects as illustrative and not restrictive.
[0103] The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Any combination of the above features and functionalities may be used in accordance with one or more embodiments.
[0104] In the present disclosure, each of the embodiments has been described with reference to numerous specific details which may vary from embodiment to embodiment. The foregoing description of the specific embodiments disclosed herein may reveal the general nature of the embodiments herein that others may, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications are intended to be comprehended within the meaning of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and is not limited in scope.
LIST OF REFERENCE NUMERALS
[0105] The following list is provided for convenience and in support of the drawing figures and as part of the text of the specification, which describe innovations by reference to multiple items. Items not listed here may nonetheless be part of a given embodiment. For better legibility of the text, a given reference number is recited near some, but not all, recitations of the referenced item in the text. The same reference number may be used with reference to different examples or different instances of a given item. The list of reference numerals is:
100 - Network environment
110 - User device
110-1 - User Interface (UI) of the user device 110
110-2 - Communication unit of the user device 110
110-3- Processor of the user device 110
110-4-Memory of the user device 110
110-5- Transceiver module of the user device 110
110-6- Computational module of the user device 110
120 - Network
130 - Server
140 – Processor of the server 130
142 - Communication interface of the server 130
144 – Memory of the server 130
150 – Database
202- Processing modules of the server 130
204 - I/O Interface of the server 130
206 - First communication bus of server 130
208 - Second communication bus of server 130
210- Acquisition module
212- Streaming module
214- Processing module
216- Identification module
218- Recommendation module
220- Determination module
300- Processing modules of user device 110
302- First communication bus of user device 110
304- Second communication bus of user device 110
400 - Method for recommending a video resolution
402-414 - Operation steps to perform the method 400
500 - Computing system
502 - Network
504 - Network Interface
506 - Processor
508 - I/O Interface 408
510 - Non-transitory computer readable storage medium
510-1 – Instructions
,CLAIMS:We Claim:
1. A method (400) for recommending a video resolution to a user, the method comprising:
receiving, by an acquisition module (210) from a user device (110), an input including a Uniform Resource Locator (URL) of a video;
controlling, by a streaming module (212), in response to the input, the user device to stream the video at a first resolution level;
transmitting, by a processing module (214), a control signal to set a bitrate for streaming the video based on a plurality of network parameters during the streaming of the video;
receiving, by the acquisition module (210), one or more performance parameters of the video, wherein the one or more performance parameters of the video are detected by the user device (110) during the streaming of the video at the set bitrate;
reiteratively controlling, by the processing module (214), an adjustment of a resolution of the video to multiple resolution levels higher than the first resolution level until the one or more performance parameters of the video violates a threshold criteria;
identifying by an identification module (216), as the video resolution, an immediate resolution level lower than a previously adjusted resolution level among the multiple resolution levels; and
recommending, by a recommendation module (218) to the user, the identified video resolution to stream the video.
2. The method (400) as claimed in claim 1, wherein the input is received by the acquisition module (210) to perform one or more video tests on the video to identify the video resolution supported by a network.
3. The method (400) as claimed in claim 1, wherein the plurality of network parameters includes at least one of a bandwidth, a latency, and a packet loss during the streaming of the video.
4. The method (400) as claimed in claim 1, wherein the one or more performance parameters includes at least one of a buffering duration, a rebuffering frequency, and a freezing ratio of the video.
5. The method (400) as claimed in claim 4, wherein the one or more performance parameters satisfies the threshold criteria when at least one of:
the buffering duration is less than a specific duration;
the rebuffering frequency is less than a specific frequency; or
the freezing ratio is less than a specific ratio.
6. A system (130) for recommending a video resolution to a user, the system (130) comprising:
an acquisition module (210) configured to receive, from a user device (110), an input including a Uniform Resource Locator (URL) of a video;
a streaming module (212) configured to control the user device to stream the video at a first resolution level in response to the input;
a processing module (214) configured to transmit a control signal to set a bitrate for streaming the video based on a plurality of network parameters during the streaming of the video, wherein:
the acquisition module (210) is further configured to receive one or more performance parameters of the video, wherein the one or more performance parameters of the video are detected by the user device (110) during the streaming of the video at the set bitrate; and
the processing module (214) is further configured to reiteratively control an adjustment of a resolution of the video to multiple resolution levels higher than the first resolution level until the one or more performance parameters of the video violates a threshold criteria;
an identification module (216) configured to identify, as the video resolution, an immediate resolution level lower than a previously adjusted resolution level among the multiple resolution levels; and
a recommendation module (218) configured to recommend to the user the identified video resolution to stream the video.
7. The system (130) as claimed in claim 6, wherein the acquisition module (210) is configured to receive the input to perform one or more video tests on the video to identify the video resolution supported by a network.
8. The system (130) as claimed in claim 6, wherein the plurality of network parameters includes at least one of a bandwidth, a latency and a packet loss during the streaming of the video.
9. The system (130) as claimed in claim 6, wherein the one or more performance parameters includes at least one of a buffering duration, a rebuffering frequency, and a freezing ratio of the video.
10. The system (130) as claimed in claim 9, wherein the one or more performance parameters satisfies the threshold criteria when at least one of:
the buffering duration is less than a specific duration;
the rebuffering frequency is less than a specific frequency; or
the freezing ratio is less than a specific ratio.
11. A user device (110), comprising:
a transceiver module (110-5) configured to:
receive, via a User Interface (UI) (110-1), an input including a Uniform Resource Locator (URL) of a video for performing one or more video tests;
receive, from a server, a first control signal to stream the video at a first resolution level; and
receive, from the server, a second control signal to set a bitrate for streaming the video at the first resolution level; and
a computation module (110-6) configured to calculate one or more performance parameters of the video during the streaming of the video at the set bitrate,
wherein the transceiver module (110-5) is further configured to:
reiteratively transmit the one or more performance parameters to the server and receive a control signal from the server to adjust a resolution of the video to one or more second resolution levels higher than the first resolution level until the one or more performance parameters of the video violates a threshold criteria; and
receive, from the server, a recommendation to stream the video at a specific resolution, wherein the specific resolution is an immediate resolution level lower than a previously adjusted resolution level among the one or more second resolution levels.
12. The user device as claimed in claim 11, wherein the computation module (110-6) is further configured to calculate, as the one or more performance parameters, a frequency of freezing of the video, a duration of the freezing of the video, and a freezing ratio during the streaming of the video.
| # | Name | Date |
|---|---|---|
| 1 | 202421026225-STATEMENT OF UNDERTAKING (FORM 3) [29-03-2024(online)].pdf | 2024-03-29 |
| 2 | 202421026225-PROVISIONAL SPECIFICATION [29-03-2024(online)].pdf | 2024-03-29 |
| 3 | 202421026225-POWER OF AUTHORITY [29-03-2024(online)].pdf | 2024-03-29 |
| 4 | 202421026225-FORM 1 [29-03-2024(online)].pdf | 2024-03-29 |
| 5 | 202421026225-DRAWINGS [29-03-2024(online)].pdf | 2024-03-29 |
| 6 | 202421026225-DECLARATION OF INVENTORSHIP (FORM 5) [29-03-2024(online)].pdf | 2024-03-29 |
| 7 | 202421026225-FORM-26 [17-04-2024(online)].pdf | 2024-04-17 |
| 8 | 202421026225-Proof of Right [09-08-2024(online)].pdf | 2024-08-09 |
| 9 | 202421026225-Request Letter-Correspondence [28-02-2025(online)].pdf | 2025-02-28 |
| 10 | 202421026225-Power of Attorney [28-02-2025(online)].pdf | 2025-02-28 |
| 11 | 202421026225-Form 1 (Submitted on date of filing) [28-02-2025(online)].pdf | 2025-02-28 |
| 12 | 202421026225-Covering Letter [28-02-2025(online)].pdf | 2025-02-28 |
| 13 | 202421026225-FORM 18 [13-03-2025(online)].pdf | 2025-03-13 |
| 14 | 202421026225-DRAWING [13-03-2025(online)].pdf | 2025-03-13 |
| 15 | 202421026225-CORRESPONDENCE-OTHERS [13-03-2025(online)].pdf | 2025-03-13 |
| 16 | 202421026225-COMPLETE SPECIFICATION [13-03-2025(online)].pdf | 2025-03-13 |
| 17 | Abstract.jpg | 2025-05-01 |